Quantcast
Channel: PXI topics
Viewing all 3418 articles
Browse latest View live

One question about "Low-Frequency Alias Rejection"

$
0
0

Hi,

In "Low-Frequency Alias Rejection" function, the sampling rate is forced to be never below 25.6kS/s, and the redundant data is discarded to keep the sampling rate user selects.

However, the cut-off frequency of the digital filter also becomes larger.

For example, with "Low-Frequency Alias Rejection" disabled, if the sampling rate is set as 10 kS/s, the cut-off frequency of the digital filter will be 5 kHz.

With "Low-Frequency Alias Rejection" enabled, the sampling rate will be advanced to 40 kS/s, and the cut-off frequency of the digital filter will becomes 20 kHz.

Therefore, signal with frequency 5 ~ 20 kHz will be introduced if "Low-Frequency Alias Rejection" is enabled.

As a result, for low frequency signal measurement, you can only either enable "Low-Frequency Alias Rejection" to filter high frequency noise and sustain the additional signal (5 ~ 20 kHz) introduced, or disable it to prevent this additional signal and sustain the high frequency noise.

Am I right?

Thanks.


Error using start trigger on PXI-6143

$
0
0

Hi All,

 

I am wirting to program that should have a synced an analog input and a digital output on PXI-6143 using Start Trigger. The trigger signal is set to use software trigger generated by an arbitrary waveform generator (PXI-5412). Please see the attached file for the program.

 

Somehow if I use either the "DAQmx Timing (SampleClock).vi" or the "DAWmx Start Trigger (Digital Edge).vi", the program will return an error saying:

Error -200452 ocurred at Property Node DAQmx Channel (arg1) in ...

 

The S-series manual didn't mention anything about whether or not PXI-6143 supports start trigger, so I am not sure if the device is really the problem or if I am making some mistake in the code.

 

If PXI-6143 doesn't support this function, could anyone please suggest a way to implement the code so that the start timing of the analog input and the digital output would be as close as possible?

 

Thank you very much!

 

DC Powered NI PXIe-1071

$
0
0

hello!

I am actually using NI PXIe-1071.The Input source is AC but my requirement is to power up my PXIe through DC Input.Can u suggest any power distribution solutions like DC-DC Converters or modules with multi voltage levels meeting the power requirements of PXIe.

Why not PCI

$
0
0

I am looking a few good reasons  why i should upgrade my PCI based tester to PXI based tester -

 

Some of the thoughts I have -

 

1. More variety of cards available in PXI section.

2. Higher speed.

 

thanks

jis

PXI 8156B Battery Failure and restore harddrive.

$
0
0

Hi,

 

I am working on restoring a very old system built using PXI8156B, attached are the snaps.

 

The system has damaged OS. I want to restore by ghost image which was captured just before final system was ready for operation.

The CPU module has battery failure which cause "Loading defaults" thereby booting from C. I tried making boot sequence "A,C" but was no success.

 

I would like to know how do I restore the battery..? There doesn't seem removable battery.

How can I restore the ghost image if battery solution fails..?

 

Kindly advice.

 

Thanks and regards,

 

Kunal

 

 

 

Computer BSOD when reading resistance with PXI-4070

$
0
0

I have a PXI/SCXI system that is connected to a Windows XP desktop using the MXI-Express 8360 card. I am using the PXI-1045 chassis and a PXI-4070 DMM. The SCXI chassis interfaces through the DMM card. My software makes calls to the nidmm_32.dll. I can initialize the DMM and set it up just fine, but whenever I try to read the card (using niDMM_read()), the desktop computer will get the Blue Screen of Death (IRQL_NOT_LESS_OR_EQUAL). I am also controlling a few SCXI-1127 modules through the DMM and all of those are working fine.

 

I have had this system up and running with no issues before using the MXI-4 interface (same software). Could there be an issue with the new MXI-Express interface? I noticed a lot of available settings in the desktop BIOS when it comes to PCI-Express (such as 8 bit indexing and what not). Are there recommended settings from NI for the PCI-Express bus?

 

I have no issues when I call "niDMM_init",  "niDMM_ConfigureMeasurement", or "niDMM_reset". The error only seems to happen when I try to read using "niDMM_Read".

 

Has anyone else had this happen or know what might be causing it?

PXI-4461 Filter Delay

$
0
0

Hi,

I have some questions about the Filter Delays associated with the PCI or PXI-4461 card.

I do not have this card, but am interested in it, especially the 24-bit analog output function, to control a test apparatus I have. The problem I face now is that the test apparatus and sensors are sensitive enough to see each bit change in the 16-bit resolution analog control voltage I now use to control the apparatus. I like the sensitivity of my equipment, but do not like seeing the bit changes in the control signal in the resulting data.

I am concerned about how the Filter Delays inherent in the 4461 operation, the roughly 3 milli second delays from instructing the 4461 to output (or input) a signal, and that output signal actually responding (as I understand what the Filter Delay is), will affect the ability of that card being effectively used as an instrument control signal.  I do not want milli second delays interrupting the otherwise smooth control of the test apparatus. I am interested in learning about what triggers a filter delay so it can be avoided if possible.

For example, I might generate an analog output signal to control my test apparatus that consists of several monotonically increasing linear voltage ramps with different slopes, each voltage ramp slope representing a different sliding velocity for the servo mechanism in my test machine to follow.

I could do this by calculating each ramp separately and join them together to make one lengthy voltage ramp / waveform where voltage changes as a function of time vary within the longer waveform. That longer waveform is fed to the DAC to play back at a constant clock rate. Besides the initial start-up of the DAC, I don’t think that there there would be any filtering delays in that example, would there?

However, I do control the machine on the fly so to speak, where changes in the control voltage ramp occurs at arbitrary times without stopping the program, calculations for each DAC update are done right before it’s sent to the DAC. Would each DAC change in this case trigger a filter delay?

Another option to generate a machine control signal is to generate a simple voltage ramp that starts at -10V and ends at +10V that has 16,777,216 values, one for each of the 24-bits. As the program runs, the clock rate is varied which causes the DAC to generate bit changes at different rates, which varies the rate the voltage output changes with time. Will each chang of the clock rate trigger a filter delay?

I realize that there are other ways to increase the resolution of an analog output using 2 or more 12 and/or 16-bit analog outputs, but they too have their limitations. It would be nice to simply have a high resolution analog DAC without the added external hardware and added programming complications associated with the multi DAC output ‘solutions’.  

Any comments regarding the 4461 cards and how they might behave under these scenarios, or suggestions for alternative hardware/software to get a finer DAC output resolution would be appreciated.

thanks,

Brian

NI 6587 Output Powerdown - Is it possible?

$
0
0

This is on a FlexRIO PXI-7952R device with an NI-6587 LVDS adapter.

 

  I have a need to power down the DUT entirely, and noticed that even with all LVDS channels set to “Input,” and even with the “Gen_Reset” node in the serializer CLIP held at True, the NI 6587 still outputs a non-zero voltage on its LVDS lines (between ~0.8V on p channel, ~1.4V on n channel).  This effectively winds up powering part of the DUT and is generally not a good thing.

 

  I noticed that when the Reset Invoke Node is used the NI 6587 powers down its LVDS channels for a half second before they spring back up to a steady output - so it appears that the hardware is capable of being powered down.

 

  I am wondering how National Instruments recommends that the LVDS channels on the NI 6587 be completely powered down during certain portions of a test.  Even if it means completely powering down the NI 6587 module, or even powering down the PXI-7952R – as long as I can ensure that the DUT is in a zero energy state I will be good to go.

 

Thanks!


How NIMAX( NIMAX is opened in the system running our dll) makes faster communication for our dll having NICAN driver interface

$
0
0

Hi,

 

We are facing the strange issue with the NIMAX and our dll application.

We are  using NI PXI Cancard for communication.

 

We have developed an application in vc++.net 2003 to flash the ECU's .At test bench when we try  to  flash the ecu it is taking longer time(5  minutes) to flash when NIMAX is not running i.e not opened.

 

If we open NIMAX and flash the ecu it is taking shorter time(1 minute).

 

How NIMAX opening  making ECU flash in less time.

 

Can you please clarify as soon as possible?

 

Thanks and Regards,

vinaya

fpga compilation'

$
0
0

I'm using PXI board in Labview FPGA. I'm a new user.

When I'm compiling even a simple program on FPGA target, it is showing error message that compilation failed as clock timing exceeds it. I made sure that it is a simple program only. But it is again encountered.

Any help on this would be appreciated.

 

Thank you in advance.

RT OS timer interrupts problem

$
0
0

I have PXI system with a RT OS. It was working pretty well until I upgraded to LabWindows/CVI 2013. It seems to be a context switching problem; I will describe it the best I can and see if there have been others with a simular problem.

 

The section of my code which has the problem is where I am trying to measure the effect of a RC filter by injecting a DC signal to a external device which has its own signal acquisition system. The external device is communicated to via CAN. The DC signal is generated by a DAC in my PXI system. My PXI system also has a CAN device I control in my application.

 

Here is the steps of my software.

1.) a CAN message to my external signal acquisition system to start acquiring a waveform of 100 samples at a sample rate of 150us. The waveform is 15ms long.

2.) p for 5 ms

3.) the DAC to a voltage

4.) load signal from external device and record to a file.

 

So by Sleeping for a bit before setting the DAC I will see the effect of my RC filter in the external acqusition system's generated waveform.

 

When I had LabWinodws/CVI 2012 this worked just fine but now it seems that the DAC output changes before my external device receives the CAN message to start acquisition.

 

Here are some of the complexities of the system.

The CAN message is sent on a separted thread so use the function PostDeferredCallToThread() to post this.

Right after that I go to sleep so the RTmain() function can allow this lower priority thread (the CAN message) to execute.

Then I set the DAC output.

 

Here is the code which Posts the CAN and sets theAC.

                        PostDeferredCallToThread( ActOnStartAbsorptionMeasurement, &g_InputVoltage[g_InputVoltageptr], g_mainThreadID);  // send CAN message
                        // 5 ms delay

                        timerTracker = GetTimeUS();
                        timerTracker += 5000;
                        SleepUntilUS((unsigned long)timerTracker);

                       // set the DAC 
                       (void) SetDACoutput(VoltageLevel);

 

Here is what I do in the RTmain()

    while (!RTIsShuttingDown () && !gDone)
    {
        /* Your code. */

        /* Sleep for some amount of time to give the desired loop rate */
        /* and to allow lower priority threads to run.                 */
        timerTracker = GetTimeUS();
        timerTracker += RT_MAIN_SLEEP_TIME;   //   RT_MAIN_SLEEP_TIME == 400us
        SleepUntilUS((unsigned long)timerTracker);
       ProcessSystemEvents ();
                             

    }

 

I appreciate any help. Thanks in advance

PXIe - Connecting to modules in the chassis

$
0
0

Hi All,

 

Sorry for the fairly simple question.

 

I'm trying to set up a new PXI system. 

 

The set up is as follows: A PXIe1078 Chassis with  PXIe-8115 Embedded Controller, 4065, 2526, 4844 and 6220 PXI chassis cards.  Using MAX I am able to identify all the different bits of hardware and everything seems to be working ok.

 

I've worked on cRIO's for a few years and so far using a similar approach has worked to set up the project and connect to and control the user lights.

 

However, I don't seem to be able to work out how to pick up the cards on the chassis?

 

Specifically for the 4844 PXI card I've used the NI OSI Utility and been able to set up the sensors and take readings so I am happy there is communication.  I just don't seem to be able to work out how to use it through LabView.

 

Can anyone point me in the direction of any online examples?  I've been searching but to no avail and the osi example didn't seem to load onto my system when I installed the osi software (v2.0).

 

Thanks,

 

Dom

Problem connecting to Realtime system to download or run in debug mode

$
0
0

I have updated my LabWindows\CVI to 2013. I have a PXI system with a PXIe-8101 controller. The PXIe-controller has a real-time OS installed. I have been programming this system for a while now so I am not exactly a novice anymore. The first time I tried to install the program to my real-time target I got a message which said "LabWindows/CVI cannot connect to the real-time target." It also said " .... The version of the LabWindows/CVI Run-Time Engine for RT must be compatible with the version of the LabWindows/CVI development environment."

 

So I figured that was the problem. My next step was to go into MAX and sure enough was getting all sorts of flags and errors when I used the real-time software Wizard in MAX by right clicking on the Remote software and clicking on Add/Remove Software. And like I suspected there were several flags indicating a software differences. My next step was to update the software in my real-time target to resolve the problem. After I did that I compiled the code and installed it on my remote system.  Everything seemed OK.

 

The following Monday I restarted LabWindows/CVI and made a couple of changes and tried to build and install to my real-time target. I get that same message as in the first paragraph. So I thought I might not have real-time system  booted up correctly or something like that so I checked all of that and it looked OK. Next I opened up MAX and it could see my real-time target just fine.

 

When I had my LabWindows/CVI 2012 I did not have this problem. I think maybe there is some incompatibliity issues with my LabWindows 2013; has anyone had a similar problem? I am able to get around the problem by building the release version and copying it to the PXI with CuteFTP V9.0 but I cannot do run my program in debug mode with my LabWindows/CVI and if you may have noticed I have other problems which is in another post I put in next week. I am hoping the problems are connected and I can kill two birds with one stone LOL.

 

I appreciate any suggestions.

Thanks,

Don Pearce

PXI-8512 software compatibility issues

$
0
0

I have a PXI-8512 high speed CAN module in my system. My PXI controller has a realtime OS. I am having some problems and have been trying to identify the problem. 

 

My question:

Is there any known compaitbility issues with the PXI-8512 using the NI-CAN 2.7.4 software installed on a realtime system?

 

 

  

generate clkin acquisition

$
0
0

Hi everyone,

 

I'm using a PXIe-6548 and I'm trying to use an "external clock". Well I mean, I want to generate datas and a clock thanks to this device (with onBoard clk). Then, I'd like to send this generated clock to clkin in order to acquire the datas (I don't have other generators to create a "normal" external clock and I need to use a clock I can control, so...)

First, is it even possible to do this?

Then, I try anyway but I got errors. I have the enclosed error1 and if I do it in Debug mode ("Highlight execution") I have a different error (well, I guess they're similar but they're still not the same, so maybe it's important) which is error_higlight (enclosed). Do you have any idea? (well it might be because I just can't do sth like this)

 

If you need my VI, just ask.

 

Thanks in advance,

 

Alex

 

PS: Sorry for mistakes and hope it is all clear ^^


PCI/PXI CAN series 1 and series 2 cards

Installing PCIe 8371 in Advantech ARK-3440

$
0
0

I'm trying to intall a PCIe-8371 to a PC Advantech ARK-3440. Put the PCIe card in the PCIe slot. Turn on the chassis (PXIe-1062Q) first then the PC. The lights on both cards turn on green. But MAX doesn't show anything.

 

Tried the compatibility BIOS thing, The PWR led on 8370 lights green while the LINK led is off, and the led on 8371 turns on orange.

 

Looking for something to appear in the device manager without luck. (NI PCI bridge or  NI SB driver)

 

Also updated the device drivers software with the one the system came with (NI System Driver Set 2010.02)

 

Any help or suggestion will be appreciated.

How to control a solenoid via PXI-2520 and DaqMx Switch

$
0
0

I am a LabView novice with Core 1 and Core 2 training so I am familiar with the fundamentals of LV process flow however, this is the first application I am attempting to develop that is not a simulated training VI. That said, I am having trouble ironing out the necessary logic to control a 10 solenoid 2 position valve manifold. I have developed a VI that will contact and disconnect a particular relay in the style of the "Switch Controlling Individual Relay.VI" on my PXI-2520 module but these are single instance events. Ie. i set the device, name the relay, and tell the VI which action (make or break) and run the VI. What I am trying to develop is a continuously running VI where any one or multiple solenoids can be energized/de-energized using simple boolean control whereby each relay is represented by a switch on the front panel.

 

It is clear that a while loop is necessary and a series of case structures or State Machine architecture is necessary but all my attempts have been fruitless. the common failure mode tends to be when following iterations of the while loop tell a switch that is already open to reopen or vise versa thereby causing an error. My thought here ist that some feedback from a former  iteration is necessary so that only "unequal" boolean values for the former iteration calls a case to connect/disconnect. What are your thoughts?

 

Eventually this VI will also need to control these solenoids automatically (i think DaqMx Create scan list with software triggers will suffice) but each action willl need to be timed. For now though it is most critical that I have the capability of operating the solenoids manually and dynamically.

 

Thankyou,

 

DK01

Performance verification PXI 6608

$
0
0

Hi,

 

I have a question about the performance verification of the PXI 6608.

 

This is what I was able to perceive from the manual.

 

We have two modules:

CTR0: referenced to an external clock that has a minimum uncertainty of 0.75 ppb and generates a 400 s pulse to CRT1.

CTR1: is referenced to the internal 10 MHz OCXO and counts the number of cycles in that 400 s window: 400 * 10 MHz=4*10^9 . Any result different from this means an error.

 

PXI 6608.jpg

 

Now to my understanding the performance verification is comparing the 10 MHz OCXO with the external clock. So by looking at the manual we have a uncertainty of 75ppb when the 10 MHz OCXO is on the slot 2: 0.75ppb * 10 MHz = +/- 0.75 Hz. BUT the performance verification assumes a tolerance of 0.1 Hz (10,000,000.1 < Hz < 9,999,999.9 Hz)

 

The calibration execute has the same approach as the performance verification,

 

Why does the PXI 6608 have a tolerance different from the manual?

Are we not testing the 10 MHz OCXO?

 

VHDCI to VHDCI DAQ cable

$
0
0

Hi All,

 

I need to connect our DAQ boards PXIe-6368 to a custom electronic board.

 

Due to dimensional constraints we need to use VHDCI connectors onto the custom board.

 

Thre is a DAQ cable suitable for this?

We have seen that the cable SHC68-68EPM start with VHDI from DAQ and terminate with a SCSI 68 pin on the other end.

 

We need a straight cable in which both ends are VHDCI.

 

For HS-DIO boards is available the SHC68-C68-D4 cable that is in line with our needs but, if I have understood well it is realised specifically for HS-DIO and isn't suitable for DAQ usage.

 

It is available from NI a proper VHDCI-VHDCI DAQ cable?

 

Thanks for your reply

Viewing all 3418 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>