I would like to acquire raw data from ADC. the python example in https://github.com/ni/nidaqmx-python/blob/master/nidaqmx_examples/ai_raw.py
failed with following message.
Does anyone how to read raw data of ADC
Thanks
I would like to acquire raw data from ADC. the python example in https://github.com/ni/nidaqmx-python/blob/master/nidaqmx_examples/ai_raw.py
failed with following message.
Does anyone how to read raw data of ADC
Thanks
I am trying to figure out how I would interface this Ximea xiB PCIe camera with a PXI system (I'm using the PXIe-1085 chassis) to be able to use the xiLib Labview interface maintained by Ximea with the camera. Currently, we interface with it with a PCIe host adapter and this eliminates the need for a frame grabber. Since the communication with the PXI system would be over PCIe that means I still wouldn't need a frame grabber is that correct? I'm just not sure how I would interface the camera with the PXI system as we'll be performing some real-time analysis on the data collected. Would something like this PXIe-8383mc Adapter Module (manual) work to allow it to be used as a PXImc device and allow me to interface with it?
Question 1: Is it possible to adapt the following NI PXIe-1082 system to perform basic diagnostics that can verify/check that all relays on PXIe-2532B are functioning correctly (Not sticking)?
Question 2: We are currently only using approximately 50 relays per PXIe-2532B out of 512 available relays, is it possible to change the relays used in the software when first 50 relays have reached the maximum switch count?
NOTE: The NI PXIe-1082 System is operating on Windows 7 (64 bit) and LabVIEW 2016.
I had this idea that it would be really nice to be able to program the relay card to open and close certain relays with a single input, instead of controlling a single relay with a bunch of labview code and have my timing be limited to software constraints.
I want to be able to turn on 4 relays, sequentially, within 5ms. I don't think I could do this with software because it would take to long for the computer to make it happen. But if i could make it a hardware timed event, that would be really cool.
Theoretically it would make sense that i would be able to program the relay card, but i have no idea how or where to begin.
The relay card I am using is a NI PXI-2567.
Thanx!
Hello all,
I have a data station that is mixed between the PXI, PXIe and PCI cards that is typically run remotely. Tuesday after Labor Day (Sept 8 2020) the system would let the user log in. I came in the lab (rare these days with COVID-19 and all and added kb, mouse, monitor. The system was stuck in a Windows recovery endless death loop. The old image was not much help and so we decided to install fresh (data was safe on secondary backup). I was trying to determine the culprit SpinCore PCI timing Card or NI PXIe system. In order to determine this I went ahead and downloaded (to the best of my ability) current drivers for my project on the development system. I then build a CVI 64 bit project and installed it the instrument computer. I also installed fresh SpinCore Drivers. Long story short: I get a connection error from Spincore only when PXI chassis active. any time the PXI chassis is active I get a PCI-to-PCI Bridge error and twotimes the Intel Management Engine Interface has become corrupted.
Can someone at NI PXI look at my NIMAX config and see if I still have an old driver dragging me down? I will add after I power down and connect the PXIe chassis and run MAX.
Hi everyone,
I have what is most likely a fairly straightforward question since I am relatively new to working with the PXIe-5171R. We have a 10MHz external timing card that we are trying to use to improve the timing stability. I understand that one can attach a 10MHz external clock through the CLK_IN, but is this also phase-locked to the PXIe_CLK100? And if not is there a way to use the external clock to synchronize the Data Clock and the PXIe_CLK100?
Thanks!
Hello,
I need to connect my PXI system to host PC over network. I have also assigned static IP to the PXI target but I could not connect the target to host PC when they are on different subnets. Has anyone ever connected the PXI target to the host PC using different subnets?If it is not possible then how do I connect the system?
This is probably the wrong place to make a question, but you know, i kind of desperate. I've already submited to NI forums, but I'm trying everywhere i can.
"Hello, everyone. I started working with a NI PXIe 1078 chassis with a NI PXIe 8133 controller and a few others modules quite recently.
It was working fine until one day i tried to power on and the led "PWROK" started to blink green and the "DRIVE" led remained unlit. The fans are still working, he just doesn't seen to boot.
Currently i've tried to, as it was suggested in the troubleshooting:
Remove all the modules leaving only the PXIe 8133 and power on;
Clear the CMOS;
I've check the power supply and looks alright, but i must admit i not completely sure.
There is any other thing that i should try before i change the power supply? Unfortunaly in my lab we don't have another chassis to tested.
I hope you guys can help me."
Can someone help me with this DAQmx issue that happens when acquiring analog inputs from a PXI system when the channel list uses multiple PXI modules (so multi-device AI task with a start trigger).
I developed the code in LabVIEW 2018 (64-bit), and using NI DAQmx 19.6.
I have the following PXI system:
PXIe-1071 4-Slot Chassis
Slot1: PXIe-8301 Thunderbolt 3 Remote Controller
Slot2: PXIe-4302 Voltage Input Module with TB-4302 Terminal Block
Slot3: PXIe-4340 LVDT Module with TB-4340 Terminal Block
Slot4: PXIe-4340 LVDT Module with TB-4340 Terminal Block
My AI Task consists of 9 channels:
PXI1Slot2/ai0, PXI1Slot2/ai1, PXI1Slot2/ai2, PXI1Slot2/ai3,
PXI1Slot3/ai0, PXI1Slot3/ai1, PXI1Slot3/ai2, PXI1Slot3/ai3,
PXI1Slot1/ai0
I would like to use /PXI1Slot2/PFI0 as the DAQmx Start Trigger (Start Digital Edge).
I would like to use /PXI1Slot3/PFI0 as the External Clock.
However when I run the VI, I get the error:
Error -89136 occurred at DAQmx Start Task.vi:7220001
Possible reason(s):
Specified route cannot be satisfied, because the hardware does not support it.
Property: Start.DigEdge.Src
Property: Start.DigEdge.Edge
Source Device: PXI1Slot2
Source Terminal: PFI0
Task Name: LVDT & LASER
I am enclosing a test VI (SR_DAQ_Test.vi) and additional subVIs that will reproduce this behavior. It can be reproduced even when using Simulated PXI modules in MAX.
Please see enclosed image of DAQmx Help for Mult-Device Tasks. According to this, the three PXI modules in my AI Task are compatible with each other.
I have also enclosed a snapshot from MAX showing the PXI simulation devices, if you wish to reproduce this issue.
Thanks again for any tips!
I am driving a PXIe-5663e VSA using a PXIe-5673e VSG producing a -10 dBm CW tone at 2.265123 MHz. The VSA full-span trace shows the incoming signal at -17.5 dBm (due to cable and filter losses) but also spurious signals at 1.9055 GHz @ -17.4 dBm, 855.205 MHz @ -44 dBm and 501 MHz @ -33 dBm. Is this spurious performance typical for this instrument?
I tried using the advanced sequencing function from the NI-DCPower. But I can't seem to get it working? I want to vary the pulse current level but I can't seem to get it working... I'm also trying to use the combine source current sequence in TestStand so the user input can either be a Ramp Up, Ramp Down or a constant pulse current level.
Hi.
I'm using LabVIEW FPGA with a FlexRIO PXIe 7971R and an adapter 5783. My goal is to implement a window to scale my acquired signal before doing a FFT in a SCTL. For my case, something as close as possible to a rectangular window would be perfect (the energy vs time distribution is extremely uneven and i want a good signal-to-noise ratio)
I saw that the Scaled Window Express VI have a implementation option: Show coefficient index terminal
I looked at the LabVIEW help and I did not found what is was.
https://zone.ni.com/reference/en-XX/help/371599P-01/lvfpga/fpga_scaled_window/
My questions are:
What does it do? How does it work?
What is the range of coefficient?
Can i change it to be close to a rectangular window but also to keep the "Apodization" effect at the borders of my signal?
Thank you deeply
I have a new Dell Precision 7750 Laptop that will not properly enumerate the cards in a PXIe-1062Q Chassis via Thunderbolt. The Chassis and cards function when connected to a two year old Dell Latitude 7490.
The Chassis is detected fine without any cards inserted.
When the Cards are inserted the SMBbus Controller (Chassis) has issues.
As one might expect NI Max is a mess as well
I have seen some reports of similar issues but no solutions. It seems to be a driver issue with new Thunderbolt Hardware / Security?
My hardware Information:
XIe-1062Q Chassis
PXIe-8301 Thunderbolt Remote Control Module
PXI-4461 Sound and Vibration Module
PXI-4462 Sound and Vibration Module
Dell Precision 7750 Laptop
Win 10 1909 (2004 has the same issue)
Thunderbolt Driver Version: 1.41.789.0
Kernel DMA Protection: Off
Operating System(OS) Windows 10 Professional
OS Version 10.00.18363
OS Info
Processor Intel(R) Xeon(R) W-10885M CPU @ 2.40GHz / Intel64 Family 6 Model 165 Stepping 2 / GenuineIntel / 2400 MHz
Number of Processors 16
Physical Memory 63.6 GB of RAM
Drive C:\ 803 GB of 934 GB free
National Instruments Software: Version:
NI-DAQmx Device Driver 20.1.0f0
NI-DAQmx ADE Support 20.1.0
NI-DAQmx MAX Configuration 20.1.0
NI I/O Trace 20.0.0f0
LabVIEW Runtime 2018 SP1 f4 18.0.1
Measurement & Automation Explorer 20.0.0f0
Measurement Studio Visual Studio 2012 Support - See individual versions below.
NI PXI Platform Services Configuration 20.5.0f1
NI PXI Platform Services Runtime 20.5.0f1
NI-PAL Software 20.0.0
NI System Configuration 20.0.0f0
NI-VISA 20.0
NIvisaic.exe 19.0.0.49152
NI-VISA Runtime 20.0
LabVIEW Runtime 2019 SP1 f1 19.0.1
I appreciate any thoughts you may have on this.
We have been using NI PXI DMMs for some time to measure small resistance values (typically under 30ohms). We of course use 4-wire measurements to eliminate lead resistance and we also ensure that the lead resistance is less than the specified 10% of the range.
When looking at the latest NI-DMM manual (http://zone.ni.com/reference/en-XX/help/370384V-01/dmm/4-wire/ ) I noticed that it indicates that if the Maximum 4-wire lead resistance is exceeded (10% of range) there is a potential of damage to the DMM (specifically the current source used to provide the 1mA test current in our scenario). While it is clear that the voltage drop over the test leads would cause a problem, it raises a few other questions.
I have asked these questions through the appropriate channels but am interested to see if anyone in the community could help shed some light on this.
Thanks in advance.
Hello,
I am running into an issue with windows 10 version 1909 and a Rubidium PXIe3352 card when connecting the PXI Chassis through a thunderbolt cable. I have two systems, one with a PC with Win10V1809, which works fine, and the other one is the Win10V1909. Whenever I plug the thunderbolt cable into the version 1909 PC I get the legendary blue screen of death. I've updated drivers (3352 and thunderbolt) and tried this same PXI 3352 card with the 1809 PC and it works fine. I turned off DMA protection, but still the version 1909 PC crashes. Would anyone have any suggestions?
We are looking at buying a PXI system from NI. Do you know any other supplier that makes similar affordable, reliable equipment, that would be worth to have a look at?
Our intended NI system would consist of:
Hello,
I want to control NI PXI-1042 via a MXI express card (NI PXI-8360) and a Laptop.
In order to avoid problems due to incompatibilities, please can you provide me a list of tested laptops with NI PXI-8360.
Thankyou in advance.
Best regards.
BELAJI
Hi everyone. I have two VST-5644 and four USRP-2922 devices. I want to build a 2x2 MIMO system. I have used the example "6x6 MIMO-OFDM System with NI USRP and LabVIEW Communications" and modified it to 2x2 by four USRP-2922, the system works normally. It can demod and generate constellations well (Fig.1 ).
Then I want to transfer this system to two VSTs(transmit) and two USRPs(receive).But the receiver(two USRPs) can't demod the signal which transmits from VSTs.. In order to verify, I fed the same signal to VSTs and USRPs to transmit, just called Sig1 transmitted by VST and Sig2 transmitted by USRP. Ideally, the Sig1 should equal to Sig2. The receiver can still only demodulate Sig2, not sig1.
I am confusing about this situation. I can find some differences between Sig1 and Sig2. The Sig1 looks like mix some low frequency with the transmitted signal, and Sig2 looks like not. But I think demod.vi is fully capable of handling such low-frequency mixing that is visible to my eye.
Does anyone know the difference between the signals sent by VST and USRP?
Thanks a lot.
Hello all,
I am using PXIe 5170r 14 bit oscilloscope. I want the module to be used for extracting medical image (demodulator).
I am using the concept of I/Q demodulator receiver in which the received signal has to be multiplied by Sine and cosine values (to be implemented through DDS) and the mixed signal has to pass through Low pass filter. The output of low pass filter is the demodulated signal.
My querry is, the FPGA of PXIe 5170r module work at 125MHz but the ADC at 250MSps, due to this I am getting 2 samples at a time. How to configure DDS to deal with 2 samples at a time? If 2 DDS is used, the first DDS will be configured easily but the second DDS should generate the data which has to be multiplied with the 2nd input data. What should be the POFF value of 2nd DDS so that it will be correctly multiplied with the 2nd input sample data?
If I implement 1 DDS of 250MHz clock frequency and fill the alternate values in FIFO, the code gives an error of unsupported clock.
Please provide hints, solution or any resource to tackle the issue.
Thank You.
Regards.
I use "niDigital Configure Digital Edge Start Trigger.vi" to setup the PXIe-6570 to start bursting when it receives a specific PXIe-Trig.
However, I cannot wait forever for this trigger and have a timeout condition. In the timeout condition I have to ready the device for new bursts and therefore use the "niDigital Disable Start Trigger.vi".
This is where the problem arises. If the trigger has not arrived, then using "niDigital Disable Start Trigger.vi" returns error:
LV code: -1074118645,
LV Status: TRUE,
LV source: niDigital Disable Start Trigger.vi<ERR>The specified operation cannot be performed during a pattern burst.
What is the correct way to disable the start trigger?