Quantcast
Channel: PXI topics
Viewing all 3397 articles
Browse latest View live

NIXNET

$
0
0

Hello Team,

 

I would like to Read and Write the CAN data over NI PXI device. I am using NIXNet drivers. 

 

I have used the following steps

1. Create Input session using nxMode_FrameInStream , set baud rate

2. Create Output session using nxMode_FrameOutStream , set baud rate

 

3. Call Write frame 

g_Status = nxWriteFrame(g_OutputSessionRef,l_MyBuffer,l_MyNumberOfBytesForFrames,10);

 

4. Call read frame function into a buffer.

 

//Read a frame
g_Status = nxReadFrame (g_SessionRef,l_MyBuffer,sizeof(l_MyBuffer),0,&l_NumBytes);

 

When i did this procedure for the First call to Write frame function, 8byte CAN data i am not able to receive any data and l_Numbytes are 0

 

if I did the call to WriteFrame function again, 8byte CAN data, i am now able to receive 8 byte data but it is the response received for the First CAN write function.

 

Could you please let me know what is that wrong in my code?

Please let me know if you need any further information. 

 

Thanks 

Rajeswari.

 

 

 


I have a PXI-8109 with Real Time installed. That displays USER1 indicator?

$
0
0

HI all,

I have a PXI-8109 with Real Time installed.

That displays USER1 indicator LED on the panel?

                                                  Thanks

PXI 6230 max and min pulse width limit

$
0
0

Can any one tell me the maximum and minimum pulse width limit of PXI 6230 card ? as in datasheet there is only info about the base clock details so please post the info .....

buffer for continuous acquisition

$
0
0
hi,everyone.I have a PXI-5124 board.I need to perform a continuous acquisition for a signal (frequency at 1MHz,cycle time is 10s).i set the min sampling rate as 5MHz,the min record length as 50Ms.i store the data into the file. I use the queue as a buffer ,but the result is not satisfying. when I run the digitizer for a while, every time it indicated “the memory is not enough”. May I ask the board can achieve the acquisition goal? how should I set the buffer?

Question for using PXIe-8135 with Linux and GNU Radio

$
0
0

Hello readers.

I am starter with NI-PXIe and NI-USRP.

I have some questions for you.

 

1. Can I use NI-PXIe-8135 with GNU Radio in LInux?

2. Which one, LABVIEW, LABWINDOWS/CVI, GNU radio, is more suitable program for using PXIe-8135 connected with NI USRP-2943R?

 

I am very new at theses things.

Please give me any comments.

Thx

 

what needed for 7813R pwm i2c SPI

$
0
0

Hello,

we want to use the PXI-7813R 3M Gate Digital RIO (160 DIO) card to generate PWM signals, test I2C communication and use it for SPI testing/communication.

Perhaps also some conventional I/O testing...

They've told us that on the site www.ni.com/ipnet we can find IP to do this.

My question: do we need additional software tooling to use/program this FPGA? Or is just downloading some libraries via VIPM and "using" some vi's sufficient ?

 

thanks.

NI PXI 1042

$
0
0

Hello!I have a NI PXI 1042 chassis.I want to make a voltage diagnose via serial port in the back of the chassis. Can I use Adlink chassis remote mon. application and as a cable to use serial to usb (Wenglor AB-USB01) converter?Do you know this Adlink application is compatible to NI PXI 1042 chassis?Also this cable I use it's ok, or I must have necessary RS-232 port on my laptop and use only standard RS232 cable?

GPIB access using PXI-8106

$
0
0

I am using a trivial program to use GPIB located on the controller PXI-8106 and connected to a temperature monitor device (Lake Shore). Usually I used the same code directly on the PC, connected by means of old GPIB interface.That worked and my I/O VISA session probed the device properly as GPIB0::18::INSTR . Now, apparently the controller does not sense the GPIB. Apparently, through MAX, the controller works properly and accesses correctly the othe two boards on the chassis (PXI-6143 and PXI-6733). MAX sees Built-in GPIB "GPIB0" into Devices and Interfaces, is it correct ? (it is parallel to the chassis, shouldn't be seen into the Chassis instead ?)

 

 


Understanding FIDL examples: Why are 4xI16 samples packed into 1xU128, and not 1xU64?

$
0
0

Hello,

 

I was studying the pre-built examples in FIDL 1.3, and was puzzled by one implementation detail. Acquisition Engine - Facade.lvlib: Packer.vi takes 4x I16 samples and packs them into a U128 value. Doesn't this waste 50% of the bits? Why doesn't it pack the data into a U64 instead?

 

In case it's important, I'm mainly interested in Acq Engine on 5734 PXIe-7962R.lvproj. (Packer.vi looks like a generic VI that's shared across different projects though)

ni pxi 5122 yuv

$
0
0

hello,

i want to understand why for testing  video signal like RGB,Y/C.. we connect  directly a video signal in a  ni pxi 5122 but for yuv signal we must use an external card who has a trigger connected to a trigger of ni pxi 5122 ??

Sporadic invalid values obtained from FlexRIO card

$
0
0

Hello,

 

I'm having trouble acquiring data from my FlexRIO (PXIe-7962R + NI-5734). I'm writing custom code instead of using FIDL building blocks, because the FIDL requires a master-slave design for synchronized triggering, while I require every FlexRIO card to be able to fire the trigger.

 

Anyway, here is a small test case (I tried with both a 40 MHz clock and a 10 MHz clock, with the same outcomes):

 

FPGA

Simple FlexRIO - FPGA Pack.png

 

Host

Simple FlexRIO - Host Unpack.png

 

While most of the output looks ok (i.e. the waveforms look like the input signals), I get many bursts of corrupted samples. I often get spikes that equal the max/min value of I16, although there are cases of non-max spikes too.

 

Interestingly:

  • Each FlexRIO card seems to have exactly 1 misbehaving channel.
  • The channel varies across cards (e.g. RIO1 gets spikes in AI0, while RIO2 gets spikes in AI3)
  • When I repeatedly trigger acquisition on the same card, I see a very similar "noise shape" each time. Different cards produce different "noise shapes".
  • I don't observe any spikes when I use the pre-built FIDL examples instead of my own code (perhaps I need to do some kind of synchronization??)

 

Screenshots below.

  • The orange graphs are supposed to be all zeroes.
  • The green graph is supposed to show a clean sine wave.

 

What might the problem be, and how can I fix it?

 

 

Thanks!

 

 

FlexRIO AI Corruption 1.png

 

Zoomed in

FlexRIO AI Corruption 2.png

 

 FlexRIO AI Corruption 3.png

 

Viewing PXI-8119 Controller from my Windows XP Dev Computer via MAX

$
0
0

Hi All

 

Should I be able to do this?

 

The PXI-8119 Controller is running WIndows 7 Professional SP1.

My development computer is a desktop PC running WIndows XP SP3 with LabVIEW 2012 SP1 installed. NI-Support advised me that I should update the DAQmx and VISA drivers, which I have done, but I still do not see it.

 

I can ping the PXI-8119 Controller. It is definitely on the same subnet as myself. Both computer firewalls (we are on a closed network) are off.

 

Why can't I see it?

 

I also have notice that from the only option for creating a PXI project on my development computer in LabVIEW is "Real-Time PXI". I am always unclear what NI means by Real-Time as they use it interchangably for Windows based as well as proper RT targets like cRIO. So I assume any target will do?!

 

I have never used a PXI chassis before and am having difficulty finding my feet with this problem.

 

Anybody able to help?

Understanding how to set up and use a FlexRIO sample clock

$
0
0

Hello,

 

Following this discussion on glitching inputs, I've learnt that I need to use the Sample Clock domain for reading AI Nodes in my NI 5734. So, I right clicked "FPGA Target" -> "New FPGA Base Clock" and selected "IO Module Clock 0". In general, I followed instructions at http://www.ni.com/pdf/manuals/375653a.pdf

 

Questions:

  1. I noticed that, no matter what value I put in "Compile for single frequency", timed loops that use this clock run at 120 MHz. Is this expected?
  2. I want to acquire at 10 MHz, not 120 MHz. Is there a way to create a Derived Clock from the Sample Clock? (Right-clicking on the clock doesn't give me the "New FPGA Derived Clock" option)
  3. In the FIDL example, Acq Engine on 5734 PXIe-7962R.lvproj, "IO Module Clock 0" is configured to compile for "100 MHz" instead of 120 MHz. Is there any significance behind this value? (From #1, I gather that the value is ignored)
  4. In the sample code below, I get "Error -61046 occurred at Read/Write Control" unless I exclude the "Reset" node from the host VI. This issue does not occur if I use the 40 MHz Onboard Clock instead of IO Module Clock 0 (although I would get glitched data). Have I misconfigured something?
    1. In my actual, more complex program, I get the same error even with the node disabled, if I stop and restart the host VI -- but the next attempt would succeed.
    2. I've attached the sample project file, VIs and bitfile, in case they're useful.

 

Sample Clock - FPGA.png

Sample Clock - Host.png

[FlexRIO] Getting started with synchronizing multiple sample clocks

$
0
0

Hello,

 

Previously, I tried reading from two different FlexRIO cards (PXIe-7962R + NI-5734) in the "40 MHz Onboard Clock" or "PXI_Clk10" clock domains. Triggering was done by simply looking for a rising edge on PXI_Trig0:

Simple FlexRIO - FPGA Pack.png

 

 

This produced glitches, but at there was no skew (or at least constant skew) between the two FlexRIOs -- I sent a duplicated pulse train into both cards, and the triggered-acquired waveforms were always in phase: 

Pulses - In Phase.png

 

 

To prevent glitches, I switched to the sample clocks (IO Module Clock 0). Unfortunately, the sample clocks between the two FlexRIOs had nothing in common, so the acquired waveforms were no longer in phase. Even worse, the phase difference changes with every trigger:

Pulses - Out of Phase.png

 

 

Looking at the FIDL Synchronization library's implementation, the conventional technique for synchronizing multiple FlexRIO cards seems to revolve around master-slave synchronization (is my observation correct?). I was wondering: Is there a way to simply share a common sample clock between the cards (like what the 40 MHz Onboard Clock was doing before), as described in http://www.ni.com/white-paper/11369/en/ ? (I think I understand the cons associated with Sample Clock Synchronization, but I'm willing to try it for now).

 

Thanks in advance!

PCIe-8361 is not being detected in new computer (Gigabyte GA-Z97-HD3 / Win7)

$
0
0

We have decided to retire the old computer we used to run a custom calibration system based around a PXI-1033 as it was getting flakey and it was running Windows XP.  The new system is an i7 running Windows 7 Pro (64bit).  I have had no luck getting the PCIe-8361 to be detected by Windows in the new computer.  The Power and Link LEDs on the PXI chassis are both green. Below is a rundown of what I’ve tried.

 

The system already had DAQmx 14 and Measurement Studio when I installed the PCIe-8361.  There was no notification that Windows had found new hardware. There was no Standard PCI PCI-to-PCI bridge in the device manager.

 

  • Ran the NI update utility
  • The forum mentioned needing NI-VISA and NI-Serial, installed those two components
  • Removed the PCIe serial port card and moved the 8361 to that PCIe slot (I did not try the x16 PCIe slots as documentation on the NI website recommended against doing so)
  • Updated the BIOS of the motherboard to the latest version
  • Tried various "Legacy" settings in the BIOS
  • Installed the MXI-Express BIOS compatibility software

I removed the BIOS compatibility software, or tried to at least. The installer encountered an error.  I can boot into Windows with the 8361 removed from the computer.

Suspecting that maybe the newness of the system may be the problem, I put the 8361 into my desktop, an older 2012 Dell Vostro 470 which already has DAQmx installed.  The 8361 was picked up, the device manager showed the PCI-PCI bridge and MAX showed the PXI-1033. 

 

The original system was a Core 2 Duo with Windows XP.

 

New system specs:

  • Core i7-4790
  • Gigabyte GA-Z97-HD3 rev 1.0 motherboard (Intel 9 Series chipset)
  • 16 GB RAM
  • Samsung 840 EVO 250GB HDD
  • Windows 7 Pro 64bit

 

Dell Vostro 470

  • Core i5-3450
  • 4 GB RAM
  • Intel 7 Series chipset based motherboard
  • Normal 500GB Western Digital HDD

The PCIe-8361 part number is 195315C-01L.

 

I’ve attached screenshots of MAX and the device manager sorted by connection type.

 

Are there any known problems with the older PCIe-8361 cards and Intel 9 Series chipsets?  Anything else I can try?

The obvious solution is to just exchange my system for the new one but that will require me setting up the database and application and migrating all the data again. I would still have to setup the new computer for myself and that will likely involve getting IT to change permissions around for my user account.

 

Thanks,

 

Adam


PCI/PXIe-2532B Matrix Expansion Ribbon Cable intermittent connection due to loose cable solutions

$
0
0

In our PXI matrices we have seen a lot of issues with the Row cables getting loose and causing failures that even though they are simple to fix, it is not fast and we believe that the strain relief is not doing its job.

We have a tester cabinet where the door when closed sits to close to the ribbon cables, so it could be pushing them.

 

Has anybody encountered this issue, if so how did you solve it? We have been thinking about using some type of glue to hold the connectors in place, but please advise if NI has a recommended type or brand.

Xilinx FIR Filter 5.0 does not work (RFD always false)

$
0
0

Hello,

I am configuring a fractional decimation filter. I calculated the taps using
Matlab, and created a valid .coe file. Using Xilinx FIR Compiler (v5.0), I
configure the filter appropriately (the code is attached). When I compile and
run, the FIR filter has its "rfd" output always set to FALSE, indicating that
the core is never ready to accept new data.

 

This behavior (RFD always being false) holds true whether I run the

top_level.vi in simulation mode, or I compile it and run on the actual

hardware.

 

Filter details:

Fixed fractional decimation filter, configured as follows:

a) Interpolation rate: 10

b) Decimation rate: 13

 

I have created a dummy project just to reproduce the project, and attached it here.

NI Software :  LabVIEW FPGA Module  version 2014
NI Hardware :  FlexRIO device PXIE-7966R

LV Version: 2014

 

Thanks,

Aditya

PXI vst 5645r gain / attenuation calibration RX TX chain

$
0
0

Hello,

 

I am working with PXI vst 5645r. I am doing some research using the RF output of transmitter connected with the RF input of receiver.

In my application it is important to know the equivalent attenuation and gain that I get from the generation (output of DAC) up to the acquisition (input of ADC) (loop gain).

I looked http://zone.ni.com/reference/en-XX/help/373680C-01/vstdevices/5645_analog_input/ and http://zone.ni.com/reference/en-XX/help/373680C-01/vstdevices/5645_analog_output/ in order to understand better how is built the structure of my channels.

Together I looked in the design in labview how these parameters are controlled and set (example VST streaming (Host)). I found just the configuration of gain for the transmitter in the block "LO_cal" but I didn't see any calculation for the various attenuations of transmitter. I have still to check these datas for the transmitter.

How are controlled and set all this parameters in the transmitter and receiver normally?

I guess the receiver chain attenuates the signal in order to use the maximum dynamic range od ADCs and to use the right power input of demodulator... does the calibration change gain/attenuation values in the receiver chain every time I use a different gain (peak power dbm) in the transmitter?

 

Thanks in advance

 

Best regards

 

Giuseppe

 

PXIe 8361 gives Code 12 in Device Manager or Windows does not boot

$
0
0

Currently I have a setup with a Dell Optiplex 7010 (i7 desktop PC), a PXIe 8361 MXI controller and PXI-1033 rack. I have connected the PXIe card into the X1 PCI slot (the smallest).

No problems whatsoever, setup has been running for over a year now.

 

However we have just received the parts for our second setup. This involves the same PXIe card, PXI-1033 rack but a new generation PC (in this case, a Dell Optiplex 7020).

Did the same steps as on our first setup, but this time I get problems where the PC does not boot into Windows with the PXI rack turned on.

When I turn the PXI rack on after Windows has booted, I get a PCI-X bridge controller in Device Manager, but this does not work OK.

The problem is reported as Code 12, not enough resources.

 

I have tried numerous things to resolve this, including the NI BIOS compatibility software, driver updates, reinstall of the complete system etcetera.

Eventually I got desperate, and noticed the small PCI (X1) also fits in the large PCI slot (X16).

Figuring that when a connector fits, it won't break; I gave this a shot. 

 

Result: My PXI rack is detected, including all PXI cards!

 

Hope this helps you!

 

Jeroen

PXI-1082 LabView RT Power control

$
0
0

I have a PXIe-1082 with labView RT enbedded deep within a rack.  The power switch is difficult to reach.

 

Can the power simply be cut and reapplied without corruption of the operating system and files?  It can be assumed that I would deactivate any running VIs of course prior to removal of power.

 

If there's a remote power control (Not software based) capability, that would also be an option.  The remote inhibit function doesn't look (From reading the user manual) like it would act the same as the front power switch (Safe shutdown).

 

Thanks!

 

Viewing all 3397 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>