how to interface a pressure meter?

Hi all,

I hoped someone might be able to give me some advice about how to interface a pressure meter I have borrowed. It is a Hopkins Innovex

600D, I can't find a data sheet, but I don't think that matters as it must operate according to some simple principles which someone here will probably be able to tell me. It just has three wires, for output, supply, and ground. The ratings written on it are: supply 12V, range 0-2 inches WC, and output 4-20 MA.

The bit I don't understand is this - when I have dealt with similar sensors before, the output has been a voltage range, but I guess that in this case MA means milliamps, i.e. a current. I connected 12V across the ground and supply, and put my voltmeter across the ground and the output. When I change the pressure a lot, the voltage only changed between 6.13 and 6.15. That seems very insensitive, and given that the output is given in current not volts, I wonder if I should be using some other method to measure the output, i.e. one that lets current flow, seeing as it output is given as a current. But I'm not sure how.

If anyone can give me some advice I would very much appreciate it!

Cheers,

Ben

Reply to
amorphia
Loading thread data ...

Ben

There are pressure transmitters and pressure transducers. Transmitters are voltage devices while transducers are amp devices. The advantage for the trasnducer is that on longer wiring runs you do not a voltage drop due to the increased wiring resistance as the length of the wiring run increases. The 4-20 ma is standard operating range for these devices. Typically, the input voltage can be 0-12 or 0-24 volts. I am not familiar with the hopkins but most of these units are designed for a straight line response of amp vs pressures - i.e. 4 ma = 0 Pressure and 40 ma = top end of the rated range. Most of these untis have a safety factor of 2 to 3 to prevent damge due to overpressure events. Given the model number is it possible that the top end of the range is 600 PSI?

The other issue is whether it is gauge pressure or absolute pressure transducer. If it is gauge pressure you should see a plastic vent tube to the sensor so that the readings are normailzed to atmospheric pressure. The result is that current atmospheric pressure conditions would read as 0 psi; or 4 ma.

You should see a change in current output when you set your multimeter to read current.

Alternately, to get output in voltage you would need to run an appropriately sized resistor across the output and supply leads. To size the resisor we typically use a high quality potentiometer and adjust until we get the proper voltage output. Seems to me the resistor sizes out to the 7500 Ohm or is it 7500K Ohm range. :)

I suspect what you are measuring is the input voltage, to verify check the your power source voltage at the source. It should read the same as what you are measuring at the sensor. Note it is very important that your ground is run to a ground common to the power source.

I hope this helps.

Steve

Reply to
WaterSteve

4-20MA means that the span from 0 to 2 inches of water column pressure is represented, linearly, by an output current that ranges between 4 and 20 milliamps. This kind of output is an industry standard that eliminates any errors from wiring resistance. Typically, the current is converter to a voltage at the receiving end by passing it through a precision resistor, often 250 ohms, to convert the output to a 1 to 5 volt range.

The output is trying to regulate the current and you are connecting such a high resistance (the volt meter) that it is saturating to full output voltage in its attempt to drive the current through that load. Connect a 250 ohm resistor from output to ground and measure the voltage across that load.

Do you want to connect this output to some computer system, or just add a remote output meter to it? If the latter, these are commercial products that include arbitrary scaling.

E.g.

formatting link
formatting link

Reply to
John Popelish

Thanks very much Steve and John for your very helpful posts!

I have often wondered what the difference between a transmitter and a transducer was, and now I know

I've tested putting a resistor between ground and output, and measuring the voltage across it, and it is working nicely. Now I will just connect it to my A to D converter and get the reading in my PC which is exactly what I need.

Cheers!

Ben

Reply to
amorphia

Ahhh, I think you have the terminology reversed : A transducer is a sensor or other device that converts from one variable to another, typically from "something" (like pressure) to voltage. It's the raw, native conversion device. It would still be called a transducer if it converted pressure to current, or watt-seconds to coulombs, or anything to anything else.

A transmitter, on the other hand, is a circuit that has a transducer (often internally) as its input and "reformats" the output such that it is a current in the 4-20 mA range. So zero output from the transducer will give 4 mA from the transmitter output, and full-scale from the transducer will give 20 mA from the transmitter output.

You can remember the difference by considering the names themselves: A transducer is for changing the variable, while a transmitter is for sending it. The standard 4-20 mA system is used for its high noise immunity, compared to trying to send millivolt or microvolt signals directly from a transducer.

Hope this helps!

Bob Masta D A Q A R T A Data AcQuisition And Real-Time Analysis

formatting link
Scope, Spectrum, Spectrogram, Signal Generator Science with your sound card!

Reply to
Bob Masta

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.