Questions about optimizing Sensor outputs to dsp/microcontroller A/D inputs

Ok, when hooking up a sensor output to a/d input we want to minimize the ratio of RMS noise to the LSB value. I would think we want a sensor with a high voltage output and a A/D with a high voltage input so that the LSB is well above the noise floor. Stand alone a/d's usually have nice high input ranges (e.g., Maxim 1271 with a +/- 10 Volt input), but many modern DSP and microcontroller A/D inputs are usually +5 volts and the really modern and attractive ones are in the

+2.5 Volts range (Analog device ADUC7X series, Cypress PSoC ), which initially look unattractive as far as the A/D is concerned.

However, when looking at the Maxim datasheet, although its take +/- 10 Volt input the internal reference is only 2.5 Volts, which is then scaled up interally to 4.096V. Secondly, many sensors with high voltage outputs simply have an internal final stage amp that scales the output voltage. So my questions are

1) Are those 2.5 Volt A/D's really at a disadvantage compared to the wide range stand alone Maxim type A/D's since they really work off 2.5 volt references? I think the only purpose of the wide range inputs is to eliminate the need to externally rescale large inputs, not to decrease the RMS noise to LSB ratio. (which is what I thought at first glance) 2) When Maxim scales up the 2.5 volts reference to 4.096 volts, what the purpose of that? I would think that the amp would amplify the noise too (unless its a differential amp). Or maybe they just want 1 count = 1 mV? 3) Sensors with 5V outputs typically are just scaled up to 5 Volts internally, and, with external parts, can be rescaled to 2.5 volts. If I do that am I hurting the noise/lsb ratio? I would think that if they are scaling up the output they are also scaling up the noise too (again unless they are using differential amps).

If there is a application note somewhere about this let me know please! thanks steve

Reply to
steve
Loading thread data ...

formatting link

Look at number of bits vs supply voltage.

Reply to
Ian Stirling

Hi Steve,

I'm not an expert, but you usually want to scale your analog input so that its full-scale value matches the full-scale value of the A/D. That way you use the full dynamic range of the converter.

Also, note that gain scaling at this stage (pretty much near the end of the analog processing chain) usually doesn't affect the analog SNR too much. (Rather, it is the early gain stages that degrade SNR, or noise figure, the most.) This is discussed, e.g., at

formatting link

Finally, note that if the analog SNR is worse than the digital SNR (i.e., about 6N dB, where N is the number of bits in the A/D), then that's OK - the lower bits will just be toggling with the noise. I.e., you STILL want to match the full-scale analog to the full-scale A/D input level, otherwise you'd be degrading the SNR needlessly.

Hopefully there was something here you didn't already know.

--Randy

bungalow snipped-for-privacy@yahoo.com (steve) writes:

--
%  Randy Yates                  % "I met someone who looks alot like you,
%% Fuquay-Varina, NC            %             she does the things you do, 
 Click to see the full signature
Reply to
Randy Yates

Somehow your opinion seems possible, but it is important to consider the amount of each individual component in your accuracy Start with the A/D itself. It has a specified accuracy included the effect of the reference voltage scaler and other factors (temp. range) which is expressed in for example +/-1LSB, which in your case would be 2mV. Now also your sensor has a certain noise level which is amplified by the signal conditioning. Lets say the voltage noise be 500nV/Hz^-2 at the

0...4.096V output. With a bandwidth of 10kHz this would be 50uVrms, the peaks might be 6 times higher, which is still only 0.3 LSB . So up- and downscaling is not of much impact because the converter and sensor noise is dominant.

If you do not use standard exchangable sensors with +/-5 or 10V range, you can better get an unconditioned sensor and build a dedicated amp + filter to drive the 0...+2.5V analog input. This can be done from a 3.3V supply which has become standard by now. You can use modern low voltage CMOS parts, needs some input protection as shown in the data-sheet.

--
ciao Ban
Bordighera, Italy
Reply to
Ban

The wide range parts (+- 10v) trend to be the older parts. Newer parts tend to target the lower voltages used in battery powered or high speed circuts. Noise-to-LSB ratio is not a particularly usefull way of looking at these things. See below.

Don't worry about what they do with the reference. Worry about your error budget.

Your system design should start with a specification of the overall performance. From there you can begin chosing parts and calculating your error budget. Broadband noise is only one part of that budget. Sample rates, gain and offset drift, and linearity are also important (or not, depending on wht you are trying to do). Sometimes noise > 1LSB is a good thing (google "dither"). Don't forget to analyze any averaging or other digital filtering effects on your signal.

There are lots of application notes. The key word here is "application". All of this issues depend on what the *application* is. So, what's your application?

Bob

Reply to
Bob

The question was application independent, namely, how to optimize the sensor to a/d interface for best noise performance.

I "worry" about the internal workings of a chip because it gives you insight concerning how the worse case specs were derived, and more importantly, how to exploit them to create a product that is lower in cost and higher in performance then the competition.

Yes I am sure if I blindly pick parts base solely on the req spec and then verify the overall design via a worse case tolerance analysis I will develop a reliable compliant product, but it may a product that is too costly, too heavy, too power hungry or too impractical that no one would buy it. For instance, the dither technique you mentioned (oversample/average) when implemented may well satisfy an overall performance requirement, but if I have to upgrade to 64x faster A/D and increase the DSP clock by 4x to implement it, I would consider that a poor design if I could of just fixed the cause of the problem in the first place (i.e., A/D to sensor mismatch).

steve

Reply to
steve

What is the output impedance of the sensor ? If both the output voltage and impedance is high, then the scaling would have to be done with a voltage divider with high resistors. In extreme cases the voltage noise from the voltage divider can be significant.

Also look at the input structure of the intended ADC. If it is some kind of switched capacitance type, the input impedance is not constant, but varies during each clock cycle, thus a high resistance voltage divider can be a problem.

What is the needed signal bandwidth. With a high resistance voltage divider, the stray capacitances from ADC input to ground will attenuate high frequencies. In this case a capacitor is needed across the upper resistor or even a proper capacitive voltage divider parallel with the resistors (as in oscilloscope probes) to flat out frequency response. However, a capacitive voltage divider will also load the sensor at higher frequencies, so the sensor behaviour should be checked in these cases also.

Of course, these are not very real issues, if the sensor output impedance is low.

What is the distance between the sensor and ADC ? Is the distance long or there is a risk of interference connected to the lines or expect ground potential differences ? Assuming the sensor output impedance is low and output voltage swing is large, placing the voltage divider close to the ADC would be a good thing (compared to using a sensor with a small voltage swing), since the interfering voltage would also be attenuated. Select the voltage divider chain total resistance to be the lowest value supported by the sensor output.

Paul

Reply to
Paul Keinanen

Hi, the sensor will be within an inch of the A/D, worse case, I don't expect significant ground potential differences,the A/D is a switched capacitive type, the only output spec I have on the sensor is that it can drive a 1000pF load, needed bandwidth is 1Khz, I'll soak in your suggestions, thanks.

Reply to
steve

...

If I had to connect such a sensor to an ADC, the first step in my design would certainly be to separate sensor driver and ADC input stage. I would probably place some sort of OPamp at the board input, which connect the sensor cable and which provide a signal at the output, which is independent of sensor specs like capacitive load. Then I'd design an input stage before the ADC which fits the specs of the ADC and which guarantees the necessary low pass behavior to avoid aliasing issues on the converted signal. Either I would insert another decoupling OPamp stage between the two parts, or - if costs or space requires - I would directly connect the both parts by a resistor - keeping the connection for further use as a test point.

If you directly couple sensor to ADC input, you'll probably see problems which could be avoided by my proposed design approach.

Based on a proper design, you'll certainly find out (as the other posters said) that voltage range or ADC selection doesn't have much influence on the noise behavior.

Analogue noise can be minimized in the input stage (sensor driver), and you'll certainly not find it in the LSB of your signal (from your selection of MAX 1271 I guess you talk of 12bit like ADCs).

Nevertheless you'll find noise of more than one LSB. There are several methods to remove or at least reduce it, most popularly by filtering the digital signal (averaging et al.). If this is not enough, and if your sensor signal doesn't have enough noise (is too smooth), dithering will help to further enhance your signal.

Tektronix use such technics in their digital oscilloscopes: with an

8bit ADC they achieve 14bit precision.

If there is a real (another) issue why you think you need a special voltage range or special treatment to reduce RMS noise, maybe you should share the background / your experiences. I'm sure, lots of ideas are lurking in some heads...

Bernhard

Reply to
Bernhard Holzmayer

The sensors I am working with (accelerometers/gyros) have buffer outputs intended to directly connect to A/D inputs , they also have variable bandwidth options (with external passive components) so I don't need the anti-aliasing filter, I assume the resistor your talking about is for current limiting to protect the A/D.

Yes thats seems to be what I am hearing.

I don't think there is. steve

Reply to
steve

No. If it's a proper design and sensor connection is persistent, there's no need to protect the A/D. If there's a socket/connector pair on a removable cable, this might be an aspect.

This resistor is only something that I've become used to and just for convenience: I never directly connect two parts of a design unless unavoidable.

If there's a resistor, this resistor can easily be removed, which results in two isolated, testable parts of my circuit. If there wouldn't be a resistor, isolating both parts might be difficult without damaging the board.

If I have the choice, I insert a 0805 SMD resistor with 0 Ohms for this purpose. This one can easily be removed with the soldering iron, and later on, even a spot of solder would repair it.

Bernhard

Reply to
Bernhard Holzmayer

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.