Ok, when hooking up a sensor output to a/d input we want to minimize the ratio of RMS noise to the LSB value. I would think we want a sensor with a high voltage output and a A/D with a high voltage input so that the LSB is well above the noise floor. Stand alone a/d's usually have nice high input ranges (e.g., Maxim 1271 with a +/- 10 Volt input), but many modern DSP and microcontroller A/D inputs are usually +5 volts and the really modern and attractive ones are in the
+2.5 Volts range (Analog device ADUC7X series, Cypress PSoC ), which initially look unattractive as far as the A/D is concerned.However, when looking at the Maxim datasheet, although its take +/- 10 Volt input the internal reference is only 2.5 Volts, which is then scaled up interally to 4.096V. Secondly, many sensors with high voltage outputs simply have an internal final stage amp that scales the output voltage. So my questions are
1) Are those 2.5 Volt A/D's really at a disadvantage compared to the wide range stand alone Maxim type A/D's since they really work off 2.5 volt references? I think the only purpose of the wide range inputs is to eliminate the need to externally rescale large inputs, not to decrease the RMS noise to LSB ratio. (which is what I thought at first glance) 2) When Maxim scales up the 2.5 volts reference to 4.096 volts, what the purpose of that? I would think that the amp would amplify the noise too (unless its a differential amp). Or maybe they just want 1 count = 1 mV? 3) Sensors with 5V outputs typically are just scaled up to 5 Volts internally, and, with external parts, can be rescaled to 2.5 volts. If I do that am I hurting the noise/lsb ratio? I would think that if they are scaling up the output they are also scaling up the noise too (again unless they are using differential amps).If there is a application note somewhere about this let me know please! thanks steve