I have a AD594 temp sensor (~10mV/C output) connected to the 10bit ADC on an AVR. The sensor output will be -375mV to +1015mv for the temperature range (-40C to +100C) I'm interested in. The ADC only accepts +V analog input so I'll need to bias the signal 'up' a little to get it above 0V. If possible the solution should be repeatble in production and not require individual calibration.
Specs: Using a TC7660 or TCM829 for the negative supply to the AD594. I'm using the internal 2.56V Vref of the AVR ADC. Vref is exposed on a pin. The AD594 can source 5ma on its output. The ADC has 100M input resistance.
Googling came up with the following suggestions: #1)Add a low voltage reference in series to temperature sensor.
#2) Try connecting the output of the temp sensor to the ADC through a 1k resistor. Take another 1k resistor and connect it between the reference voltage and the ADC input. This may not work if your temp sensor will not sink enough current.
#3) Many ADC's have voltage reference outputs that can be used to bias the analog input(s) of the ADC. This can be done by connecting a resistor from the ADC reference output to the analog input. The Voltage reference output can be bypassed to analog ground with a small capacitor to improve the ripple rejection. The bias resistor value can be selected based upon the ADC input leakage current. Consider a resistor value such that the maximum ADC input leakage current alone causes less than a 1 LSB voltage across the bias resistor. In this way the "offset code" of the ADC will not be overly input bias current dependent.
#1 seems straight forward enough, however, #2 only requires two resistors and #3 only one.
- Is one or is two resistors the better solution?
- #3 gives specifics on selecting the resitor value. Any help with this, I'm lost?