Noise created by resistor used to reduce op amp input offset

Hello--

To preserve bandwidth and slew rate I've selected a high speed op amp with low wide-band voltage noise and current noise. The 1/f noise is only 7nV/Hz^(1/2) at 10 Hz, which makes the part suitable for lower frequency signals extending to DC.

However, the input offset voltage is typically 40 microvolts (300 microvolts max), which may need to be nulled since the op amp is being used in a non-inverting configuration with high voltage gain.

On my schematic, I have the op amp wired up in a non-inverting configuration. At the negative input, the feedback resistor is connected to the output and another resistor is connected to ground. I've also connected another resistor (with a value of 1 Megaohm) to this negative input. The terminal of the resistor that is not connected to the negative input is connected to the output of a 16-bit buffered DAC.

The idea is that the DAC will be used to adjust the voltage at the negative input, thereby nulling the offset voltage without using a trim pot. After shunting the positive input of the op amp to GND via a CMOS switch and load resistor, I'll use a microcontroller to sample an ADC connected to the op amp output and a minimization algorithm to select the proper voltage required to reduce the input offset.

However, I'm concerned about whether the 1 Megaohm resistor will inject significant noise into the feedback loop. Although I know how to calculate the noise density created by a very large 1 Megaohm resistor, I am wondering if this noise will be amplified by the op amp feedback.

If so, then is there a way to reduce this noise? Might an RC filter be an applicable way to go? How do you reduce noise created by a mechanical trim pot?

Reply to
Nicholas Kinar
Loading thread data ...

I assume the resistor from opamp minus to ground (call it R1) is much lower than 1M... ballpark 100 ohms maybe. In that case, the 1M adds no significant Johnson noise. It's essentially in parallel with R1 and as such doesn't change its value much.

Of course, any dac noise gets to the output with a gain of R2/1M, which should be pretty small. If dac noise is an issue, split the resistor and bypass the heck out of the junction.

But why not do the zero fix in software? Short the input, digitize the result, subtract that from future measurements.

John

Reply to
John Larkin

I don't quite follow your description but if you have a series R to help cancel input bias current effects at DC, simply bypass it with a C or R-C series network to reduce the noise.

Graham

-- due to the hugely increased level of spam please make the obvious adjustment to my email address

Reply to
Eeyore

ed

C.

S

Nick, try this. Instead of resistor voltage-noise, it's often useful to think of the noise current generated by a resistor. The familiar e_n =3D sqrt (4kTR) for voltage-noise density is modified by moving the R downstairs, giving us its current noise i_n =3D sqrt (4kT / R). Also, memorize 4kT =3D 1.6 x 10^-20 at room temp.

Taking your case, but now with R in the denominator, it's clear that higher R means less current noise into your summing junction.

Reply to
Winfield Hill

Yes, the resistor is close to 100 ohms. It's a good observation that since the resistor is in parallel, the change is minimal.

I was thinking of using cascaded RC or LC filters to get rid of noise, but this might prevent the DAC from changing the voltage quickly.

That's a great idea, John! Thank you for suggesting this. Would this also be effective for cascaded op amps (i.e. when the output of one op amp is fed into another op amp input)? I would think that it would be.

Reply to
Nicholas Kinar

Hi Winfield,

Thank you so much for your response! That's an interesting way of looking at noise in the system.

Memorizing the value 4kt is useful for calculations at room temperature.

It's interesting to note that with higher R, current noise drops. Since I work with electronics which must operate at lower temperatures (-40 deg C), lowering the temperature will reduce the current noise.

Your reply prompted me to take a look in that book that you've written. There's a really good section on noise and noise sources!

Reply to
Nicholas Kinar

Thanks, Graham. The noise can be easily reduced by an RC filter. I tried using an LC filter, but some very preliminary spice analysis showed that this could cause distortion in the output signal. A simple RC filter is probably the way to go.

Reply to
Nicholas Kinar

You can also just remember that a 60 ohm resistor has 1 nv/rthz noise, and figure everything else from that.

John

Reply to
John Larkin

Sure. Short the input with your cmos switch and digitize the output of the entire chain. We call this "software autozero."

In some of our products, like thermcocouple scanners, we have, say, an

8-input mux, and one of the inputs is a high-quality ground. All 8 inputs are scanned on a rotating basis. The "ground" value is regularly digitized in its turn and software lowpass filtered to make the internal variable ZOFF that's subtracted from all other channel measurements. The filtering removes most of the noise from the ground-measurement data.

The filter is just

ZOFF = ZOFF + (GNDSAMPLE - ZOFF)/2^N

where the divide by 2^N is a right-shift. If you have the luxury of working in floats, you can do

ZOFF = ZOFF + F * (GNDSAMPLE - ZOFF)

where F is a small number, 0.02 or some such.

This just simulates a 1st order RC lowpass filter.

The ZOFF value is also useful as a gross error check.

If you have multiple gain ranges, it may be prudent to have a zero factor for each.

John

Reply to
John Larkin

John's idea works IF you have enough headroom in your analog string. In my latest 10-bit ADC I capture the input offset voltage on a capacitor such that the natural sequence of events (SAR) subtracts out the _input_ offset before any gain. (I only have 2.7V minimum VDD to work with.)

...Jim Thompson

--
| James E.Thompson, P.E.                           |    mens     |
| Analog Innovations, Inc.                         |     et      |
 Click to see the full signature
Reply to
Jim Thompson

Thanks, John!

It's much, much better than trying to reduce offset by adjusting trim pots! This could literally take *forever* to achieve. And if a change in temperature occurs after you've finally done the adjustment, you would have to repeat the process again, and again, and again.

If the measurement time takes only 1 second (max), then would it be safe to assume that the offset is the same over the time of measurement? I would imagine the following steps of the sampling process:

(1) close switch and measure offset voltage; (2) open switch and take measurement of signal from transducer; (3) apply software autozero filter

I would suppose that with a MUX, you would not have to worry about this, since for every sample that you take, you also have a ZOFF value.

What's a high quality ground? Could this be created using an RC filter? I'm thinking of using an RC filter tied to GND.

So the ZOFF value will show the maximum error in the analog signal processing chain?

That's a really good idea! How would I separate out the contributions from each gain range?

Reply to
Nicholas Kinar

That's an interesting idea, Jim. How do you deal with the input impedance of the SAR ADC, which may change with frequency? Did you use a network analyzer and an impedance matching network on the input of your ADC?

Reply to
Nicholas Kinar

Let me clarify: Does the input impedance of your SAR ADC change with sampling frequency?

Did you use a network analyzer when developing an input matching network for the ADC.

Thanks, Jim.

Reply to
Nicholas Kinar

I auto-correct offset at _every_ SAR step. (I'm using the old flying capacitor stunt ;-)

The real stunt is how I get a 10-bit monotonic DAC with a crap process, but that has to remain a trade secret for the moment ;-)

...Jim Thompson

--
| James E.Thompson, P.E.                           |    mens     |
| Analog Innovations, Inc.                         |     et      |
 Click to see the full signature
Reply to
Jim Thompson

Which is why continuous auto-zeroing is good; it keeps adjusting for any drifts.

You can do that. The az filter only makes sense if the autozeroing is done as part of a continous repeated process.

In general, one prefers to not have the noise of the zero offset add to the noise of each data measurement. So signal-average or filter a stream of zero measurements.

No, just a short. But a short that's not trashed by ground loops or thermals. We did one thermocouple-based temperature controller where the short was a couple of inches away - on a solid ground plane - from where the t/c was grounded. Not far away was a 150 watt PWM heater driver. Voltage gradients in the ground plane made big temperature errors. Moving the ground sense point about an inch fixed it.

Yes. Check it for reasonableness.

Do the zero experiment at each gain setting and keep multiple ZOFF things.

John

Reply to
John Larkin

Ah yes, the capacitor is connected at the input of the SAR ADC.

Trade secrets could be traded. ;-)

Reply to
Nicholas Kinar

Thanks, John.

Sure, what I'll do is simply do is average the noise of the zero-offset for a few milliseconds, or I'll wait until the cumulative signal variance levels off with respect to time.

It's interesting how moving a single trace can fix so many problems.

I suppose that if the system has a fixed gain, then I would require only one ZOFF.

Reply to
Nicholas Kinar

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.