I was thinking of using an instrumentation amp instead of a normal op amp for a x10 gain stage, which has to be low voltage noise, because I wanted both inputs to be high impedance (I also considered a simple voltage follower on one input). I noticed something about the noise specifications I did not understand and hoped someone here could advise what is going on.
The normal op amp I originally considered is an LT6202 from Linear Tech, which I've found to be a very stable, easy to use device in the past. It has a 100MHz bandwidth and an input noise spec of 2.8nV/rt Hz. This is presumably 28nV/rt Hz and a bandwidth of 10MHz at an (inverting) gain of10 by the time it gets to the output.
Comparing this with Analog Devices' AD8253 in-amp, I see its noise is specified as 45nV/rt Hz at a gain of 1 (alas at a lower frequency so one cannot truly directly compare!) but only 12nV/rt Hz RTI at a gain of 10. Does this mean that this in-amp's noise is LOWER at higher gain (presumably due to some feedback trick) and thus better than the simple op amp's? Or does it mean the voltage noise at a gain of 10 is 120nV / rt Hz at the output and hence much worse than the normal op amp's?
It also strikes me that the normal op amp's noise specs probably don't include the thermal noise from feedback resistors, so the 2.8nV/rt Hz figure is optimistic, whereas the in-amp's noise figure will be a truer reflection of what's achievable.
Since noise types vary with frequency, it is relevant to note the circuit needs to pass signals from DC up to 300kHz, maybe a little higher. So I'm concerned about low frequency (DC) drift as well as the overall AC fuzz emerging from the stage. Input levels will be about 20mV max.