I was just checking out various opamp datasheets. I encountered this opamp:
In the few lines of text, it says that the input noise voltage is 9 nV/ sqrt(Hz) at 10kHz. But in the table at the bottom of the page, it says that the input noise voltage is 18 nV/sqrt(Hz) at 1kHz.
How did they calculate this?
I'm curious because if I were to use an opamp (not necessarily this one) then I would like to know how much input noise voltage it contributes for a certain bandwidth, for instance audio band from 20Hz to 20kHz. There was another discussion that I saw (don't remember the link) that mentioned that at one particular design, the amplifier was designed with a 40dB gain and the input noise got amplified to about
100mV.