Trying to sort out circuit simulation artifact from something that may be real.
Does the noise come up [because of the inability of the OpAmp to reduce] when the OpAmp is driven too fast at too large a signal level?
Circuit is a simple LT1028 in an 'inverting' configuration with power of
+/-12Vdc. GNDed non-inverting input. Rinput = 1k. Rfdbk = 1k. Cfdbk = 20pF. Load is a simple 10k in parallel with 10pF. VERY simple inverter.The drive is a sinusoid 1MHz of 7Vpk, which the LT1028 can't keep up with, making the output a 'triangular' signal around 2 Vpk.
Now I wish to consider noise: Using LTspice I get an 'expected' noise density function over the range of ?? to 1MHz that seems reasonable for the LT1028. With almost no input voltage and doing what I call .tranoise analyses, where .tran and .noise analyses are combined, I get approximately the same noise density function. HOWEVER, when the above sinewave is applied which obviously overdrives the LT1028, the noise density function increases around 10dB.
Thus, my question. In real-life should I expect the noise density function from an overdriven OpAmp to increase?