Hi All,
I'm having a very hard time understanding the full picture on amplifier stability, even if I am an engineer. For instance, if we select a transistor that is unconditionally stable at our frequency band of interest -- let's say from 900MHz to 1000MHz -- but the transistor is only conditionally stable at all other frequencies: So, we bias, match, and resistively stabilize the transistor so that we finally see, in the linear simulator, that K is greater than 1, and B1 is greater than 0. This should indicate that we now know that the amplifier will not oscillate under any input/output impedance conditions. But what happens when you place a filter at the input or output port of our newly stabilized amplifier? Since the stopbands of the filter are anything *but* 50 ohms when the amplifier is looking out-of-band, wouldn't there still be a chance that this "unconditionally stable" amplifier could oscillate, since it is only based on a conditionally stable transistor?
Thanks!
-Bill