Mostly at higher frequencies, above the useful range of frequency response in the feedback loop. I saw a board design that used an op amp and an n-FE T as a linear regulator. The power source to the pass FET was a switcher t o prevent a lot of dissipation and the op amp was powered from the 12V inpu t so there was plenty of drive to the gate. The reference voltage was the same as the output voltage, so with the FET in a common source configuratio n there was no gain other than the op amp which has a 1 MHz gain-BW product .
So in the frequency range above 100 kHz say, I would expect the control loo p to have minimal impact on noise from the switcher. However, even though the FET configuration has no gain from gate to source, it should have gain to minimize the transmission of noise from the drain to the source. Raisin g the drain voltage will increase the current raising the source voltage. Of course that reduces the gate-source voltage which acts to prevent the so urce rise... however, that is without considering the FET capacitances.
The gate-source and gate-drain capacitances will bring the gate up to allow the drain noise to transfer through to the source. Is there a reasonable way to mitigate this effect? Someone suggested adding capacitance to the g ate to swamp out the effect of the gate-drain and gate-source capacitances. If that can be done without impacting the response of the control loop, i t seems like it might work. Or is this doomed to fail because it *will* im pact the control loop at a level before does what is intended?