Hi,
I'm trying to create a high speed enable/disable control for an analogue signal. I found some SD5400 DMOS ICs (quad DMOS N-type FETs) from Linear Systems that switch quickly (2ns) and I've got a simple spice circuit to test this.
The test circuit is: a 2V p-p (with 1V DC offset) sinewave applied to the drain, a DC control voltage applied to the gate, and a 1MOhm load (modelling the input impedance of a buffer) at the source, ie:
Vsin in GND SIN(1 1 1000k 0 0) M1 in gate out GND sd5400cy Vdc gate GND DC 1 R1 out GND 1000k
However, the circuit doesn't simulate as expected. With a Vg = 1V, the FET is off and the output is pulled to ground via the load resistance. This is as expected.
But when I apply a 5V gate voltage (so that the gate-source voltage is always greater than the Vth=1.2V of the FET), it turns on but the output looks distorted, as shown here:
Shouldn't the output be following the input? The FET is operating as a SPST switch, and its Rds(on) is insignificant in comparison to the load.
Also, I've found that increasing Vg decreases the deviation of the output from the input, ie with Vg=15V and Vg=100V:
For completeness, Vg=1V is also included:
Can anyone give me any hints as to what is going on here?
Is there something wrong with my idea of using a FET as a switch directly? Am I missing anything?
Cheers,
Steve