What's everybody's favourite method for protecting analogue output drivers? Say I have a nice beefy op amp driving a 50 ohm output from
+-15 V supplies. The output is series-terminated, so that there are no cable reflection funnies from more or less open-circuited patch cord connections. For small output signals, it'll drive a 50 ohm load fine, but I really don't want to have to deal with 4-5W of dissipation if somebody shorts the output and leaves it like that.My current best guess is a 100 mA I_trip polyfuse in each supply lead, with a series resistor in the input to protect the op amp from death when the polyfuse trips, and a Schottky diode to ground to protect it from supply reversal. That's five parts per output device, totalling probably $1 or thereabouts.
Alternatively I suppose I could put a single polyfuse in series with the output, but with a heavy load, that would lead to weird gradual degradation rather than a nice obvious and sudden refusal to continue.
Any better suggestions?
Thanks
Phil Hobbs