So, I need to bias a sensor with a few mA, and in my exceeding cleverness I came up with the following:
+18V o-------o---------------. | | | | .-. .-. | | 499 | | 49.9 | | | | '-' '-' | | | | >| |< MMBT3906 |----o------| MMBT3906 /| | |\ | | | | | | o------' | | | | | .-. | 18.2k | | | ~10mA | | | '-' V | | | | === === GND GND (created by AACircuit v1.28.6 beta 04/19/05The only problem is that while _I_ know that I'm a f***ing genius, the circuit seems to think I'm a dolt. Both my on-paper calculations and my SPICE simulations tell me that with a transistor output admittance of
200uS or so, the output impedance of the circuit should be on the order of 10uS. But the actual circuit is showing me something around 160uS.This is not good. This is not right. Moreover, I'm not sure if it'll work with the sensor I'm using.
So, before I go spinning conspiracy theories involving bad transistors or dirty boards (I've got three identical channels, with essentially the same characteristics), does anyone have a suggestion for where I'm going wrong here?
Thanks....