I have been playing with my shiny new photodiode amplifier (BW=75MHz A=35k) and my trusty spectrum analyzer set to 'noise' mode. I'm dissecting the thing to try and find areas where some simple tweaks might improve the SNR.
The photodiode is a 500um PIN device with a NEP of 0.008pW/RtHz and a dark current under -5V bias (on the anode) of about 1.25nA.
The amplifier circuit under examination takes it's input from the photodiode cathode through a common-base amplifier (BFG25A/X) and from there into a low noise op-amp transimpedance amp. The common-base amp is biased with 12k resistors to +/-5V to get the required bandwidth from the circuit. Basically the circuit from Phil Hobbs' book in the 'Photodiode Amplifiers' section.
Could someone please explain the following effect?:
- When the photodiode is removed (i.e. just a biased cascode into the TIA) , the output noise is 80nV/RtHz at 50MHz. This matches back-of- the envelope calculations of what this should be.
- When the photodiode is connected, I get 160nV/RtHz!! Note that I have covered the photodiode window with copper tape to remove the possibility of DC bias, cut down the device leads, etc -- I think all I should be getting at this point is dark current, correct?
This additional 80nV/RtHz seems excessive -- though the system still meets spec, I would like some 'margin' in there and wouldn't mind reducing the noise if possible.
The only source of 'noise' current that I could think of would be from the bias supply, however, I am using a low-noise LT1964 LDO for this, and am supplying the bias through two 10k resistors with a 1uF and a
1000pF ceramic cap arranged in a 'pi' configuration. This -ve supply IS shared with the -ve rail of the TIA.Before I tear into the bias of the PIN (building a second -ve bias supply from an LDO), is there any other source I have missed or could this really be all from the bias voltage?
Thanks for any help you can offer