Okay, so I have this big nasty laser. It produces 20-ps pulses tunable over a wide range (420 nm-10 um), which is very nice. Two not-so nice features: the rep rate is 20 Hz, and the pulse-to-pulse amplitude variation is +- 5% or worse, which means it's close to 2:1 p-p. Using that with a sampling scope translates to a 25.6 second sweep time for
512 points, plus lots and lots of averaging to get the noise down to something reasonable. Say an hour per trace--frustrating.A streak camera would be a beautiful solution--I could get a whole trace per pulse, and would need many fewer averages, but they're too expensive and take up too much room.
I have a nice Tek TDS7704 7 GHz scope coming, which will help some, although it's really still too slow--the budget didn't stretch as far as those 12-GHz monsters from Agilent.
The worst problem is how to trigger properly from a pulse train, in the face of the amplitude noise. Right now, I use a ~2 GHz photodetector(*) feeding a digital delay generator and a mixer to pick off the 8th pulse in the sequence, and stuff that into the trigger input of my 11801B sampling scope. This produces about 10 ps jitter, which is uncomfortably large given the pulse width. Small pulses get sampled later, which makes the amplitude variation worse.
I can look backward in time by about 50 ns due to optical path delay, so it might be possible to normalize the pulse height somehow before the trigger.
Any bright ideas?
Thanks,
Phil Hobbs
(*) It's a Thor Labs InGaAs photodiode running straight into a Mini-Circuits MMIC amp with no coupling cap.