Hi, all,
I don't have much to contribute to the LCR MOSFET argument except possibly some ethological insight: . But I digress.
I'm on a trip to SoCal to debug the pre-production models of my spectrometer. The SNR is pretty good, way over 60 dB (which is good for a SWIR spectrometer).
Looks like the last remaining problem is that the spectrum has a way of sort of swimming around a bit, i.e. the gross shape stays roughly the same, and the small scale noise is low, but there are small systematic variations on scales of 1/10 to 1/2 of the scan range, where multiple spectra don't quite line up with each other.
One thing I noticed is that the Hall effect shaft encoders are quite unstable with temperature, something like 3 arc min per kelvin. Since the grating only has to rotate about 8 degrees for the whole measurement, that winds up being, like, 3 nm/K, which is _horrible_. (They're US Digital type MAE3-P12-125-500-7-1: 12-bit PWM.)
The RC airplane servo that I used in the proto appeared much more stable on the medium scale motion, although it didn't do small steps very repeatably. It used a pot for the encoder, of course.
Right now we're working on putting a temperature controller on the encoder, mostly by thermally grounding the leads to a controlled plate, since the case is plastic and there's no contact between the encoder body and the shaft.
Is the tempco usually this bad?
Any additional wisdom on these things? We can rip out the encoder and bodge in a pot if we have to, but it'll limit the instrument lifetime fairly severely.
Thanks
Phil Hobbs