Gentlemen,
I'm working on the concept for an instrument that needs to take a number of current sources (photodiodes, 44 off) and find the ratio between each of the 43 currents with the 44th (the largest). The currents are in the range of, say, 100 nA to 10 uA.
Electronics design is a bit out of my skill set but I'd like to give it a try.
I could try to A/D-convert as soon as possible and divide digitally but I suspect that it is more sound to do the following:
Mirror the 44th current with BJTs and use Hobbs' laser noise canceller to form the differences of the logarithms of the currents and A/D- convert these.
Now, my question is: is it at all feasible to replicate a current so many times with any kind of bandwidth and precision?
I will have to do this with discrete components so matching is going to be a problem. I anticipate using small signal RF transistors to get a reasonable beta, and having a calibration procedure to get rid of remaining errors.
That said, I don't want calibrations to be a crutch for doing things wrong in the first place.
The measurement bandwidth is going to be quite low (not fixed yet but probably less than 200 Hz) but it feels right to do the ratioing out to a few MHz.
Comments and suggestions are most welcome
Chris Egernet