circuit to measure signal level / log amp / log math

I want to Accurately (within 1dB or so) determine the signal strength of a signal coming into an amplifier circuit. The input signal level can vary dramatically, and the circuit includes AGC. I am thinking that I need to do the following:

The attenuation level of the AGC circuit is logarithmically related to a current output from the AGC circuit. To accurately convert this current to a voltage that represents Attenuation, and to provide for temperature compensation of the log conversion, I would use a Maxim MAX4206 Log Amp.

I would then convert the Output Signal to DC, convert it to log, and add it to the AGC voltage obtained above.

What all am I overlooking?

Is there a simpler way?

Thanks, Scott Kelley

Reply to
Scott Kelley
Loading thread data ...

For +/- 1 dB accuracy, I'd suggest that you measure the signal level before the amplifier. I realize that would require a separate amplifier, but that is what I would explore. Of course, if this is a "one-of" design problem, that is a different matter. Then, I'd try working with the amplifier and the AGC voltage. However, it might not be easy to meet your spec.

Reply to
Charles Schuler

The signal level can vary pretty dramatically. At least 60 dB, possibly a good deal more.

What would be the advantage of doing it this way, and how would you deal with the large dynamic range?

Reply to
Scott Kelley

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.