I want to Accurately (within 1dB or so) determine the signal strength of a signal coming into an amplifier circuit. The input signal level can vary dramatically, and the circuit includes AGC. I am thinking that I need to do the following:
The attenuation level of the AGC circuit is logarithmically related to a current output from the AGC circuit. To accurately convert this current to a voltage that represents Attenuation, and to provide for temperature compensation of the log conversion, I would use a Maxim MAX4206 Log Amp.
I would then convert the Output Signal to DC, convert it to log, and add it to the AGC voltage obtained above.
What all am I overlooking?
Is there a simpler way?
Thanks, Scott Kelley