Hi, What are the practical limits (in terms of signal to noise ratio) of filtering a known (say 1kHz) signal from noise and other signals using digital signal processing when you want to measure the amplitude of that signal accurately? I'm working on a system that has to measure very small resistances (of the order of 1mOhm, for measuring battery resistance) to a good accuracy (better than 1%). The system injects a sine wave current using a FET and then measures the resulting sinewave voltage across the resistance using an instrumentation amplifier and an ADC (12 bit, may need to go to 16-bit). It then uses synchronous averaging to average out one cycle of the sine wave (averages 1000 cycles), calculates the RMS of that cycle and uses that to calculate the resistance (based on the known level of the injected current sinewave). The filtering needs to be very good because in our application the resistance we are measuring may have a large ripple current flowing through it (UPS's often use the batteries as capacitors, so they put quite large currents through the battery). This works pretty well and the synchronous averaging does a very good job of filtering the signal (is an exceptional bandpass filter), but I am struggling to get the accuracy I need. The current needs to be fairly low to stop the FET overheating, so I'm using around 1 to 4 amps (peak), meaning that I am trying to extract a 1 to
4 mV (p-p) sinewave from within a 45mV p-p signal (with a predominant frequency of 300Hz, with some components of 50Hz , 100Hz and 600Hz). For a 10mOhm resistance the measurement signal is 10mV and the noise/ripple signal is around 140mV. The accuracy is typically not better than 2%. This doesn't seem too hard for DSP if you just wanted to detect the signal (e.g. for digital communication), but I need to accuractely detect levels, so I'm looking for a reality check to make sure that I haven't already reached the limits of what's possible. TIA Charles- posted
17 years ago