Suppose one (actually, me) were firing an adc at some fixed rate, ballpark 12 KHz in this case, and the input to the adc was a sine wave of fixed but unknown frequency. The adc will take a bunch of samples, ballpark 1000 maybe, and I want to compute the mean (ie, dc value) and the mean of the abs value of the samples (ie, the ac value). That works, but sometimes the input frequency aliases against the sample rate and messes up the data, like gives a big average dc value when there's really none there.
(This is not a Nyquist issue; the sample rate may be above or below the sinewave frequency.)
So maybe I can fuzz up the sample rate so that it can't alias against any constant sinewave frequency.
I could add a pseudo-random delay after every adc sample shot; but the mean sample rate wouldn't change a lot. Or I could add successive delays, essentially sweeping the sample rate down.
Any ideas?
John