anti-aliasing

As addressed elsewhere, I have legit reasons for seeking a general solution, not to mention that it's an interesting problem. I'm also planning a couple of future measurement products that absolutely have no control over the signal frequency, and need accurate signal analysis from an ensemble of samples, so it's doubly interesting.

John

Reply to
John Larkin
Loading thread data ...

You could not be more wrong. The phrase I don't like, and what you really mean when you say "chaotic," is AD HOC. There are many really bad features that go along with the descriptor AD HOC, the primary one of which is you don't really know what you're doing. I use 'you' here in the general sense and not in reference to you specifically. The only thing I know for sure is, whatever you come up with, it will be ugly. LOL

Reply to
Fred Bloggs

Actually Fred, you perpetually pontificate, but you don't win, place or even show. SHOW us a product of your design.

...Jim Thompson

--
|  James E.Thompson, P.E.                           |    mens     |
|  Analog Innovations, Inc.                         |     et      |
|  Analog/Mixed-Signal ASIC\'s and Discrete Systems  |    manus    |
|  Phoenix, Arizona            Voice:(480)460-2350  |             |
|  E-mail Address at Website     Fax:(480)460-2142  |  Brass Rat  |
|       http://www.analog-innovations.com           |    1962     |
             
         America: Land of the Free, Because of the Brave
Reply to
Jim Thompson

You take it as pontificating because you don't know anything. Testing to achieve reasonable coverage and characterization of ad hoc algorithm performance is an ongoing and important area of state-of-the-art research.

Reply to
Fred Bloggs

How about some positive suggestions instead of negative BS ??

...Jim Thompson

--
|  James E.Thompson, P.E.                           |    mens     |
|  Analog Innovations, Inc.                         |     et      |
|  Analog/Mixed-Signal ASIC\'s and Discrete Systems  |    manus    |
|  Phoenix, Arizona            Voice:(480)460-2350  |             |
|  E-mail Address at Website     Fax:(480)460-2142  |  Brass Rat  |
|       http://www.analog-innovations.com           |    1962     |
             
         America: Land of the Free, Because of the Brave
Reply to
Jim Thompson

Of course I know what I'm doing. I sell heaps of stuff to the biggest areospace, scientific, and instrumentation people on the planet. Some serious fraction of all the submicron ICs made in the world are exposed by eximer lasers that my gear fires. My gear is also in close to half the analytical NMR systems made, and tests - at production and maintenance levels - jet engines and apu's that, probably, everyone in this ng depends on to keep them alive.

What do you do?

But new ideas have to come from somewhere, and that somewhere isn't in equations or textbooks. Of course ideas must be filtered for quality, in a rigorous manner, but you've got to have ideas first. Of course the process must be ad hoc on the front end, and the goofier the better. When you hope to do something you've never done before, and maybe nobody has ever done before, of course you really don't know what you're doing; that's the fun part. Especially when you've already committed to shipping the thing in five weeks.

And our stuff isn't ugly; we regularly get written compliments on its elegance and quality, and on the quality of the documentation and support. Show us something you've done.

John

Reply to
John Larkin

"John Larkin" schreef in bericht news: snipped-for-privacy@4ax.com...

Why not simply measure the sine waves frequency and choose the sample rate accordingly? As long as you don't know that frequency it's hard to predict the influence of any measures you take.

petrus bitbyter

Reply to
petrus bitbyter

It doesn't have to move it far--the aliased frequency goes from 0 to 1 cycle per N samples, which gives you a beautiful plot of the waveform. What's not to like?

If you do it three times, at least two will have similar offsets always. Majority rules.

Cheers,

Phil Hobbs

Cheers,

Phil Hobbs

Reply to
Phil Hobbs

Blather.

John

Reply to
John Larkin

Yes; underline filtering is very powerful, as it puts emphasis on whatever it oerates upon (usually typed characters).

Reply to
Robert Baer

Hmmm....what about some external hardware? Add a +peak detector and a -peak detector and feed their outputs to the ADC?

Reply to
Robert Baer

I only have about a square inch for the BIST stuff, and the board is done. And I want to measure both the AC voltage (not just its peak) and any DC offset. The biggest problem is to resolve a small DC offset (say, 100 mV) by averaging samples of a big (+-10 volt swing) sine wave.

Firmware is free in production, so there ought to be a way to do this. Besides, it's an interesting problem.

Frequency-domain artifacts in fast ADCs are often zapped by adding a couple lsb's of wideband noise to the signal, to smear out quantization and dnl errors. This kills specific spectral artifacts at the expense of raising the wideband noise floor.

Nuclear detector histogramming ADCs also benefit from adding noise, to improve differential linearity and equalize bin size. Some clever guy figured out that you can add analog noise (with a dac) and then subtract out the digital equivalent, best of both worlds.

A delta-sigma dac zaps a horrible pwm component by spreading it out across the spectrum. A delta-sigma adc converts a useless 1-bit adc into a nice 24-bit one, same idea.

So if a Nyquist-scoffing sampling adc has aliasing artifacts in the signal spectrum, it seems like time-dithering the triggers can do the same thing, pulverize an alias and sprinkle it lightly across the noise floor.

John

Reply to
John Larkin

I don't believe a word you say. Be sure and burn off a copy of that post to take to your doctor so he/she can recommend the best therapist for your megalomania...

Reply to
Fred Bloggs

You think rhat my web site is a personal fantasy, that nobody actually buys the stuff?

Lately we're selling a lot of tach/overspeed and thermocouple stuff into Russia, for the pipeline business. They're ramping us fast and are rolling in money. They pay us in advance, by bank transfer, in dollars. Think we should charge euros?

What do you do, Fred?

John

Reply to
John Larkin

Suppose you take samples at a fixed rate. If aliasing is giving a false value for the offset, then wouldn't looking at even and odd samples (without resampling), or every third sample, give different values for the offset by effectively shifting the point in the wave that was sampled? Then only if that check showed a big discrepancy would you have to resample at a different rate, very infrequently giving a longer test.

-- John

Reply to
John O'Flaherty

Thinking again, that wouldn't work if the input were _exactly_ at the sampling frequency, since all samples would be at the same point in the wave. So take the first 500 samples at one sample rate, and the second at a slightly different rate. If there's a difference between the two data sets, or among subsets within each set, only then do you have to resample, otherwise average the two results.

-- John

Reply to
John O'Flaherty

I was thinking along similar lines. There really is no need to change the sample rate unless there is *perfect* synchronization between the sampler and the samplee. The real problem is knowing how many samples to take in order to get accurate results.

He (Larkin) could probably set up tests to empirically determine the number of samples needed (for a given amount of accuracy), but there would always be a level of doubt since there could cases (as a function of production variances in the two clock sources) that give bogus results. Your technique, however, would reduce this chance (at the expense of time).

Either way, I'm glad it's not my problem. ;-}

Bob

Reply to
BobW

My first guess is that sampling with well distributed (Gaussian?) noise should do it. But it probably takes some math to prove it right or wrong.

--
Reply to nico@nctdevpuntnl (punt=.)
Bedrijven en winkels vindt U op www.adresboekje.nl
Reply to
Nico Coesel

If you treat each sample event as an impulse, there is probably an optimum resulting spectrum, and a corresponding time-between-samples scatter algorithm that may not be just bounded random delay.

Some really smart person should work this out.

John

Reply to
John Larkin

What I was thinking is, there's not much time burden in looking at subsets of the data, at least compared to resampling. Then, in the infrequent case that there is a bad result, more time to resample would separate the cases of coincidental aliasing from actual failure of the test.

-- John

Reply to
John O'Flaherty

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.