Audio oversampling

I'm curious about how high sampling rates go nowadays with standard ADC chips.

I'm starting a project that requires sampling of 15 kHz analog audio and I need as high a sampling rate as possible without spending lots of money on the ADC chip.

BTW, this application has nothing to do with audio quality. The digital audio stream will be used for other purposes. It won't be converted back to analog. But the sampling rate needs to be as high as possible.

Thanks for any input.

Reply to
George
Loading thread data ...

It can go into the gigasamples per second. Please be more vague.

Reply to
a7yvm109gf5d1

Define "ordinary". Figure that you can easily get 100ksps with a 16-bit SAR ADC (but maybe not 16 _good_ bits), there's plenty of good audio chips out there, you can go up to insane sampling rates with 8-bit devices.

Start looking. If you get specific with what you really want, someone who's recently done a search may suggest some manufacturers.

--
Tim Wescott
Control systems and communications consulting
http://www.wescottdesign.com

Need to learn how to apply control theory in your embedded system?
"Applied Control Theory for Embedded Systems" by Tim Wescott
Elsevier/Newnes, http://www.wescottdesign.com/actfes/actfes.html
Reply to
Tim Wescott

OK, let me be more vague ... :o)

I'm taking a 15 kHz random audio source (music) and converting it to digital in an ADC. The streaming output from the ADC is tested on-the-fly against a specific data pattern of a few (less than a dozen) samples. If there's a match, the time it occurs is reported. I'm assuming 16-bit accuracy.

This same process is occurring in TWO devices looking at the same analog source. The devices are operating asynchronously with no common clock, though the clock frequencies are reasonably close to one another.

Bottom line, the accuracy of the time-of-occurrence measurement in each device has to be such that the two independent measurements agree closely in time with one another. This depends on several factors including ADC sample rate. Less time between samples means less uncertainty in the time that each device recognizes the desired data pattern in the common source.

I'm looking for I'm curious about how high sampling rates go nowadays with standard ADC

Reply to
George

It sounds like you are going to synchronize these streams by tossing out samples. [It is a form of phase locking.] So the worse case is the systems would be sampling half a cycle apart. That would be your

0.25uS, or 4Mhz sample rate. This is the minimum. So a video ADC would do the trick.

Hopefully I haven't oversimplified the problem.

Reply to
miso

digital

a

This is not a good idea for a number of reasons. It is not going to work this way.

in

sample

That does not imply fast sampling. If your signal is bandlimited to 15kHz, sampling faster then 30kHz doesn't buy any extra accuracy. BTW, time measurement of the 15kHz signal to the 0.5uS accuracy may require quite high SNR and a sophisticated processing. However if you insist on your mistakes, there is a number of the reasonable

16-bit ADCs which can sample to 1MHz or so. Check the Pilsar line from AD, and TI/BB too.

Vladimir Vassilevsky DSP and Mixed Signal Consultant

formatting link

Reply to
Vladimir Vassilevsky

Less that microsecond match detection? That will nominally require 10 Msps. Hard to get 16 bits at that speed.

Reasonably priced equipment (under US$ 500) will do 24 bits at 196 ksps,

1/50 of the speed you are talking about.
--
 JosephKK
 Gegen dummheit kampfen die Gotter Selbst, vergebens.  
  --Schiller
Reply to
joseph2k

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.