How do you test a 24Bit ADC

Can anyone recommend a way to test the resolution and a accuracy of a 24bit converter? Does this require expensive, calibrated, super precision equipment, or is there a simpler (cheaper) way to do this?

much thanks!

Reply to
Fibo
Loading thread data ...

You need a voltmeter that is certified linear to at least the precision of the ADC a secondhand Solartron 7061 might hack it if you don't need traceability back to a certified reference. Drift will be an issue.

Expect to see quadratic droop and certain bits systematically wrong.

Testing an ADC or calibrating it to ideal behaviour against a high quality standard reference is sometimes done for mass spectrometers.

You could probably hire something certified for a month if needed.

--
Regards, 
Martin Brown
Reply to
Martin Brown

Reply to
Lund-Nielsen, Jorgen

Am 12.06.2014 05:43, schrieb Fibo:

It coud be worth to read the Linear Technology App Notes Nr. 86 and 120

Jorgen

Reply to
Lund-Nielsen, Jorgen

it converter? Does this require expensive, calibrated, super precision equi pment, or is there a simpler (cheaper) way to do this?

There have been a couple of published schemes. The one published in Measure ment Science and Technology a decade or so ago is rubbish - I refereed it a nd recommended against publication.

There was a rather better scheme published in one of the IEEE journals, a y ear or so later, which dealt with all the issues that I'd raised in my revi ew.

IIRR - and its a long time ago - the second system depended on generating a very high quality sine wave.

Jim Williams tested an 18-bit ADC that way

[1] Jim Williams and Guy Hoover "Test 18-bit ADCs with an ultrapure sine-wa ve oscillator" EDN August 11, 2011, pages 19-23 with an
formatting link
20Ad%20Free.pdf

The National Semiconductor/Texas Instruments LME49710 integrated circuit operational amplifier introduces quite a lot less distortion than the Linea r Technology parts that Jim Williams was constrained to use, and has been u sed in a Wein Bridge oscillator to make a sine wave where the harmonic cont ent was some 140dB below the fundamental, good enough to test a 24-bit ADC.

--
Bill Sloman, Sydney
Reply to
Bill Sloman

it converter? Does this require expensive, calibrated, super precision equi pment, or is there a simpler (cheaper) way to do this?

This might help

formatting link

They are talking about a 20-bit converter, but it's very likely that your "

24-bit" converter is really only good for 20-bits. The slower sigma-delta A DC's all pump out three 8-bit bytes of data, but the last 4-bits tend to be noise, rather than signal. Marketing is happy to ignore this inconvenient fact, but the data sheets usually spell it out, albeit in small print.
--
Bill Sloman, Sydney
Reply to
Bill Sloman

There are several articles in the Jim Williams Analog Design books - 1 & 2 One article is entitled "Get all the Bits You Paid For" I mention it since it addresses the issue of "orphan bits," the lsb's.

I have seen a test whereby you just run the ADC many times and look at the distribution of outputs. Could a mercury battery provide the test signal? Those standard tests are done to test the noise in the feeder circuit more than the ADC itself. BUT...It does bring up the issue of whether you ant to test the ADC in isolo or in a practical circuit?

Just my 2 bits worth, an admittedly non-expert opinion.

Reply to
haiticare2011

4bit converter? Does this require expensive, calibrated, super precision eq uipment, or is there a simpler (cheaper) way to do this?
2
e

Those standard tests are done to test the noise in the feeder circuit more than the ADC itself. BUT...It does bring up the issue of whether you ant t o test the ADC in isolation or in a practical circuit?

You really want to test all the possible input voltage levels, with a a mor e or less static signal and when the input is slewing at a realistic rate.

The low-distortion sine wave isn't too bad, particularly if you can get awa y with over-driving the input by about 50%, so that the sine wave can have an appreciable slew rate as it cross into the out of range regions.

Amplifying random noise and just seeing how even the histogram of the vario us output values is would be useful - though less useful, since you won't k now which input voltages caused the bad outputs. The histogram ought to be a Gaussian distribution, or the central chunk out of a Gaussian distributio n).

It doesn't directly test for slew rate problems - though if you low-pass fi ltered the random noise and boosted the amplitude enough to keep it more or less filling the full-scale range you could get some kind of handle on sle w-rate-related imperfections.

--
Bill Sloman, Sydney
Reply to
Bill Sloman

24bit converter? Does this require expensive, calibrated, super precision equipment, or is there a simpler (cheaper) way to do this?
& 2

the

l? Those standard tests are done to test the noise in the feeder circuit mo re than the ADC itself. BUT...It does bring up the issue of whether you ant to test the ADC in isolation or in a practical circuit?

ore or less static signal and when the input is slewing at a realistic rate .

way with over-driving the input by about 50%, so that the sine wave can hav e an appreciable slew rate as it cross into the out of range regions.

ious output values is would be useful - though less useful, since you won't know which input voltages caused the bad outputs. The histogram ought to b e a Gaussian distribution, or the central chunk out of a Gaussian distribut ion).

filtered the random noise and boosted the amplitude enough to keep it more or less filling the full-scale range you could get some kind of handle on s lew-rate-related imperfections.

And none of this says what his application is or the circuits he is using i t in. In fact, what do people use 24 bit ADC for? Isn't there an information theo ry or a quantum limit there somewhere?

Reply to
haiticare2011

:

it in.

eory or a quantum limit there somewhere?

I've used ADC's of that sort of accuracy exactly twice, once in a very prec ise weighing head (which used the progressively increasing weight of a sing le crystal of GaAs as it was being drawn from a pool of molten GaAs) is the primary control signal for the induction heaters that determine exactly wh ere the freezing point lay with the - rotating - crucible. I didn't have an ything to do with the ADC - just fed it the signal that got digitised.

The other time is written up in

Sloman A.W., Buggs P., Molloy J., and Stewart D. "A micro-controller-based driver to stabilise the temperature of an optical stage to 1mK in the range 4C to 38C, using a Peltier heat pump and a thermistor sensor" Measurement Science and Technology, 7 1653-64 (1996)

We used a 20-bit ADC as one half of a Wheatstone bridge, where the other ha lf was a thermistor and a 78k7 +/-0.01% 15ppm/K metal-film resistor.

The quantisation noise from the ADC limited temperature resolution to +/-10

0uK.

Local lowish-frequency noise - 0.0025Hz which probably came from the lab ai r-conditioning unit - limited us to a stability of +/-1mK, so 20-bits was a bout right.

If we'd wanted to use a Pt-1000 temperature sensor we would have had to wor k a bit harder.

The Johnson noise limit in that circuit was around +/-7.5uK peak-to-peak.

I once worked with a much faster 18-bit DAC system (in an electron beam mic rofabricator) where we had to rework the output amplifier to get the input amplifier noise down below the quantisation noise, which was a unique exper ience.

The problem didn't show up until we were asked to write our patterns in a n ew, and appreciably more sensitive electron-beam sensitive resist, and coul d consequently move the beam a lot faster, so the thermal noise from the am plifier input showed up as a bobbling on the written track as the beam went from moving faster than it should have done to slower, and back again. Ugl y, but fixable.

--
Bill Sloman, Sydney
Reply to
Bill Sloman

link didn't work for me.

Reply to
RobertMacy

There are practical applications. Especially, in audio range,

I use 24 bits to extend and improve interpretting sensor information in order to run more effective algorithms on the data sets from NDE Eddy Current Sensors. Overall, lowered the noise floor to less than 1/100th what was previous, which in eddy current systems enables 'viewing' to twice the depth. High cost to squeak a bit deeper. Thus, a 1/4 inch diameter air core probe can 'see' through 1 inch of aircraft aluminum. Cleaner data, can do more with the information.

Over statement and giving away here, but the trick is to work with fundamental tones, which by the perversity of the electronics ALWAYS gets distorted, so don't pay attention to any of their upper harmonics, throw them away, and doing so, you can make the system 'appear' to be almost perfect. 24 bits will get you down close to 0.1 ppm error

You're balking at 24 bits? at low frequencies there are 32 bit ADC's specifically for processing seismic sensor activity. Take a look at TI's ADC and app note As you go up in frequency, 18 bits will give you equivalent of 24 bits, think 'dithering'

Reply to
RobertMacy

Thanks for the responses, I had read some of that Jim Williams stuff and was wondering if there was an easier way, maybe this is just not an easy thing.

The application is for a large dynamic range current sensor, from 1mA to 100A. Taking a reading across a 1milliohm shunt.

It's a DC current, but I would like to do the same for AC in the future

I don't need all 24-bits, but I'd like to figure out a way to know how many I'm getting.

If this gets too tricky, I may do two seperate ADC's with two different gain stages to get my range.

much thanks!

Reply to
Fibo

If it's a delta-sigma, which it likely is, a sine wave test won't work, because the ADC will have a wide integration window.

You can test it with just a string of stable resistors, and a lot of patience.

Imagine a series string of very stable (Vishay?) resistors, with various values. Apply Vref to the ADC and to the string, and digitize all the taps. Flip the string, and remeasure. Something like that.

The numbers that come out of the ADC are dimensionless (fraction of Vref) so no external artifacts are needed to do calibration. It's like balancing weights.

An old GR decade voltage divider box would be interesting. They were pretty good.

Expect something line 18-20 bits of real accuracy from a "24 bit" adc.

--

John Larkin                  Highland Technology Inc 
www.highlandtechnology.com   jlarkin at highlandtechnology dot com    

Precision electronic instrumentation
Reply to
John Larkin

Oops. Sorry.

formatting link

They seems to have been fiddling with their web-site again.

--
Bill Sloman, Sydney
Reply to
Bill Sloman

As a bonus the lower bits of this "32-bit" ADC probably make a pretty good random number generator.

--sp

Reply to
Spehro Pefhany

you should watch the lower DIGITS of Geometrics magnetic field meter, its display shows 12 digits! with the last ones more than a blur.

Hmmm 12 digits is what? 40 bits? so sensitive that you can see the 'blur' pattern changing because there was a bus moving around 2 blocks away.

Reply to
RobertMacy

thanks got it!

Reply to
RobertMacy

Yes, almost 40 bits. That's pretty incredible.

What's the measurement principle? Something involving time, I assume.

--sp

Best regards, Spehro Pefhany

--
"it's the network..."                          "The Journey is the reward" 
speff@interlog.com             Info for manufacturers: http://www.trexon.com 
Embedded software/hardware/analog  Info for designers:  http://www.speff.com
Reply to
Spehro Pefhany

Righteo.

I missed the explanation. Something to do with microwave frequencies, kicking electrons into higher orbit, watching them fall back down. Sounded like 'super' precession on a tiny level. But the frequency measurement yields a lot of 'accuracy'

Reply to
RobertMacy

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.