triangle

It has a lot of parts, but it's parts that we have

How can you possibly know that? Have you been peeking in on him?

Reply to
John S
Loading thread data ...

"Tom Del Rosso" wrote in message news:ng08jr$9bd$ snipped-for-privacy@dont-email.me...

Overshoot -- the op-amp doesn't react instantaneously. You get three errors:

  1. Finite op-amp gain: it's wired as an integrator, but loop gain drops at some point, so input voltage will become nonzero.
  2. The op-amp itself is an integrator, but not an ideal one (it has additional poles and delays near fT), and at a different rate than the circuit is designed for.
  3. Feed-forward because the op-amp output pin isn't an ideal voltage source, even for good bipolar parts. This becomes a gross error for most R2R types.

Simply limiting the input bandwidth to fT or less, will more-or-less take care of this. Arguably it's kind of hacky because you're just rounding it off with an RC (or better), but it isn't arbitrary: the cutoff is some ratio of fT.

Tim

--
Seven Transistor Labs, LLC 
Electrical Engineering Consultation and Contract Design 
Website: http://seventransistorlabs.com
Reply to
Tim Williams

OK, given the triangle, I can sweep the ADC range and analyze the codes.

Given a random ADC sample, any selected ADC bit should have a 50% probability of being true, a coin toss. So if I take N samples and get X ones, I can decide if a bit is broken by seeing if X is reasonable. That involves the accumulated binomial distribution.

There are lots of online calculators for the cumulative binomial distribution, but most seem to use single-precision math and break for practical values of N and X. The Wolfram one seems to do huge-precision math

formatting link

and handles N=5000, which most others won't.

Looks like N=500, K=330 is good. The probability of getting more than

330 1's (or less than 170) is about 1e-13.

Wolfram has a lot of cool stuff.

--

John Larkin         Highland Technology, Inc 

lunatic fringe electronics
Reply to
John Larkin

How much certainty are you going for? The traditional five 9's= 99.999% ?

Reply to
bloggs.fredbloggs.fred

I'd rather have no false errors (below 1% probability) on, say, 1000 production units, each with a 12-bit ADC, which is 12,000 bits to test. So something south of 1 PPM probability would be OK. But the way the binomial thing works, just a small increase in X takes you from parts-per-thousand to parts-per-trillion.

The data from the ADC to the FPGA is LVDS, and I'm guessing that an open or a short anywhere on the chip or the board, or some clocking error, will really trash the bit statistics [1]. We can do gain tests to check the first two or three MSBs, but for the LSBs we'll use the triangle statistics thing.

The interface to the box is serial, slow, so we want to keep N down.

We could get compulsive and do bit-bit correlations or something, but that's probably not necessary to catch part and production errors. If the signals were single-ended, that might be more reasonable.

It's a 250 MHz ADC, so the timing window is +-2 ns. The guy who did the FPGA initially used the wrong clock edge, and it actually/usually worked at room temperature, but it was very delicate.

[1] unless a line-line short has no effect on the data at all! That's possible, I think.
--

John Larkin         Highland Technology, Inc 

lunatic fringe electronics
Reply to
John Larkin

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.