Data Acquisition Accuracy & Clock Jitter

Having a bit of difficulty sorting out the effect on signals when there is a bit of Clock Jitter in grabbing the Sampling point.

It appears that Clock Jitter DESTROYS the signal!

And, it is NOT a simple relationship. For example, low speed signals are barely affected, but high speed signals...

And when both are present it's another matter. Which means that clock jitter relates to the time signal when placing its 'stamp' upon your ADC's performance.

So how does one determine the clock jitter allowable?

And, what is the minimum clock jitter I could expect in a well-designed

200MS/s system? Is there some way to get better?
Reply to
RobertMacy
Loading thread data ...

once the signal is sampled all you have is amplitude so timing error translates to amplitude error

it is bascially ~ jitter(s) * slewrate(V/s) = error (V)

I'm sure google can find a more formal calculation of what you need for a given requirement

-Lasse

Reply to
Lasse Langwadt Christensen

Sample clk jitter has three components to consider: number of digitizing levels, freq of the highest acquired signal,and the sample clk rate.

The way to think about this is in terms of slew rate. ie, how much time does it take to create 1/2 LSB of error.

gl

Reply to
jdc

Does that imply that oversampling would improve by the sqrt of the number of 'over' samples; effectively improving the ADC system?

Reply to
RobertMacy

It's easy enough: The error in a sample is proportional to the product of the timing error and the rate of change of the input signal.

The allowable jitter depends on the resolution of your ADC. To give some idea, to digitize a 100MHz sine to

12 effective bits, you need clock jitter to be in the fractional ps ballpark. Your results suggest that your clock jitter is very bad indeed.

You get good jitter specs by choosing your sampling clock source wisely: Quiet stable quartz oscillators, good layout and decoupling. *Don't* put your clock through FPGAs or other shared logic. Treat your clock as if it was a sensitive analog signal.

Jeroen Belleman

Reply to
Jeroen Belleman

If you oversample and then filter the data downstream somehow, that will effectively reduce jitter. But it's complex, because "jitter" is complex. An FFT or bandpass filter will reduce effective bandwidth, if that's what you are doing.

Got any details of the application?

--

John Larkin                  Highland Technology Inc 
www.highlandtechnology.com   jlarkin at highlandtechnology dot com    

Precision electronic instrumentation 
Picosecond-resolution Digital Delay and Pulse generators 
Custom timing and laser controllers 
Photonics and fiberoptic TTL data links 
VME  analog, thermocouple, LVDT, synchro, tachometer 
Multichannel arbitrary waveform generators
Reply to
John Larkin

We always use a jitter attenuator/clock distribution chip to drive our high speed ADC's, using low skew drivers to independently clock the ADC's and FPGA.

Silicon labs do a good range as does TI. For example the Si5317 has an input clock freq. range of 1 - 700MHz and an output jitter of

Reply to
Andy Bartlett

This application note is apposite:

formatting link

--
John
Reply to
quiasmox

You can turn this into jitter * (some constant * frequency) * (amplitude), which would make it easier to determine the resulting noise if you have the spectrum of the input.

I'm pretty sure that "some constant" is 2 * pi, but I'd have to actually _work_ to figure it out.

--

Tim Wescott 
Wescott Design Services 
http://www.wescottdesign.com
Reply to
Tim Wescott

DON'T DO THAT!!!

--

John Larkin         Highland Technology, Inc 

jlarkin at highlandtechnology dot com 
http://www.highlandtechnology.com 

Precision electronic instrumentation 
Picosecond-resolution Digital Delay and Pulse generators 
Custom laser drivers and controllers 
Photonics and fiberoptic TTL data links 
VME thermocouple, LVDT, synchro   acquisition and simulation
Reply to
John Larkin

Thank you for Linear's App Note, the appnote has curves starting at 200 fS rms, I've been trying to live with 500 fS and can almost make the system work, then the AppNote goes on to mention the LT2209, which has 70 fS rms aperture jitter!! That'll do it. Less than 100 fS rms aperture jitter is possible. Since the my application already reduces the jitter by 10:1 [I think] that makes the 'effective' jitter 10fS, and that almost works!, thanks.

Reply to
RobertMacy

Thank you for the URL, forgot about Silabs as a source. I can live with more than 10% clock jitter everywhere BUT the data acquisition aperture. 'Rattling' around there causes havoc on the quality of digitizing the input signal.

Reply to
RobertMacy

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.