While I have only a 40MHz clock connected to a 10 bit ADC, nothing connected
the analog input sockets, the chip is Analog AD9218 (ADC chip) EVM board. On
the LA, sampled outputs are 0~16 for one channel, -32~0 for the other
channel (400 MHz
timing mode). Is my board faulty?
According to my simulation in digital design, my sampled inputs (10 bits
each) can not
have noise higher than 1 LSB (flipping small number of bit 0 causes
Is the setup I have workable or croaked?
I took a look at the data sheet
For the noise question:
1. The inputs need a low inpedance source. Someone already noted trying
to tie the inputs to ground.
2. The Sample and hold (as with all sample and holds) will mix any
extraneous noise, regardless of the precautions taken in the silicon.
3. A 10 bit converter usually has somewhere of the order of 8.5 ENOB
(Effective number of bits)
4. The output will definitely be flipping codes if the input is at the
I would not expect the samples to be varying up and down too far,
The datasheet shows pretty superior performance, but you still have a
limited differential input range of either 1 or 2V (depending on
device). So 1 LSB equates to just under a millivolt for the 1V range.
It's not hard to get a few millivolts of noise onto an analog signal.
Signal shielding and low inpedance drive are two of the various tricks
we have up our sleeves.
Keep in mind (you say you are demodulating a signal) that the various
errors on the A-D section can get you +/-18LSB (worst case ) offset
error, up to 8% fullscale (for 1V, that's 80mV) gain error and
differential nonlinearity (variances between the encoding levels) of up
to 1LSB. This all in addition to the irreducible (well, at the Nyquist
limit) +/-0.5LSB quantization error. Note that differential
nonlinearity acts as a curvature on the conversion curve which we would
like to be a perfectly straight line.
For the immediate question, for an unterminated input in a noisy
environment (and most things are noisy at the millivolt level), I don't
think you are seeing anything really unexpected. It can be dealt with
1) Turn on a an autoranging digital multimeter with the test leads
dangling. You will probably see the reading bounce around from +/-5 to
10 mV. Now twist the leads and short the ends. You probably won't see
more than 1 mV that way.
2) Put your finger on your sound system's input terminal with nothing
else connected to it. Set the volume control to a normal level.
Either procedure will demonstrate the need for thought and care with
analog signals. I'm sure you knew that, but it's easy to overlook.
Engineering is the art of making what you want from things you can get.
ENOB is relative to the maximum signal level and isn't much help if a
measurement of system noise is desired.
I might tie the inputs of my A/D converter to ground through a resister that
approximates the source impedance the converter would see in normal operation.
This should give a better idea of what is expected. I would not leave the
terminals open in any case.
Yeah, it seems a good idea to terminate the ADC input with 50 Ohm resister,
I will try that
with one 50Ohm termination.
my DAC output is said to terminate with a 50Ohm output resistance, while
when I connect
the output with a plain cable into oscilloscope with 50 Ohm termination, the
showed amplitude of only 100mV when digital outputs are in full swing. When
are zeros, the oscilloscope shows a 50mV peak to peak random noise.
That doesn't sound very promising. You've set it up for 20mA peak output so
your outputs in full swing should be varying between +/- 20 mA ? Through a
1:1 output transformer and into a 50 ohm termination I would have expected
close to 2V peak to peak.