Target = XC3S400 Spartan 3 Tool = ISE 8.2i
If I understand correctly, the DCM does not use a PLL to multiply-up the input frequency; it's a DLL and it generates all required frequencies/phases by selecting outputs from a tapped delay line. I heard these taps are only tens of picoseconds apart. Is this so? Why then is the peak-to-peak jitter, as calculated by the DCM wizard, so large e.g. hundreds of ps?
For Spartan 3 designs, the tools do not automatically take DCM jitter into account. To get it included, I've manually added INPUT_JITTER 0.82 to the end of my external clock constraint:
TIMESPEC "TS_EXT_CLK" = PERIOD "EXT_CLK" 20 ns HIGH 50 % INPUT_JITTER 0.82;
Half of this figure then appears as "clock uncertainty" on the timing analysis report, and PAR works that much harder to get closure. I don't like including DCM jitter this way. The input clock is clean. Is there a neater way to specify it on the DCM outputs where it belongs?