TRACE and Modelsim Timing Help

Dear Gurus,

I hope that someone would help me on how to interpret the following timing parameters. Thank You very much in advance.

I have to interface an ADC to a VirtexII-4 Board, 16 bit data + 1 clk; both data and Clk are LVPECL. I have successfully implemented such interface at a frequency of 300 MHz and now I would like to increase the frequency at 500 MHz. On each rising edge of the 500 MHz clk a new data sample is transmitted out by the ADC; the Virtex II interface to such data consists basically of a DDR registers in a single IOB and the 250MHz clock for these registers is derived by the incoming 500 MHz by means of a DCM. I have imposed the following OFFSET IN constraint in the UCF file:

TIMEGRP "InputData" OFFSET = IN 0 ps before "clk" ;

The output of TRACE is :

- Data Input Delay 1.745 ns (that match with the IOB setup time + LVPECL adjustment time as taken by VirtexII data sheet)

- Clock Delay 0.598 ns

Minimum OFFset In is 1.147 ns.

In this way by selecting a proper value for the initial phase of the

250 MHz clk I could met the IOB setup time.

Question: If I have well understood the minimum 250 clk arrival time I should set is 1.147 ns (so that clk delay + clk arrival time should be

= 1.745 ns); in this way the maximum clk arrival time I could set

should be 1.391 ns (in fact 0.598 + 1.391 = 1.989 ns) to avoid to go beyond the 500 MHz clk period and have another data value at the pad inputs. But I got right values with TRACE until the clk arrival time of the 250 MHz clk is below 2 ns, in this way it seems to me that the clk delay of 0.598 ns is not taken into account; what have I missed? I run even a Post P&R simulation to verify the values given by TRACE and I have discovered that the clk arrival time of the 250 MHz clk is not the expected value of 0.598 ns + 1.265 ns (supposing to have set such value in the DCM Gui) but 0.847 ns (so it seems that the value given by the simulator is 481 ps smaller than the value set in the DCM Gui without taken into account the 0.598 ns clk delay); I am missing something, but What?

Thank You very much for Your help.

GianniG

Reply to
GianniG
Loading thread data ...

GianniG,

I know this doesn't really answer your question, but why are you dividing down a 500 MHz clock via the DCM to run at 250 MHz internal? Distributing a 500 MHz clock on the board wastes power not to mention the EMI issues. In addition, I would imagine the Xilinx DCM adds jitter via the divide? Why not use a 250 MHz reference, or better yet, multiply up a 125 MHz reference?.

John

Reply to
John M

Because that's what his ADC provides. 500MHz sampling rate.

Because he doesn't want a sampling rate of 125MHz, he wants 500MHz. He's doing exactly the right thing, maybe XAPP685 would help? Or XAPP268? Cheers, Syms.

Reply to
Symon

Thank You all for the answers, I have already read the XAPP You mentioned, of course they help, but my feeling is that I am missing something about basic delay computations and misusage of design Tools and to cover this I asked help to the forum.

Thanks GianniG

Sym> > GianniG,

internal?

mention

yet,

He's

Reply to
GianniG

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.