set_input_delay min and max (timequest)

Hello,

I have some problem understanding the set_input_delay min and max constraint. Assume that you have an interface that is connected to an FPGA. This interface has a clock (Clk) and a databus (DB). The datavalid window is centered around the rising edge of Clk. Assume a setup time of 1ns and a hold time of 2 ns. The clk period is 10 ns. How should I constrain this with the set_input_delay command?

thanks for helping me, Karel

Reply to
Dolphin
Loading thread data ...

Hi, this interface is input or output for the FPGA.

In the first case (input) you have to put set_input_delay -max A+B wherw A is the time CP_to_Q (clock to Q) + and B the delay of interconnection. this say to the input of FPGA that has clock period -(A+B) to caputred the data. If you don't know this value you could put 60% clk_period about 6ns.

In the second case (output), if I understood well it is your, you have to put set_output_delay -min 2ns to be sure that FPGA don't change the databus before 2ns ( and be sure to met the hold time) and set_output_delay -max (clk_period - Tsetup - Tdelayinterconn) is you don't know Tdelay interconn I think that 60% of clk_period is good.

have fun.

Dolphin ha scritto:

Reply to
Gianluigi

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.