Hello,
I have some problem understanding the set_input_delay min and max constraint. Assume that you have an interface that is connected to an FPGA. This interface has a clock (Clk) and a databus (DB). The datavalid window is centered around the rising edge of Clk. Assume a setup time of 1ns and a hold time of 2 ns. The clk period is 10 ns. How should I constrain this with the set_input_delay command?
thanks for helping me, Karel