To all,
I am currently using an example from a book which codes a simple FSM. The code is seen below. I simulate the following FSM obtained from a text book under Quartus 4.0 with the condition that "in1" signal is low for one clock-cycle commencing at the negative edge of the clock for an entire period. During this time, the state machine is in the "START" state, and the output changes to sequence = "continue" and the output changes immediately given a small delay (less than 1/2 clock period). I have sketched out the timing diagram as it appears in the simulation from the text book. For some reason, when I simulate this same state machine in Quartus with the same 10 ns clock period, I end up getting "out1" delayed by more than half the clock period. I'm simulating using a stratix chipset speed grade -6. Unbelievably long time of a propagation delay, so does this sound right to anyone?
---- ----- ----- | | | | |