I have an input signal that is required to be high for a certain length in time before a halt request is issued to a processor. To detect this signal, three solutions popped into my head.
1- A shift register with all bits ANDed together. The output is taken from the AND gate and will not be high unless the signal has passed completely through the pipeline without going to zero(except for any glitches that wasn't captured by the first FF.2- A shift register with the Reset line (active low) of all the FFs tied to to the signal. The output is taken from the last FF. In such a case the last FF will never be 1 unless the signal was high for the whole period while passing through the entire pipeline. If it goes to zero any time before that, then all FFs (and hence the last FF) will be zero. This solution may be more elegant than the first since it doesn't really matter how long the pipeline is, while in the first, the more stages, the bigger the AND gate. It would also detect any glitches on the signal.
3- Divide the master clock using a binary counter until the desired time length is achieved, then use a variation of the previous two schemes but with a much reduced shift register length.Which solution would you choose? Are there any better approaches? I would personally choose the third solution combined with the second, since it means fewer FFs can be used for longer required times, and any glitches will invalidate the pipeline.
Thank you!