How critical is keeping the timing relatively stable in synchronous communications? Obviously the slave is synched to the master's clock but is there ever any problems if the clock timing is extremely unstable? Say varying up to 10% on average but even peaking to 100% or more in rare circumstances.
I'm writing a windows app which sends data to the parallel port but because its pre-emptive there can be extreme latency in the timing. The data is always synched with the clock so there is no issue with that but just of the clock's frequency varying a great deal. I imagine since its synchronous comm. that it should matter but just wondering if there are instances it could?
Lets say, for example, that its a clock at 1Mhz but then stops completely for several ms. Could this cause any problem with any device?
I know its the nature of synchronous devices only to send/recv data on a clock transition and it would seem that frequency variations wouldn't matter but I just want to be sure that its not going to be an issue.
Thanks, Jon