Hi All,
I'm sampling high-fidelity analog audio at 44.1 kHz with a 16-bit ADC. The analog audio is noise-free for the purposes of this question.
The ADC data stream goes to a microprocessor that compares the data in blocks of multiple samples to a previously stored set of data. When the ADC output data matches the stored data, the microprocessor generates an output pulse. Some amount of processing time "X" is needed to recognize a match and generate the pulse.
My question, again assuming zero analog noise, is: what is the time uncertainty of the output pulse? In other words, if I split the same analog audio into two of these circuits in parallel, how much could their output pulses differ in time? Is the answer simply the clock frequency accuracy?
Thanks is advance.