Can someone check my logic on this? It sounds simple but I'd appreciate someone confirming what I'm trying to do.
I have CD-quality streaming audio digitized at 44.1 ksamples per second,
16-bit samples.I also have an approx 10 msec duration recorded segment of the same audio, in the same digital format as above.
I'm shifting the digitized streaming audio in real time into a comparator and looking for a data match between my sample and the streaming audio.
Question: What is the maximum theoretical time lag before a match can be detected, ignoring quantization noise and processing time? Is it one-half the sample period or some other value?
Thanks for your help.