Why do we do a quadratic add of jitter when calculating total jitter. For example, with an input jitter of +/- 100 pS, and a DCM jitter of
+/- 100pS, will there not be an instant (with very low probability) that when the input hits the -100pS, the DCM follows it with another - 100pS, leading to a total time period narrowing of 200pS?- posted
17 years ago