We have a product with a 12-bit ADC that's being clocked around 60 MHz, with the data poured into an FPGA. We want to do a good production test.
We can route a big triangle wave into the adc and take a lot of samples. Our thinking is that each bit should be high about 50% of the time, and we can set limits on that. This will catch bits stuck high or low, bits shorted to other bits, opens, logic races, and general misbehavior.
We already have a "logic analyzer" in the FPGA. It snapshots 1024 ADC samples and some other states in a big RAM. We could read that out (many times) and analyze each bit in our Python test program, but that would be slow. So the kids added 13 new 32-bit registers: total sample count, and one register per ADC bit to total up how many 1's that bit had. Now we can get 60 million coin tosses per bit line, in one second, and read that out over ethernet in a millisecond.
Turns out that 60 million coin tosses has some pretty powerful statistical behavior. The probability of heads being outside 49..51% is effectively zero.