Since A/D converters are not identical, they might produce slightly different digital values even from a steady analog voltage. Thus you might get at least 1 LSB high square wave from a constant input. Thus adding sufficient dither is critical.
Yeah, flash converters sounded great, but because of the spread in the response times (caused partly by the resistor string connected to the other inputs of the comparators), their aperture jitter wasn't too good.
The first and last one I used was a TRW TDC1038, circa 1994. (I picked it mostly for fun--I didn't need anything that fast, but the extra cost didn't matter in a POC proto.)
This is solvable by calibration, at the expense of a minor part of the full scale (say twice the worst case error you want to correct). I did something similar to get integral nonlinearity from 2-3% down to below an LSB for a 13 bit convertor ...nearly
30 years ago...(can't be true that time passed, can it).
The real issue is the one Rick talked about, sampling window/accuracy and of course clock jitter. And budget... :).
We also used a 20 MSPS 8 bit Flash ADC made by TRW to digitize and average ultrasonic signals @ 6 MHz for measuring the inner hull in nuclear power plants. Scopes were analog then and could not average.
Since we were one of the first customers, we were given an extra ADC in a plastic cube. You could see the resistor ladder to the comparators with the nekkid eye. The chip was the size of a thumbnail.
It was the nearly 68000-sized black chip with the white TRW-Logo in the right bottom.
I also did the top board that replaced the entire 19" crate in the blurred part some years later, from concept to FPGAs, layout, soldering and software driver, and @ 200 MSPS. The computation loop pipeline was 23 stages deep. The Siemens ECL to 8 lanes parallel CMOS chips were just like made for us. :-)