I don't know where to ask this, so I'll try here.
Are analog to digital converters fundamentally, in their core inner design, basically tiny systems which operate like 555 timers, with a series of resistors and capacitors designed to sample ranges, essentially counting ticks per fixed units of time, resulting in the digital data necessary to perform an indexed lookup from the inner sampler that's in range, to produce an output bit patfern? With then some tail logic to prevent jitter beyond an expected operating range / frequency?
Thank you, Rick C. Hodgin