I've been doing analog stuff since the 1970s, used to design HV power supplies stable to a few ppm, but it is still amazing to see what can be achieved e.g.
How how can this possibly work? 24 bits of noise-free performance, never mind linearity, never mind any sort of absolute accuracy, seems completely unachievable in the analog world.
I find that a 12-bit ADC inside a microcontroller might just about give you 12 bits, though usually you get 10 useful bits and to get the other 2 you have to take say 30 readings and average them.
Once away from a microcontroller but still on the same PCB, with careful use of grounding and ground planes, and perhaps even a shield over the relevant bits (and a ground plane on the back of the PCB) one can use a 16-bit ADC and end up with only 1 or 2 bits of noise.
I would imagine that to get better than 16 bits one would need to put the ADC in a shielded box, well away from any logic etc, but even the clock+data lines are going to radiate noise inside. Maybe one is supposed to bring them in over fibre. Maybe the timing is such that no transitions on the control signals are necessary during the actual conversion cycle?
At work we make a product which uses a 12-bit ADC (ADS7828) which we calibrate to < 0.05% by using a precision voltage source, 0.1% 15ppm
0805 resistors and storing calibration coefficients in an EEPROM (I saw the other thread here on 0.01% resistors) and we get pretty well the full 12 bits out of that. I'd like to go to a 16-bit ADC one day but I am very sure it won't give us more than maybe 2 extra bits that mean anything...