In news: snipped-for-privacy@sdf.lonestar.org timestamped Wed, 6 Jun
2007 20:34:07 +0000 (UTC), Andrew Smallshaw posted: "[..] Often the requirement is simply for resolution, absolute accuracy isn't too important. This is the case for audio for instance. [..]"
Hello,
I do not understand the distinction. I agree that absolute accuracy is not always important and that the ten most significant digits of a low quality 16 bit analog to digital converter might not be as faithful as a high quality ten bit analog to digital converter, and I agree that the least significant bits of an analog to digital converter are less likely to be as faithful as the most significant bits, but I do not believe that a sixteen bit ADC is equivalent to nothing except a ten bit ADC whose output is leftshifted by six bits. That would result in a datatype which has a resolution of 16 bits but clearly no more accuracy than a reading of ten bits. I believe Andrew Smallshaw was talking about something else but I do not understand what. Would you care to explain? "[..] the eye is only good for eight bits [..] [..]"
I do not know what the limit is, but I believe that it is significantly above sixteen bits and below 33 bits. I believe that much true color graphical work is done at 24 bits.
Regards, Colin Paul Gloster