How come DVD players boast of 10 bit resolution per channel, presumably lots of engineering effort went into achieving this?
People pay a premium for this and ask for it. Video cards for PCs output 8 bits per channel.
Then they connect it to a LCD panel where you need insider information just to find out that it's a 6 bit per channel display. Which most are.
So why are companies on one side lying and hiding information about the true performance of their LCD products, but other companies produce specs and hardware that can never ever be used fully?
And what the hell is the deal with "HD" panels with oddball resolutions like 1366x768 or 1440x900? Is there like a backlog of 12 trillion gate drivers that can only handle 1366x3 pixels that people want to use up?
Whhyyyyyyyyyyyy????
It's like if CD players only played back 12 bits per channel at 22KHz sample rate, but all the studios released 24 bit/96KHz SACDs, and everyone claims they can tell the difference.