Video ADC

formatting link

Why make a 16-bit ADC that only has circa 8-10-bit linearity? I mean granted, video doesn't need much linearity, but then video doesn't need

16 bits either.

See Page 8, Differential/integral non-linearity,

Or am I reading the data sheet wrongly?

There has to be some quid pro quo for a sub-$20 chip that claims 70MSPS and 16 bits, given that the nearest similar credible chips are over $100 from TI or Linear/AD. But still...

I'm a bit shy of Cirrus anyway given the propensity of their audio ADCs to randomly melt after 12 months continuous use, unrelated to any actual abuse. We had some in a fleet of SoftRock SDRs and got tired of replacing the chips after the third time. The entire fleet dies at around 12-14 months.

Clifford Heath.

Reply to
Clifford Heath
Loading thread data ...

Ahh wait, I maligned Cirrus unnecessarily. It's the C-Media ADCs that melt.

Is AKM back from the dead yet?

Reply to
Clifford Heath

My #2 daughter works for Cirrus in Austin. I'm generally a fan of theirs.

It looks like the strange ADC is meant to be run mostly in 10-bit mode, which as you say is generally lots for video.

It also runs CID chips, though, which are mostly used in machine vision and other such applications, where there are lots of bright glints and where blooming is very bad news. (CCDs tend to dump charge into a whole line of pixels when one gets overdriven.)

CIDs work oppositely to CCDs: the wells start out full, and the photocurrent empties them. Thus there's no blooming.

So the extra bits may be intended for things such as flat-fielding, where you want to do a bunch of averaging to get lower noise. (You compute the gain and offset frames occasionally, and then use them for many data frames, so calibration errors give rise to fixed-pattern noise.)

Or it may have been a design blunder that didn't affect the main operation of the chip and so wasn't worth fixing.

I agree that it's weird, though.

Cheers

Phil Hobbs

Reply to
Phil Hobbs

Thanks for your thoughtful response Phil.

I guess I could ask Cirrus. But I wonder if your daughter might be more likely to get an honest answer, if you feel you could request that?

Clifford Heath

Reply to
Clifford Heath

10-bit processing isnt peculiar at all for standard 8-bit video signals (ITU Rec 709, etc), since some LSB errors may accumulate with subsequent signal processing (which there will be, much of, in typical consumer video products). Moreso, HDR or High- dynamic range, video products & content are in the marketplace associated with "4K" UHD TV, where 10-bit video is the standard (ITU Rec 2020). So > 10bit digital video is internal to the product.

Almost all 4K UHD TVs claim HDR capability (10-bit disaplys, >1000:1 contrast ratios), but still fall short (mostly because of low luminance output range, rather than visible D/A artifacts.)

regards, RS

Reply to
Rich S

Yes. The real question is why are there 16 bits, and how many of them can you actually use? I'm not interested in video, but inexpensive SDR.

Clifford Heath

Reply to
Clifford Heath

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.