screen DPI and picture resolution

Seems the bigger the monitor, the worse the DPI..

CRT, aka boob tube (Dell E551): * 800x600 gives approximately 75DPI (depends some on adjustment H,V size). * 1024x768 gives approximately 98DPI (depends some on adjustment H,V size).

Flat panel monitor (Dell E171FPb, screen is 13.3x10.75): * 1280x1024 gives approximately 96DPI.

Flat panel multi-input TV (Samsung LN32AA450C1D, screen is 27.5x15.5): * HDMI input 1920x1080 gives approximately 70DPI. * PC input is substantially worse.

So, the old "obsolete" CRT monitor surpasses virtually all of the modern, "sexy" viewing thingies.

That being said, how come a video (say Star Trek) looks SOOOoo spiffy on a 50 inch wide monster (where the calc spec would be roughly 38DPI)?

Reply to
Robert Baer
Loading thread data ...

You tend to stand further away from a big screen, so the number of dots per steradian ends up higher. It's all got to be processed on your retina, after all.

--
Bill Sloman, Sydney
Reply to
Bill Sloman

On Sun, 24 May 2015 03:13:53 -0700, Robert Baer Gave us:

Get a 4k. Even the smaller form factors are sweet.

Reply to
DecadentLinuxUserNumeroUno

On Sun, 24 May 2015 03:51:45 -0700 (PDT), Bill Sloman Gave us:

I had a flat faced Viewsonic which had a Sony tube. It was at 0.21, and that was about the top, before LCD puked all over CRT industry.

Still used in medical imagery scenarios (CRTs), but the newer AMOLED stuff is amazing.

Someone needs to catch up with reality (again) and it isn't me.

Reply to
DecadentLinuxUserNumeroUno

In early days when the CRT monitor driver was more 'analog'; You could take a really lousy monitor resolution, even as poor as something like 200 lines, and due to the ability of the monitor to properly display the 'phase' of the signal obtain the equivalent of over 1200! I know I had great difficulty making digital monitors, even with 480 by 640 fixed pixels and NO phase capability, gain the equivalent 'eye' acceptance.

Reply to
RobertMacy

CRTs went up to 2048x1536 or so, give or take possible oddball versions. That's pretty sharp, out of 21" or so.

Laptops are surpassing HD now, so the DPI is finally climbing again. Not to mention Apple's things with excessively high resolutions. I find those screens rather annoying, because very few PC programs know how to scale correctly, so you have to set Windows to "100% desktop size", and either run at native resolution and squint, or throw away the pixels and run at a lower resolution.

Tim

-- Seven Transistor Labs, LLC Electrical Engineering Consultation and Contract Design Website:

formatting link

Reply to
Tim Williams

The problem with shadow mask and similar tubes (including Trinitrons in horizontal direction) is that you can't really address any single "RGB pixel" due to the shadow mask. In reality, you have to observe the Nyquist sampling theorem and use "fat" (low pass) electron beams to avoid Moire issues etc. (i.e. Nyquist issues)

For computer generated graphics generated for a specific LCD/plasma resolution, you do not have to worry about this. Only trying to display some real images at different resolution, you really need pay attention to the sampling theory.

Reply to
upsidedown

I'm using a Sony GDM-FW900 CRT, a 24-inch, 42 kg monster claimed by some of its owners to be the best monitor *ever* made. The ~23-inch viewable area with 16:10 aspect ratio has a maximum resolution of 2304x1440 at ~120 dpi. Personally, I wouldn't be so bold as to claim that it's "the best ever made" but, apart from the high pixel density, it also displays a superb tonal range that puts any LCD monitor I've seen to shame.

Reply to
Pimpom

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.