Observations on a UPS - follow up to a previous post

Recently I asked about suggestions regarding a UPS. I ended up getting an 875 VA 525 Watt "Geek Squad" model from Best Buy - yeah, yeah, everyone says Geek Squad stuff is overhyped junk, but at $69 on sale, the price seemed right.

It seems to handle my 2 computers fine - a PIV 2.4 gig and a PIII 933 mhz sharing a monitor. With both machines and the monitor on, the onboard readout shows them well below the unit's max capacity, drawing about .250 - .260 kw (which I assume translates to 250 - 260 watts) , with an estimated run time of 9 minutes with both computers. More than enough to get me through short hit outages with both machines running.

Interesting to note how much of a difference the monitor makes. Without the monitor - a 17" MAG CRT, the draw for both computers drops under 200 watts and the estimate run time for the 2 computers goes from 9 mins to 15mins. Over 20 mins with just one computer running but no monitor.

Since this thing has a built-in watt usage meter, any reason I couldn't hook it up to say a refrigerator or TV to check how much wattage they're using?

Reply to
Doc
Loading thread data ...

Yeah CRT monitors take a lot of juice, it's one of the reasons flat panels are so popular, though I still prefer a good CRT as it looks slightly better to my eyes.

Sure you can plug in other items, though motorized appliances will probably not particularly like the modified sine wave those things put out.

Your best bet for that is to buy a Kill A Watt or similar device, they're only about 25 bucks and will do so much more. You get accurate measurments of watts, volts, amps, volt-amps, power factor, and accumulated kwa and you can plug in anything you want.

Reply to
James Sweet

I don't like the way LCD/flat panel monitors look. The image isn't as sharp and loses brightness unless you're sitting dead-center in front of them, even the models that supposedly have a wider viewing angle.

Reply to
Doc

I'd have to disagree on this, if there is anything that is a clear win for LCD, it is image sharpness... it doesn't get any sharper than having individual pixel representation. This can be offset in perception if the particular LCD has poor contrast but it is still just as "sharp", though not as vibrant.

True but it's a computer monitor, how many positions do you really need to be in while viewing it?

Reply to
kony

Agreed, but only if in native resolution, as others have said, and only if what you are viewing is standing still. Even with the fastest LCD panels, motion blur is still a problem, although probably more noticable at the lower resolutions offered by 'standard' TV transmissions displayed on standard LCD TV sets. Much as I like the picture on my (expensive) widescreen HP LCD monitor, and given that I do agree with you about sharpness, I still have to come down just on James' side in that I too think that there is something fundamentally 'better' looking about a good fine dot-pitch CRT monitor, but if you asked me to define "good" in this context, I don't think that I could ...

Arfa

Reply to
Arfa Daily

Have you seen significant motion blur on a current generation smaller LCD? It seems everyone is upsizing which offsets the improvements being made.

Reply to
kony

Well, certainly on LCD TV sets, yes. Oddly enough, I was looking at just that in a store last night. They all had a studio-based news broadcast on them, and it was superb as long as they were in the studio, where everything was basically standing still, and properly lit. As soon as it cut to an OB on the other hand, there was motion blur on them ALL. Some were worse than others in that the blur was not just a function of panel speed, but also drive artifacts. These were not cheap sets either. Many were from big name houses. The larger screen sets actually seemed to fare somewhat better than the small ones in my opinion, and the plasmas were a little better again, but none of them produced what I would describe as a 'good' picture in this respect, compared to a CRT set of any size or vintage - even my 10 year old large screen Toshiba. There's a world of difference between LCD pixel switching times in the mS bracket, and fast phosphor reponses down in the uS range.

I actually think that at the moment, digital display technology - without wishing to open up *that* can of worms again - lags behind CRT display technology, by a significant amount. Next time you go to the cinema, look up at the booth window and see if you can see film looping around the ceiling. If you can't, then it uses one of those new-fangled DLP video projectors. Sit back comfortably with your popcorn, and wonder what's happened to your eyes, when the first car drives across the screen ... d;~}

Arfa

Reply to
Arfa Daily

Many people in their daily use cannot see any lag or ghosting from 19" and smaller LCD computer monitors.

If you can't actually see it, does it matter if it exists? I can play 50 FPS video or games running at over 50 FPS on a

19" LCD computer monitor and not see any problems except the obvious lack of contrast (but with CRT I am spoiled in this respect, having bought Diamondtron tube based monitors for the last few I used myself before switching to primarily LCD usage).
Reply to
kony

I sure can, maybe my eyes are just better than average, there's those "golden ear" audiophools I always thought were nuts, but maybe some of them aren't as nutty as I thought. I've got a high end 20" flat panel on my desk at work, it looks really good, but still not as good as the 22" flat Trinitron CRT I have at home. Geometry is flawless, but the picture doesn't look as smooth and clean as the CRT, it looks more "digital".

Reply to
James Sweet

I did not write "some LCD", I wrote about current generation

19" and lower.

It doesn't matter if you see ghosting on 20"+, for the purpose of the discusstion which is whether smaller comparable resolutions exhibit it.

If we were taking about higher resolutions than native to

19", then CRTs lose on another front because their refresh rate and pixel boundaries get so blurred it is no longer an accurate output.

Looking more "digital" is not necessarily a flaw. A video card does not transmit an infinitely high res, flawless image, it transmits pixels. Accurately representing those pixels is the monitor's job, not blurring them so they look more lifelike.

Reply to
kony

Hi James, goes it well ? Yes, that about says it. Perhaps it is just the level of discernment, and it *is* just us, but that doesn't explain how my wife thinks that the pictures are 'fuzzy' when anything is moving on them, but makes no such comment when watching our 34" CRT Tosh TV, or the cinema when it's projected filmstock, rather than a DLP video projector. She has no technical axe to grind, as it were, and is interested in the picture only for its entertainment value. Since I have had this high-end HP widescreen LCD on the computer, which she also uses, she has made little comment other than it looks "nice", which is true on the typically stationary pictures that are normally displayed on it. I have, however, heard her comment that the pictures on my son's (equally high-end) HP LCD are "out of focus", and that would be typically when he is playing a game. Being non-technical, "out of focus" is the best description that she can come up with for 'motion blur'

Arfa

Reply to
Arfa Daily

Only if it has DVI output, and you are making use of it. Many video cards still in common use, output three analogue waveforms created by hi -speed DACs with at least 16 bit inputs, via the VGA output socket, which the monitor, CRT or LCD, displays via pixels made up either from phosphor triads, or LC cells. As we live in an analogue world, I fail to see how you can contend that something which looks "more digital" is not flawed. If the display looks anything different from how the real world looks, then it is an inaccurate representation, which by definition, makes it flawed. If the CRT display does anything to make the picture look closer to reality, then that must make it more accurate, and thus less flawed.

I'm not too sure why you feel that a CRT monitor's refresh rate has any impact on the accuracy of the displayed rendition of the input data. High refresh rates are a necessity to facilitate high resolutions. The response times of the phosphors are plenty short enough for this to not represent a problem. I do not understand what you mean by a CRT's pixel boundaries (?) getting blurred, and how that fits in with refresh rate.

The last thing that you say is a very odd statement. If the CRT monitor does anything to make the image more lifelike, how do you make that out to be a bad thing? By logical deduction, if any display technology reproduces the data being sent to it more accurately than any other, and this actually looks less lifelike than reality, then the data being sent must be inaccurate, and thus flawed ...

Arfa

Reply to
Arfa Daily

Well the LCD I have at work runs 1600x1200 native, the same as I run my CRT at. This whole discussion is really moot, the CRT looks better to *me* and that's all that matters, I don't care what the specs say or what others claim. *I* see/notice the disadvantages of LCD panels, they bother *me*, and therefore *I* prefer a good CRT. If you prefer a flat panel, then get one, but this is a personal preference.

I want the image to look lifelike, the CRT does a good job of that, what do I care if that's not the "monitor's job"?

Reply to
James Sweet

False, while DVI is certainly better the higher the resolution, it is a separate factor.

It's pretty easy to understand once you realize that the picture the video card is attempted to display that was generated by the OS, IS DIGITAL. Anyone knows that conversion back and forth between digital and analog causes loss (to whatever extent, which must be a large extent if you deem the conversion to change the image enough that you feel it's better somehow).

WRONG. An accurate representation is to preserve as much of the input information as possible, not burring it so that it becomes in some way closer to smooth but simultaneously losing information in the process, becoming less detailed.

If all you want is blurry, smear some bacon grease on your screen!

Sorry but you are 100% wrong.

Reply to
kony

Well I'm sorry too, but it is you who is wrong. You would be right if we were talking a signal that was being converted back and forth between types or standards, but in the case of a computer generated picture, we are not. We are talking a digitally created image of something that needs to be an analogue one for our eyes to see. Whether the conversion from digital to analogue takes place at the video card, or at the face of the monitor, it is still a necessity that it takes place. The ultimate goal is to make it look as lifelike as possible. If you think that by making it look sharper or in some way different (or in your opinion, better) than real life, then you have a very odd understanding of what the word 'accuracy' means in this context.

Bacon grease ?? What a silly thing to throw into a discussion.

And what does your declaration of "false" about DVI mean ? If you want to talk card-outputted 'pixels' then you need to be talking digital, which is what a DVI output is. Otherwise, it's analogue as close as doesn't matter, from the VGA socket.

And there's no need to shout by capitalization. I am neither deaf nor stupid ... d;~}

Arfa

Reply to
Arfa Daily

Just grouchy, some days? (Like the rest of us.) ;-)

--
Service to my country? Been there, Done that, and I've got my DD214 to
prove it.
Member of DAV #85.

Michael A. Terrell
Central Florida
Reply to
Michael A. Terrell

It is your goal to blur the information, which is what the grease would do.

Pixel data is output by a computer to a video card. Since human vision has far higher granularity, it is not expected to look like reality except to the depth of granularity possible by that pixel data, resolution. If the pixel data is not preserved but rather smoothed to reduce your perception of the pixels, it is also removing "data" from the image, it is less accurate than the output was intended to be. Monitor manufacturers strive to accurately reproduce the image, not make it asthetically pleasing.

The goal is accuracy, not "lifelike". Lifelike and accuracy can coexist but it will come from higher resolution, not degradation of the signal upon output as you propose.

Reply to
kony

Capitalization is also used in text for emphasis, not just shouting.

Reply to
kony

Yeah,OK. I give in. You are right. I couldn't be more wrong if I tried. 35 years down the pan. Just as an experiment, I wiped goose grease all over my LCD monitor and guess what? YOU ARE RIGHT !!!! ( that's for excited emphasis, I'm not shouting at you ). My picture is now so blurred that it looks just like the real world when I don't have my specs on. Accuracy or what ?!!! Have you thought of marketing this idea ? You could put it in tubs and sell it on the net as "Kony's patent image enhancing compound (blended with REAL snake oil )"

My next plan is to see if I can drop a couple of bits on the input to the video card's DAC. That should increase the 'granularity' no end. This is another idea that could be put forward to monitor manufacturers to help them in their goal of making the reproduced image anything but lifelike, and better yet - *less* aesthetically pleasing !!

Boy, you're a lad ! All these wickedly good ideas ! If you don't market them yourself, *I'm* gonna, and get really rich. Then you'll be sorry ! ;-)

Arfa

Reply to
Arfa Daily

The problem with goose grease is the high number of geese it would take to treat all monitors. :-)

Ok, I never did think capitalizing as shouting worked very well anyway, since the person has to read the text either way for it to matter.

I'm not the one who wants to end up with less than the computer was designed to output. Yes the grease idea is crazy and has no merit but it is the type of degradation (albeit to a greater extent) causing your more lifelike image. CRT manufacturers didn't aim for that, it was just the result of the coating and thick glass. If LCD manufacturers wanted this, they could put a thick diffuser sheet on the front.

Ok!

Reply to
kony

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.