I've got a Dell P1130 (Sony G520 in disguise) that is annoying me a little. I run it at 1600x1200, and am looking at a tet picture that consists of alternating (1 pixel wide) black and white vertical lines. At a 60 Hz refresh rate, the picture is very crisp (ignoring the nausea-inducing flicker). The individual black and white lines are quite distinguishable. At
85 Hz, it becomes a little more blurred - the lines are still distinguishable but the difference between them is less. Increasing the rate to 100 Hz, which is my target (I'm flicker-sensitive enough to be able to spot 85 Hz without using my peripheral vision) almost turns it into a uniform grey mush.Obviously, there's a bandwidth issue here somewhere ...
The video card is a PowerColor 9800 SE, which is identical to pretty much every other 9800 vanilla or Pro (presumably all using the ATI reference design). Cable is ... I'm not sure. It came with the monitor.
The question is, is it possible/likely that the VGA cable is the bottleneck? At 100 Hz refresh rate, my calculations put the pixel clock at about 270 MHz. The video card has a 350 MHz RAMDAC (which unfortunately says nothing about the stuff after the RAMDAC) and the monitor is rated up to 370 MHz so although the pixel clock is getting up there, it's not really pushing the limit of either component.
On the other hand (and this is the only bit that's really relevant to this newsgroup :) ), how good is your average VGA cable as a lowpass filter? And more importantly, is there any difference between an average VGA cable and a really good VGA cable? While the extra layers of shielding obviously help against any interference, this isn't the problem in this case. Finally, where's a place in Oz where you can get a good VGA cable? There seems to be a lot of hyped-up stuff around which makes finding actually good stuff harder ...