LCD VGA input circuit

How do LCD monitors get the correct dot clock frequency to sample the incoming VGA? If all you have is a HSYNC frequency, do they do a look up function or something more fancy? What chips are typically used to recover the clock, or is it built in to some huge SOC?

Reply to
a7yvm109gf5d1
Loading thread data ...

Something more fancy: A phase-locked loop, derived from HSync.

Since the LCD "knows" it has, e.g., 1280 pixels to display, a PLL is configured such that it generates 1280 pulse (pixel clocks) between the active edges of HSync. Take a look at the data sheet for a digitizer meant for LCDs, e.g., the Analog Devices 9884:

formatting link

---Joel

Reply to
Joel Kolstad

I appreciate how the clock is generated, I want to know how the thing knows _which_ dot clock to generate. I can think of many situations where the same HSYNC would have a different dot clock. Look at it this way. You are in a cardboard box. All you have is a tiny slot through which I slip you a piece of paper written 15743 Hz on it. Now guess which video mode I'm in, and what the correct dot clock is?

How?

Yup, but I want to know how the thing knows it needs 1280 dot clocks per line. It must have a timer, measures the HSYNC, looks up which video mode is closest, then programs the genlock?

"A Voltage Controlled Oscillator (VCO) generates a much higher pixel clock frequency. This pixel clock is divided by the value PLLDIV programmed into the AD9884A, and phase compared with the HSYNC input."

You see, you still need to program the value yourself via the I2C interface. You don't just toss in a HSYNC and it magically determines the dot clock.

I'm asking because of a personal project that's been on hold for a long time.

Reply to
a7yvm109gf5d1

The guy that built the display told the PLL designer that that's how many pixels the display has. Actually, if you had 1280 pixels, you probably wouldn't use 1280 * Fh for a dot clock, because then you'd be displaying horizontal retrace at the edges, so you'e up it a little and gate the actual pixel address counter with the horizontal blanking pulse.

See above. It doesn't have to "know" anything - the circuit is designed to work for however many pixels across that the designer put into it.

I think you're confusing the idea that the signal has to somehow "know" something about what mode it's being displayed in, but it couldn't give a shit less - if you get a chance, take a look at a TV video with the hor. sweep set to give about one line across, and you get sort of a top view of the edge of the picture, where height = brightness, and sync it to a whole frame/field and you get like an edge-on view.

Cheers! Rich

Hope This Helps! Rich

Reply to
Rich Grise

the only problem is that'd give the wrong result.

"1280x960" 108.00 1280 1376 1488 1800 960 961 964 1000 +hsync +vsync A B C D E F G H clock# action

0 first pixel 1279 last pixel 1280 right-hand overscan (aka border) 1376 overscan end/horizontal sync start (retrace period) 1488 horizontal sync end/left overscan start 1800/0 first pixel 1279 last pixel there's more clocks per line than there are pixels.
--

Bye.
   Jasen
Reply to
jasen

Those are trivial details next to finding out what mode you're in. I'm thinking of counting the hsyncs per frame so at least I know the vertical resolution. From there I can guess what mode I'm in. Problem is I'm dealing with an entirely programmable video chip, and there's no fixed H-V relationship. Ugh.

Reply to
a7yvm109gf5d1

Not the signal, the monitor.

So the monitor doesn't need to know that the incoming signal is

640x480 VGA, and it magically appears on the RSDS lines to the gate drivers on the panel as 1280x1024 native resolution?

Holy crap.

Reply to
a7yvm109gf5d1

computer video cards have been like that since forever, CGA didn't offer the opportunity to change the pixel clock but all else could be tweaked (sometimnes not real good for the monitor)

I can dial up pretty much any mode I want by editing a text file for Xfree86 windows users can do the same by editiing the registry.

Bye. Jasen

Reply to
jasen

I don't mean in software on the PC, I mean a seperate piece of hardware connected to only the analog VGA port. Just from the signals present on the subD-15 connector, how do you figure out what the correct video mode is? I've got a microcontroller set up right now to give me a /1016 ratio on my PLL, but what can I do when the source video chip is fully programmable wrt syncs?

Reply to
a7yvm109gf5d1

LDC's have a physical native resolution. Just the same most sync signals are set up for physical CRT's. CRT's require retrace time, both horizontally and vertically. Read up on NTSC signal composition, especially vertical blanking. It is the basis of all display timings. Next, when you have assimilated that, read up on X-windows modelines. Then you will know for your self.

--
 JosephKK
 Gegen dummheit kampfen die Gotter Selbst, vergebens.  
  --Schiller
Reply to
joseph2k

snipped-for-privacy@netzero.com wrote in news: snipped-for-privacy@n33g2000cwc.googlegroups.com:

1280 dots is hard coded in the firmware. The firmware programs the PLL so as 1280 pixel clock cycles (or so) are generated between horizontal sync pulses.

That is, if the display directly digitises the input signal to the display elements.

Reply to
Gary Tait

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.