Questions about video frequency?

Hi:

I have some questions regarding video frequencies.

formatting link

  1. Why is the horizontal frequency [15.734 kHz] higher than the vertical frequency [60 hz]?

1a. What would happen if the horizontal frequency was 60 Hz and the vertical frequency was 15.734 kHz? How would the video look like? [assuming all devices in the video equipments could handle the frequencies w/out any damage]

  1. What would happen if the horizontal frequency was 15.734 thz [instead of khz] and the vertical frequency was 60 ghz [instead of 60 hz]? How would the video look like [again, assuming all devices in the video equipments could handle the frequencies w/out any damage]

Thanks,

Radium

Reply to
Radium
Loading thread data ...

Kid, go and read the books. Once you grasp what is going on inside a tv set, then you'll understand the sync frequencies.

Especially look for Farnsworth's youth, when he was planting the fields.

Michael

Reply to
Michael Black

The scanning is from left to right, one line at a time, so the horizontal frequency is higher. The scanning (line by line) could be from top to bottom and, if it was, the vertical frequency would be higher.

When all of the lines for one picture are completed, then one frame has been sent ... the vertical frequency is the frame rate or picture rate.

Higher frequencies allow both more lines per frame and more frames per second. The appearance would be higher resolution and lack of any flicker effects (very smooth motion).

Reply to
Charles Schuler

You are replying to a not very bright troll.

Reply to
Don Bowey

Sigh ... so that makes me less bright than him/her? Or simply naive? Or eternally positive about education and information transfer?

I taught electrical engineering technology for 33 years and fit into the third category above.

Consider that others beside you and me and the OP read this stuff.

Reply to
Charles Schuler

yes some of us, not very bright lurkers/readers, do read this stuff and are thankful

Reply to
robb

60 was chosen to match the environmental magnetic fields (eg: caused by nearby appliances so as to not cause the image to dance).

no great difference.

MUCH less flicker, but environmental 60Hz magnetic fields may blur the picture. also production cost would be astronomical.

--

Bye.
   Jasen
Reply to
jasen

And that is why I'll keep doing my thing until the worms get me. Thanks.

Reply to
Charles Schuler

Thanks for clearing this up.

Reply to
Radium

appliances so as to not

How can 60 hz magnetic fields cause "blurring" or "dancing" on a frame rate that is 60 ghz? Doesn't the magnetic interference have to be the same frequency as the frame rate [the vertical frequency] in order to resonate with -- and hence, cause disturbance on -- the receiving device [i.e. the video equipment]?

Reply to
Radium

Greetings Charles, Thanks for that clear explanation. Cheers, Eric R Snow

Reply to
Eric R Snow

You are most welcome.

Reply to
Charles Schuler

Good for you, but PLEASE make sure to point out the flaws in the stupid questions, as well. Video scan rates in the Terahertz range? There is no way to do it in the foreseeable future, the resulting "video" signal would be gamma rays. How would it be displayed? Even at

1% of current pixel size, just how big would that screen have to be? How much power would be needed for this insane concept? Radium likes to ask trick questions, to make fools of people.

formatting link
has a chart showing the relationship between frequency and where it falls in the spectrum. Radium's "Idea" would be well off the right side of the chart.

--
Service to my country? Been there, Done that, and I\'ve got my DD214 to
prove it.
Member of DAV #85.

Michael A. Terrell
Central Florida
Reply to
Michael A. Terrell

But forget about whether it could be done or not, there'd be no purpose.

It was precisely the exaggerated frequencies that made the post so outrageous. It wasn't about "what if the frequencies were a little bit different" but a really stupid figure.

Michael

Reply to
Michael Black

Because in a raster-scan system, one axis is the "fast scan" axis and the other is the slow axis. In standard video, it's simply been the norm for the fast axis to be horizontal, primarily because TV is based around the notion of "landscape format" (wider than tall) images. So you wind up scanning lots of lines (horizontal rate) for every single field or frame (vertical rate).

You'd have a TV system in which the scan lines ran vertically.

Not a damn thing in theory, in terms of the image format. What you have described is a system in which the ratio between the H and V rates is still the same, and so the image format would be unchanged - it's just operating at an outrageously (impossibly, actually) high frame rate.

Bob M.

Reply to
Bob Myers

Let's first realize that "60 GHz" is clearly an absurd figure, and instead ask the question "what happens if the vertical rate is something other than the local power line rate?"

An earlier poster was correct in noting that the 60 Hz rate was chosen (at least in N. America) because it matched up with the power frequency, and this was important in order to minimize the visible effects of local magnetic fields. As long as the video field rate and the local power rate are exactly the same, the image will appear stationary; if these rates differ, then the local magnetic field created by power lines or electrically-powered equipment will, in effect, "beat" with the field rate and cause the displayed image to move or distort at that "beat" frequency. For instance, if a CRT display is operated at a 65 Hz refresh rate, and there is a sufficiently-strong field in the vicinity from a 60 Hz power line, there will be movement or distortion of the displayed image occuring at a 5 Hz rate.

Bob M.

>
Reply to
Bob Myers

The TRS-80 used a shitty B&W TV with no tuner/IF strip as a monitor. When you ran them on 50 Hz they were ugly with black rolling bars. I figured out a hack which involved changing the divider chain and the crystal which brought them to 50 Hz and made them tolerable.

Reply to
Homer J Simpson

Well, what if the frequency of power lines -- and all electrical equipment in the vicinity -- were 60 ghz?

This is true for CRTs, not for plasma displays or LCD screen. I have noticed that CRTs sometimes distort/move the displayed image when another CRT or another computer is switched on or off

Plasma screen and LCDs are immune to such interference.

Reply to
Radium

Probably explains why computers featured in movies tend to flicker away.

Reply to
Radium

You wouldn't need a microwave to heat your coffee.

Reply to
Homer J Simpson

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.