Why is AC 60 Hz?

Hi,

How is AC 60 Hz and not 30 Hz. What is so significant about 60 or 50Hz? Can't bulb or electric motors run at 20 Hz or so??

Dave

Reply to
Dave
Loading thread data ...

A light bulb flickering at 20Hz would drive you nuts within 2 minutes.

50Hz is a frequency that is perceived by most humans as a steady light, 60Hz is better. The frequency is generated by - um - the generators that run at your local power plant. They are kept strictly in sync over the day (in the so called "modern" world), because many clocks depend on the correct frequency to keep time.

"Dave" schrieb im Newsbeitrag news: snipped-for-privacy@g43g2000cwa.googlegroups.com...

Reply to
Matthias Melcher

In response to what Matthias Melcher posted in news:d652ig$t11$05$ snipped-for-privacy@news.t-online.com:

but only because the flicker frequency is actually 100Hz - the lamp brightens and dims twice per cycle.

--
Joe Soap.
JUNK is stuff that you keep for 20 years,
then throw away a week before you need it.
Reply to
Joe Soap

Higher frequencies have higher loss. You body has more resistance to higher frequencies. You eyes can perceive light flicker at 40 Hertz. You nerve system runs at ~40 Hertz.

So low frequency is bad for lighting and electrocution. High frequencies are bad for motors. Thus the compromise.

Dwayne

Reply to
Dwayne

That doesn't happen to be true. The so-called flicker rate for the eye is

24 Hz. ANything faster than that is perceived to be steady light. Witness the standard TV set with a frame rate of 30 Hz. (2 interlaced sets of video at 60 Hz.). No flicker there that I can see, nor anybody else for that matter.

Jim

Reply to
RST Engineering (jw)

You can't "see" it, but you can perceive it. Low-refresh rate light sources are known to cause eye strain or even headaches. Anything below 100 to 200 Hz *will* be perceived by your neurons. You may not directly be aware of it - but it has some effect. To your brain, it is some kind of "noise" that has to be sorted out on low levels, hence the possible headaches and attention span troubles.

As for TV and CRTs in general, your assertion is pretty much off, because another problem with CRTs is not just the frame rate - but the fact the image is itself not steady in each frame. So between a 30 Hz and a 100 Hz frame rate on a CRT, I guarantee you that you can actually tell the difference.

Reply to
Guillaume

In response to what RST Engineering (jw) posted in news: snipped-for-privacy@corp.supernews.com:

So try feeding a lamp via a diode, and report back on the flicker.

--
Joe Soap.
JUNK is stuff that you keep for 20 years,
then throw away a week before you need it.
Reply to
Joe Soap

The other thing to take in is that it won't dim before the filament has cooled enough to output a different colored light. So higher frequency also means it has less time to cool down...

--
MVH,
Vidar

www.bitsex.net
Reply to
Vidar Løkken

| >> but only because the flicker frequency is actually 100Hz - the lamp | >> brightens and dims twice per cycle. | | So try feeding a lamp via a diode, and report back on the flicker.

If you feed an incadescent lamp with a 50 Hz signal, it'll flicker at

100 Hz. It won't flicker much, however, as the filament won't have enough time to really heat up and cool down.

If instead you feed it through a diode, it'll flicker at 50 Hz, and it will be much more noticable. During half of the 50 Hz cycle, it'll get no power at all, where without the diode it's power would go up and down again during that part of the cycle.

Now, if you fed it through a 4 diode bridge rectifier, the flicker would be back at the level without any diodes, assuming that the voltage drop caused by the diodes was small compared to the total voltage.

Flourescent lights and LEDs and other lights flicker more than incadescent lights because they don't rely on something getting hot to create light, and so the light emitted is a function of how much power it's getting *now*, not a function of how much power it got in the immediate past. (The filament of the bulb cannot cool off much each cycle, so that decreases the amplitude (but not the frequency) of each `flicker'.

I'm not sure if that's what you were getting at, Mr Soap, or not.

--
Doug McLaren, dougmc@frenzy.com
Death before dishonor. But neither before breakfast.
Reply to
Doug McLaren

I'm not sure we have evidence yet that Mr. Soap knows what he was getting at...

Reply to
Noah Little

In response to what Doug McLaren posted in news:uduhe.87076$ snipped-for-privacy@tornado.texas.rr.com:

A diode in the feed is the easiest way of getting 50Hz flicker; full wave rectification is irrelevant to the discussion. If you feed a sine wave oscillator at 50Hz to a power amplifier driving a bulb, you will also get noticeable flicker. The flicker effect will vanish at about 70Hz.

--
Joe Soap.
JUNK is stuff that you keep for 20 years,
then throw away a week before you need it.
Reply to
Joe Soap

Reason for the flicker at 50Hz with a diode is more a function of the

50% off time than the frequency per se.
--
John Miller
email domain: n4vu.com; username: jsm(@)
Surplus (For sale or trade):
Besson International Trumpet by Kanstul
Reply to
John Miller

Not true generally.

I believe this figure comes from movies having 24 frames per second. But with movies the dark period is short, the light period is long, and I am told that projectors interrupt the light twice per frame to make the flicker 48 Hz.

60 Hz - I occaisionally see flicker. 56 Hz - done by some monitors in 800x600 mode - I often see flicker.

- Don Klipstein ( snipped-for-privacy@misty.com)

Reply to
Don Klipstein

Electrocution has nothing to do with choice of 50-60 Hz, since these frequencies are actually close to as bad as it gets for electrocution. The way I hear it, 50 and 60 Hz are good for motors. Otherwise the best frequency overall is higher.

- Don Klipstein ( snipped-for-privacy@misty.com)

Reply to
Don Klipstein

In most cases projectors expose each frame twice making the fundamental of the flicker frequency 48hz (and incidentally meaning that pulldown has to be complete in 1/96th of a second).

Even with this, a common indication that the screen brightness is too high is the perception of flicker as the flicker fusion frequency is higher at greater light levels. Note that this is distinct from the lower frequency beat effects you can get between an poorly filtered lamp supply and the second harmonic of the shutter.....

Try turning the brightness down, flicker is MUCH more apparent at high brightness.

Regards, Dan.

Reply to
Dan Mills

Hmm this site:

formatting link
says: "It is generally accepted that Nikola Tesla chose 60 hertz as the lowest frequency that would not cause street lighting to flicker visibly. The origin of the 50 hertz frequency used in other parts of the world is open to debate but seems likely to be a rounding off of

60hz to the 1 2 5 10 structure popular with metric standards." Actually early silent movies were projected at 16 frames per second as that was felt to be the slowest frame rate to be smooth to the human eye. I feel we can sense when something is running even at 24 frps. Which is what Douglas Trumbull was doing when he developed Showscan which is 65mm running at 60frps. I have seen this in "Brainstorm" and the simulation of reality is very unsettling. Showscan has been used in some theme park rides or simulators. Has anyone experienced any of these?

Richard

Reply to
spudnuty

Just thought I'd throw this observation in: I was recently breadboarding a counter/frequency divider chip using a

32.768KHz TTL oscillator source. The counter chip train was set to divide by n multiples of 32768, (where n would be switch-selectable)then roll over to 0 while the "carry out" pin on the last chip in the train went from high to low for about one clock cycle (~30.5 microseconds). I had an LED hooked up to the carry out pin. I set it for 8 second intervals and powered it up. To my surprise, I could "see" the 15 microsecond flicker every 8 seconds! Maybe more like it should be worded as "perceive" than "see". Got me to thinking: How low could I go on the pulse width and still perceive the flicker? What if I put an inverter between carry out and the LED, so that the LED transition would be from low to high? But I was in a hurry to get the project done, so I didn't go any further. I'm sure this has been extensively studied, investigating all the variables(light source, rest interval, light intensity, spectral output, EEGs, age/sex/ethnic groups, etc. Anybody know of any studies that have been done or refer me to a source?

Reply to
Charles Jean

True.

75 Hz - I often see flicker (CRT screen).
Reply to
Ken

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.