Standard mains voltage in the US?

When I started out in electronics some 40 years ago, the few books and magazine articles I could get my hands on - mostly of American origin - often cited 117V as the mains voltage. Later on, I've come across various figures like 110, 115 and 120 volts. What, if any, is the standardized domestic supply voltage there, notwithstanding variations in what you actually get in your home?

(Over here, during a period of a decade and a half when the government-run power agency often called on me for help with technical matters, I advised them to aim for 230V at the consumer outlet).

Reply to
Pimpom
Loading thread data ...

That should be 'some *50* years ago'.

Reply to
Pimpom

Pimpom wrote in news:UZCAC.158590$%B3.44000 @fx12.ams1:

In the US I quote this...

"Mains power is sometimes spoken of as 110 V; however, 120 V is the nominal voltage."

"In 2000, Australia converted to 230 V as the nominal standard with a tolerance of +10%/-6%, this superseding the old 240 V standard, AS2926-

1987."
Reply to
DecadentLinuxUserNumeroUno

On Sunday, 15 April 2018 08:20:33 UTC+1, snipped-for-privacy@decadence.org wr ote:

UK nominally converted to 230v with asymmetric limits too, but IRL the targ et voltage is still 240v. It remains the same so that old equipment is stil l compatible, plus no need to spend on making any significant distribution changes. 230v can mean 220, 230 or 240v.

NT

Reply to
tabbypurr

rget voltage is still 240v. It remains the same so that old equipment is st ill compatible, plus no need to spend on making any significant distributio n changes. 230v can mean 220, 230 or 240v.

Mains voltages always vary significantly. So why would they need to "shoot " for 240 V vs 230 V to maintain compatibility? If the spec says the volta ge can sag to X which is out of spec for some older equipment, what differe nce does it make if they were "shooting" for 240 or 230 when it sagged to X and the equipment fried or stopped running?

Reply to
gnuarm.deletethisbit

I'm kind of struggling to think of any (UK) mains powered equipment which have any problems operating at any of those voltages. In fact I've found most stuff specced for 230V will work perfectly fine at anything over about 190VAC. Where I am it's 240 and that's generally surprisingly constant.

--
This message may be freely reproduced without limit or charge only via  
the Usenet protocol. Reproduction in whole or part through other  
 Click to see the full signature
Reply to
Cursitor Doom

It regularly exceeded 127V at our house and we had light bulbs blow in the evening. Only some stern complaining including scope shots convinced the utility to have someone come out. He revealed that a capacitor bank gets "slammed on" at fixed times, by a simple timer. They corrected a transformer tap and no more bulbs blew.

There it can exceed 260V.

Long story short, if you design something make sure it can exceed "standard" tolerances levels by a lot. In Australia by a whole lot. BTDT.

--
Regards, Joerg 

http://www.analogconsultants.com/
Reply to
Joerg

120 is the sort-of standard. I usually see 120 to maybe 122 RMS, but that will vary a lot in different places.
--
John Larkin         Highland Technology, Inc 

lunatic fringe electronics
 Click to see the full signature
Reply to
John Larkin

-

The issue isn't that the voltage varies a lot with time (although that can happen on the end of a line when heating and cooling are switching on and o ff), but that there can be a lot of variation between locations. Again, en d of run vs. start of run. Power lines do drop voltage.

Rick C.

Reply to
gnuarm.deletethisbit

I had a soft neutral at our old house, and sometimes got over 130 on one phase. I had a hard time convincing the service crew that it was real. There was a bad connection on a pole across the street.

--
John Larkin         Highland Technology, Inc 

lunatic fringe electronics
 Click to see the full signature
Reply to
John Larkin

I found this 1999 document, published by PG&E (Pacific Gas and Electric):

"Voltage Tolerance Boundary"

Nominal is 120v. There are two ranges: Range A = +5% to -5% (Normal) Range B = +6% to -13% (Not so normal but might happen) The same limits apply to 208, 240, 277, and 480v Also see the comments on conflicting standards.

There is some talk about better control the voltage (or just dropping the voltage) to save energy: "Typically, every 10 percent increase in voltage increases total energy use by six to nine percent."

--
Jeff Liebermann     jeffl@cruzio.com 
150 Felker St #D    http://www.LearnByDestroying.com 
 Click to see the full signature
Reply to
Jeff Liebermann

This is due to the EU wanting to have a common mains voltage.

In 220 V countries, there was a quite narrow voltage tolerances, as well as in 240 V countries. By standardizing the EU wide nominal voltage to 230 V but simultaneously the voltage tolerances were greatly increased in such a way that the 220 V lowest allowed voltage and highest 240 voltage fit in the 230 V large tolerances.

There is a small problem with electric heaters, in which the power of a resistive heater for nominal 230 V will wary between 81 % and 112 % within the 230 V tolerance range.

The situation is even worse for incandescent lamps, in which the amount of usable visible light is temperature dependent. I have used the cube relation so the amount of visible light would vary between 73 % ad 119 %.

OTOH, the bulb lifetime is strongly inversely proportional to voltage. For applications needing high reliability or places, where it is hard to replace bulbs, bulbs rated for 260 V are sometimes used.

If you still can buy incandescent lamps, select bulbs matching your actual voltage (220 or 240 V) for expected performance and lifetime.

Reply to
upsidedown

Historically there were tubes with 117V heaters for straight across the incoming mains, could be just a rectifier or a dual element unit.

The UK system was universally 250V at some point, but somehow drifted down to 240, and now tends to be somewhere around 220 - 230.

Our tolerances were fudged a wile back to 'comply' with the standard on mainland Europe.

Before the national Grid just after WW2, most decent size towns had their own power station - I think DC supplies were done away with nearer WW1, but there may have been a few still lurking about.

Reply to
Ian Field

Joerg wrote in news:fjhe0dFt6tkU1 @mid.individual.net:

The target these days, since linear supplies with switched taps are not made much any more, are "auto-switched" SMPS. We designed ours to work from about 85 VAC thru about 265 VAC. That way it worked even at a poorly fed tail end of a japanese residential feed, all the way to a poorly adjusted australian hot line.

Reply to
DecadentLinuxUserNumeroUno

mostly incorrect

Reply to
tabbypurr

a
6-

target voltage is still 240v. It remains the same so that old equipment is still compatible, plus no need to spend on making any significant distribut ion changes. 230v can mean 220, 230 or 240v.

ot" for 240 V vs 230 V to maintain compatibility? If the spec says the vol tage can sag to X which is out of spec for some older equipment, what diffe rence does it make if they were "shooting" for 240 or 230 when it sagged to X and the equipment fried or stopped running?

the answer seems too obvious. Some old 220v kit doesn't run ok on 240v & vi ce versa.

NT

Reply to
tabbypurr

formatting link

Reply to
bloggs.fredbloggs.fred

Many of the pumping stations in New Orleans still use 25 Hz power.

--
John Larkin         Highland Technology, Inc 

lunatic fringe electronics
 Click to see the full signature
Reply to
John Larkin

My measurements have always been 118 to 125. Usually it's 120. Once I saw 135 for an hour or so, then it was corrected.

Reply to
Tom Del Rosso

There is a lot of residential solar, and that can make the voltage rise at the end of a line, above what it is at the transformer. The household solar inverters must by law be programmed to switch off at or above a certain fixed voltage, to avoid damaging things due to excessive voltage.

If the electricity company wanted, for some reason (political or financial), to prevent households from feeding in solar power and perhaps even prevent some of them from generating and consuming it themselves, one effective method of achieving that might be to set the transformer tap to produce just under the legal maximum voltage at the transformer, when at minimum load but with no solar generation, e.g. during the night in mild weather, so that the inverters trip due to overvoltage at times when it is sunny and there is little load. I don't have any evidence that they have deliberately done that, but it might sometimes happen accidentally, or "accidentally". For consumers, where possible it might be better to aim for equal margin on the minimum voltage at maximum load and maximum voltage with maximum solar generation, at the end of the line. In some cases that would give no margin at all, and require thicker or shorter power lines, or changing to bundled conductors (with lower inductance).

Reply to
Chris Jones

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.