Mains power voltage drop to reduce usage? (2023 Update)

Instead of rolling blackouts when there's a power shortage, why don't we just allow (or deliberately) the voltage and frequency to drop? Wouldn't that make a lot of devices use less?

Reply to
Commander Kinsey
Loading thread data ...

In the US that's a common technique the utilites use called a "brownout" [a]. They'll drop the voltage by five or even ten percent.

As to whether this makes any difference with power demand these days, given the types of loads, is another story.

[a] that term is often misused to refer to "shedding load", where a power company will black out some sections of the grid to keep everything else up and running.
Reply to
danny burstein

No.

Reducing the frequency will cause mains-referenced clocks to run slow. And that is many clocks with digital displays (which I thought until recently used quartz crystals like a watch) and not just older clocks with synchronous motors. Reducing the frequency too far may affect the efficiency of transformers: I think reducing the frequency makes it more likely that the magnetic core will saturate (though I may have got that the wrong way round!). (*)

Reducing the voltage will reduce the power consumption of resistive loads such as immersion heaters and cooker hobs/ovens, but may have little difference to switched-mode power supplies as used in electronic equipment because they will drawn proportionally more current to maintain the rated output (eg 5V for phone or 20 V for laptop).

And if the temperature of an oven reduces, or the power output of a kettle reduces, the appliance will need to be on longer to cook the food or boil the same amount of water, so there will be no saving. In most cases you are interested in the transfer of a certain amount of energy to do a given job, and it doesn't matter whether that is transferred as high power for a small amount of time or a lower power for a longer period of time.

(*) I've heard it said that transformer-driven devices from the US don't necessarily work efficiently in Europe (even if you correct for the different voltage) whereas European devices run OK in the US (having corrected for voltage) because of the difference between 60 Hz (US) and 50 Hz (UK). Or maybe it's the opposite way round.

Reply to
NY

That sounds more likely to be right than wrong in an extreme case.

Applying a high-enough-voltage DC supply to the primary side of a transformer can eventually overheat the coil and burn off the winding wire's insulation coating. Reducing the frequency isn't the same as supplying DC, but the lower the frequency, the more like DC it becomes.

Charging units work more or less just as well when transplanted from the USA to the UK and vice-versa. Reducing frequency by about a fifth or a sixth shouldn't have a disastrous effect.

We like induction hobs. There's no power wasted heating the environment

- just the cooking vessel - and they are so efficient that they can work off a 13A power point. Same point as above regarding Hz change, though.

See above re phone and laptop chargers.

Reply to
JNugent

No, you're spot on. It has been discussed in this group before. In general, 60Hz transformers can either be more efficient or smaller than their 50Hz cousins.

Reply to
Fredxx

NY snipped-for-privacy@privacy.invalid wrote

We had a brand new US manufactured mass spectrometer quite literally catch fire because it didnt like running on 50Hz instead of 60Hz. The main transformer overheated and caught fire.

Nope, you have it the right way round.

Reply to
Rod Speed

Are you aware that the UK uses 240 Volts instead of 110 Volts in the US? I managed to blow up the SMPS of a large HDD that way. Wim

Reply to
Wim Ton

120, actually. Usually measures about 122.

Residences have some 240 outlets and boxes too, for big loads like a/c and clothes dryers.

The power supplies that we buy are agnostic.

Reply to
John Larkin

Hence the very tiny kHz ones in SMPS.

Reply to
Commander Kinsey

Does this mean the power station generators spin faster over there? Or just have more coils?

Reply to
Commander Kinsey

Most loads are energy-based with negative feedback (anything with a sensor like a thermostat) and not power-based. The countries like Indo-Paki have always lived with brownouts which stimulated mass-sale of "voltage stabilizers" which are negative incremental impedance converters that cumulatively adds more instability.

As local demand current increases from voltage reduction and tap-change to increase the voltage which causes more current and voltage drop when accumulated by many users of further reduces disti-load voltage, leading to unstable frequent daily blackouts.

So end-users of stabilizers increase grid instability.

Current (pun intended) customer outages are: USA 60k (s. east+west coast) Canada 14k UK 1.5k UA massive outage from RU destruction

Reply to
Anthony Stewart

Are you sure they cause instability? They don't increase the power used, they increase the current to compensate for the voltage drop. Instability would only occur if the generators stalled, which happens if the power drawn exceeds what they can provide. If they're anything like a car alternator, the same power can come out at any revs.

Reply to
Commander Kinsey

Think.

The generator(s) can not cope with the load, so the voltage drops. The stabilizers at the clients rise the current and keep the power constant, instead of decreasing, causing the generators to drop speed instead (because they can not cope). Normally, when the speed drops below a value, the generator disconnects. Cascade failure. General blackout, unless some areas disconnect from the too loaded areas and isolate.

Reply to
Carlos E.R.

I was assuming a voltage drop caused on purpose later on, not at the generator. I was originally asking about reducing peak usage problems for short periods. The electric company could drop a winding or two somewhere to lower everyone's voltage. So resistive loads would use less, and loads you're talking about would use precisely the same (as far as the generator is concerned, as it's output voltage and current remain the same).

Reply to
Commander Kinsey

They spin a bit faster in the USA, and have a little bit of less metal.

Reply to
Carlos E.R.

Hum.

Dunno.

Maybe.

But if this strategy "works", the current increase causes more stress on the hardware: it heats more. Maybe destructively.

Reply to
Carlos E.R.

Stuff shouldn't be running so close to the limit and without cutouts. What happens if you try to overrev your car engine?

Reply to
Commander Kinsey

I thought they just powered up the generator as a motor off the grid, so it's precisely in phase, then started up the fuel input.

Reply to
Commander Kinsey

On the other hand of course there are nice ways to tell somebody they are a dick head and bad ones designed to set off a chain reaction, and managers need to learn the difference. Brian

Reply to
Brian Gaff

It gets worse.

A double whammy. Now that reneawble so called energy is all the rage, together with battery packs and undersea DC links, all that lot is not rotating synchronous mass, it is actually electronic inverters that are designed

*to disconnect themselves* if the frequency drops.

So the net result of lowering the frequency is to remove not load, but generating capacity.

At which point whole areas of the country trip out.

Reply to
The Natural Philosopher

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.