Switching Power Supplies - Low Output Voltages - Efficiency

In a step down converter, what is the mechanism that causes efficiency to be so poor when stepping down to low voltages? The typical case I'm dealing with is 12V in to 1V out at around 5A. Most canned buck converter designs quote high 80s to mid 90s efficiency when going from

12V to 3.3V, but change the output to 1.0V and the efficiency drops by 10% or more. It's generally worse at lower output voltages.

What are the properties of the regulator that cause this? How does one go about designing a power supply to provide good efficiency for 12V to 1V @ 5A?

I'm relatively new to understanding the details of switching regulators, but I'm trying to understand the mechanisms behind some issues. The target is power for processors and FPGAs, which require around 1V for their core voltages; more and more stuff on my boards needs 1V or similar low voltages and I only have a 12V input to work with.

Thanks,

Chris

Reply to
kmaryan
Loading thread data ...

Usually it's the catch diode that burns power. Synchronous switchers are much more efficient at low voltages.

John

Reply to
John Larkin

1) If a catch diode is used, even if it's only, say, 0.3-0.4V, that's a huge chunk relative to 1V whereas not necessarily that bad out of 3.3V (...and of course nothing at, say, 12V). 2) If instead a FET is used (a "synchronous rectifier"), it acts as a resistor and the loss is (mostly) I^2*R. If you drop from 3.3V to 1V but want the same output *power*, the *current* will increase by a factor of 3.3 and hence the loss will increase by a factor of 3.3^2=10.9. Ouch!

I don't think there's any real panacea. One trade off you might be able to make could be in "making things bigger" -- if you aren't trying to build cell phones or similarly tiny devices, choose a slower switcher: The switching losses will be less, the inductors will be less lossy, and EMI will be less of a problem.

Linear Tech provides models for LTSpice for all of their switcher ICs, and they make it very easy to go around and poke at parts and see how much each one is dissipating. Hence you can spend you time attacking the "worst offenders" in your quest for higher efficiency...

Note that even 75% efficiency for 1V @ 5A is still a large improvement upon trying to build a linear power supply with the same specs!

---Joel

Reply to
Joel Koltner

uge

of

istor

he same

the

In the FET case (synchronous), why does the efficiency still drop so substantially even at the same current? i.e. all else being equal.

Consider for example the figures in the datasheet for the LTC3850

formatting link
- bottom of page 5). The peak efficiency going from 12 to 3.3 at 2A is around 94% (middle graph), the same configuration but to 1.8V lists about 90% at 2A (left graph).

I guess my question really should be: If it's possible to design a 12 to 3V, 5A regulator with a mid-90s efficiency, why does it seem that it's impossible to design a 12 to 1V 5A regulator with mid-90s efficiency? I can't find any designs that meet this spec, and I can't come up with any good reasons why not. Even the CPU power suppy designs that I've been able to find (i.e. the kind that supply 1V at

100A for the processor core) only seem to have efficiency ratings around the low-mid 80s. My only guess is something about transistor operating mode issues.

Chris

Reply to
kmaryan

Isn't obvious yet? For a given output current, losses are more or less constant, while the output power scales with output voltage. So Pout/Ploss goes down with output voltage.

Jeroen Belleman

Reply to
Jeroen Belleman

m
y

a huge

and of

resistor

t the same

ce the

Got it, I was a bit dense there for a moment. I ran through the numbers of a couple designs and everything agrees.

Thanks,

Chris

Reply to
kmaryan

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.