In a step down converter, what is the mechanism that causes efficiency to be so poor when stepping down to low voltages? The typical case I'm dealing with is 12V in to 1V out at around 5A. Most canned buck converter designs quote high 80s to mid 90s efficiency when going from
12V to 3.3V, but change the output to 1.0V and the efficiency drops by 10% or more. It's generally worse at lower output voltages.What are the properties of the regulator that cause this? How does one go about designing a power supply to provide good efficiency for 12V to 1V @ 5A?
I'm relatively new to understanding the details of switching regulators, but I'm trying to understand the mechanisms behind some issues. The target is power for processors and FPGAs, which require around 1V for their core voltages; more and more stuff on my boards needs 1V or similar low voltages and I only have a 12V input to work with.
Thanks,
Chris