I'm trying to understand why the efficiency of a switching regulator reduces dramatically as the load current reduces, while it stays high and relative flat when load current is large. The initial reason I cooked up was this:

efficiency ~ Pout/Pin. Pout = Vout***Iout, Pin = Vin***Iin If Vout and Vin are relatively constant, and average input current is also relatively constant (Its actually pulsing I guess, but I'm ignoring that), then the efficiency should linearly increase with Iout.

The problem is this - the efficiency doesn't depend linearly at all on the load current. Most graphs I've seen seem to look like a log function on linear axes (i.e. growing rather quickly as load current increases from zero, and then flattening out for higher currents).

So can anyone suggest any reasons why the curve looks like this? Here is an explanation that is brewing currently in my head - I believe the switching losses in the FET are largely dependent on the switching frequency and rather independent of the load current. I've also heard that this switching loss comprises most of the regulator loss. Is that true? If that's the case, then I would assume the other losses become more and more significant as the output power reduces, so for higher output power, the switching loss being dominant makes the curve flat, which for lower and lower output power, the other losses start becoming dominant. Have I hit the nail on the head here? If not, can someone please clarify this issue...Thanks!