I'm making a 24 to 5V @ 12A DC-DC convert with accurate constant current limiting down to zero volts. After spending time designing a prototype using TI's UCC2541 synchronous buck controller, I discovered this IC has as a serious problem at low duty cycles. When the PWM on period is between 0 and 100ns the high-side FET fails to turn off correctly, causing catastrophic shoot through currents across low-side and high-side MOSFETs. I am now looking at different synchronous buck controller ICs.
It seems that all available ICs specify some minimum PWM on period (typically 50 to 150ns), thereby limiting the lowest controllable output voltage. I don't mind the control loop going a little chaotic at extreme low output voltage, but it must be capable of averaging out to a constant current into a short circuit.
Anyone know what is meant by controller minimum on period ? Does it mean:
1) Shortest on period regardless how low the programmed duty cycle. 2) Shortest on period while maintaining reliable control. Periods shorter than this are considered uncontrollable due to noise.None of the buck controller datasheets clearly state what happens.
Adam