While messing with buck/boost circuits I had some thoughts which don't hold up in real tests....
A simple example (aside from losses) is that if transfer energy from say 10V 10uF into 1uF the voltage will increase to preserve the charge.
Now, in my circuit, I charge 25V 1,000uF capacitor and charge a 10uH inductor. When the switch turns off, all the energy should be in the10uH inductance. So if I have a 100pF capacitor, then the voltage should be like 100,000volts according to my workings out.
Now I ran a computer simulation on this, at best I can only obtain12KV on the 100pF. So most of the energy is lost in switching losses I assume.
In realworld tests, I end up with less voltage than I started out with, So I am trying to find out why ?
I know charging 22uH inductor at 100khz can be used as a buck/boost supply, I built a simple 12V to 30V inverter, can switch 10amps easily. Though I am not running at 100khz, only 100hz. Though the current pulse rises to something like 500amps over 500uS.
I am not sure I follow all this exactly, Or even if it will work ? AFAIK, The longer a inductor has current pumped across it the more charge it obtains over time. So at turn off, all the energy given to a coil is recovered. It works well, even with my simple buck/boost circuit.
So I am slightly confused as to why pushing 500A into a coil has no effect. I can only assume I have a huge loss somewhere, Or I do not follow the idea correctly ?