How do you calculate the REQUIRED voltage from a transformer secondary?
Ok, here's the plan. A friend of mine installed a bunch of LED fixtures to run off a solar panel and a 12V marine battery. It's drawing about 15amps at 12VDC.
It works fine when the sun is shining, but drains the battery when there is no sun. He said he gets tires of having to hook a battery charger to that battery on cloudy days. (I would think that is also hard on the battery too).
I suggested that he build a 12VDC power supply to connect to a regular
120V outlet, and put a switch, so he can switch off the battery and switch on this power supply on cloudy days.Now we need to build this power supply. It does NOT need to be regulated for this use. Just a basic transformer, bridge rectifier, and a capacitor. Capable of outputting 12 to 14 VDC at 15 amps or greater.
He ordered a large bridge rectifier from ebay, (with heat sink), rated at 50V 20A. for around $2 from China. (more than enough power).
But now comes the question. I know that when using a bridge rectifier, the output voltage is "close" to the output of the transformer. But there is still some voltage drop, from the rectifier.
The ideal voltage desired is 13.8 VDC (same as a battery). But anything from 12V to 14 V would be fine. How does someone determine the secondary voltage of the transformer that is needed to accomplish this? (I'm guessing 16V would work, but thats only a guess).
One other thing, he plans to put a capacitor across the output. I know this cap needs to be rated at 25 volts or higher, but what capacity should be used? From looking at some schematics which use bridge rectifiers, it appears they use fairly large caps, such as 1000uf or thereabouts.
For this use, the power supply does not need to be precision, or filtered to the extreme, but a fairly constant voltage with filtering to avoid flicker is desired.
Suggestions?
Thanks