Hello,
I have some 220V tools that I need to run from 110V through a transformer. I want a nice permanent installation in my workshop with mounted sockets and a breaker to protect the transformer. The transformer is rated at 3 kVA continuous. I'm not sure how to calculate the rating of the breaker on the 110V side of the transformer to protect it from have too big a load being plugged into it.
I think it should be 3000VA/110V = 28A breaker. But, I don't think I should be using VAs in the calculation, rather watts. So how do you get from VAs to watts? I know it has to do with power factor but this depends on what is plugged in. So do you use and average pf like 0.8?
So now : 3000 * 0.8 / 110 = 22A breaker
So by my calculation 22A is the most the transformer should ever see. But a 22A breaker carrying 22A will trip won't it? So how much bigger do you have to go to keep the the transformer safe but still be able to get the most from it?
Also, you get different curve ratings on breakers. Something to do with the way they "integrate" the load until they trip based on the load and time. I think these types are for protecting specific equipment, for expamle an inductive load will need a different type of breaker than a resistive load of the same wattage. Is this correct?
Quite a few questions here. Maybe I need a circuit breaker tutorial.
Thanks for any input.