direct current loss per foot of wire

I can't seem to find an answer to this question anywhere, So I've come to the experts. I need to power a device that requires 500ma of

12v dc current, but it has to be 200 feet away from the transformer. Will a 1amp power supply provide adequate current over a 14 gauge wire? Is there a formula to figure this out? Thanks in advance

Nick (servozoom)

Reply to
servozoom
Loading thread data ...

Thanks so much to everyone, outstanding info, John's chart is a keeper, and Figaro's calculation made my job much less painless. You guys are the best.

Reply to
purplefloyd

You can find many tables of wire data through Google, like this one:

formatting link

It lists the ohms per 1000 feet (ohms /Kft) and feet feet per ohm for many wire gauges. You should be able to apply Ohm's law to some possibilities and figure out how much of your 12 volts will get lost in the wire, out and back.

Reply to
John Popelish

---------------------------------------------------------------------------

See

formatting link
section on Current Ratings

V = DIR/1000

D = 400 feet (200 feet x 2 round trip) I = 0.5 amps R = 2.575 ohms per 1000 feet

then V = 0.52 volts = voltage drop

your power supply must be 12.52 volts so that the voltage at the far end device will be 12.0 volts

Reply to
Figaro

Thanks for all the info, I understand the problem much better.

Reply to
servozoom

Oh boy, where do we even begin:

This question is impossible to answer because you don't provide useful information. First there's a question of your power source and then there's a question of your load.

You claim you are 200 feet away from the transformer and then talk about a 1 amp power supply. So do you mean a 1 amp wallwart? You can't put DC into a transformer so you'd have to rectify it on the output of the transformer, it would also be a good idea to regulate it. You also don't state what voltage is acceptable at the load. 12V +/- what? Bare in mind that wallwart voltage tends to go all over the place depending on the load and they are unregulated.

14 gauge wire has a resistance of .00297 ohms per foot (assuming copper wire, and room temperature, resistance goes up with temperature), so 200 feet of it * 500mA gives you about .3V of drop. That means the voltage at the output of the transformer must be at least .3V above what you need at the load.
Reply to
me

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.