This falls outside the realm of small voltages on small circuits, so redirects appreciated.
If I am using an amp probe to sense the current flow through wires in a breaker box and I probe something like 8.33 amps on a 120v circuit (1000 watts) then I believe I can deduce that this consumes a kilowatt in 1 hour and costs me approx $0.09 at my residential rate.
As I understand North American 240v service, each pole runs 180 degrees off the other and consumption of current on one side "should" equal the other. If I probe 8.33 amps on one wire of a 240v circuit (dual pole breaker) then is this also consuming 1 kilowatt per hour, or do I need to calculate my reading of 8.33 amps and double it to 16.66?
I'm trying to audit my energy bills compared to ambient electricity consumption and I'm not certain if I am doing the math properly with the
220v circuits.When a 220v breaker is rated for 30 amps, is that 15+15 amps or 30+30 amps?
Thank you.
(Very tempted to get an RS-485 adapter, run a 100 foot cable, and start datalogging my thermostat and graphing HVAC 1st, 2nd, 3rd stage runtimes and energy costs)