Spent a day recently with a colleague on his farm, tracking down shorts and leakage on electric fencing. There are a couple of natty devices, including a little handheld box that you just hook onto the wire, and it indicates the current and voltage. My colleague was using the magnitude of the current as we worked our way along the wire to infer the distance to the fault, his theory was that more current means closer. In fact, the current did seem to vary as we moved along, I wasn't watching closely enough to see whether it related clearly to the fault location.
My understanding is that electric fences are energised in pulses of a few KV, with a PRF of a second or so. Assuming a pulse width in the low milliseconds without a lot of high frequency content, then for a fencing setup spanning not too many Km, transmission line effects should be fairly negligible, and the line current should be fairly uniform along the fence. Is this correct, or do transmission line effects actually play a part in what you measure? What sort of pulse waveforms and pulse lengths do fence controllers typically deliver?