A few weeks ago, there was some talk about how interference leaks into coaxial cables, and I promised to dig up some measurements. (Thread "Are ferrite cable shields on coax a good idea?")
The results are at: . I measured only a few cable types, chosen from those that are commonly used around here, and which would fit into my test jig.
I've been trying to come up with a spice model for this, with checkered results. The low-frequency end is easy enough: It just depends on the DC screen resistance. But the drop of Zt at medium frequencies is apparently caused by the skin effect, and not, as I formerly believed, due to the common-mode inductance of the cable under test.
With just the cable geometry and screen resistance measurement, I can't predict at which frequency the breakpoint will be, and I don't readily know how to model the skin effect in spice, either. Anyone care to comment?