Just sorting through my vast collection of obsolete test equipment looking for a milliohm meter (found one!) and came across a *lot* of test equipment with 600 Ohm outputs. It was obviously some sort of standard back in the day but I was just wondering how it was arrived at? Is this a throwback to the valve (toob) gear days or is there some other reason behind it? I'm just thinking Zo=0 would be much more adaptable. ;-)
That was the nominal impedance of telephone lines, including dedicated leased pairs. A twisted pair really has a Zo closer to 100 ohms, but long phone lines are very resistive, so 600 is closer in the audio range.
Lots of audio gear was rated 600 ohms, although it usually wasn't.
--
John Larkin Highland Technology, Inc
lunatic fringe electronics
I think it also had something to do with being the optimal impedance for a long line used at baseband. The lines were equalized using 88mH coils (pupinisation) and somehow the best results for audio baseband were obtained with a 600 ohm system. It then became a standard impedance much like 50 ohms for most RF systems.
Now you come to mention it, I do remember goofing around making electric fences with surplus Post Office relays half a century ago and they were all AFAICR 600 Ohms. But like you say, I can't see the connection between relay solenoids and transmission lines. :-/
I have a few 600-ohm HP voltmeters: an HP 400EL and a couple of HP
3403As. They have dB labelled for 600 ohms, but are really 10M and a few picofarads.
600-ohm open wire line is (or used to be) a common type of feed line used for ham antennas--it has super low loss, and due to its 6-inch spacing, it's pretty well immune to rain, snow, and ice.
FM Radio and recording studios, mixers, etc., used 600-ohm balanced lines as a standard. Connections were made with transformers, to avoid hum loops, etc., and they expected a final 600-ohm termination. The bridging transformers typically had 10k impedances.
Almost all old moving-needle meters had ACV logarithmic scales with a 'dBm' designation. It was relative to 0dB at one milliwatt into a 600 ohm load. Though the meters were high-impedance, it meant you could probe inlet and outlet of a driven, loaded amplifier and subtract to find its gain, in decibels.
Anyone who taught electricity or electronics, or lab courses with meters, would eventually hear the question. Until about the late eighties...
Short mismatched cables inside the switchboard didn't matter compared with the long overhead 600-ohm pairs that the standard was originally designed for.
It is relatively easy to check the characteristic impedance of a drum of cable: connect a square wave signal generator with an output of 600 ohms or more to an oscilloscope. Generate a signal somewhere between 100kc/s and 1Mc/s and view the voltage waveform, it should be nice and square. Connect one end of the cable to the signal generator, the reflections from the other end will be visible as distortion of the square wave (ripples, overshoots etc.).
Now connect a non-inductive resistance box to the far end of the cable and adjust the resistors for minimum distortion of the square wave. Check by disconnecting and reconnecting the cable at the signal generator, the amplitude of the square wave will change but, if the terminating resistance equals the characteristic impedance of the cable, its shape should not.
Most ordinary equipment wire will turn be found to have a characteristic impedance around 100 to 120 ohms.
--
~ Adrian Tuddenham ~
(Remove the ".invalid"s and add ".co.uk" to reply)
ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here.
All logos and trade names are the property of their respective owners.