Why 75 Ohms not 50 Ohms?

Does anyone know why video connectors have impedance 75 Ohms not

50 or any other number Ohms?

Why is 75 Ohms better than 50 Ohms for video signals?

-- Harry

Reply to
Harry
Loading thread data ...

73.2 ohms is the radiation resistance of a dipole. 2 dipoles in a balanced configuration are thus 293 ohms. These numbers are rounded off to 75 and 300 ohms respectively.

Norm Strong

Reply to
<normanstrong

I once saw this explained ( maybe in the ARRL handbook): the higher impedance gives better voltage transfer,lower(35 ohms?) give better current match . 50 is a compromise. Actually, some missle and radar systems use 95 ohms.

Reply to
sdeyoreo

Isn't it related to the FM/TV dipole antenna which has 300 ohm impedance???? I'm not quite sure but the relation may have someting to do with 300 / 4 giving 75.

Jerry

Reply to
J. Davidson

Isn't is related to the dipole TV/FM antennas with 300 ohm impedance?

300 ohms / 4 giving effectively 75 ohms.

Jerry

Reply to
J. Davidson

Whn the video guys needed some coax, they just grabbed what was handy.

--
Paul Hovnanian     mailto:Paul@Hovnanian.com
------------------------------------------------------------------
I could get a new lease on life but I need the first and last month
in advance.
Reply to
Paul Hovnanian P.E.

It isn't.

Reply to
Reg Edwards

Hello Reg,

Reight, technically it isn't. But then again 75Ohms -> less copper for center conductor -> more plastics -> lower cost -> more profit...

That could all change with the current oil price trend.

Regards, Joerg

formatting link

Reply to
Joerg

off

What frequency would you pick as the low end of RF?

Reply to
Richard Henry

You get more voltage for the same power.

Tam

Reply to
Tam/WB2TT

Hello Jerry,

This ratio had to be 4:1 because of baluns. However, it depends on which part of the world you are looking at. In much of Europe the flat cable and thus antennas were set at 240 Ohms and consequently the coax used to be 60 Ohms. I believe they also went to 75 Ohms now, probably assuming the old 240 Ohms antennas have all corroded away. Most of the late models had preamps anyway.

Regards, Joerg

formatting link

Reply to
Joerg

Coaxial cable impedance has nothing to do with radio antennas or video. Its a matter of engineering economics.

50-ohm cable maximises power handling capability for a given amount of expensive copper.

75-ohm cable minimises attenuation for a given amount of expensive copper.

If the price of copper had been different in the 1920's when coax was developed we might have had two different standard impedances.

--
Reg.
Reply to
Reg Edwards

--
If the source, line, and load are matched, for the same power out of
the source, a higher impedance system will develop a higher voltage
across the load.
Reply to
John Fields

Because 50-ohm coax maximises power handling capacity it is most appropriate for feeding radio transmitting antennas and hence other radio applications.

Because 75-ohm coax minimises attenuation versus distance it is most appropriate for long and short distance communications purposes such as the telephone system.

Far more millions of miles of 75-ohm coax have been manufactured than

50-ohm coax since the 1930's.
--
Reg
Reply to
Reg Edwards

It was pretty mature, even then.

formatting link

Reply to
Arny Krueger

formatting link

"On December 8, 1931 Lloyd Espenschied and H.A. Affel from AT&T received their first patent No. 1,835,031 for their "concentric conducting system", aka the coaxial cable. Their invention was not intended for amateur transmissions but rather for the first television signals that required a line broadband enough to transmit a range of frequencies compatible with television image. Espenschied and Affel 's invention involved placing a central conductor inside a hollow tube and holding it in place with washers spaced equally along the length of the tube. The low-loss dielectric was air."

Reply to
Arny Krueger

I read in sci.electronics.design that Harry wrote (in ) about 'Why

75 Ohms not 50 Ohms?', on Mon, 12 Sep 2005:

Better colour rendering 75 ohms is violet-green-brown, while 50 ohms is only green-black-brown. Violet and green are complementary, so you can get any colour along that axis by mixing. Brown is low-brightness yellow (yellow would have given 750 kohms, which is obviously too high), thus introducing the missing red element.

Or perhaps it's just a historical accident, based on early designs of coaxial cables in different countries. In Germany, 60 ohms was used.

--
Regards, John Woodgate, OOO - Own Opinions Only.
If everything has been designed, a god designed evolution by natural selection.
http://www.jmwa.demon.co.uk Also see http://www.isce.org.uk
Reply to
John Woodgate

Man said Video, not RF. These seem to be spec'd at around 1Volt. We also used 75 Ohm cable for high speed data. Controlling parameter was the about

0.65V needed to turn on a transistor. The ~100 Ohm cable was too fat or too fragile.

Also, you have to look at this in historical perspective. Of the cable available in the 1940s, 75 Ohm cable had less loss.

Tam

Reply to
Tam/WB2TT

Hi,

Usually 75-ohm cable has less capacitance per unit length than

50-ohm and so, at video frequencies where you can't have a match over the complete range (which with short cables is not so important anyway), roll-off at the high end will be less pronounced.

Cheers - Joe

Reply to
Joe McElvenney

I'm not seeing the fallacy in his statement, Reg.

BTW, 50 ohms is NOT the max power handler for a given size of coax. 32 ohms claims this distinction. 50 ohms may have been chosen for a lot of reasons; the most plausible is that it is a good compromise between highest power (32 ohms) and lowest loss (75 ohms).

Jim

Reply to
RST Engineering (jw)

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.