NTSC versus PAL

"Die, die, my darling!"

As both PAL and NTSC are basically dead systems (NTSC in the US, at least), there is little point in discussing their differences. But as Mr. Alison insists on displaying his ignorance in public, I'm going to, anyhow.

The first color TV system approved by the FCC was a field-sequential (or frame-sequential -- I forget which) system proposed by CBS. It was developed by Peter Goldmark, the same man given credit for the modern LP phonograph record. (I say "given credit for", because there have been questions as to whether he was the principal designer.)

The CBS system is a classic example of a design botched from the get-go. At that time (not long after WWII), there was no practical way to display three color images simultaneously with a single CRT. So Goldmark went with a spinning color wheel, a system that had been tried 25 years earlier for color motion pictures, and found wanting.

The problems with such a system are obvious, but I'll describe them. One problem is that it requires three times as much film (or in the case of TV, three times the bandwidth). Another is that moving objects show color fringing.

Then there was the problem of the spinning color-filter disk. A 10" TV would require one at least 2' in diameter. Imagine the disk needed for a 21" set! (Not to mention the noise, and the possibility, however remote, of decapitating the cat.)

These obvious (and lethal) deficiencies didn't deter Goldmark or CBS, because they were in competition with RCA/NBC. The CBS argument was... Why limit TV to B&W? Why not /start/ with a color system, and be done with it? CBS pressed the FCC (as one writer pointed out, every sale of an RCA B&W TV would be another nail in the coffin of the CBS color system), and in 1950 the CBS system was approved, despite the fact it was wholly incompatible with the 480i system already in use. *

David Sarnoff ("the most-nasty name in electronics") was naturally upset. RCA had to make CBS look bad, while completing development of their own color system. Sarnoff gleefully pointed out that the CBS system was "mechanical", and subject to all the limitations accruing thereto. Though this was literally true, it overlooked the fact that one can have all-electronic field-sequential color. But -- on the other hand -- CBS had nothing other than a mechanical system to offer.

RCA was working on a "dot-sequential" system. Each line of the image was divided into 300 (or so) pixels **, with red, green, and blue samples alternating. This system worked fairly well -- it produced an acceptable picture on B&W sets. But (for reasons I don't remember) color receivers had problems displaying B&W images. As color receivers would (initially) be used mostly for B&W viewing, this was not acceptable

The breakthrough came when engineers at Hazeltine and GE remembered Monseuir Fourier, and recognized that sampling the colors was equivalent to a "continuous" signal at the sampling frequency. They "slipped a note under RCA's door" (so to speak), and NTSC/PAL came into existance. The color information was transmitted on a subcarrier whose sidebands were interleaved with the luminance sidebands, to minimize interaction. ***

"...complete with bad commercials that repeat all night, both in compatible color and black and white." -- Stan Freberg

The brilliance of NTSC/PAL is that their signals produce as good (or better) an image on B&W sets, and display excellent color on a color set -- without making any existing equipment obsolete, and without requiring additional bandwidth.

So... why is NTSC "better" than PAL? For one thing, it has "better" and "more" color. Although the original NTSC proposal used red and blue color signals of equal bandwidth, it was recognized that this didn't fit with the way the eye actually sees color.

It turns out that for a 480-line system displayed on a 21" tube, the eye sees full color (red/green/blue) only to about 0.5MHz. From 0.5MHz to

1.5MHz, the eye sees only those colors that can be matched with red-orange and blue-green primaries. **** The system was therefore changed to the red-orange/blue-green and yellow-purple primaries, the former of 1.5MHz bandwidth, the latter of 0.5MHz bandwidth.

PAL uses equal-bandwidth (1.0 MHz) red and blue primaries. If an NTSC set fully demodulates the 1.5MHz color signal (most limit it to 0.5MHz to make the set cheaper), more of the original image's color detail will be displayed (though this will be visible mostly in graphics).

Much has been made of PAL's phase alternation, especially its supposed ability to eliminate the need for a tint [sic] control. (It should be hue control.) When was the last time you adjusted the hue control on an NTSC receiver? 30 years ago?

This issue is confused by two factors -- the differences between European and American distribution systems, and their studio standards.

If the transmission network has constant group delay, the hue setting should be set 'n forget, and never need to be changed. The American system had good group-delay characteristics -- the European did not. So switching channels could require twisting the hue knob. But that's not all there is to it.

Non-linear group delay changes the colors in a way that cannot be corrected simply by adjusting the hue control. All the colors cannot be "correct" at the same time. The advantage of PAL is that these color errors "flip" with the phase, and are complementary -- the eye "averages" them to the correct color.

So what's wrong with that? Well, the averaging also reduces saturation. (Mixing an additive primary with its complement pushes it toward white.) With severe group-phase error, the image shows bands of varying saturation. (In NTSC, there are bands of varying hue.)

The other point of confusion is that, for many years, US broadcasters didn't pay much attention to signal quality. Cameras weren't set up properly, and burst phase wasn't properly monitored. So when you changed channels, you sometimes had to change the hue setting. Broadcasters finally got their acts together, and color quality has, for some time, been pretty consistent from channel to channel.

In short, PAL's phase alternation is an advantage with transmission systems having poor group-delay characteristics -- a problem that did not exist in the US. In every other respect, it is inferior to NTSC.

All of this is true, to the best of my knowledge. Corrections and additions are welcome.

  • Some dishonest manufacturers sold B&W TVs with a "color converter" jack on the back. It wouldn't have worked, because these sets didn't have the required IF bandwidth (AFAIK).

** No, the term didn't exist at the time.

*** Some interaction is visible with objects having fine B&W detail. The set "misinterprets" this detail as color information.

**** This is why two-primary color-movie systems (such as the original Technicolor) could give acceptable -- though hardly great -- results.

--
"We already know the answers -- we just haven't asked the right
questions." -- Edwin Land
Reply to
William Sommerwerck
Loading thread data ...

What about PAL and NTSC videos, DVD/BluRay? When did they die?

Reply to
Meat Plow

I meant as broadcast systems. I have plenty of NTSC DVDs, and analog cable signals are still NTSC.

Blu-ray is its own format (1080p/24 or 1080i/60).

Reply to
William Sommerwerck

Technically video tapes are not NTSC or PAL. They have separate tracks for luminance and chroma. The recorders all stripped them apart before recording them and put them back together when playing them.

There is no technical reason not to build a video player with a digital output, which digitzes the signals and presents them as an digital data stream, with out actual NTSC nor PAL encoding. The field/frame rate would be the same as the source material, but that's not the same thing.

The same with DVD's and BluRay. The data is encoded using MPEG compression, which has separate information for luminance and chroma. It can be rebuilt as red-green-blue pixels without ever going through NTSC or PAL.

As reg-green-blue cameras become more common, I expect that there will be an eventual shift to rgb encoded data, but that's a long way off.

Geoff.

--
Geoffrey S. Mendelson, Jerusalem, Israel gsm@mendelson.com  N3OWJ/4X1GM
New word I coined 12/13/09, "Sub-Wikipedia" adj, describing knowledge or
understanding, as in he has a sub-wikipedia understanding of the situation. 
i.e possessing less facts or information than can be found in the Wikipedia.
Reply to
Geoffrey S. Mendelson

Ahh should have clarified

Oh? So BluRay will play on a 50 or 60 HZ system and the audio will be in sync?

Reply to
Meat Plow

Good question. I haven't looked to see whether a Blu-ray player can be set to deliver an SD signal. I don't think it can.

Reply to
William Sommerwerck

I don't think that is actually true. It's been a lot of years since I studied PAL decoding at college, but as far as I recall, the averaging is done totally electronically, courtesy of the PAL delay line. This is a glass block delay line of one scan-line period, so if you run a direct and a delayed path side by side in the chrominance channel, and then sum the outputs of both, you arrive at an electronically averaged result of two sequential lines, with any phase errors balanced to zero. This has nil effect on the overall colour saturation, as this is controlled by a) the ACC circuit, and b) the user saturation control

Reply to
Arfa Daily

l

n,

t
M

n.

ia.

Not so fast there. 2" quadruplex invented by Ampex in 1956 was a composite recorder that records the composite signal as analog FM with no processing at all. It was replaced by SMPTE C 1" helical recorders which also recorded the composite analog FM signal with no processing. Sony Betacam was the first recorder that used separate luma/chroma channels analog FM. This is a little problem when supplied with a composite input as the luma and chroma has to be separated but when mounted directly on a camera where the luma and chroma originate it's very good. D1 digital is a component format with 13.5 MHz luma sampling and 6.75 MHz chroma channels with no data compression on 19 mm tape. Commonly called 4:2:2. Digital Betacam is the same sample rate as D1 on a 1/2" tape with 3-1 data compression. This is one of the most common formats. D-2 digital invented by Ampex for commercial play is a composite digital machine sampled at 14.3MHz aka 4x subcarrier on 19mm (3/4") tape. The Sony D-2 machines were much more successful. Panasonic D-3 digital is 4x subcarrier composite digital on 1/2" tape. Ampex DCT on 19mm tape is a component digital 4:2:2 machine with 2-1 compression.

So, PAL / NTSC refers to the subcarrier encoding so composite machines are indeed PAL or NTSC while component machines are not. They are however referred to as 525 or 625. And yes, all those formats are still in daily use, the 1" and 2" primarily for dubbing to modern formats - often digital Beta.

HD machines have options to record all 3 channels at full bandwidth. This becomes important when compositing images using blue screen or green screen.

G=B2

Reply to
stratus46

I think you'll find that was the intent. However, if the phase error is too great, the eye averaging doesn't work so well, hence the introduction of the delay line.

At which point you wonder why bother sending two colour signals in quadrature if you're just going to average them with the next scan line anyway. SECAM avoids that complexity by just going straight to the delay line. I lived in Paris for 18 months. If there's a quality difference between a SECAM and PAL picture, it was far from obvious.

Sylvia.

Reply to
Sylvia Else

In * REALITY * the NTSC broadcast signal is massively compromised in

PAL has plenty wrong with it and is 'massively compromised' the same ways as NTSC.

** More INSANE CRAPOLOGY !!!!!!!!!!

Editing in composite PAL .....

** More f****it, OFF TOPIC CRAPOLOGY !!

See the words " broadcast signal " - f*****ad ???

Even know what it means ???

..... Phil

Reply to
Phil Allison

Company I worked for in the UK were using component recording (Panasonic MII - high band like Beta SP) in the early '80s, and low band component was around for quite some time before that. Although didn't meet UK broadcast spec for most things, unlike high band. Within a couple of years it was the main format with 1" relegated to archive use. Next change was to DigiBeta.

--
*Give me ambiguity or give me something else.

    Dave Plowman        dave@davenoise.co.uk           London SW
                  To e-mail, change noise into sound.
Reply to
Dave Plowman (News)

The averaging can be done electronically, but there is also some visual averaging.

I'm not sure you can remove the phase distortion without reducing the saturation -- all the stuff I've read on PAL says otherwise -- but I won't press the issue because I haven't thought it through carefully.

Reply to
William Sommerwerck

But you don't have to average them. NTSC doesn't. And the delay line can be used for comb filtering.

The problem is, SECAM /requires/ the delay line because the system transmits only the red or blue color-difference signal at any time. This is what I was talking about -- it keeps the transmission side cheap, while making the user pay more for their TV.

For most images, you won't see a difference. But in an image with strong vertical color transitions, you'll see aliasing, especially when the image moves vertically.

Reply to
William Sommerwerck

If we were building an analogue colour TV transmission infrastructure now, then maybe we'd go the NTSC route, since it eliminates the delay line. But it's undoubtedly true that, for whatever reasons, in earlier times, NTSC didn't perform that well, whereas those whose systems were PAL or SECAM got good colour pictures from day one.

Sylvia.

Reply to
Sylvia Else

They still do some composite D-2 editing at CBS network. Or don't they count as broadcast?

G=B2

Reply to
stratus46

They still do some composite D-2 editing at CBS network. Or don't they count as broadcast?

** Hey f****it.

In relation to television transmission - where does one find the " broadcast signal " ???

Don't strain you tiny brain thinking too hard.

..... Phil

Reply to
Phil Allison

And had high-brightness flicker for just as long...

Isaac

Reply to
isw

It's not clear to me why that wasn't the case anyway. Whatever phase error was introduced to the colour signal by the transmission system would also affect the colour burst. If the problem could be addressed by means of a tint control with a setting that remained stable even over the duration of a program, it rather seems to imply that a phase error between the colour burst and the colour subcarrier was built into the signal at the studio.

Sylvia.

Reply to
Sylvia Else

NTSC? No delay line? Moron. The luminance data had to be delayed to allow time to process the Chroma data. An open delay line in a NTSC video display caused a very dark image with moving blotches of color. I found and replaced several, in NTSC TVs and Video Monitors.

--
Lead free solder is Belgium's version of 'Hold my beer and watch this!'
Reply to
Michael A. Terrell

In which case you'd know that a PAL TV contains two delay lines. One provides a short delay and addresses the difference in delay between the chroma path and the luminance path. The other provides a full scan line delay to allow averaging of the chrominance signal.

It should be obvious from context that "the" delay line that I was referring to was the latter.

But I suppose calling people morons is easier than doing your own thinking.

Sylvia.

Reply to
Sylvia Else

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.