NTSC versus PAL

Many years back, Bush in the UK produced a colour decoder which was 'revolutionary' compared to other manufacturers' efforts, in that the subcarrier was regenerated in the decoder directly from the burst, rather than being a free-running oscillator just locked to the burst with a PLL. They did this by deriving a phase-adjustable pulse from the H-flyback, and using this to 'notch out' the burst from the back porch period. The 10 cycles of burst thus recovered, were then applied directly to the 4.43MHz crystal, which caused it to ring at exactly the same frequency and in exactly the same phase as the original subcarrier. Always seemed to work pretty well, and they continued to use this system over a period of probably

10 years or more, covering three chassis designs / revisions.

Arfa

Reply to
Arfa Daily
Loading thread data ...

I would guess that you never would see such an effect, as all of the decoders that I can remember working on, had ACC circuits which worked very well ...

Arfa

Reply to
Arfa Daily

This was first done by GE, circa 1966, in the Portacolor set, mostly because it was cheaper.

Another way of looking at this system is that the crystal was an extremely narrow-band filter that removed the "Fourier sidebands" around the subcarrier frequency created by transmitting the 10-cycle burst only once on each scanning line.

Reply to
William Sommerwerck

It is, for people who consider one as zero.

--
Service to my country? Been there, Done that, and I've got my DD214 to
prove it.
Member of DAV #85.

Michael A. Terrell
Central Florida

http://www.flickr.com/photos/materrell/
Reply to
Michael A. Terrell

I'm left wondering what exactly was the *real* problem that PAL was intended to fix. It appears that the NTSC tint control could only address a fixed phase offset between the colour burst and the subcarrier, with both transmitters and TV sets able to maintain that offset sufficiently closely that the hue wouldn't vary from left to right of the picture.

Other issues, such as non-linear phase shift would have been a problem for NTSC viewers, regardless of the tint control.

So were NTSC viewers tolerating colour pictures that couldn't be set right even with the tint control? Or is there something else that I've missed?

Sylvia.

Reply to
Sylvia Else

--snippety-snip--

Political. The Europeans didn't want US companies selling sets there.

Isaac

Reply to
isw

Didn't stop the Japanese, etc. But US companies would have to do other mods to their products for European sales anyway. Like mains voltage and frequency. Most couldn't be bothered - even when that's all which had to be changed.

--
*Letting a cat out of the bag is easier than putting it back in *

    Dave Plowman        dave@davenoise.co.uk           London SW
                  To e-mail, change noise into sound.
Reply to
Dave Plowman (News)

Correct.

Also correct.

You /have/ missed something, which I explained "long ago and far away".

The US TV-distribution system DID NOT generally suffer from non-linear group-delay problems, whereas the European system DID. That's it.

Even without the extra delay line, there is some degree of visual color averaging, which tends to mitigate the phase error.

Reply to
William Sommerwerck

I don't buy that. US sets would have been fairly expensive in Europe, even in the mid-60s. Not to mention the strong competition from Thomson, Philips, etc.

Reply to
William Sommerwerck

Part of the difficulity in understanding is that perhaps you don't have experience with early American color televisions... I certainly remember how in the 60s we had to adjust the tint control on a regular (show by show) basis, because of lack of consistancy.

Today, with predominatly digital systems, it has been so long since I've touched a tint control, that I wonder if they still exist!

Anyone who had one of those old, tube (valve) color sets, with the 21" round color CRT, will remember seeing green skies, and blue grass while having skin colors set to the proper shade. Get the sky blue, and the skin turned red, or blue, or green!

Reply to
PeterD

Yes -- a lack of consistency. That was not the fault of NTSC, but of the broadcasters.

I don't think that's correct. The cameras (and/or encoders) would have had to have been very badly set up for that to happen.

On a related subject... I remember reading long, long ago that the first RCA color TV had /four/ controls for adjusting the color, which the author described as a "combination lock"! Anyone know anything about this?

Reply to
William Sommerwerck

OK, I vaguely remember your saying that now.

In the UK, colour was only transmitted on a new 625 line service (newish, in the case of BBC2), in parallel for a long time with a monochrome 405 line service (except BBC2), and I'd have thought the new transmission infrastructure could have been built to obviate the non-linear group-delay, given that it existed in the USA.

And, as I commented before, the Sony Trinitron sets, which didn't implement PAL, performed acceptably according to my memory.

Sylvia.

Reply to
Sylvia Else

I have to wonder what the broadcasters were doing to achieve that. Contriving to get the colour burst phase consistent amongst cameras in a studio (so that the tint stayed the same for a show), but inconsistent with the actual colour subcarrier, would take some doing.

Sylvia.

Reply to
Sylvia Else

You're probably correct.

Reply to
William Sommerwerck

There is no subcarrier or burst signal in the cameras. They aren't needed at that point, and are added during the encoding process.

Setting them up is another matter. The early episodes of "Barney Miller" provide a good example of poor setup, with inconsistent color, and poor convergence.

Reply to
William Sommerwerck

Ok, so the separate colour signals (and luminance?) are sent from the cameras. Still, at some point the colour signals have to be encoded using the colour subcarrier, and a bit of the latter has to be included as the burst. Failing to keep them in phase would require a considerable amount of indifference.

Which I think you've also said ;)

Poor convergence? The mind boggles.

Sylvia.

Reply to
Sylvia Else

And AT&T who provided the coaxial cables that fed the video to all the stations on a network. The tint and chroma level could be adjusted at every facility in the system. I knew someone who worked for AT&T at the time, and he told me what a pain it was to compensate for the cable. When the network switched to a different studio or city for a show, it threw everything out of calibration.

He may be talking about the three 'drive' controls that set the gain for each channel. These are set up to provide equal gain to get a white line during setup. They are service adjustments on TVs, but on an early design they may have been easier to get to. Some TVs still had hollow plastic shaft extenders that passed through the rear of floor model cabinets to adjust these and other pots.

The fourth would be the actual dolor intensity control.

--
Lead free solder is Belgium's version of 'Hold my beer and watch this!'
Reply to
Michael A. Terrell

Oh, yes. The pickups had to be aligned. The "modern" system, in which solid-state sensors are attached to a prism/beamsplitter was not practical with vidicons and Plumbicons.

Reply to
William Sommerwerck

No, these were supposedly user controls. Anybody got a photo of the user controls for a CT-100?

Reply to
William Sommerwerck

So camera setup was poor - as was the later stages of transmission?

This certainly wasn't the case in the UK - despite the transmitters being fed with land lines.

--
*Where do forest rangers go to "get away from it all?"

    Dave Plowman        dave@davenoise.co.uk           London SW
                  To e-mail, change noise into sound.
Reply to
Dave Plowman (News)

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.