Audio Precision System One Dual Domani Measuirement Systems

I don't see how we could harmonize system I channels with the French 919 line channels!

Other western European countries[1] used system B in a 7MHz channel width and system G in an 8MHz channel at UHF.

To use the same channels we would have needed to devise a system X with a truncated vestigial side-band to fit our 6MHz sound-vision spacing into 7MHz - in reality, I don't think it would have fitted!

In practice, if we had decided to carry on using VHF for 625 line broadcasting, I think we would have harmonised with the Irish 8MHz channel plan - not least because of the proximity of NI transmitters to those in the republic.

[1] Belgium also had its own variant of the French 819 line system crammed into a standard 7MHz channel - it must have looked truly appalling in comparison to 625!
--

Terry
Reply to
Terry Casey
Loading thread data ...

Umm..No. You've both got it the wrong way round. With -ve polarity sync pulses are more affected by noise bursts than with +ve polarity. And white flecks are far more obvious than black. Part of the reason is that impulse interference could greatly exceed the 100% vision carrier level, saturating the video amplifier and, with +ve modulation, the CRT.

This was why US TVs, where -ve modulation was used from the beginning, employed flywheel sync very early on, whilst UK TVs didn't. On the other hand UK TVs needed peak-white limiters to prevent the CRT defocusing on to the "whiter-than-white" interference specs.

The real benefit of -ve modulation was AGC. With -ve modulation sync tips correspond to 100% modulation and make an easy source for the AGC bias. With

+ve modulation sync tips are at zero carrier which obviously is useless for AGC. Instead the back-porch has to be used and many different weird and wonderful circuits were devised to "gate out" the signal voltage during the back porch. Due to the need to keep costs down manufacturers increasingly turned to "mean-level AGC" in which the video signal itself was simply low-pass filtered to form the AGC bias. This lead to receiver gain being varied by the video content, so the black on low-key scenes was boosted whilst the whites in high-key scenes were reduced leading to a general greyness to everything. To me it looked awful but as the Great British Public kept buying these sets (and they were cheaper to build) mean-level AGC became the norm for B&W UK domestic TV receivers. One great advantage of colour was that mean-level AGC could not be used, to give correct colour values colour sets *had* to display a picture with a stable black-level.

David. >

Reply to
David Looser

We have a PAL TV set that displays bright white as black. :-)

Geoff.

--
Geoffrey S. Mendelson,  N3OWJ/4X1GM
My high blood pressure medicine reduces my midichlorian count. :-(
Reply to
Geoffrey S. Mendelson

In message , David Looser writes

Even with negative video modulation, it didn't seem to take the Americans long to realise that they could cut costs by using AC coupling in the video amplifier between the video detector and the CRT. [I've got some old US monochrome TV circuits which definitely show AC coupling.] As a result, the benefits of having an AGC line which didn't vary (much) with video content would be essentially lost.

Regarding using the back porch as the signal reference, and deriving the AGC from it, I recall a Wireless World article in around 1967, describing a simple add-on circuit (which I made) which partly did this. It worked both on 405 and 626-line signals. It wasn't intended to improve the horrible mean-level AGC but, at the start of each video line, it did clamp the video drive (to the cathode of the CRT) to the black reference of the back porch. As a result, you still got the contrast varying with video content (maybe not so much on 625), but at least the black stayed (more-or-less) black.

--
Ian
Reply to
Ian Jackson

In message , Terry Casey writes

Of course, both the British and the Irish could have simply adopted the European systems B and G (5.5MHz sound-vision - plus the horrendous group delay pre-correction curve). If I remember correctly, the only difference between systems B and G is the 7 vs 8 MHz channel spacing. Even the VSBs are the same (0.75MHz).

Again, IIRC, the RoI VHF 625-line channels were the same frequencies as the 'lettered' 625-line channels already used on many VHF cable TV systems.

I think that these had gone well before I got involved!

--
Ian
Reply to
Ian Jackson

I have often been puzzled by this requirement. What is the reason - just identification of the earth wire, or something else?

--
J. P. Gilliver. UMRA: 1960/
Reply to
J. P. Gilliver (John)

In message , Arny Krueger writes: []

I think the voltage in use probably has about as much relevance as the wiring system - at twice the voltage, the same power will require half the current anyway. Doubling the wire as well obviously increases the capacity too (or, allows the same capacity with thinner wire - though I'm not as convinced by that argument as some).

--
J. P. Gilliver. UMRA: 1960/
Reply to
J. P. Gilliver (John)

In message , David Looser writes: []

[] ? - one on band I and at least one on band III, surely? I lived in (West) Germany in the 1960s and '70s, and I'm sure we could get at least two channels on band III (yes, I know B and G channels are narrower, but not that much).
--
J. P. Gilliver. UMRA: 1960/
Reply to
J. P. Gilliver (John)

It's just belt and braces - slightly less chance of a short if wires get trapped by careless assembly.

--
*If horrific means to make horrible, does terrific mean to make terrible? 

    Dave Plowman        dave@davenoise.co.uk           London SW
                  To e-mail, change noise into sound.
Reply to
Dave Plowman (News)

In the UK Band 1 was divided into 5 channels which, with care, could just about support one national TV network. (a few low-power fill-in transmitters for 405-line BBC1 had to operate in Band 3)

With 8MHz channels that would reduce to 3 which I suggest is not enough for one national network.

Of course if you are only looking for local coverage you could run several networks in the available spectrum. But the argument was that VHF gave better national coverage than UHF. If that is the aim then, I suggest, you'd need both Bands 1 and 3 to give truly national coverage of just one network. Its probable that it would be possible to add a second network that only covered the main population centres, as Analogue Channel 5 did on UHF.

David.

Reply to
David Looser

I don't know how well UK sets worked in the 1960's, but US TV sets were not capable of receiving adjcent channels at one time, so they were not used. For example, channel 2 was used in New York City, while the nearest channel 3 station was in Philadelphia, 90 miles away and too far to be received without a large antenna.

I think the next one up was 5 in NYC and 6 in Philly.

When the US started UHF TV in the mid 1960's (all 1965 models had to have VHF/UHF tuners), they spaced the channels far apart, Philadelphia for example had three, 17,29 and 48.

Geoff.

--
Geoffrey S. Mendelson,  N3OWJ/4X1GM
My high blood pressure medicine reduces my midichlorian count. :-(
Reply to
Geoffrey S. Mendelson

Generally, UK (and even European) TV sets had a hard time with adjacent channels. Like the USA, the off-air broadcast channels were arranged so that, within the normal service area, there would never be an adjacent channel which was anything like as strong as the channel(s) intended for that area.

The same was true of cable systems. As TV sets were incapable of operating with adjacent channels, they carried only alternate channels.

However, things changed with the advent of cable set-top boxes. These were specifically designed to be capable of receiving a level(ish) spectrum of maybe 30+ channels. The tuned channel was converted to a single output channel in Band 1 (selected to be a vacant off-air channel in the area where the STB was to be used). Essentially, all the adjacent channel filtering was done on output channel, so the TV set was presented with only a single channel, thereby eliminating any problems with poor adjacent channel selectivity.

Early STBs covered only non-off-air channels, eg 'midband' (between Bands 2 and 3) and 'superband' (above Band 3 to around 300MHz). As a result, large cable TV systems would carry alternate channels in Bands 1 and 3 (so that they could be received directly by the TV set), and adjacent channels elsewhere (which could normally only be received via the STB).

Later on, when multi-channel cable TV was recognised as 'the way to go' by the TV set manufacturers, TV sets themselves started being equipped with wideband tuners - typically providing virtually continuous coverage from 50 to 300MHz and beyond, plus the UHF TV broadcast band. At the same time, TV set adjacent channel selectivity was improved, as they had to be capable of receiving the adjacent cable channels.

In the 1980s, SAW filters became widely available for use in domestic TV sets, and these virtually eliminated the problems of interference from adjacent channels. Of course, eventually, cable TV set-top boxes also developed further, providing not only continuous wideband coverage of from 50 to 870MHz, but they also became descramblers/decoders for pay-TV services.

IIRC, at first, UHF was not very popular in the USA. Tuners were pretty rudimentary - consisting of virtually nothing except a triode variable frequency oscillator and a crystal diode mixer (techniques essentially borrowed from WW2 radar technology), and this fed the input of the existing VHF tuner. UHF transmitter powers were low, and as receiver sensitivity was not much better than a crystal set, coverage was minimal, so virtually no one bothered much with UHF TV. As a result, TV sets continued to be manufactured fitted with only the traditional

12-channel lowband/highband VHF tuner.

Eventually, because of total congestion in the VHF TV bands, I believe the FCC stepped in, and more or less forced TV manufactures to fit the additional UHF tuner. I believe understand that they did this rather indirectly - not by requiring TV manufacturers to fit UHF tuners per se, but instead by making it illegal for them to ship TV sets across a state border if they did not have a UHF tuner.

--
Ian
Reply to
Ian Jackson

As the UHF bands had been engineered by international agreement for 8MHz channels to accommodate all European 625 line systems (with the vision frequency being common to all of them), it made sense to make better use of the bandwidth available - in fact, as we were starting from scratch, I've often wondered why we didn't adopt the eastern European OIRT standard with its 6MHz vision bandwidth.

As for group delay, I suppose it made sense to pre-correct the transmission to suit the average receiver group delay response. Were the system I parameters, without group delay correction, determined in the belief that UK manufacturers were so much better at designing IF strips than their continental counterparts? ;-)

Group delay was something I never thought about - until a rude awakening doing early experimental work on Teletext - but the introduction of SAW filters resolved the problem ...

Yes, but don't forget the Belgian system H with 1.25MHz vsb ...

Chicken and egg situation? RTE was broadcasting using VHF 625-line channels at least two years before BBC2 came along. I think you meant: many VHF cable TV systems used the 'lettered' 625-line channels already used by RTE ...

Continental systems, of course, used the CCIR broadcast channels, as well as filling up the gaps in between ...

--

Terry
Reply to
Terry Casey

It's also been changed twice!

Originally, earth wires were bare, then the requirement to cover them with green sleeving was brought in.

Finally, the sleeving was changed from green to green/yellow ..

--

Terry
Reply to
Terry Casey

Were both channels available nationwide or just in densely populated areas?

Or is German topography more amenable to providing large area coverage with fewer transmitters?

(I'm thinking here of the German plains that we were told for many years provided ease of access for Soviet tanks ...)

--

Terry
Reply to
Terry Casey

In message , Ian Jackson writes

When I have a book about faults in American NTSC sets and was surprised to see just how cut down they were. Instead of "I" subcarrier being

1.5MHz it was reduced in the sets to 1.0 MHz because that was the same as the "Q" subcarrier and it made the sets cheaper to produce, with of course the lower colour accuracy, but that came second to price.
--
Clive
Reply to
Clive

The UK UHF band plan specifically avoided the use of channels n, n + 5 and n + 9 in any transmitter group (n + 5 = n + IF; n + 9 = n + 2*IF) to prevent interference.

I was quite surprised not to find any problems with a cable system I started work on in 1969 which used 22 adjacent VHF channels (45 -

228MHz). As the system provided financial information only, there were no sound carriers.

All the receivers used were modified domestic receivers using the ITT/KB VC100 chassis. This chassis was effectively the old dual standard chassis that had gone through at least five iterations that I can remember - VC1, VC2, VC3, VC51, VC52 - in the previous four or five years, with all the 405-line bits left out. Consequently it was really quite an old design.

The GPO (which was just starting to transform itself into BT) were responsible for the RF generation and trunk distribution and had chosen a non-standard 8.3MHz channel spacing to ensure that the local oscillator never clashed with a vision channel. This was possibly inherited from the ILEA schools CCTV system they'd run because the tuning errors they'd allowed for were a joke as our receiver tuning always had to be spot-on because of the high frequency component of the video - think CEEFAX in vision but with 48 character[1] lines.

Despite the adjacent channel traps in the receivers still being aligned for 8MHz spacing(!) we never encountered any problems.

All later (broadcast) CATV and SMATV systems I've encountered, though, have always used alternate channels, as described by Ian, for channels intended for direct reception by a domestic receiver (i.e.: without first being received by an STB).

[1] The worst characters in the special set used in these pre-decimal days were 10 and 11 (for tenpence and elevenpence). Of these, ten was the worst, producing a 10101 pixel sequence for most of its height - tuning really had to be spot on for this!
--

Terry
Reply to
Terry Casey

Are you talking about channels or stations?

In the analog days of television in Bundesrepblik Germany, the three public networks ARD Das Erste, ZDF, and die Dritten Fernsehprogramme (regional TV stations) were available nationwide but as is the case in all countries (except Netherlands and Vlaanderen), transmitter coverage was not 100%.

In the late 1980s, two commercial networks were allowed to start terrestrial broadcasts -- RTL and Sat Eins, but these were low power and only available in major urban markets.

With the switch off of analog TV, all TV transmissions in Germany are now on UHF channels. In Western Europe, only Danmark and Letzebuerg have transmitters with DVB-t on VHF Band III.

If you want to see which stations are available in the nation's capital and surrounding region (Berlin-Brandenburg) then take a look at the tables at

Note that in order to provide a good quality SD picture with rock solid reception, the modulation is 16-QAM 8k FFT, and only four TV stations per multiplex. Meanwhile SDN crams 12 video streams into 64-QAM 8k FFT with reduced FEC because commercial dross trash and profits are more important than picture quality and reception stability in a free-market light touch regulatory broadcast framework.

Reply to
J G Miller

In article , Terry Casey writes

IIRC, it was to make it easier for colour-blind people to identify.

There's also been another change: the cores in T&E (=romex) used to be red and black for phase and neutral, now it's been harmonised with Europe to brown and blue.

Three-phase wiring has been harmonised from red, blue and yellow for the phases and black neutral to brown, black, black and blue neutral. Yeah, I know...

--
(\__/)   
(='.'=) 
(")_(")
Reply to
Mike Tomlinson

You mean that two of the phases are the *same* colour? Surely not: how do you know whether it's safe to connect two wires if they could be on different phases? And if you connect brown, black and black to a three phase motor and get the two blacks the wrong way round it will run backwards.

Reply to
Mortimer

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.