Analog TV and
Video standards |
 |
Introduction
Analogue TV signals use a bandwidth compression scheme
(invented in 1932) called interlacing. This involves sending
two half-resolution pictures sequentially, and arranging them
so that the lines of one sit half way between the lines of the
other. The two half pictures are called 'fields', and a whole
picture, in keeping with cinema parlance, is called a 'frame'.
A
Video standard is usually described by giving the
number
of lines per
frame, the number of
fields
per second,
and the method used for encoding the colour information. A full
television standard also needs to state the
modulation
polarity used, where the sound channel lies relative to the vision
signal, and what type of sound modulation is used (i.e., AM or
FM). There may also be extensions to the TV standard; for text
or subtitling services, and stereo sound.
Countries that have a 60 Hz mains supply (USA, Canada, Japan,
etc) generally use a video standard based on 525 lines, 60 fields/sec.
Countries that have a 50 Hz mains supply (Europe, Australia,
Russia, etc) generally use a standard based on 625 lines, 50
fields/sec. There are many variants however, particularly in
the colour encoding method, the sound channel spacing, and the
channels used for transmission. The principal colour encoding
methods are called
NTSC,
PAL,
and
SECAM.
Monochrome
Television
Standards
The table below shows the basic monochrome standards on which all
analog TV services were based.
System |
Lines |
Fields
/sec |
Video
Bandwdth /MHz |
Mod.
Polarity |
Sound
Channel /MHz |
Sound
mod. |
channel
width /MHz |
Broadcast
bands |
Main
countries |
A (3) |
405 |
50 |
3 |
pos |
-3.5 |
AM |
5 |
VHF bands
I & III |
UK, Eire
obsolete |
B |
625 |
50 |
5 |
neg |
+5.5 |
FM |
7 |
VHF |
EU, Aus,
NZ |
C |
625 |
50 |
5 |
pos |
+5.5 |
AM |
7 |
VHF |
Luxembg
obsolete |
D |
625 |
50 |
6 |
neg |
+6.5 |
FM |
8 |
VHF |
Russia,
China |
E (4) |
819 |
50 |
10 |
pos |
±11.15 |
AM |
14 |
VHF |
France
obsolete |
F |
819 |
50 |
5 |
pos |
+5.5 |
AM |
7 |
|
France
obsolete |
G, H |
625 |
50 |
5 |
neg |
+5.5 |
FM |
8 |
UHF |
EU |
I |
625 |
50 |
5.5 |
neg |
+6 |
FM |
8 |
UHF |
UK, Eire |
K, K' |
625 |
50 |
6 |
neg |
+6.5 |
FM |
8 |
UHF |
Russia |
L |
625 |
50 |
6 |
pos |
+6.5 |
AM |
8 |
UHF |
France |
M |
525 |
60 (1) |
4.2 |
neg |
+4.5 |
FM |
6 |
VHF,
UHF |
60Hz
America |
N |
625 |
50 |
4.2 (2) |
neg |
+4.5 |
FM |
6 |
VHF, |
50Hz
S
America |
Notes:
1) The field rate of
system M is modified to 59.94
fields/sec for
introduction of NTSC colour.
2) 625
line systems B-D, G-L are designed to fit into a 7-8 MHz wide broadcast
channel, according to the European scheme. Systems M and N are designed
to fit into 6 MHz wide channels, according to the American scheme. The
loss of horizontal resolution from squeezing 625 line TV into an
American channel (system N) was more theoretical than serious at the
time of adoption, since early colour TVs couldn't reproduce fine
detail. Standard VHS and 8 mm recorders have even less bandwidth than
system N allows.
3) Also included, for amusement, are
the now defunct 405 and 819 line systems. Britain came within a hair's
breadth of having NTSC colour added to its 405 line system (with a
2.6578125 MHz colour subcarrier), but in the end, it was decided that a
625 line colour system based on the European CCIR standard
would be introduced. The first experinental colour service,
broadcast unannounced in the London area, was 625 NTSC-I (BBC
experimental
specificiation, March 1963, 4.4296875 MHz subcarrier, very
impressive chroma bandwidth ), but this was finally replaced by the
Telefunken 625 PAL system (which had European decoder patents that were
used in various attempts to exclude Japanese TV manufacturers from the
European
markets).
Once all three of the British TV
programmes (BBC1,
BBC2 and ITV) were available as 625-line UHF services, the 405-line VHF
BBC1 and ITV services were derived at the transmitter site by standards
conversion. The original method was that of pointing a camera
at
a monitor, but this gave way to analog framestore
interpolation using ultrasonic delay lines, and then, sometime after
1971, to a digital method called DICE (for the amazing story of DICE,
see IBA Technical Review, book 8, 1976).
4) The French 819 line system was HDTV before its time, but it was
bandwidth hungry; and the compromise bandwidth restriction of system F
completely defeated the object. As far as the author knows, no 819 line
colour experiments were ever conducted.
Why the peculiar numbers of lines?
An interlaced picture must always have an odd number of lines per
frame, because a field must have a half-integer number of lines in
order for the lines of one field to lie halfway between the lines of
the next. It also follows that the line and field frequencies must be
derived from a common reference in order for this exact relationship to
be maintained. The common reference is twice the line frequency ( 2f
H
), which is divided by 2 to get the
line sync, and by the line number (N) to get the field sync. The
development work on electronic TV was done in the 1930s (interlacing
was invented in 1932), and the circuits used had to be as simple as
possible to avoid the need for enormous numbers of valves (vacuum
tubes). Division by 2 was reliably accomplished by using the
Eccles-Jordan circuit (flip-flop) as it is today, but division by an
arbitrary integer had to be done by means of a critically adjusted
non-retriggerable monostable multivibrator. If the monostable's
operating parameters drifted, the circuit was likely to jump to a new
division ratio, so the early system designers liked to stick to schemes
that depended on divisions by low numbers, i.e., 3, 5, or 7, which
allowed for considerable drift without risk of jumping. Thus the
preferred TV standards (in approximate historical order) came out like
this:
Line
number |
Division
scheme |
|
243 |
3
x 3 x 3 x 3 x 3 |
UK
Experimental
1936 |
405 |
3
x 3 x 3 x 3 x 5 |
UK
Experimental
1936. UK, Eire, 1939-1984. |
441 |
3
x 3 x 7 x 7 |
Germany
1939, USA
(Empire state) 1939. |
525 |
3
x 5 x 5 x 7 |
USA
(EIA) |
625 |
5
x 5 x 5 x 5 |
Europe
(CCIR) |
819 |
7
x 9 x 13 |
France
(obsolete) |
Field rate: Interlaced
TV systems were never locked to the mains, as is claimed by numerous
sources. Field rates were however chosen to be the same as the nominal
mains frequency so that any 'hum bar' on the picture (due to bad power
supply smoothing) remained approximately stationary. This,
unfortunately, results in a rather low overall refresh rate in 50 Hz
countries, and forces flicker sensitive individuals to stare at the
screen to avoid being irritated by it. A solution to this problem only
arrived towards the end of the analog broadcasting era; in the form of
TV sets that performed an internal
standards conversion to 100 fields/sec.
The EIA and
CCIR monochrome video standards
The television standards include country specific details relevant to
the broadcasting
of video signals by radio and the inclusion of sound. The
underlying video standards however are common to large numbers of
countries. The two main standards that survived up to the
introduction of digital services are EIA 525 and CCIR 625, and these
are still used off air.
Standard |
EIA |
CCIR |
Lines
/ field |
262.5 |
312.5 |
fields
/ sec |
59.94 |
50 |
Line
frequency |
15734
Hz |
15625
Hz |
Line
period |
63.55 μs |
64 μs |
Active
line period |
52.6 μs |
52 μs |
Active
lines / field |
240 |
288 |
Active
lines / frame |
480 |
576 |
Aspect
ratio w : h |
4:3 |
4:3 |
Optimum
bandwidth |
6.11
MHz |
7.36
MHz |
Transmitted
bandwidth |
4.2 MHz |
~5 MHz |
Active periods:
Only part of each line carries picture information, and not all of the
lines are used for picture information. The dead-times are called the
blanking intervals, and are used to transmit synchronising pulses,
black level reference, colour reference burst, and (optionally) test
signals and data. Thus a "625" line picture really only has 576 lines,
and "525" has 480.
Video bandwidth: There
are no hard and fast rules regarding video bandwidth but, for a given
line standard, there is a point above which the law of diminishing
returns sets in, and below which horizontal blurring starts to occur.
Nowadays we might call this point the 'square pixel bandwidth', and
work it out by determining what is needed to display the same number of
pixels per unit length in both the horizontal and vertical directions.
Taking the CCIR system as an example, there are 576 lines, and the
aspect ratio is 4:3, which gives equal H and V resolution if the number
of horizontal pixels is 768. Those 768 pixels must be transmitted in
52 μs, and the highest
frequency video component corresponds to the case when adjacent pixels
are turned on and off alternately. Thus the worst case signal must
change polarity 384 times in 52 μs,
i.e., it is 7.38 MHz.
Note that in the EIA case, the 'ideal' number of pixels is 640
× 480,
which you may
recognise as the basic resolution of a Personal Computer.
TV resolution is sometimes quoted in lines. To display vertical lines,
you must turn adjacent pixels on and off, so 765 pixels gives 383 lines
of horizontal resolution, 576 TV lines gives apparent sharpness
equivalent to 288 lines of vertical resolution (neglecting aliasing
effects).
All of the broadcast TV standards have slightly sub-optimal video
bandwidth, because the founding fathers were worried about the
visibility of the TV line structure and the problem of aliasing. The
point is that for a given bandwidth, you can trade horizontal
resolution for vertical resolution (
horizontal
resolution
is the ability to display
vertical lines,
and vice versa). The problem with TV is that the horizontal and
vertical
sampling methods are not equivalent; i.e., you can put a change in
brightness anywhere you like as you move horizontally in the picture,
but as you move vertically, the positions at which brightness changes
can
occur are constrained by the line structure. Thus, if you put up a test
card of horizontal lines of spacing close to the system resolution, the
lines do not display properly, and have superimposed on them an
undulation in brightness at the difference in spatial frequency between
the test card and the TV raster. The problem is called aliasing, and is
exactly the same as that encountered in digital sampling. Nowadays we
would say that the TV has no protection against vertical spatial
frequencies that exceed the Nyquist limit, and the pragmatic solution
is to use more lines than the available bandwidth would appear to
merit. By further convoluted argument, we get to the fact that the
broadcast video bandwidth used is approximately equal to the 'square
pixel' bandwidth divided by √2 (i.e., 5.2 MHz for
625/50, 4.3 MHz for 525/60) . For closed circuit video work, there is
usually not much point in increasing the video bandwidth beyond the
square pixel bandwidth
multiplied
by √2.
Colour
TV standards |
 |
525 line
colour systems
System
M |
Subcarrier
/ MHz |
notes
|
NTSC |
3.579545* |
N America,
Japan |
M-PAL |
3.58 |
Brazil only. |
NTSC 4.43 |
4.433619 |
Sony hybrid
625 VTR playback of 525 tapes. |
Hybrid PAL |
4.433619 |
625 VTR
playback of 525 NTSC tapes |
* 315/88 MHz
625 line
colour systems
Systems
B-L,
N |
Subcarrier
/ MHz |
notes
|
PAL |
4.43361875 |
W Europe
(not France), M East, Aus, NZ. |
N-PAL |
3.58 |
S America -
to squeeze into American channels. |
SECAM V/H |
FM |
France,
Russia, M East. |
NTSC-N |
~3.58 |
50Hz
countries using American channels. |
Hybrid NTSC |
3.579545 |
525 VTR
playback of 625 tapes. |
Acronyms
NTSC = National Television Systems Committee (USA).
PAL = Phase Alternating Line.
SECAM = Systeme Electronique Couleur Avec Memoire.
It is traditional to give facetious
translations for the colour system acronyms:
Never
Twice
the
Same
Color
Pictures
At
Last
- (refers to the amount of time it took for Europe to get colour TV).
Pay for
Added
Luxury
System
Essentially
(Entirely?)
Contrary
to the
American
Method.
NTSC
All analog colour TV systems are based on the NTSC system, which for
its day, was a brilliant feat of engineering. The idea behind it was
that to transmit color TV, it wasn't necessary to transmit separate
channels of Red, Green, and Blue. Three channels
are
necessary,
because our eyes have three types of colour receptor, but these can be
made up of combinations of R G and B, such that one channel corresponds
to a monochrome picture, and the other channels tell the TV how to
deviate from monochrome in order to recreate colour. We take it for
granted now that a black and white TV could tune to a colour signal,
but
the systems proposed before this invention were not black and white
compatible.
The signals used are called
Luminance (Y),
(where Y= 0.59R + 0.3G + 0.11B, the combination which simulates white
light), and
colour difference R-Y
and B-Y. The colour difference signals together are known as
Chrominance
or Chroma.
The Y-channel is simply modulated onto a monochrome TV waveform, to
provide a black and white compatible TV signal, and the chroma has to
go somewhere else. The obvious thing to do with the chroma, is to put
it into a couple of spare TV channels next to the monochrome one, but a
few clever tricks make such a bandwidth-wasteful approach unnecessary.
The first observation is that the eye is less sensitive to detail in
colour (cone cell vision) than it is in monochrone (rod cell vision),
so the two colour difference signals can be reduced in bandwidth so
that they each need only half the width of a TV channel (or less). So
now, we've fitted the colour signal into only two monochrome TV
channels, but better still; two signals can be squashed into the space
of one by using a scheme called Quadrature Amplitude Modulation (QAM) -
the method used nowadays to get large(ish) amounts of data to travel
through telephone lines. So now we're down to one and a half TV
channels, but then the ghost of old Jean Baptiste Fourier comes to show
how to get it all into one.
If you look at the spectrum of a TV signal, you find that it is not
continuous, but is made up of spikes, at multiples of the line scanning
frequency. Each of the spikes has sidebands on it, spaced off at
multiples of the frame rate, but the frame sidebands are weak compared
to the line harmonics; and so the spectrum essentially has hundreds of
large gaps in it, each easily big enough to fit an AM radio station.
Chroma signals are also TV signals, so they have the same type of
spectrum, so the trick is to modulate the chroma onto a subcarrier that
sits exactly half way between two of the spikes of the luminance
spectrum. In this way, the chroma signal can be placed right in the
video band, and all of the spikes of its spectrum fit between the
luminance spikes. This technique is called
interleaving
and,
in combination with
interlacing,
results in a TV waveform in which the subcarrier phase gets back to its
starting point after four fields (2 frames). Modern video-tape
recording was in the process of being invented at the time by a certain
A
M Poiniatoff (whose initials are in the first three letters of
'Ampex'),
with the financial backing of Bing Crosby. The RCA engineers may not
have known it, but NTSC would turn out to be 'electronic editing
friendly'.
To get the chrominance signals back out of the TV signal, the system
designers resorted to a trick called
synchronous
demodulation, - a method used by spectoscopists and others to
recover signals buried in noise. There was one small problem however,
which was that the subcarrier was visible as tiny dots on the screen,
and while colour tubes of the day were too crude to reproduce them,
they could be seen on black and white sets. The solution was to use a
trick developed for short-wave radio communication, which was to use
suppressed-carrier amplitude-modulation for the chroma. It may sound
surprising, but the carrier signal of an AM radio transmission carries
no information. A lot of transmitter power can be saved by leaving it
out, as long as it is re-inserted in the receiver prior to
demodulation. A short-wave radio had a
carrier
insertion oscillator or
beat-frequency
oscillator (BFO) for this purpose. The operator simply tweaks
the tuning to get the oscillator in the right place, et voila - the
signal becomes intelligible. For a QAM signal however, and for
synchronous detection, both the frequency and
phase
of
the carrier must be re-established, so in NTSC, a small reference burst
of carrier is sent just before the beginning of each line. Using
suppressed-carrier
QAM
was a brilliant idea, because it meant that in areas of the picture
where there was no colour, there was no colour signal either. The dot
interference was thus greatly reduced, and effectively confined to
areas of high colour saturation only.
NTSC achieved a respectable 3:1 bandwidth compression, in an age
when vacuum tubes (valves) were the dominant technology, and
no
one had yet
made an integrated circuit, let alone a DSP chip. It was also very
daring, using every analog signal processing trick in the book; and to
cap it all, it worked. It is not perfect however, and suffers from two
noticeable defects:
1) When the video signal is rich in frequencies that lie in the colour
channel, luminance leaks into the colour decoder and gives rise to
psychedelic effects. For this reason, checks and pin-stripes will
always be out of fashion in NTSC TV studios (and PAL is inherently
worse). The effect is called '
color fire' or '
cross
color', and can be eliminated by modern signal processing
techniques. The TV set has a '
color killer' circuit,
to prevent cross-colour from appearing on monochrome pictures, although
TV companies tended to sabotage black and white films by leaving
the colour burst switched on.
2) When the composite NTSC signal suffers from distortion in the
transmission chain, the QAM signal is skewed in phase, and hue shifts
occur. An NTSC TV needs to have a
Hue control,
to get flesh tones looking right, but even this cannot fix the
brightness dependent
differential
phase distortion that
sometimes occurs. The NTSC knew about this, and an alternative scheme
called
Chroma Phase
Alternation (CPA) was suggested as a solution. CPA was based
on the observation that if one of the colour difference signals (e.g.,
R-Y) was inverted on alternate fields, then any hue errors on alternate
lines of an interlaced frame would be equal and opposite, and if you
stood back from the screen, pairs of incorrectly coloured lines would
average to the correct colour. The problem was, that if phase errors
were bad, they gave the picture a flickering 'Venetian blind' effect,
which could look a lot nastier than a straightforward hue error. The
NTSC decided that the marginal benefit of CPA did not warrant the added
complexity.
PAL
As the American NTSC system reached the marketplace, other countries,
notably in Europe, were working on their systems. In particular, a team
at Telefunken, under the direction of Dr Walter Brüch, was
working on an ingenious modification to the NTSC system that involved
inverting the phase of one of the colour difference signals on
alternate lines. They called the system PAL, which stood for
Phase
Alternating Line, or something like that. The problem with
the PAL method was that, if the chroma phase errors were bad, they gave
the picture a revolting 'Venetian Blind' effect, which they called '
Hanover
Bars', after the town in which the effect was 'first'
discovered (The NTSC almost certainly considered both line and field
CPA - but would have rejected the line version on the grounds that,
over a whole frame, it exacerbates the Venetian blind effect by
producing a pair of lines of one hue followed by a pair of lines of
another). The solution was to average the hue errors electronically, by
taking the TV line coming in off air and combining it with the previous
line stored in an analog memory (un memoire). The original memory was a
device called a 'delay line' (line as in
wire,
or
cable, not TV
line, even though it stored almost exactly one TV line), a cumbersome
and lossy collection of coils and capacitors designed to simulate the
time delay of a very long cable. This was soon replaced by a small
block of glass with two piezo-transducers glued to it - an
ultrasonic
delay
line.
The PAL variant of NTSC needed a few tweaks to turn it into a viable
standard. In particular, the dot interference with a half-line colour
subcarrier offset was exacerbated by the phase inversion process, which
caused the dots to line-up vertically. The solution was to move the
subcarrier to a position a quarter of the line frequency away from one
of the line harmonics (actually 15625 x 283.75 + 25 Hz = 4.43361875
MHz). This is something of a compromise, because the interleaving is
not so good. This reduces the signal to noise ratio of the synchronous
demodulation process, exacerbates colour fire, and gives highly
saturated parts of the picture a crawling appearance. The quarter-line
offset, with interlacing, also results in a subcarrier that returns to
its original phase after 8 fields (4 frames), which precludes precise
electronic editing. This was a small price to pay however, for the
opportunity to take out patents on top of the NTSC system and use them
to control the European marketplace. The point was not to patent the
transmission standard however, which was in any case just NTSC-CPA-H,
but to patent the technology used in the receiver.
The Telefunken team described three decoding methods for HCPA (sorry,
PAL), which they called PAL-S, PAL-D, and PAL-N (the N in this case
stands for 'new' and is nothing to do with TV system N used in South
America). PAL-S (simple PAL), was the "let them stand back until the
Hanover bars aren't noticeable" approach, which couldn't be patented
because of the NTSC prior art. PAL-D was the basic delay-line method,
and PAL-N or '
Chrominance Lock', was a more
sophisticated delay-line method that could track and cancel
differential phase distortion, without the loss of colour saturation
that occurs with the basic D method. Telefunken patented the
delay-line methods, and used these patents vigorously in an attempt to
exclude Japanese TV manufacturers from the European marketplace.
Consequently, until the PAL patents expired in the mid 1970s, all
Japanese TV sets in Europe either used the disgusting PAL-S, or were
made by Sony.
In the early 1970s, Sony introduced a range of PAL Trinitron TV sets
that had a Hue control like an NTSC set. These were a breath of fresh
air in comparison to the dreadful Shadow-Mask TVs of the day, and it
was quite a status symbol to own one. The colour decoder contained a
delay line. Telefunken sued - and lost. The Sony designers had hit
upon a third delay-line method, which used the memoire to store a line
so that it could
throw away alternate lines and treat the signal as though it was NTSC.
If NTSC was as bad as it was claimed to be, Sony should have been
inundated with complaints; but as it was, if you owned a Trinitron set
in those days, people came round to your house to watch it with you,
and the TV companies adopted the video-monitor versions as studio
monitors (despite the EIA tube phosphors - it was the
brightness
they
wanted). The irony was that the most discerning TV owners were watching
PAL as NTSC.
The Sony method was known as 'gated NTSC' and came in two versions;
PAL-H
and PAL-K. PAL-K attempted to ameliorate the diff-phase problem by
averaging over several NTSC lines, but it gave pictures in which the
colour appeared to be displaced downwards, and (in the author's
opinion) the disarmingly simple system-H gave best results. Diff phase
was never a problem unless there was bad ghosting on the signal, and it
was never a problem with colour video playback either. In practice,
with the modification mentioned below, the decoder hue control was set
once and never touched again.
In order to get as far away from Telefunken's patents as possible, Sony
eschewed the 'official' method for extracting colour line
identification information (the swinging burst), and instead used an
obscure feature of the PAL signal called 'Brüch blanking'. The
problem was that Brüch blanking was optional (ish), and some
TV stations didn't always use it (it varied from day to day at one
point). The Sony decoder didn't actually care whether it used all
+(R-Y) lines or all -(R-Y) lines, but a different setting of the hue
control was required in each case, and the line ident was required to
make the choice consistent. If there was no line-ident, there was a 50%
chance that the hue control would have to be readjusted on first
locking on to a signal; and the (commercial) stations emitting the
non-standard signals also tended to put breaks in the sync-pulse train
at programme changeovers, which threw the system out of lock and made
the Sony users keep on getting up to adjust the set every few minutes
(there were no infrared remote controls in those days).
Modifying the decoder, to use swinging burst ident, involved a little
circuit using two diodes and a transistor (a phase-bridge to compare
the reference burst against the subcarrier oscillator, and a transistor
to reset the PAL flip-flop), which could be tacked on to the underside
of the circuit board.
Sony changed to the PAL-D method when the Telefunken
patents expired, and felt obliged to devise a hue control for that, to
keep up the tradition. The control didn't do anything useful, it
basically gave the user the choice of whether or not to have Hanover
bars, and they dropped the idea fairly quickly.
SECAM
The French system results from a highly pertinent observation, by Henri
de France, its inventor; that if you're going to use an expensive
memoire to decode the signal, then you might as well dispense with the
troublesome QAM and simply send R-Y and B-Y on alternate lines. He thus
came close to a scheme that might have given a pronounced improvement
in any environment (studio, editing, and transmission), but the devil
is always in the details. There were two technically feasible methods,
at the time, for extracting signals buried beneath other signals: one
was synchronous demodulation, and the other was the FM capture effect.
It is well known that FM radio stations are immune to impulse
interference, and the idea was to use this trick to make the colour
channel immune to the luminance channel. So much for cross colour, but
unfortunately, the immunity is not reciprocal. You can't suppress an FM
carrier, so an FM-SECAM system has dots in parts of the picture where
there is no colour, and the dots, are not related to the line
structure. Consequently, a SECAM signal makes for very poor viewing on
a black-and-white TV set (some would say flatly that it is not
black-and-white compatible), and there are further problems in
processing the signal in the studio.
Studios working in NTSC or PAL can lock all their cameras to a
subcarrier reference. PAL studios also need to lock their cameras to a
line identification reference, so that they all produce +(R-Y) or
-(R-Y) lines at the same time. When this is done, it is possible to
cross-fade between different sources almost as easily as if they were
monochrome. This is fundamental studio practice, but it can't be done
if the subcarriers are FM. If you mix two FM signals together, you get
horrible interference. The obvious solution was to work with a separate
baseband chrominance channel (you only need
one
with
SECAM), but the pragmatic solution adopted by many TV companies was to
buy PAL equipment, and transcode to SECAM for final distribution. This
is not the cop-out that it might seem however, because SECAM signals
are very robust in transmission. (Most TV companies, of course, now use
digital systems internally.)
Both the PAL and SECAM systems need to transmit a line identification
reference, to tell the TV what type of chrominance information is
coming next. In the PAL case, this is done by shifting the phase of the
subcarrier reference burst. In the SECAM case, this is done by sending
a reference burst in the vertical blanking interval (SECAM-V) or in the
horizontal blanking interval (SECAM-H). SECAM-V is the older of the two
systems, and the signal can carry both types of line ident for
transitional compatibility with older sets. The V-ident signal has to
go however, if the TV station wants to transmit subtitles or Teletext.
S-Video
The point about all of the colour TV standards is that they were
actually conceived as
transmission standards.
When you add the colour information to the TV signal, it always
degrades the quality of the basic monochrome picture, so there is
really no need to do it unless you have to send the signal by radio. It
took the video equipment manufacturers a while to grasp this point, but
when they did, they came up with
S-Video.
Prior to that, we had to work with composite CVBS (Chroma, Video,
Blanking, and Sync)*, or separate RGB. S-Video (Separated) is just the
C and the VBS in separate cables, but otherwise exactly as they would
have been in composite form. If you want to use a monochrome video
monitor with a colour camera, feed it with the VBS part of the S-Video,
rather than composite, and you will get a picture free from subcarrier
dots.
* CVBS originally stood for
'Composite Video Blanking and Sync.', but the C came to stand for
Chroma by consensus at some point.
VHS Video Recording methods.
When attempting to play or transcribe domestic-format videotapes, the
platform of interest is usually VHS. The
following points may therefore be of relevance:
1) All 525 line NTSC machines use the same recording format.
2) All 625 line PAL machines use the same recording format.
3) 525 line and 625 line VHS machines use the same scanning geometry,
they just rotate the heads and feed the tape at different speeds; so
they can be made to play back alien tapes if the manufacturer has
decided
to include the facility. This has led to the development of special
hybrid colour signals (see below), which can fool a TV into
working at the wrong line standard.
4) SECAM recordings can be made using several different encoding
methods.
Video recorders convert the luminance signal into FM, and record it as
diagonal stripes on the tape, one field at a time. The amount of tape
that is fed forward as each stripe is written depends on the thickness
of the head and the speed at which the drum rotates, which is why 625
(E) and 525 (T) cassettes have different lengths of tape for a given
time duration. The chrominance is separated off for recording, and is
moved to a sub-band below the luminance, at around 650KHz. PAL and NTSC
recorders use a straightforward heterodyning system to shift the chroma
down, and shift it back on playback by using a fast VCO (voltage
controlled oscillator), which is adjusted by comparing the burst
signals against a local 3.58 or 4.43MHz reference. The VCO system thus
gives
time-base correction to
the chroma, and protects the delicate phase information against the
vagaries of a mechanical system (i.e., wow and flutter). There is
usually no corresponding timebase correction for the luminance however,
and so diagonal recordings always have slightly wobbly edges on any
verticals in the picture. This problem can be cured by feeding the
video signal through a box called a
Timebase
Corrector (TBC). Some
up-market S-VHS players have a TBC built in.
SECAM can be recorded onto standard VHS in one of two ways. It can
either be heterodyned down and back; or since it is FM, it can be
treated as a string of pulses, divided by four to get it down to the
sub-band, and multiplied by four to get it back. The divide-by-four
method is most common. The heterodyne method is called MESECAM (which I
think stands for 'Middle-East').
S-VHS
recorders
don't use either of these methods however; they transcode to PAL for
recording, and transcode back to SECAM for playback; which means that
S-VHS
is
compatible across all 625-line PAL and SECAM countries (but
unfortunately was not well established as a domestic VTR format).
Hybrid Playback Standards.
NTSC-4.43, PAL-525, and NTSC 625.
These are not transmission standards, although they do come out of RF
modulators. They are used to enable some VCRs to play back tapes with
the wrong line standard. They all exploit the fact that the 625 and 525
line systems have similar line frequencies (15625 vs 15734Hz) Thus a
monitor or TV can usually sync to either, with a small tweak of the
vertical hold to make up the difference between 50 and 59.94Hz. The
purpose of the hybrid standard is to get the colour to work as well.
NTSC-4.43 appeared
in the 1970s, as a way of enabling Sony U-Matic PAL VCRs to play back
525 line NTSC tapes. The reproduction quality was excellent, but the
system required a special type of monitor.
PAL-525 (Mitsubishi
& others), involves recoding the NTSC signal as PAL, on a
4.43MHz subcarrier. This works with almost any 625 line monitor or TV,
but the decoder delay line is 0.44 microseconds longer than the actual
lines, and this causes decoding errors at colour boundaries in the
picture. The results are generally acceptable nonetheless.
NTSC-625 is
a simple matter of unscrambling the PAL signals and re-coding as
NTSC-3.58. There are no inherent problems other than that the chroma
interleaving is not optimal - which doesn't matter at all provided that
S-Video is used for the link between the VTR and the monitor..
© David W Knight 2000, 2002, 2018, 2021.