27 Jun 2019

Widespread HD should have happened much, much sooner

  • Written by 
Hitachi's SK-UHD8000 Series 8K (SHV) Studio, Field Production Camera Hitachi's SK-UHD8000 Series 8K (SHV) Studio, Field Production Camera Hitachi

Famously, NHK are setting up to broadcast the 2020 Olympics in 8K. The broadcaster has shown impressive cameras, and just as importantly, other bits of broadcast infrastructure such as point to point radio links, at recent trade shows. This might encourage us to believe that 8K is now an inevitability and the irresistible tug of progress means that it’ll soon be universally deployed...

The thing is, NHK also recorded parts of the 1984 Summer Olympics, in Los Angeles, in HD. Fascinating a historical footnote as this is, the Hi-Vision system used there is not the HD we know today. The term has been used rather interchangeably for many things, including Super Hi-Vision to refer to the upcoming 8K experiments, but the HD systems of 1984 didn’t have much in common with the 1080i broadcasts of today. Anyone who still needs convincing that the broadcasters’ keenness to push the envelope is not inevitably a harbinger of things to come might also consider HD coverage of the 1990 FIFA World Cup in Italy, which reached a total of ten cinemas.

Two of the most prominent systems intended to allow HD broadcast to the home, with 1980s technology, were MUSE and HD-MAC. They worked, but they were never widely adopted, at least if we allow that “widely” means “outside Japan”, for MUSE. It was broadcast until September 2007, and HD-MAC didn’t even make it that big; it was used for a while for satellite links between broadcast infrastructure, but never really as a broadcast format. 

How they worked

The idea behind MAC was to send the colour information and brightness information sequentially, with each line of the image containing a burst of digital information at the start (used for sound) followed by the chrominance information, which was then followed by the luminance. The acronym stands for “multiplexed analogue components” and began life in the UK, at the Independent Broadcasting Authority, as a way of improving the peformance of broadcast television in general. It was a component analogue broadcasting format which could and did achieve very significantly better images than PAL or SECAM because it avoided composite encoding which led to visible artefacts – the infamous crawling ants – in areas of high saturation. 

The cost was a more complex receiver, albeit one that could be made at a consumer pricepoint with late-80s technology. There were several MAC variants, perhaps most popularly D-MAC, which was adopted by the EBU as the preferred standard for satellite broadcasting, and the slightly reduced-bandwidth D2-MAC which found favour in cable TV and was used until it was displaced by digital options in the early years of the twenty-first century. The high definition version was called, with stunning predictability, HD-MAC, and was developed from 1986. It used a lot of bandwidth, and by 1993 it had become clear that digital encoding could do a better job much more easily.

MUSE

The Japanese option, MUSE, was used in broadcasts starting in 1989, based on a design effort that went all the way back to the late 70s. It can fairly be described as crafty or intricate, and possibly quite fairly as Machiavellian. A complete technical description of how it works is significantly beyond the scope of this article, but anyone interested in the details (and boy, are there a few details) can look at the ITU’s description of MUSE over on its site. It is possibly the earliest system deployed on any scale to use motion compensation; it is not a codec in the modern sense of block transforms, but it uses knowledge about motion in the image to do a better job of leaving out unnecessary detail. It uses active equalisation. It specifies variable noise reduction. It uses a four-field dot interlacing scheme. It is splendid and wonderful and majestically complicated.

What’s perhaps most interesting about both MUSE and HD-MAC is the circumstances for which they were designed. Early thinking about HD video essentially involved doing much the same as for standard definition: composite video, only with more lines and faster. Composite video is really the lowest common denominator, as anyone who’s ever compared the composite and component outputs from a DVD player will know. Composite video involves taking a black and white video signal and putting little tiny wiggles on it. Roughly speaking, the position of the little tiny wiggles indicate the hue and their size indicates the saturation. It was developed as a crafty way of cramming a colour image down a pipe intended for a monochrome one, but all too often it becomes obvious that we’re watching a picture that’s got little tiny wiggles all over it, hence those crawling ants. Nobody really wanted to resort to that with HD images, and in any case, that straightforward approach demanded way more radio bandwidth than anyone wanted to dedicate to the project.

Bandwidth hungry

Naturally, HD would need more, but nobody wanted it to take up more than was absolutely necessary. HD-MAC was never used for UHF broadcast; it was too hungry for bandwidth, and better suited to satellite broadcasting. MUSE was more or less designed for the specific capabilities of available satellite technology, particularly with regard to the amount of transmitter power available. Most satellites, being powered by solar arrays, make a few hundred watts available for a television channel, which isn’t much when we consider that most of them cover continent-sized areas from more than twenty-two thousand miles away. The people behind MUSE made sure that it could be broadcast for a mere couple of hundred watts with enough signal strength to ensure decent pictures. It was also distributed on LaserDisc. Yes, LaserDisc did, at least sometimes, contain digital information, and there was an HD version of LaserDisc that wouldn’t be bettered until Blu-Ray and HD-DVD.

So HD could have happened twenty years before it did. The reason it didn’t is twofold: first, because it quickly became clear that these analogue, hybrid and very early digital systems would quickly be eclipsed by the digital systems we know now, with familiar codecs from the MPEG family. It was a longwinded, expensive technological cul-de-sac. The second reason is that it was perhaps pushed for too early, before the technology base was really ready. Perhaps as a result, the equipment could be swingeingly expensive; the longevity of MUSE broadcasting might be something to do with the desperation of early adopters to get a reasonable amount of value out of their five-figure TVs.

Perhaps the moral of the story is not to be an early adopter. That 8K Olympic broadcast may not, after all, really represent the shape of things to come.


Phil Rhodes

Phil Rhodes is a Cinematographer, Technologist, Writer and above all Communicator. Never afraid to speak his mind, and always worth listening to, he's a frequent contributor to RedShark.

Twitter Feed