HDMI 2.0 and 2.0a: Technology made for 4K and HDR

Written by Phil Rhodes

RedShark NewsHDMI 2.0 and 2.0a

The more data images take up the more difficult it is to move them around effectively, which is where HDMI 2.0 and its recently added HDR-friendly 2.0a version come in.

"More!" screams the film industry, almost incessantly. More pixels, more frames per second, more stereoscopy, more bits per pixel. While there are arguments that this progression is reaching a natural conclusion, there's already plenty of material in existence that won't comfortably fit down some of the pipes that are commonly used to transport it. The ever-active Douglas Trumbull, creator of the ambitious Showscan system – 70mm film at 60 frames per second – is still at it, talking about 120fps 4K stereoscopic material. To put that in perspective, assuming that means each frame is 3996 by 2160 and 12 bits per channel, each (uncompressed) frame is about 37 megabytes.

For 120fps stereo, that's nearly nine gigabytes per second of data, enough to empty a dual-layer Blu-ray disc in five point six seconds, if a Blu-ray could be read that fast. At recent trade shows, Quantel has shown enormously capable implementations of its Pablo colour correction system that can handle movies with specifications such as these, a development made possible by the massive parallelism of modern information technology. Naturally, it'll never be stored like that – the maximum bitrate even for digital cinema packages is only 250 megabits per second, which would need expanding to do this meaningfully – but once the pictures have been unpacked, they need to be transported to the display device.



As we've discussed before, even version 1.4 of HDMI, which probably represents most of the installed base of products, is already a pretty capable standard. It can handle 10.2 gigabits (1.2 gigabytes) per second at 3.4 gigabits per channel using hardware that is, in many ways, simpler than that used for three-gigabit SDI and is therefore more suitable for consumer equipment. SDI (at least the original, 270-megabit standard definition version) was designed with the constraint that it should use existing 75-ohm BNC cable installations. Because of this, all the pixels have to fit down one wire, meaning SDI has to be fast – three times faster, pixel for pixel, than HDMI, which sends pixels down three separate channels simultaneously.

The comparison of HDMI and SDI can tell us something about why faster is harder. Cables and connectors for both formats are specified for impedance and capacitance, things which affect how a signal is affected by its passage down a conductor. Details are outside the scope of an article like this, but these things are easier to control with the coaxial cable and BNC connectors of the more professional standard. The mitigating factor for HDMI is that each channel is sent over two wires, one of which carries the signal and the other an inverted version of the signal. Because distortions and interference are imposed equally on both channels, inverting one of them at the receiving end then mathematically summing the results removes the interference. This technique, differential signalling, is widely applied to data communications: USB, ethernet, SATA SSDs and CFast cards, among others. It is also used with AES digital audio, mainly because AES, like SDI, was made to replace an existing standard which happened to offer enough connector ways to do it. Given more signal lines, SDI would probably have used the same approach and the higher bitrate versions would be more reliable over long distances.

Slow transition

HDMI is a cost-constrained consumer standard that's being expected to perform some fairly significant electronic gymnastics. While the high-speed requirement is mitigated by the multiple channels, the complexity is significant and, up to HMDI 1.4, there was not sufficient performance, for instance, to handle beyond-HD resolutions at more than 24 frames per second. This is inadequate for full 4K stereoscopy or for 4K material at higher frame rates that might be used for sports. Achieving more speed has been reliant on general advances in semiconductor technology and has resulted in HDMI 2.0, which was released as a standard in 2013, but has only recently become common on displays and other devices intended to create or handle 4K material. Because of the comparatively slow takeup of HDMI 2.0, in fact, some UHD displays suffer the unfortunate effect of being unable to feed their very capable TFT panels at maximum capacity over HDMI, whereas they could, for instance, via DisplayPort.

Encouraging numbers

The numbers associated with HDMI 2 are encouraging. The key improvement is that the 3.4Gbps per-channel data rate of 1.4 is almost doubled to 6Gbps, providing for 60-frame 4K formats and a total thorughput of 18Gbps. HDMI 2.0 also supports the Rec. 2020 colourspace (this could be unofficially transmitted over previous standards, requiring manual display configuration for correct behaviour), cinemascope-esque 21:9 aspect ratio and various improvements to multichannel audio and 3D handling. This also leads to improved capability in HD modes, with 1080p60 stereoscopy (for a total of 120 complete frames per second) on offer.

HDMI is a complex standard with a wide variety of capabilities and some devices choose to implement only a selection of them, so users will still need to research carefully, pore over manuals and be quite cautious to ensure that everything works as well as it should. The tendency of HDMI devices to invisibly negotiate a picture that looks reasonable without actually being technically ideal is a common problem. A lot of devices don't offer easy ways to specify exactly how things should behave. Even devices using HDMI interface components which are capable of advanced features may not actually present them to the user.

These concerns aside, the improved capability of HDMI 2.0 is naturally welcome. The increased speed may limit cable lengths to less than those achievable under HDMI 1.4 and the connectors are no more robust, so there will still be some need to convert HDMI to SDI (or potentially several runs of SDI). The HDMI 2.0a addition of high dynamic range support, potentially facilitating Dolby's fascinating Vision technology, was only added in April 2015 and will trickle down the distribution chain in due course. Again, 48-bit colour support would probably be sufficient to implement HDR anyway, given manual configuration of a display to expect HDR material, but official support should ease the configuration workload. In the end, HDMI 2.0 still isn't fast enough to support Trumbull's ambitious 120-frame 4K formats, but that's an extreme situation. Neither, for that matter, is DisplayPort, recent versions of which are almost twice as fast as any version of HDMI. It would take four HDMI 2.0 cables to transport 2160p120 in 3D, so there are clearly places to go – notwithstanding, of course, the debate about how worthwhile it is to go there.

Tags: Technology


Related Articles

31 July, 2020

This is how Netflix is adapting Anime to modern technology

The streaming service brings 4K and HDR to the classic Japanese artform in the latest example of its prototyping production techniques.


Read Story

30 July, 2020

Gigabyte Aero 17 XA review: A competition beating powerhouse [sponsored]

The Gigabyte Aero 17 XA has some pretty nifty specs on paper. How does it stack up in the real world, and more importantly against the competition?


Read Story

30 July, 2020

For all film makers: How to avoid losing your stuff and where to put it

Replay: The technological revolution has created great opportunities for new film-makers everywhere, but has in its wake created a new challenge:...

Read Story