<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

HDR: de-confusing the standards

6 minute read

DolbyDolby rendering of the effects of HDR

RedShark Replay: For a new technology that looks set to be deployed without any damaging format wars, there is still plenty of confusion regarding HDR standards and what you should be looking for when thinking of purchasing a new TV set. Here’s the information you need.

HDR is most certainly the next ‘big thing’. Unlike resolution increases, HDR promises to offer a very tangible improvement in viewable image quality, at least on the right monitor and in the right viewing circumstances. But HDR is not always what it seems at first glance, especially when it comes to the process of choosing a monitor or television. There are various competing standards and specifications on offer, which drastically effect the viewing experience that you will have. It is enough to give even the most ardent video experts a headache.

If you are in the market for an HDR display to view content, I will attempt to clear up some of the confusion here in a way that I hope mere mortals can understand. So forgive me for not delving fully into the deep technical minutiae of the EBU and SMPTE standards, I will leave that to our technical guru, Phil Rhodes, to do at some point since I want to give an overview in a nutshell of what HDR is really all about when it comes to choosing a display.

The Brightest Whites Doorstep Challenge

In order to understand the effect our choice of display will have, we need to understand a few basic points about HDR. One of the popular conceptions is that HDR offers much more screen brightness. The theory being that having a brighter display means that the brightest whites can be viewed in a much more realistic way. This is correct, sort of, but glosses over a lot of the reasons behind it, as well as what the brightest whites really are. Or even whether we are really talking about white tones at all! HDR is not, as it would seem, about simply showing what you see in a conventional picture, but brighter. And this is an important point to be borne in mind when it comes to your choice of display.

Regardless of the HDR standard being used, one of its main purposes is to give you greater displayed dynamic range. As we know, the main problem that we have with image acquisition is with regard to highlights. On a standard display, reference white is specified in existing specifications at 100 nits. So far, so simple, and one would therefore assume that on a good HDR display, this same white would now be well over 1000 nits brightness.

Not so. And this is the crux point about HDR as it stands. That same white level in HDR will still roughly be the same brightness as it is on a standard display, as in fact will the average picture brightness overall. The reason for this is because HDR is designed to handle the detail in highlights much better. Take a look around you. Even in bright sunshine, is the white piece of paper on your desk as bright as the sun, or as bright as the specular highlight reflecting off a polished metal object? Of course not. And so what HDR gives us is the ability to replicate that difference in brightness, and the detail contained within, much more realistically and accurately.

In other words, the average brightness of subjects such as human faces, room illumination etc in HDR wont be that much different to SDR (Standard Dynamic Range), when graded well. In fact most of the tonal space in the specified transfer function curves is given over to these aspects of the picture still. But what HDR does have is much, much more headroom available in those bright areas of the picture above the conventional 100 nit level, which results in greater creative freedom in the grade, as well as making textures, for example sea water with glinting sunshine, or the surface of polished textured metal such as copper, much more realistic. Bright light shining on a human face, too, will be made, finally, a creative possibility without losing skin detail or essence of colour.

Much like having greater dynamic range in sound, with more headroom for loud noises such as explosions, and the ability to make intelligible sound at lower levels, HDR gives the opportunity to be much more subtle, as well as ‘in your face’ as and when required.

But this very fact causes a bit of a headache when it comes to choosing the right display, and sorting through the different capabilities. Current display technology is still quite limited. Currently, we still have to choose between OLED and LCD technology. As we will see later, the various competing standards for HDR have very different approaches to displaying such images. And not all of them will be suitable for a general viewing environment.

Because of this, when you choose a display to view HDR with, you are faced with a bit of a choice. At least with current display technology. With regard to current HDR you have a choice of getting a display with a lower maximum luminance specification, but better blacks (OLED), or a higher luminance capability and slightly higher blacks (LCD). Yer pays yer money, yer takes yer choice, and it is for this reason that the UHD Alliance specifies two different display luminance standards to account for this.


Elephant in the room

When it comes to displays, it is therefore very important to be aware of what standards it displays. This is important because the screen that you choose really should be based upon what you view, and importantly, where.

We are all used to having the ability to adjust our television brightness to account of the viewing conditions. In bright sunshine you may well want to increase the brightness of your set. Unfortunately some standards of HDR come with a fairly large limitation. Both HDR10 and Dolby Vision are shown on ST-2084 compliant monitors. These are the two best known HDR standards and are seen as the current pinnacle of quality viewing. But what you might not know about them is that whilst both are designed to adapt to the different capabilities of the monitor or television being viewed, they are also designed to have the maximum available dynamic range on that display available on tap at all times.

This means that if you want to adjust the brightness of the monitor to account for ambient light conditions, you’ll be out of luck. So viewed in a bright sunny room some scenes may appear very dark. Whilst in dark viewing environments, they could appear overly bright and cause eye strain. Furthermore, while both standards are designed to adapt to different monitors dynamic range capabilities, there is no set standard way to achieve this remapping of the transfer curve. So from monitor to monitor, the viewing results could be very different indeed.

As a result, if you are looking to view Dolby Vision or HDR10 material, you generally need to have a good amount of control over your viewing environment and the ambient light.

This isn’t the case with displays that cater for HLG (Hybrid Log Gamma). HLG was developed by the BBC and NHK as an open standard. Unlike HDR10 and Dolby Vision, HLG capable displays can have their system specific gamma adjusted based upon ambient conditions, and so the viewing environment is not so critical.

Complexity unraveled. Sort of...

One of the complexities of understanding HDR in terms that mortals can understand, is that many of the terms used are not really that relevant to the vast majority of monitor and television customers. The various specifications and standards have a lot of crossover, too. In addition there can be a lot of complexity with regards to the display environment, and the effect HDR will have on your eyes. This latter point will be of great importance to the creatives who shoot and grade their productions. But for the average consumer, what is important is that the display that you buy does what you need it to do.

HLG is already supported by online services such as YouTube, and it is also already supported by HDMI 2.0b, the HEVC encoding standard, and VP9. Both Dolby Vision and HDR10 are being supported by most of the big television manufacturers. LG for example has announced that all of its HDR capable televisions will be supporting Dolby Vision, HDR10, and HLG. In fact any HDR capable television that has the capability of showing internet based viewing such as YouTube, as the majority of televisions do these days, will likely have HLG compatibility going forward.

It would seem, then, that we may not have the format wars mess that has plagued new technology in the past. Instead monitor and television manufacturers are tending to prefer to hedge their bets by not placing all their eggs in one basket. But even if your display can show all standards of content, the caveats of viewing environment restrictions still apply to the Dolby Vision and HDR10 based output.

Therefore, much more important than any of the various standards is going to be where you are viewing the display. More than anything else, this will either ruin and render your HDR experience pointless, or it will give you the best viewing experience you have had.

If I was to hedge my bets on a personal level, I feel that HLG has good legs, simply because it is more accessible to a wider variety of content makers, offers adaptability to different displays, and is backwards compatible with SDR systems. In other words it offers a convenience that is not provided elsewhere. But don’t ask me to place money on it! But I hope that I have simplified things enough for most people to understand.

I realise that there are many different points to make about the different specifications and standards, so I will not have satisfied the hardened technical nuts. But as anyone who has attempted to read through an HDR white paper knows, such topics can run for many pages, with much of the information irrelevant for the majority of television and monitor customers. Right, I’m off to take some paracetamol!

Tags: Studio & Broadcast

Comments