<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

HDR technology and technique: what you need to know, part one

4 minute read

RedShark NewsWe can't demonstrate HDR on your standard-dynamic-range monitor, so compare to the extra-low-dynamic-range image at right.

In this introductory article, Phil Rhodes kicks off a new series on HDR by taking a step back and explaining high dynamic range display tech and what it really means to display HDR images.

There's currently little debate that high dynamic range pictures look great. Of course, to some extent we've been used to HDR pictures for quite a while, with several cameras now capable of shooting pictures where the brightest whites are thirteen or fourteen stops brighter than the darkest blacks. Until recently, though, the only way to view these images was by cramming all of that range onto a conventional display, with the results that everything looks terribly grey and murky. We might even conclude that what we're really looking at right now is a revolution of display technology, after the camera technology was developed to shoot it had already been developed for other reasons. Arriving at a coherent approach to all this is difficult under circumstances where the techniques we'll be using for the next many years haven't yet been formalised.

HDR display primer

In the interest of minimising the prerequisites for this article, let's cover the basics. The HDR display demos that have been drawing appreciative glances at trade shows for the past few years do a lot more than just make the picture brighter. For reasons that are difficult to fathom, this is a common misconception. A true HDR display does not simply make the backlight on an LCD brighter. The idea is to increase the peak brightness while leaving the shadow detail where it is, so that the image overall has more absolute contrast. Simply winding up backlight will make the shadows brighter as well.

Atomos_promotes_its_new_recorder_with_HDR_display_at_NAB_2016.jpgAtomos promotes its new recorder with HDR display at NAB 2016.

Manufacturing displays which will do this is not necessarily very easy, although it'll be easier with the greater penetration of OLEDs. LCD displays suffer the issue that black is never quite black. It's a big panel of white light produced by (generally) some LEDs, and we filter it to create colours and shades. Creating black requires all the light to be filtered off. This works okay, but there's a maximum amount to which the LCD pixel will block the light, and there's still some light in areas which should properly be entirely black. Canon has shown TFT-LCD monitors with convincing HDR performance.

Compare an OLED display, where each pixel is – broadly – a light emitting diode. It's not a light emitting diode like the 'on' light in your computer, but certainly something which can be switched off. When it's off, it's off. It's black, entirely so, and in fact to the point where some people actually set OLED monitors to emit a very tiny amount of light deliberately in black areas, because otherwise they're too high in contrast compared to the displays that most people will actually be watching. Sony's BVM-X300 display, possibly the world's finest video monitor at the time of writing, uses this technology.

The third way is a TFT display with a backlight made out of a matrix of conventional LEDs, whether white or some combination of colours, which can be switched on or off depending on how bright the picture needs to be at that location. Each LED backlights a small area of pixels, so it isn't possible to switch alternate pixels to the absolute brightest and absolute darkest values simultaneously. For most picture content, though, it works pretty well. This is the approach used by Dolby in many of the demos they've given.

Displaying HDR images

So, those are the display technologies, and all of them are capable of creating soaring heights of brightness and deep, inky black shadows simultaneously if we engineer and drive them correctly. HDR displays aren't just brighter, they're higher in contrast. The question for a lot of people is how this material should be shot. It's currently quite normal for very upscale productions to produce an HDR finish, especially via online distribution where it's arguably a bit easier to roll out new technologies.

Good-quality_cameras_such_as_this_Panasonic_Varicam_35_have_been_capable_of_shooting_HDR_for_some_time.jpgGood-quality cameras, such as this Panasonic Varicam 35, have been capable of shooting HDR for some time.

The short answer is that well-exposed, properly-lit material shot on a camera capable of more than twelve stops of dynamic range is quite often already suitable for HDR finishing. Cameras have been capable of shooting high dynamic range images for some time. The problem has been displays. Part of the reason that unprocessed images from modern cameras look so flat on a conventional display is because they often encompass up to fourteen stops of dynamic range, whereas a Rec. 709 display is only capable of actually outputting somewhere between seven and nine, depending who you ask and what their predilections are. The easy way to understand this is to think of a movie containing a shot of a sunset. The sun on the screen does not project our shadow on the back wall of the cinema. The real object was massively brighter than the projector can achieve. The projector has lower dynamic range (much, much lower) than the real scene. The camera probably did much better.

SmallHDs_range_now_includes_LCD-based_HDR_displays.jpgSmallHD's range now includes LCD-based HDR displays.

HDR, fundamentally, is designed to put that sun closer to its real-world brightness, to expand the gap between black shadows and the sun. Matching reality would be unpleasantly dazzling; that's one of the problems that's been identified with HDR. Many systems suggest a maximum brightness of 1,000 nits, which is achievable by the BVM-X300. Things like the domestic LG OLED55E6 achieve perhaps 600, which is still six times the brightness of most Rec. 709 displays. High power can cause fatigue in the viewer, which is a cautious way of saying that it's difficult to watch large extremes of contrast for long periods. This should curtail the tendency of producers to wind new techniques up to 11 in order to justify the cost of using them, although there's an uncomfortable parallel with stereo 3D. Extreme subtlety is always absolutely crucial to avoiding viewer fatigue, but is still very rarely practised. HDR is probably less risky, as domestic TVs will lack the power to really dazzle and, in any case, viewers are likely to reach for the contrast control if things become genuinely uncomfortable.

This_promotion_of_Dolbys_Vision_standard_at_the_corner_of_Hollywood_Boulevard_and_Highland_Avenue_in_Los_Angeles_cost_a_lot_of_money.jpgThis promotion of Dolby's Vision standard at the corner of Hollywood Boulevard and Highland Avenue
in Los Angeles cost a lot of money.

And a final word: yes, this is all driven by the desire of the TV manufacturers to sell more TVs. The fact that a few technical people happen to quite like the way HDR looks is secondary. There's a serious question to answer about whether the public will continue to tolerate these rolling TV upgrades, and what the manufacturers will do once they have sold everyone a 4K, 120fps, 3D, HDR OLED. There's only so far things can go. So, much as radio did not replace books, and cinema did not replace radio, and TV did not replace cinema, we might even assume we're approaching something of a steady state, like the one the audio part of the industry has been in for a decade or more.

 

Tags: Technology

Comments