<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

This is a wonderful explanation of how the Dolby Vision grading process works

4 minute read

Dolby

HDR is a whole new world for both consumers and professionals alike. In this interview with senior colourist Toby Tomkins from Cheat, he sheds light on the current state of play when grading HDR, and how the Dolby Vision system affects grades.

Let’s start by grossly oversimplifying the process of colour grading.

There are numbers in the file created by the camera which represent the brightness of the scene.

There are numbers leaving the grading application which usually represent the brightness of the display.

Grading, grading everywhere.jpg

Grading, grading everywhere

Both are subject to lots of mathematical manipulation, which we account for when we set up the camera, monitor and grading software. The difference between SDR and HDR is how that mathematical manipulation works and what capabilites that allows the whole system to have – how far the peaks of brightness and colour saturation can go. The job of the colourist is to make the subjective choices that take the camera data and turn it into an acceptable picture on a monitor, regardless of the capabilities of that monitor. In theory, grading HDR, for most formats, is not technically much different to grading SDR. It’s just different goals, and HDR might even be less work.

Toby Tomkins is a senior colourist at Cheat, a company which became certified to produce Dolby Vision material in early 2019. He describes the process in terms of the differences between HDR and SDR. “So much of what we were doing in SDR was the remapping of tone, especially in highlights. Traditionally, going back to the film days, you'd have the digital intermediate which would get printed onto a positive stock and that stock had a fairly strong S-curve of contrast, compressing the shadows, compressing the highlights… then it became digital, with print LUTs and colourists creating their own curves. But with HDR you don't necessarily want to compress those highlights.”

Film has some inbuilt characteristics which deal with a certain amount of brightness control.jpg

Film has some inbuilt characteristics which deal with a certain amount of brightness control

So, in HDR, a colourist is, as Tomkins puts it, “doing a lot less work in the highlights.” In any case, he goes on, much of an HDR scene exists in the traditional brightness ranges: “All the stuff you learned as a colourist doesn’t go out the window. Anything with diffuse light will tend to sit in the 100 nit range. You're using the eye you trained before with SDR. The only difference is the highlights - neon, anything that's bright and saturated. You don't have to compromise saturation in the highlights to create a pleasing image.”

Making things easier

So HDR might actually make things easier, at least until we hit Dolby Vision. The idea of Dolby’s process is that less capable displays have a decoder that knows the capability of the display and will process the image appropriately so that it looks the best it can. Yes, that means the display is reinterpreting the grading decisions made earlier, and yes, that changes things for the person making those decisions, because Dolby’s algorithm makes changes based on a complex interpretation of what’s in the scene. The relationship between the numbers coming out of the grading system and the light coming out of the screen is no longer intended to be fixed; instead, the subjective experience is intended to be similar.

It’s a reaction to a longstanding problem: we have long accepted that consumer displays are not always (or often) likely to precisely (or vaguely) match a reference display, and that’s doubly true for HDR displays, where even the high end is currently struggling to find monitors that really match the paper specifications. Dolby Vision makes a pragmatic choice by allowing the display to modify the image to best suit its capabilities, which requires the colourist to develop an understanding of how the process will treat things. A preview is available, and the user interface allows the colourist to look at either the HDR grade, or an SDR downconversion of it performed by Dolby’s algorithm, the idea being that any compatible consumer TV should be able to display something between those extremes.

Looking at two versions of an image simultaneously is interesting.jpg

Looking at two versions of an image simultaneously is... interesting

Consistency

Tomkins describes it as “an algorithm doing the tone mapping of those highlights back to SDR, so I'm creating something in front of another virtual colourist who's doing the tone mapping for me. It's quite a head-scratcher. It analyses the whole shot, looking at the brightest thing that occurs in the frame and then it'll look at the average luminance across the shot and it will create an SDR trim of that. If you have a shot with a practical lamp in and you grade that in HDR and it looks natural and lovely, and then there's a closeup of two people lying in a bed, there's a lamp in the shot in the wide but not in the CU [and] the analysis will react differently. Whether or not this is an issue or a blessing I’ve yet to find out.”

The awkwardness, though, is worthwhile. “The conversion from HDR to SDR isn't constant, so you never really know how it's going to be transformed. Having said that, the way in which it does the conversion from HDR to SDR is amazing. The algorithm is fantastic… the nice thing is that the tone mapping will reflect the monitors. So if the monitor has a peak luminance at 500 nits it will adjust the picture to have the same emotional experience as on a 1000 nit reference monitor. When it comes to tone it does a bloody good job of putting an HDR source into a small bucket.”

Calibration creates just one of several layers of modification between a grading application and photons coming out of a display.jpg

Calibration creates just one of several layers of modification between a grading application and photons coming out of a display

Part of Cheat’s Dolby certification involves a Flanders Scientific XM310K, a 31” 4K LCD mastering display. Capable of 3000 nit peak brightness, the display relies on an LED backlight which Flanders describe as having “over 2000” zones. As most people would admit, HDR monitoring is a difficult area, and Tomkins admits “I think I will forever miss RGB reference OLEDs. They were perfect. They could hit the reference specifications. LCDs weren't saturated enough in the shadows and they had poor blacks. RGB OLEDs could literally hit the spec perfectly, with delta-Es below one across the range and perfect gamma. Every other monitor in HDR is a compromise.”

Either way, industry enthusiasm for HDR seems unabated, suggesting that it will eventually filter down to enough home users that, in time, it could be called a mainstream technology in the hands of users (it is in many ways already mainstream in postproduction.) Tomkins describes consumer displays as “WOLEDs,” referring to the use of a white-emitting subpixel to boost peak brightness and which as a result “don't have a great colour volume in the highlights. When people start do to try to hit Rec. 2020 you'll get metamerism where different people see things differently.”

Colour charts forever.jpg

Colour charts forever

So, much as progress in camera equipment seems to be at least decelerating toward a point of complete adequacy for current applications, there are still huge demands on display technology, and perhaps some fundamental limits on what can really be achieved. Perhaps that’s a good thing for some people, because it means that intelligent processing systems like Dolby’s Vision remain necessary in a way they might not if every HDR display in every lounge on the planet was trivially able to achieve exactly the same performance.

Tags: Production

Comments