RedShark News

11 Dec

High Dynamic Range (HDR) Video

  • Written by 
  • submit to reddit  
HDR Beach HDR Beach Shutterstock

HDR or High Dynamic Range, is a way to record the massive distance between light and dark that exists in nature. Used well, it can produce amazing images. Overdone, it can make nature look like a videogame

Even with normal HD video, electronic systems began to approach the resolution of a 35mm release print, and modern cameras such as Sony's F65 add resolution at least as good as many negative stocks while simultaneously expanding colorimetry from the common, rather limited Rec. 709 range. What remains, then, is dynamic range, and that quirk of highlight handling that makes film so flattering, and that's something that electronic sensors have historically struggled to match.

Attempts to mitigate this problem involve a lot of clever engineering and advancements in semiconductor manufacturing, but impatient users of current technology have nonetheless come up with ways to get more out of the existing hardware.

Multiple exposures

The simplest approach is to take several pictures of a scene with the camera set up to vary the exposure of each image. There's even a feature to do exactly that automatically on Canon DSLRs, referred to as “exposure bracketing”, and HDR enthusiasts often use up to five exposures per frame, providing a huge luminance range. Of course, the camera and the scene must both be completely unmoving between shots, which makes the technique difficult to use for motion picture photography. This didn't stop Red from trying, though, and their HDRx feature is effectively this – two successive exposures, which are recorded as two separate video streams. It can be effective on scenes with little or no motion, although movement in the scene can cause visible problems and it remains a feature for very particular circumstances only.

This is not a new technique. Multiple exposures for scenes containing a lot of contrast were particularly relevant in the earliest days of photography, when emulsions were primitive and would struggle to equal the dynamic range of even quite everyday modern digital cameras. As long ago as the 1850s, French photographer Gustave le Gray was, at the suggestion of his contemporary Hippolyte Bayard, taking twin exposures of seascapes, to capture detail in both water and sky, and putting the result together when printing his negatives in a process called combination printing (which is important because it was among the first techniques to demonstrate that the camera could lie, and lie convincingly). More recently, news camera operators are used to gradual iris pulls when tilting down from a bright sky to a dimmer foreground scene, which you could argue is a high dynamic range technique.

Digital Signal Processing (DSP)

The principal modern development is in the use of digital signal processing (read: photoshop) to intelligently combine the exposures. Future techniques might involve the use of optical flow interpolation to mitigate the problems of sequential exposure in moving-image applications. The total exposure range in an HDR image is potentially massive, and it is more accurate to think of the information as a data set describing the luminance of a scene from which an image can be derived, as opposed to being an image in and of itself. Since no displays are available which provide enough luminance range to accurately display an image of, say, the sun, we must compress the range into something that's viewable on common (often Rec. 709) equipment. This can be done simply by dividing the luminance values down to an appropriate range (that is, simply turning down the contrast), or by using curves, creative grading tools, or a lookup table, which may involve more or less creative grading decisions as the situation demands. It is for this reason, among others, that we would pursue HDR imaging for motion picture work: to provide huge, unprecedented flexibility in the grade. HDR processing for stills frequently uses tone mapping, an approach which takes account of local contrast in various areas of the image in order to make best use of the available information overall, and that's how the slightly eerie, almost glowing HDR demonstration images we see are produced.

HDR though effective noise reduction

Making these approaches practical and easy for everyday photography is something that the industry has struggled to achieve. Rather imperfect experiments with sequential-exposure HDR aside, various techniques have been proposed and tested, including sensors grouping photosites of varying sensitivity, much as current sensors group photosites sensitive to different colours. Proper postprocessing might allow such a device to produce unprecedentedly high dynamic range in a single exposure and solve the problem, although if there were no problems with this approach I suspect we would have seen it more widely deployed by now. Perhaps recent developments in CMOS image sensor fabrication, with attendant increases in resolution and integrated peripherals, will make these approaches more practical. |That aside, most of the last couple of stops' worth of technological advancement seem to have been gleaned from noise reduction techniques, making shadow detail more usable, and making the sensor faster into the bargain.

Ultimately, there is no standard defining what HDR is, in terms of the number of stops range that an image has, and modern cameras such as Sony's F65 are beginning to provide per-frame dynamic range that would have required HDR techniques only a few years ago. Even so, it remains the case that high dynamic range, particularly the handling of highlights in a way that's subjectively attractive, may be the last bastion of film.


SIGN IN TO ADD A COMMENT:

 

 

Not registered? Sign up now

Lost your password?

  • I can attest to the high dynamic range that the F65 is capable of. Seriously impressive - I graded an interior shot that had a window in the background. Monitoring with Sony's standard Rec-709 gamma the window was completely blown out, but all of the exterior detail was still there in the raw file and ready to be pulled back in. I'm talking 3 - 4 stops left to pull back. It's a very good sensor. I don't know if it's the same sensor that is now being used in the F55 and F5 but nevertheless Sony seem to be leading the way currently. No doubt Red have something up their sleeve as well.
    I agree with what Phil alludes to in the article that multiple [simultaneous] exposures do not look like the way forward for HDR, at least for video. The high-end sensors are just about good enough these days and will only get better over the coming years. For me, the main thing holding everything back is the Rec-709 display standard, which unfortunately probably won't change any time soon, so projects destined for anything smaller than a cinema are never going to show the true potential of these fabulous cameras.

    In terms of post-processing, it's becoming reasonably common practice amongst colourists to blend two images taken from the same shot together - one under-exposed, one over-exposed - in order to maximise the dynamic range from these cameras. This is particularly the case when not working from raw files for the grade (which admittedly is becoming less common, although there are many advantages). This process is essentially the same as the traditional HDR process, the difference being that the HDR blending comes from the same original source rather than two separate exposures.

    Anyway, I'm rambling. The point is that this is a very good time to be a colourist or DP.

    0 Like
Phil Rhodes

Phil Rhodes is a Cinematographer, Technologist, Writer and above all Communicator. Never afraid to speak his mind, and always worth listening to, he's a frequent contributor to RedShark.

EditShare 2014 © All rights reserved. EditShare Logo

Top Desktop version

music Are you sure that you want to switch to desktop version?