<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

High Dynamic Range (HDR) Video

3 minute read

ShutterstockHDR Beach

HDR or High Dynamic Range, is a way to record the massive distance between light and dark that exists in nature. Used well, it can produce amazing images. Overdone, it can make nature look like a videogame

Even with normal HD video, electronic systems began to approach the resolution of a 35mm release print, and modern cameras such as Sony's F65 add resolution at least as good as many negative stocks while simultaneously expanding colorimetry from the common, rather limited Rec. 709 range. What remains, then, is dynamic range, and that quirk of highlight handling that makes film so flattering, and that's something that electronic sensors have historically struggled to match.

Attempts to mitigate this problem involve a lot of clever engineering and advancements in semiconductor manufacturing, but impatient users of current technology have nonetheless come up with ways to get more out of the existing hardware.

Multiple exposures

The simplest approach is to take several pictures of a scene with the camera set up to vary the exposure of each image. There's even a feature to do exactly that automatically on Canon DSLRs, referred to as “exposure bracketing”, and HDR enthusiasts often use up to five exposures per frame, providing a huge luminance range. Of course, the camera and the scene must both be completely unmoving between shots, which makes the technique difficult to use for motion picture photography. This didn't stop Red from trying, though, and their HDRx feature is effectively this – two successive exposures, which are recorded as two separate video streams. It can be effective on scenes with little or no motion, although movement in the scene can cause visible problems and it remains a feature for very particular circumstances only.

This is not a new technique. Multiple exposures for scenes containing a lot of contrast were particularly relevant in the earliest days of photography, when emulsions were primitive and would struggle to equal the dynamic range of even quite everyday modern digital cameras. As long ago as the 1850s, French photographer Gustave le Gray was, at the suggestion of his contemporary Hippolyte Bayard, taking twin exposures of seascapes, to capture detail in both water and sky, and putting the result together when printing his negatives in a process called combination printing (which is important because it was among the first techniques to demonstrate that the camera could lie, and lie convincingly). More recently, news camera operators are used to gradual iris pulls when tilting down from a bright sky to a dimmer foreground scene, which you could argue is a high dynamic range technique.

Digital Signal Processing (DSP)

The principal modern development is in the use of digital signal processing (read: photoshop) to intelligently combine the exposures. Future techniques might involve the use of optical flow interpolation to mitigate the problems of sequential exposure in moving-image applications. The total exposure range in an HDR image is potentially massive, and it is more accurate to think of the information as a data set describing the luminance of a scene from which an image can be derived, as opposed to being an image in and of itself. Since no displays are available which provide enough luminance range to accurately display an image of, say, the sun, we must compress the range into something that's viewable on common (often Rec. 709) equipment. This can be done simply by dividing the luminance values down to an appropriate range (that is, simply turning down the contrast), or by using curves, creative grading tools, or a lookup table, which may involve more or less creative grading decisions as the situation demands. It is for this reason, among others, that we would pursue HDR imaging for motion picture work: to provide huge, unprecedented flexibility in the grade. HDR processing for stills frequently uses tone mapping, an approach which takes account of local contrast in various areas of the image in order to make best use of the available information overall, and that's how the slightly eerie, almost glowing HDR demonstration images we see are produced.

HDR though effective noise reduction

Making these approaches practical and easy for everyday photography is something that the industry has struggled to achieve. Rather imperfect experiments with sequential-exposure HDR aside, various techniques have been proposed and tested, including sensors grouping photosites of varying sensitivity, much as current sensors group photosites sensitive to different colours. Proper postprocessing might allow such a device to produce unprecedentedly high dynamic range in a single exposure and solve the problem, although if there were no problems with this approach I suspect we would have seen it more widely deployed by now. Perhaps recent developments in CMOS image sensor fabrication, with attendant increases in resolution and integrated peripherals, will make these approaches more practical. |That aside, most of the last couple of stops' worth of technological advancement seem to have been gleaned from noise reduction techniques, making shadow detail more usable, and making the sensor faster into the bargain.

Ultimately, there is no standard defining what HDR is, in terms of the number of stops range that an image has, and modern cameras such as Sony's F65 are beginning to provide per-frame dynamic range that would have required HDR techniques only a few years ago. Even so, it remains the case that high dynamic range, particularly the handling of highlights in a way that's subjectively attractive, may be the last bastion of film.

Tags: Technology

Comments