<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

Are we too sensitive about image noise?

How much noise is acceptable, and is it a good thing?
3 minute read
How much noise is acceptable, and is it a good thing? Image: Shutterstock.

Replay: Feature films of days gone past were really quite grainy compared to the pictures produced by a modern digital camera. We didn't complain too much then and just accepted it as part of the look. Should we be placing less emphasis on noise if it means a more emotionally resonant picture is produced?

If people would tolerate the amount of noise from video cameras that they tolerate from film, every camera made in the last five years would be considered perfectly usable at 1600 ISO.

This is a sensitive camera. It's  far better than most film stocks, comparing the three factors of noise,  sharpness and sensitivity

This is a sensitive camera. It's far better than most film stocks, comparing the three factors of noise, sharpness and sensitivity

This idea comes from the realisation that a lot of expensive and well-shot films are really very grainy. It's always easiest to see grain in mid-tones and the brighter end of shadows, and it's also far easier to see in the cinema and on Blu-ray than it is on YouTube. The compression used for a lot of video distribution has an effect on grain that's very similar to a deliberate grain-reduction algorithm. As a result, we need to be very careful about making evaluations based on poor-quality footage and we also need to be careful about how we prepare things for distribution whether they're afflicted by grain or noise.

As an example, consider this sequence from Man of Steel.

Inset - 100% crop of background.  It's not a very quiet film

Inset - 100% crop of background. It's not a very quiet film

That's a single flash of grain that's visible behind Amy Adams' character, in the soft mid-tones of the wall. That's how grainy Man of Steel really is and that clip was picked specifically because it represents what happens when this sort of thing isn't handled with enough care. On that one frame, the H.264 compression algorithm decided there was enough bandwidth to include the high-frequency detail in at least some of the macroblocks. On almost all of the others, it didn't.

Bite

But issues of compression are concerns for the distributor. Man of Steel is far from the most famous film to have been, well, really grainy. Look at Aliens, which has some of the same issues even in a trailer posted by something describing itself as “the official Alien YouTube channel.”

A 100% crop of Aliens. Sigourney  doens't have a skin condition - it's grain. Sort of a lot of grain

A 100% crop of Aliens. Sigourney doesn't have a skin condition - it's grain. Sort of a lot of grain

Again, only occasionally does the grain get through the H.264 compression. But are we complaining that there's anything wrong with the photography on James Cameron's famously wonderful Aliens? Of course, we aren't, notwithstanding the fact that Cameron didn't get on with the late, great Dick Bush, who was originally hired to shoot the film. It was finished by Adrian Biddle and despite this interruption, it's a moody, atmospheric film that looks fantastic and has an entirely coherent visual style.

What it has – and the reason for talking about it - is something that might loosely be called... well... bite. The grain is part of that and it’s often described in glowing – if subjective – terms like grittiness or texture. It's possible to do that in the digital world, of course, but we're likely to be criticised for shooting noisy images.

Contrast

All of this interacts, interestingly, with something else that films like Aliens have in spades – contrast. Aliens has lots of dark scenes with deep, black shadows. A lot of modern pictures lack that bite because of the drive for lots of dynamic range, which lets digital cameras see into the shadows better than film. That costs us richness and a dark contrast reference if we're not willing or able to light so that the shadows are really dense. Film clips blacks automatically, to some degree. Digital doesn't. So, using cameras at higher ISO settings is directly counter-revolutionary, since it will make them more sensitive to shadow detail and allow for even more dynamic range, and arguably somewhat less of this dimensionless factor I've called bite.

To some extent that can be solved with an aggressive shadow crush in grading, but the better solution is, inevitably, the same technique that was used on eighties masterpieces like Aliens and Blade Runner. Big lights to create contrast, to overcome the ambient light. That contrast lets us leverage and showcase the improved highlight handling of a camera run at a higher ISO. So, in the end, we might have gained a couple of stops of sensitivity over film, but the underlying techniques remain amazingly similar. Create contrast, crank up the gain and hang the noise; it's what we've been doing for decades.

Title image courtesy of Shutterstock - Daxiao Productions.

Tags: Production

Comments