10 Aug 2018

'Film grain' has no place in video [opinion]

  • Written by  Chris Foreman
  • submit to reddit  
'Film grain' has no place in video [opinion] Shutterstock - rdonar

Emotional grain? Grunging up the image is often seen as a way of packing more emotional punch into an image. This is something Chris Foreman has a lot to say about!

I’m going to be unpopular here, I know it, I just am. You see, I’m going to talk about something I alluded to in a previous piece… grain.

A lot of people seem to love film grain, I get it, I do. It’s evocative, it can be atmospheric and nostalgic, but it can be a very personal thing and there are a few things to consider. We will come back to film grain, but first, let’s talk about something related, and that’s noise.

You see, most noise added to a clean image to try and evoke some nostalgia or emotion is just plain wrong in my opinion. Artistic intent, yes all very good, but actually misused noise has the opposite effect on me. When I see it, I just think ‘oh they’ve spoilt that image by adding noise’. I have to sit through this film or TV programme being continually distracted by it. This is not the intent of the director and editor, but this is my reaction. What makes me react emotionally is mainly the acting and the story, followed by the images. If I can see the emotion in someone’s eyes, the tear falling down their cheek and the hair standing up on the back of their neck, then this is what pulls me in.

Noise hides emotional content

Noise hides this, literally, that’s not to say film grain spoils my enjoyment, in context, it’s fine, although a lot of time I wish I could get rid of it (not through crude noise reduction). I really have difficulty believing that if I went back in time and offered the director and cinematographer the option of shooting with reduced or zero grain that they wouldn’t jump at the chance. It’s a limit of what was available at the time of shooting, influenced by film stocks and lighting as much as anything else.

You see, film grain is a thing of its time and we can’t change that, but what we can change is the addition of digital noise in post production to emulate film grain in the mistaken idea that it adds a certain something to the image that evokes film. What are we doing here? Are we trying to evoke the feeling of film or are we looking to evoke the feeling of the time film was dominant?

There’s also a related issue at play and it’s down to the characteristics of film grain. Film doesn’t tend to show grain in the blacks, unlike video, down to the nature of the negative. Where you see film grain is in the mid-tones and highlights. This is not the same with digital noise produced by the camera. Excessive noise in the blacks used to be common years ago. Newer sensors, electronics and compression have led to a dramatic reduction here.

Clean doesn't have to mean clinical

A clean image doesn’t have to be clinical. Use of lighting and grading are key but adding noise to emulate film, well, that just seems wrong to me. I will pay careful attention to films and shows that exhibit noise or grain, especially if they’ve been shot in the digital era. I watched Justice League some weeks ago and that mixes both film and digital formats as a lot of films do, and yes, there was grain – very fine but it was there – not overly distracting, but it was from a film source.

There is an argument for adding noise to digital sources to give a very definite film appearance for historical reasons. This I can sign up to. If you are trying to give a look to an Alexa-sourced shot to make it look like something shot on an Arriflex 35 with period stock, then that’s completely appropriate.

Of course, what also doesn’t help is the amount of bandwidth that grain or noise can take up, and if this bandwidth is capped like in a constant bitrate stream, for example, then something else is going to have to give way. Sometimes this results in a softer image and sometimes it means that motion, especially in the background of a shot, will not be smooth.

Film is film and video is not. There was a time during the early days of shooting video to look like film when this got a very bad name. Usually, but not always, down to the fact that video shot at film frame rates, 23.976, 24 or 25, was poorly shot. It was not unusual to find cameras set to a 360-degree shutter by default. Shutter crime at it’s worst. This was because shooting those progressive rates at a 180-degree shutter made the pictures darker when compared to the interlaced frame rates of 50 or 59.97 or 60i, and that was not seen as desirable by the manufacturers, so some of them made the camera default to a 360-degree shutter when set to progressive. The fact that some people can’t see this artefact, or at least are less sensitive to it, made it difficult to spot sometimes. I think we’re now emerging from a long transition period. Most of these issues have settled down, but that’s not to say we’re immune for any new developments.

I think it’s rather ironic that a lot of work goes into restoring older material for HD release and it’s nothing short of astounding. I hope we’re not in a position 20 years from now when we have to clean up artificial noise because at the time it was added for effect to emulate film. Tastes change and people don’t like to be tied down, they want choices and maybe in time instead of a noise reduction setting in TVs, there will be an ‘add film grain’ setting in the menu…the best of both worlds?

Title image - Shutterstock - rdonar


Twitter Feed