Sony Livstream
11 Oct 2015

We may have solved the mystery of why film just looks better

  • Written by 
  • submit to reddit  
The art of noise The art of noise RedShark

Index

RedShark Replay:  It just won’t go away, will it? However much you can prove with specifications that digital video is indisputably better than film, there’s a stubborn feeling that there’s more to it than the simple-to-prove facts. We think we've identified one, subtle, process that helps film to store more visible information than digital.

Recently we asked for reader’s opinions on this, and we had a good response, although much of it was rather predictable. Some said that we shouldn’t be comparing the two at all. Some said that whatever anyone wants to believe, that film will always be better - even going on to say that something is “lost” when we digitise things.

All of which may be true. But I think we’ve at last stumbled on something that might be tangible. It’s to do with the fundamental difference between film and digital.

It’s fairly easy to explain, but not that easy. And remember - this is just our theory: we're not going to be dogmatic about this and if anyone can prove us wrong, that's fine with us.

Here goes.

Film doesn't have pixels

Both film and digital have a limit to their resolutions. With digital, the fundamental unit of resolution is the Pixel. You can count them easily as they’re arranged in a grid. There’s a slight (well actually rather a severe) complication here, which is that in order to get colour out of a modern, single, sensor, you have to have a Bayer-pattern filter, which does reduce the resolution by the time its output has been run through debayering software that kind of guesses what colour each pixel should be, based on the ones around it. This makes it difficult to state the exact resolution but as Bayer algorithms get better and resolutions get higher, it doesn’t change the fundamental dynamics of the film vs digital debate.

Film doesn’t have a grid of pixels. Far from it. What it has instead is essentially random shaped crystals of chemicals. And of course these vary completely from frame to frame, and between different parts of the same frame.

So, whereas with a digital system, the grid doesn’t move, there isn’t a grid at all with film, where, if you try to look for corresponding areas of light on successive frames, you won’t find them on a microscopic level.

So, you’d be perfectly entitled to ask at this point whether, how, or why this matters, when the film grain is too small to see.

Well, you can see film grain. That’s one of the things about film. It needn’t be obtrusive, and it won’t be foremost in your mind, but it will undoubtedly have an effect on the viewers perception of the moving images.

But there’s more to it than grain vs no grain, not least because you can always add “grain” to digital images. No, the effect is much more subtle, and yet more profound, than that.

This is where we’re going to talk about a concept that’s called “Dither”. This rather indecisive-sounding term is a powerful concept in the mechanics of digital media.

The question of "dither"

Strictly, dither is noise which is added to an analogue signal, usually at the point just before it is digitised. You might wonder what’s the point of this when the whole thing about digital media is to have a clean signal without noise.

Let’s explain this with a simple example.

You’ve probably seen those contour lines in a blue sky on television or in videos. This is an illustration that you need an awful lot of colours to describe the very shallow gradients that you get in a blue sky. In reality, the sky is not simply“blue” at all, but every shade of blue you can imagine (and more) that fill the continuum between the lightest shade of sky and the darkest one.

You would need an almost infinite number of colours to produce a perfect rendition of this type of sky, but, luckily, you don’t need that many, because we only need a finite, albeit quite large, number for our eyes to see a continuous gradient.

But we probably need more colours than the 256 per colour channel that 8 bit video can give us. That’s why you often see distinct bands of colour in blue skies, separated by a sudden jump to the next available colour.

This is called “quantization noise” and is one of the most unpleasant after-effects of digitisation. It’s so undesirable, that even quite drastic measures are sometimes preferable to the effect itself. One such measure is adding noise. Of course, this sounds completely bizarre in a context where we’re trying to make a picture better, but it’s not that bad. Because when you add noise - or “dither” - it’s at a very low level. In fact, it’s at such a low level that it should only make a difference of less than one bit.

But surely, if it’s that quiet - and less than one bit - it can’t make any difference at all?

This is where it starts to get a bit strange

Well, this is where it starts to get a bit strange.

In the scenario we’ve talked about above, where we’re trying to lessen the effect of having too few bits, the media has already been digitised, so we can’t actually add information. Noise, in itself, certainly isn't information - by definition. We would probably have to add somewhat more than one bit’s worth of noise to make a difference here. Once we’ve done that, the effect is pretty remarkable, because the noise has the effect of randomising the position of the contour lines. Effectively, they just disappear.

If we’re prepared to accept the overall degradation caused by adding noise, then it’s often an acceptable compromise: a little gentle noise is far easier on the eye than unnatural-looking contour lines.

So that’s one way in which noise can improve the look of a moving image. Ironically, I believe that many 8 bit cameras look better through having a slightly noisier image because of exactly this effect!

Now, here’s where it gets almost magical - but, trust me, it isn’t: it’s soundly scientific.

There’s a difference between adding noise to a signal that’s already digitised to one that’s still analogue. Here’s an example that shows that the phenomenon has been known for at least seventy years!



« Prev |


Have you tried RedShark Sound yet? If you love audio, you’ll love our new website. Click here for The Home Page For Audio.


David Shapton

David is the Editor In Chief of RedShark Publications. He's been a professional columnist and author since 1998, when he started writing for the European Music Technology magazine Sound on Sound. David has worked with professional digital audio and video for the last 25 years.

Twitter Feed