Apparently it’s still impossible to publicly compare film and video technologies, as we did back in December, without generating angry mailbag. This is an attempt to get into the issues that surround this sort of comparison, particularly in the context of ITU-R document 6/149 and its associated reports, which were issued and revised in the early 2000s.
It always has been impossible to generate and compare absolute resolution figures for film and digital formats. In the extremes, people have even claimed that digital formats won't be the equal of film until they're able to individually describe the shape of every grain in the emulsion. To look at that the other way around, we could just as well claim that film won't equal digital until it can describe the square shape of every pixel. Neither of these is really required to create the same subjective impression of sharpness, and claims this extreme don’t inform the debate.
There are some characteristics of film that can help it look sharper than the basic technology would seem to allow. Since the grain pattern changes each frame, it achieves what we might call temporal pixel shift, giving us a better idea of where an edge in the image might be through a sort of positional dithering. That inevitably comes with a noise signature, and we're not going to shy away from the idea that grain is noise. Noise is a picture artefact not created by the scene. Grain is noise. So are the edges of pixels, and worse, pixels are fixed-pattern noise, which is generally more visible and more objectionable than noise that moves around.
Is a release print sharper than good 4K video?
But are those advantages enough to offset the resolution issues of projected release prints? Does any of this imply that film, as experienced by most of the people who have ever watched it projected, actually is sharper than 4K video, after all?
No. In fact, some fairly well-researched numbers make it clear that an average 35mm release print is at best competitive with conventional HD video, give or take compression. Before anyone lunges for the comments section, let's ensure we're all very clear what we're talking about. A release print would be a made from an internegative, which is made from an interpositive, which is made from the original camera negative. Release prints are invariably made on high-speed contact printers and shown using rather cost-controlled projection lenses in a multiplex where the projectionist is not watching the result. This situation differs markedly from the circumstances under which directors of photography generally watch film projection: answer prints or dailies are made directly from the original camera negative and projected by someone who knows the director of photography will be watching.
So, even if 35mm negative is a 6K format, which is itself a dubious idea, that detail is not going to make it to the cinema screen. There's a report which puts some numbers on all this, called “image resolution of 35mm film in theatrical presentation,” produced in 2001 by Vittorio Baroncini, Hank Mahler, Matthieu Sintas and Thierry Delpit, engineers from various television organisations. Their report mentions the ITU-R documents we talked about above, but the ITU material is behind a paywall. You can usually find the report around as a PDF file, and it contains more than enough details to verify what we're about to talk about.
The relevance of MTF
To understand their conclusions, we need to understand modulation transfer function (MTF,) which is a measure of how much an image is being affected by, broadly speaking, the resolution of the medium in which it's being captured. Tests often involve placing a sinusoidal pattern of black and white stripes in an image. As the stripes get finer and finer, they will begin to blur out to an even grey as we approach the resolution limit. MTF expresses the reduction in the contrast of fine detail caused by resolution limits and is often expressed as a percentage at a given fineness of lines. We might realise that, at a resolution of 2100 lines across the height of the image, contrast is reduced to about 10% of its peak value, which is pretty blurry.
And yes, those are the numbers arrived at in the report we're discussing, based on a Kodak 5274 (200T) camera negative, which remains a current stock: it's very much starting to run out of resolution at that level. We might reasonably describe that as requiring more than HD resolution to adequately capture, but theatrical releases do not maintain that amount of resolution. The interpositive achieves perhaps 1400 lines. Assessments of a projected image in a cinema, two generations later, were subjective but varied between 685 and 875 lines, which is barely beyond HD. For all the reasons we carefully mentioned above, a direct equivalence of these numbers with pixel counts is difficult, but the reality is clear.
So it makes complete sense that a modern 4K scan of a well-kept camera negative from decades ago might look sharper than the original theatrical release. It might not look better, in terms of other factors, but we’re talking about sharpness. Even a basic HD television set will also look brighter than theatrical exhibition; a correctly-configured TV should not greatly exceed 100 nits (roughly 30 foot-lamberts) according to Rec. 709, while the standard for a cinema screen works out to 54 nits (16 foot-lamberts, without film in the projector, according to SMPTE-196M.) Many TVs are brighter than that, to stand out in the showroom, and cinemas are generally dimmer than given the minimum density of the film and the penny-pinching of cinemas keen to extend bulb life.
It's bittersweet to realise that such a revered old technology is looking less and less competitive as time goes by. About the only way in which a 4K TV generally won't exceed film is in sheer colour range, and better implementations of Rec. 2020 colour may even erode that advantage. Sorry, folks.