<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

New resolutions: the technology that wins isn’t always the best

5 minute read

ShutterstockAre higher resolutions really inevitable?

The debate over higher resolutions tends to provoke a lot of emotional response. Roland Denning offers us his counter to the point of whether higher resolutions are inveitable. 

Is the move to higher resolutions inevitable? Articles recently by Neil Oseman Simon WyndhamMark L. Pederson and David Shapton have provoked a lot of discussion about the inevitably of ever-increasing resolutions, sometimes at an almost feverish pitch.

I am not one of the dissenting voices, but I am, perhaps, one of the less certain ones. This is not because I am in any way opposed to the idea of progress or because I dislike high resolutions (that would be absurd), I just want to emphasise two fundamental points:

  • The sheer number of pixels is not an end in itself
  • Nothing in this business is inevitable.

Technology does not get adopted just because it is good. There are plenty of inferior technologies that became the standard (NTSC, VHS, DAB to pick out just 3 acronyms) and great technologies that never took off. It is never technology alone that determines changes in technology and standards in this industry. It is always dependent on market forces.

I emphasise the term market forces rather than marketing. I am emphatically not saying the demand for increasing resolutions is a part of a cynical marketing ploy, rather those new technologies will not take off without an economic case.

It’s always dependent on market forces

Although technology continually improves and offers potentially higher quality, at many points in history quality has actually diminished where money can be saved or made. There was a shift, for instance, in film from 4-perf 35mm to 2-perf and 16mm, in audio from CD to mp3.

Why are the majority of TVs sold now 4K when the majority of cinemas still exhibit at 2K? The answer is simple and market forces are the reason. No-one wants to buy an obsolete TV if one with a higher spec is available for only a little more, but the costs of re-equipping cinemas to 4K cannot simply be recouped by higher ticket prices (as sort of happened with 3D movies) as audiences don’t seem particularly dissatisfied by 2K projection. Cinemas, unlike TV manufacturers, can’t make more money by shifting to 4K.

The rapid consumer take-up of 4k might not be repeated

The rapid consumer acceptance of 4K in the home took all of us by surprise. Unless you are going for an unfashionably small screen size, who would not buy a 4K TV today? Even if all you were going to watch was upscaled HD.

Crucially, manufacturers’ desires to encourage consumers to upgrade their TVs coincided with the sudden rise of the streaming services like Netflix and Amazon which needed ‘added value’ and, unlike the over-air broadcasters, could offer UHD. Whether 4K/UHD dominates in the future is dependent on how much this shift to streaming services continues. Nevertheless…

4K remains minority viewing

A quick reality check: reading these pages, it’s easy to get the impression that everyone is watching UHD/4K in their home – in fact, these resolutions remain minority viewing. At least 80% of TV viewing is broadcast over the air rather than streamed and that maxes out at 1080 HD. In the remaining 20% or so of the market, only a minority of programmes are streamed in UHD and some streaming services (like NOW TV in the UK) only output at 720 HD. And we do not know what percentage of Netflix viewers pay the premium to watch in UHD rather than HD. Moreover, if your internet connection is slow or you simply live in a part of the world where high-speed internet is simply unavailable, you are not going to get UHD, anyway. And the majority of cinema distribution remains at 2K.

The reality is that the majority of the world still doesn’t get to see a decent HD picture on their screens.

The law of diminishing returns – there is a point at which there is no point

In terms of distribution and exhibition, there is a point at which there is little to be gained by increases in resolution. Whether the difference between HD and UHD in the home is noticeable is entirely dependent on the size of your screen and how close you sit to it. There are very few homes where an 8K screen makes any sense and houses, certainly in the UK, are getting smaller rather than larger. Possibly in the future we will have screens that occupy an entire wall and houses will be designed around them, but that, to me, would suggest a very different approach to framing and composition – do you really want to sit a few feet away from a close-up on an 8-foot high screen? Avoiding aliasing is another argument that is put forward which has some credence, although I am not convinced it is a significant problem looking for a solution.

There is also always a cost in computing power and storage cost of going to higher resolutions – anyone who denies 4K takes more computing power is compromising in other ways. These costs may or may not be significant, depending on your budget, but they exist.


It makes sense to originate at a higher resolution than we deliver – or does it?

There are good arguments for shooting at a higher resolution than you want to deliver – certainly the ability to re-frame is a genuine advantage for acquisition at a higher resolution and a strategy that many have adopted.

I would hesitate to endorse the view that it is necessary to originate at a higher resolution for the best quality. The fact that the Arri Alexa didn’t quite manage 4K resolution didn’t seem to hold it back from being acquired for motion picture shooting.

I believe some of the arguments in favour of originating at a higher resolution are actually a hangover from analogue days when quality was diminished at every stage of the process. But if your delivered image is 3840 × 2160 or 3996 × 2160, you are not going to get any more pixels – in fact, by downsampling you have added more processing to the stream which could, in theory, create more artefacts.

There is a general notion that your final image looks ‘better’ if you shoot at a higher resolution image than you intend to deliver. OK, I am not an engineer. I’d just like to know why. A scientific explanation please, not just an anecdote about the particular camera you use.

It is not just the number of pixels

In the consumer marketplace, high resolutions are an easy sell. HDR is harder to explain, colour sub-sampling is probably something about which consumers would prefer to remain ignorant. But we should also be considering how much that picture is compressed. Many video cameras offer 4K resolution at the price of higher compression or lower bit depth.

We are unlikely to be working in 6K or 8K at 4.4.4 uncompressed, so there is a trade-off between resolution/compression/colour sub-sampling/gamut/bit depth/frame rates – increasing the pixel count is not the only game in town.

It’s not just about picture quality

And this, to me, is the most crucial point. An obsession with resolution can distract from what this business is all about, which is telling stories.

Throughout the latter half of the 20th century, cameras were getting smaller and lighter. This, at the time, was considered a breakthrough in how films could be made – you could shoot fast on location with smaller crews, you could use documentary techniques to capture performances spontaneously. Now, higher resolutions don’t necessarily mean bigger cameras, but the trend towards larger sensors has brought with it bigger lenses and bigger rigs and a reversal of that trend. Perhaps this is a reaction to the fact you can shoot a movie on your phone – directors distance themselves from the world of YouTube.

My predictions? 4K will become more or less the norm for HD delivery. Broadcast will gradually shift from HD to 4K post production as standard, the speed at which this happens depends on the wider TV landscape and whether the rise of the streaming services continues. I don’t see resolutions higher than UHD becoming a standard format for home viewing in the foreseeable future. 4Movie distribution will settle down to 4K but acquisition may shift to 6K or 8K depending on the project. And on which camera Arri launches next.

But, as always, I could be totally wrong.

Image courtesy of Shutterstock.

Tags: Production

Comments