<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

Is 4K pointless, or perfect? Or is it both?

8 minute read

RedSharkThe great 4K debate

4K is being thrust at us like a out-of-control steamroller. But hang on a minute! HD hasn't suddenly got worse, and we all thought it was pretty good a couple of years ago.

Here's a bit of a spoiler alert: we still like 4K. But when you look at the arguments on the web for and against 4K they almost all suffer from the fact that they are either for or against it. None of them, therefore, give the complete picture.

What follows is only my take on the subject, but it's written from an objective point of view. If you disagree, please let me know in the comments!

The case against 4K

Well, I've already said it. HD is already pretty good. What's more, many of us haven't even seen what HD is capable of.

The same was true of SD. Most of us (and by "most of us" I don't necessarily mean professionals like RedShark readers. I mean most people in their homes, who will have badly adjusted televisions connected to a terrestrial analogue or digital service) didn't realise quite how bad SD was. But, even so, it did the job for many decades. But I remember for the first time seeing SD in a television studio, on a studio monitor showing the feed directly from a studio camera. It looked amazing! Quite honestly, that was as big a jump in quality compared to my previous experience of SD as the first time I saw HD.

For a long time after that, I wished that I could merely have that studio quality SD in my home. The nearest I got to this was when I bought my first DVD player and connected it using component cables. I thought it was pretty breathtaking in comparison with broadcast SD and VHS.

When I started working with HD, in around 2003, I couldn't believe the quality of the images coming from the cameras. It didn't look like television at all. It didn't look much like film either, but it was amazingly crisp and sharp. It was certainly ideal for a mini-documentary I was producing about a historic steam train in the UK called the "Flying Scotsman".

 

Flying Scotsman Footplate Crew

 

It was hard to imagine that there might one day be a video standard that was even more precise and clear than HD, especially when digital still cameras at the time were maxing out at little more than HD frame-grab resolutions anyway.

Well, now most of us (who care about image quality, anyway) have got HD TVs and quite a few of us actually have them connected to an HD source! (anecdotally there are still plenty of viewers who have absolutely no idea that their HD TV is only showing them SD!)


Still not Seeing HD

But the trouble is that we're mostly still not seeing "proper" HD.

In the UK, at last, BBC1, the first national channel, is in HD. Even more recently, as the BBC news operation moved into the newly extended Broadcasting House, at the top of Regent's Street in London, we started receiving the main news bulletins in HD as well, and very impressive it looks too: with ideal studio lighting and top-end studio cameras you would expect it to be good. Except that when you look closely at the picture, you can still see compression artefacts. And what you will definitely not see, is progressive video, because at 1080, broadcast video is always interlaced.

But connect your HD TV to a decent Blu Ray player, and you'll see a big difference. Despite the fact that a lot of films go for a softer "look" than broadcast studio television, and that Blu Rays are compressed as well, they just look better, because the compression is mostly gentler than broadcast TV, and because we can have proper progressive footage.

The bottom line with HD is that when it's good, it's very good indeed, and it is at least arguable that you don't need higher resolutions than HD when it's at its best.

ARRI Alexa

And just to reinforce this, there's the ARRI Alexa.

If you're not familiar with this camera then there are two things that stand out. The first is that it has a sensor that a consensus of top-end film makers and DOPs think is about as good as you can currently get - despite only being effectively HD resolution. Whatever it is you need to make a movie look good, this sensor apparently has it. Whether it's the way it handles highlights, or skin tones, its record speaks for itself. You don't get to be used for films like Skyfall and Life of Pi without being exceptionally good.

And these are, of course, cinema films, which are shown on extremely big screens in cinemas. The fact that HD (or 2K - very slightly more pixels horizontally) is more than acceptable in a cinema says as much about the fact that films are rarely viewed optimally in cinemas as it does about the quality of the sensors, although digital projection is gradually improving the experience.


Not all images justify 4K

And this final point is just one example of what might be 4K's biggest flaw. Which is that it's so precise and so demanding of image quality that, very often, even with a complete 4K workflow, it's very difficult to actually capture an image that justifies 4K resolution.

It's one thing to say that, at eight million pixels or so, 4K is only equivalent to a mid-to-low range still camera resolution, but this isn't the whole story. Because it's quite another thing to capture a moving image and keep it completely in focus. You only have to blur two adjacent pixels in a 4K image to be back down to HD again. With motion artefacts as well, the times when you will see a genuine 4K image might only amount to moments in a whole production (remember, I'm looking deliberately on the gloomy side here!).

When you take all of this into account, and you look at the potential for degradation throughout the whole production and distribution chain, it's easy to see how even with 4K end-to-end, what you finally see might be nothing like 4K.

Of course, technologies like HEVC (H.265) are going to help. But might not H.265 be better used making it possible for us to see HD without artefacts, and at a higher frame rate?

Frame rates

Talking of frame rates, recent work by the BBC has shown what an improvement higher frame rates can make. Don't forget that spacial resolution (measured in pixels) is only part of the story with video. Temporal resolution (measured in frames per second) is important as well. This is why VHS recordings were (just about) acceptable, whereas a VHS freeze frame almost certainly wasn't. Again, some of that extra bandwidth freed up by more efficient compression could give us more frames per second and less motion blur.

What we've seen above is that what is in theory a four times increase in resolution (with twice as many pixels per linear inch) is likely to give us only marginal benefits, and even then in ideal conditions. Given normal room sizes, normal budgets, and normal eyesight, it's a bit like the choice between a car that will merely do 160 mph and one that's capable of 230 mph. The faster one will require immensely more complex and expensive technology, and you're unlikely to do more than 100 mph in either of them most of the time.


The case for 4K

So, from my perspective, that's the case against 4K. Now the case for it.

First of all: if you're given a choice, and all things being equal, why would you not chose to shoot in as high a resolution as possible? Not only would this offer your audiences the best experience wherever and whenever the right display equipment is available, but it would preserve your work for future pristine display. Choose a lower resolution format and at some point in the future, someone might say, "We can't show this because it's not good enough".

There are other reasons, too.

Compression

Given the fact that compression is likely to be around still for a very long time, it's always going to work best on the image that contains the most information, and if that's a choice between HD and 4K, then 4K wins. Here's an analogy that I roll out frequently to illustrate this.

Imagine you're having your picture painted by an artist known for his photographic likenesses. Would you prefer that he or she had you sitting for a portrait, or used a Lego model of you? Which one is going to give the best result? Obviously the higher resolution version: you, in other words.

Even if your intended target is a mobile phone, compression is going to work better on a 4K source. If you're delivering on Blu Ray, the same thing applies: HD looks fantastic from a 4K source.

There are other reasons why you might want to shoot in 4K. If you're outputting to HD, you can pan and scan around the image, and zoom in, right up to when you reach the HD raster size. Amazingly, you can fix your composition in post.

Screen size and viewing distance

I want to turn now to the argument that says you can't benefit from 4K with normal screen sizes and normal viewing distances. What I'd say to this is: Why do you sit so close to your car's windscreen (windshield) and why is it so big and so wide? Simple. It's because it needs to fill your field of vision.

Windows don't have a resolution measured in pixels. It is, in fact, all about how much of your visual field it fills. If you want to have immersive video, you have to either have a very big, very wide screen, or you have to sit very close to it, or both.

Now, imagine your current HD screen four times the size. Whatever else you can say, you will need to have four times the resolution just to keep the perceived quality the same (this only applies if you're sitting close to it - but you will need to for it to fill your field of vision).

That's the nub of it. If you keep screen sizes the same, 4K doesn't make sense, but if you quadruple it (and don't forget that that only means doubling the diagonal) then it makes complete sense.


The debate in practice

So, what's the conclusion here? The conclusion is that there are arguments for and against 4K, but they're not contradictory. It makes sense for them to be held by the same person. I know that, because that's what I feel.

What does this mean in practice? It means that if you look at individual aspects of 4K, like whether you should upgrade your 40" TV to a 40" 4K TV, then you might find answers that militate against the higher resolution. But what it also means in practice, especially when you consider that if the trends we saw with the move to HD are going to apply, if not more so, in the migration to 4K - is that prices will come down to around the same as they are for HD today. Not only that, but difficult and expensive 4K workflows will ease over time, and this will happen more quickly than it did with HD.

Do you remember how hard it was to work with HD when it first arrived? If your first experience of it was with HDV cameras, you'll probably recall how difficult it was to edit with the stuff. At the very least, you had to invoke a more edit-friendly intermediate codec, or you had to be prepared for slow, jerky performance.

In some ways, the same applies to 4K. But 4K is just a matter of scale, not a complete change of method. SD to HD meant that we had to use Long GOP codecs in post for the first time. We'll have to do the same with 4K but we're used to it; it's not going to come as such a culture shock. (And if you don't want to use Long GOP codecs, you can use production-quality codecs like ProRes at 4K resolution).

What's more, computers are more powerful. In the ten or twelve years since we started editing with HD, we now have multi-core computers and the cores themselves are more powerful. Meanwhile, storage is much, much cheaper and loads faster too. Here's a clue: we don't have to put tape in our cameras any more!

So, while working with 4K is still going to be a significant jump, it's not going to be a massive one. It's unlikely to stop people moving to it. But what it will bring is the opportunity for even better pictures, less aliasing, more detail, a wider colour gamut, and movies and videos that won't look out of date in a mere five years. I think it's inevitable, and I'm glad that it is.

Tags: Technology

Comments