RedShark Editor Dave Shapton presents the still controversial view that 4K is already a done deal, and that it's not just a marketing ploy but has real advantages for content creators and viewers
One of the advantages of getting older is that you gain a sense of perspective. You remember what things were like decades ago, and with today’s rate of change, that means they were nothing nothing like what we have today.
When I was born, the state of the art in consumer electronics was a valve (“tube”) radiogram. This was a thing that virtually anyone could repair with a basic soldering iron, and it had components that you could not only see, but could grab with your fingers. Today’s electronics have multiple billions of transistors on a single chip that are orders of magnitude too small to see, and run fast enough to either process several thousand tracks of digital audio or compress high definition video into ProRes in real time - while powered by a battery.
So while I’m not quite old enough to remember people saying that “Colour Television won’t catch on” (although that certainly did happen), I do remember people complaining just a decade ago that HD was just a fad, or at least that it was a gimmick to make us all upgrade our TVs.
I’m going to gloss over 3D, which might just be the exception that proves the rule, and go straight to 4K, which, as the next TV “standard”, is now poised to be well and truly established within an even shorter time than even we were predicting - and we’ve always been on the side of “it’s going to happen sooner rather than later”. (Note that although 4K and UHD are, strictly, different, when I say 4K in this article, I mean both 4K and UHD. My justification for this is that both are roughly four times the resolution of HD, with 4K having slightly more horizontal pixels than UHD.)
Let’s come back to the rate of adoption in a minute and assume for now that 4K is going to happen for good reasons. Let’s say that it makes things better for production, post production, and, ultimately, the viewers.
The justification for 4K
There’s a lot to say here, so I’ll keep it brief.
Strictly, logically and scientifically, 4K looks more detailed than lower resolutions. It’s effectively a logical truism that more detailed pictures are better for us. Of course there are downsides, like cost, and increased bandwidth requirements, but let’s keep to the aesthetic justifications for now.
And it is easy to justify 4K.
4K for acquisition
For a start, even if you’re not going to distribute your production in 4K, any derivatives of your original 4K material will look better. They’ll look better if you’re downscaling from uncompressed material, and they’ll look better if you compress your content. This is because there’s more information to start with. Compression algorithms can make better decisions because they have more information to base those decisions on.
Think about it like this: if I said to you “you have to make a painting of me, and you can have a choice of a high resolution photo to use as your source, or a detailed Lego model”. Quite obviously, you'd choose the photo and play with the Lego. Here's the crux of the analogy: the Lego model is high definition, and the photo is 4K. You can see why you’d choose 4K as the source every time.
So this shows what there really isn’t much argument about anyway, which is that if nothing else, 4K is better for acquisition. But it isn’t essential - just look at the continued success of the Arri Alexa, which remains, even after this year’s NAB, resolutely sub-4K but with such a great-looking output that no-one that I know has even gone near to criticising it for not having enough resolution.
HD demonstrated to everyone that shooting in a higher resolution than needed for distribution means that pictures at a lower resolution look better as a result, and the same is true for 4K in relation to HD.
There are other benefits of acquiring your material in 4K, even, especially if you’re going to watch in a lower resolution. You can change your camera angle and framing in post.