With RED’s “announcement” of its 8K S-35 Helium sensor it’s clear that the technology is not only possible, but it’s here now. It’s not experimental any more. From now on, 8K “cameras” will no longer be mules adapted from other devices with four Ki Pro Quad recorders and sixteen HDSDI cables, but real products with proper, realistic, data flows and viable, realistic storage regimes.
Before we look at how this has happened, almost completely outside the awareness of most industry practitioners and observers, let’s just remind ourselves about the very recent history of 4K.
4K's path to the mass market
Very soon after RedShark started, four years ago, we noticed that 4K was beginning to pick up momentum. 4K really started with the early discussions about Digital Cinema delivery formats, and then with RED’s first announcements on the subject nearly ten years ago. Back then, 4K seemed at best wild speculation, and at worst, science fiction.
But then, at the IBC show in September 2014, we saw this 84” Sony Bravia 4K TV. It cost $25,000, but it was a beauty. It really did look like it was worth the money.
But would anyone buy these things, especially with an almost complete absence of 4K content to show? And at such a high price?
Well, the answer is that some people did, but the questions above answered themselves very quickly. Prices plummeted. Soon, 4K sets from China were available — with surprisingly good specifications — at around $1500, and some even below $1000.
Meanwhile, Netflix and other OTT providers were gearing up to make their own content - and it was almost universally shot in 4K.
In October 2013, we reported that a major UK department store chain had started to sell 4K TVs, but they were still very much in a minority.
Then, just a little over two years later, the vast majority of sets in the same store were 4K.
Most professional (or consumer, for that matter!) cameras released today are 4K.
4K isn’t universal in the home yet, but given that you’ll be unable to buy an HD set in the stores before very long, it’s not much of a stretch to say that 4K is now the dominant format.
Drawing all these threads together, what’s the overriding conclusion? It’s that almost everyone (modesty prevents us from saying here “except RedShark”) got 4K wrong. They got it wrong about the timescales, and they got it wrong about the need for it.
Let’s just clarify what we mean about the “need” for 4K. We mean it in a rather specific sense, which is this.
Nobody needs 4K as a means to carry on living. Nobody’s basic way of life depends on it. Of course not. As a simple communication tool, SD black and white TV can get the message across almost as well as high resolution HDR video. But that’s not the point. But for over a decade we’ve been taking part in a huge, fundamental paradigm change in filmmaking, which is that we don’t use film any more. Film was very good. So good that there are still some people that prefer it. But now - it seems to have happened incredibly quickly - technically, we can exceed the abilities of film with digital video and to some extent go beyond.
Looking to the 8K future
So, now, is the same thing going to happen with 8K? It certainly looks like it.
Just like with 4K, there’s plenty of scepticism about it. If I had to sum up current feelings about 8K I’d probably say that it’s seen as very high-end, very experimental, difficult to use, and ultimately pointless because there aren’t any 8K televisions.
None of this is true any more.
Of course 8K is high-end, it’s highest resolution we can work with at the moment. But it’s no longer high-end in the sense that it’s unachievable by filmmakers with a “normal” budget. It’s not experimental any more. There are real products that will be arriving in the next few months that will be sold just like any other camera. It’s not especially difficult to use because — just as with 4K — computers and storage quickly caught up with the quadrupling of data rates and file sizes that were associated with the previous “maximum” resolution.
Is it pointless? Absolutely not. And here’s the real reason why we need 8K. It’s not because there are millions of 8K TVs in consumers’ living rooms waiting for 8K content. It’s because just as video shot in HD makes SD look better when it’s downsized, and 4K makes HD video look better, 8K will make everything look better. If you view it in its native resolution, then it will look amazing, but it will also make 4K look better than anything shot natively in 4K. In fact, every resolution derived by downsampling from an original 8K master will look better to some extent.
And then there are all the technical reasons why 8K will make life easier for post production. Quite simply, the more resolution you shoot with, the easier it is to work with green screens and to composite complicated hybrid CGI and real-world shots. For projects to be output at 4K or HD, it will be possible to reframe certain shots in post. 8K definitely won’t make it easier financially for facilities who have just upgraded to 4K but none of this has to happen overnight. It is surely good, though, to have the option if you want to work at a higher resolution.
Ultimately, though, there’s another reason why 8K is here, and here to stay. It’s because manufacturers have to have somewhere to go after 4K. Whether you think that this is just cynical commercialism or a genuine advance in the art of filmmaking will depend on your current perspective.
Meanwhile, we suspect that the rapid move towards 8K is just another example of the effect of exponential technology. HD is around five times the data rate of SD. 4K is four times HD, and 8K is four times 4K. The fact that we’re sitting here talking about the actual arrival of 8K in commercial products says it all, we think.
Landscape graphic by Shutterstock.com