The 4K paradox

Written by David Shapton

RedsharkRedShark on 4K

On the face of it, 4K is a done deal. But is it?

That was the rapid conclusion I came to when I arrived at the NAB show in Las Vegas this year. It was abundantly clear that 4K had moved from being somewhat speculative and experimental, to being current, almost mainstream technology.

Whereas last year there were a few 4K screens from major manufacturers - all of them very big and very expensive - this year, they were all over the place, and at all prices.

At the top of the pile were the color-calibrated screens intended for grading. These can easily cost $50,000. Near the bottom you have screens from Chinese companies that can cost under $2,000. And you know what? They weren't bad.

There were new 4K cameras everywhere: from Panasonic, Sony, JVC (although nowhere near shipping yet!) Grass Valley, AJA and Blackmagic.

And all the support infrastructure was there as well, from 12G SDI, through 4K switchers, to seemingly dozens of 4K HEVC (H.265) encoders.

There is no doubt whatsoever that the production and broadcast worlds are ready for 4K (although broadcast bandwidth still remains an issue).

So it came as a surprise to hear that there is a another point of view - one that says that the demand for 4K is just not there yet and that there is no need for everyone to upgrade their equipment right now.

I heard this from several exhibitors at the show.

Surprise

It won't come as news to hear that the vendors who have a range of 4K products are the ones who say that 4K is a done deal, and it's the ones that don't have a range of 4K products that say that there's little demand.

The thing is that they may both be right.

If you're going to make a major investment in new equipment, it's at least arguable that you should invest in products that are at least upgradable to 4K. If you get this wrong, you'll be "stuck" with HD longer than you might want to be, when the rest of the world has moved on.

Perhaps the most pressing reason for upgrading to 4K is that it bestows longevity on your material: capture and post-produce it in 4K and it will still look good in ten or twenty year's time. If you don't do this, your "mere" HD material will look shabby in a few years when it's displayed on a screen that's the size of a living-room wall.

Opposite point of view

But there's another completely valid point of view. It's not completely opposite but exists in a different context.

What if, for example, you specialise in video coverage of things that are here today, and - most probably - gone tomorrow. What about sporting events, conferences, and news - to some extent.

Are your clients and viewers clamouring for 4K? Probably not. Partly because they haven't thought that deeply about it, but also because HD still does an extremely good job, even though a single frame of HD is only around 2.4 megapixels as opposed to 4K's 8 megapixels per frame.

To a very large audience, HD is still "good enough".

There's a very good reason why some manufacturers might not want to jump aboard with 4K yet: they would need to radically upgrade their product range.

The extent to which they would have to do this depends on the nature of their products.

For companies that provide media management systems, they're mainly just dealing with files. 4K files will take up more space than HD ones but all these companies will need to do is specify bigger drives and find more bandwidth (please note that this is a gross oversimplification, but for our purposes, it's true).

But if a product relies on CPU or GPU or even hardware video processing, whatever it is, it's only going to be able to do a quarter as much processing with video.

Just imagine you've about to buy some software to process your video in real time. Perhaps it's part of an all-in-one system to process live videos. Unfortunately, the more pixels in your video, the less your system will be able to process it in real-time.

If you suddenly move from HD to 4K, your system will only be able to do 1/4 as much processing given the time, which means that you'll have fewer effects, and your whole environment will be less powerful.

So, a rather depressing conclusion to this article?

No. Not really.

Because of Moore's law and the exponential nature of scientific progress. We can see it all around us: processors get faster, and digital electronics gets cheaper.

In two or three years, those improvements predicted by Moore's law will take effect and, ultimately, we'll be working on our computers as much in 4K as as we do today in HD.

 

Tags: Technology

Comments

Related Articles

31 July, 2020

This is how Netflix is adapting Anime to modern technology

The streaming service brings 4K and HDR to the classic Japanese artform in the latest example of its prototyping production techniques.

Netflix's...

Read Story

30 July, 2020

Gigabyte Aero 17 XA review: A competition beating powerhouse [sponsored]

The Gigabyte Aero 17 XA has some pretty nifty specs on paper. How does it stack up in the real world, and more importantly against the competition?

...

Read Story

30 July, 2020

For all film makers: How to avoid losing your stuff and where to put it

Replay: The technological revolution has created great opportunities for new film-makers everywhere, but has in its wake created a new challenge:...

Read Story