<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

Higher resolutions do not necessarily mean much more data. Here's why

2 minute read

Replay: Higher resolutions mean much higher storage requirements, don't they? Or do they? Phil Rhodes examines why higher resolutions can make for much more efficient compression.

In 2017, NHK published a paper called “HEVC/H.265 codec system and transmission experiments aimed at 8K broadcasting”, which talked about compression for their upcoming 8K broadcasting efforts. Their concern is clear: it's one thing to need a bit more hard disk space to hold 8K footage. It's another thing entirely when you're trying to put together a complete broadcast infrastructure, complete with point-to-point wireless microwave links which need to work, according to the paper, at a minimum of 60 frames per second. That's especially true given that almost all modern distribution codecs are, at least in their long-GOP forms, very asymmetric – that is, they take a lot more work to encode than to decode.

Notice that the right half of the picture, which is very heavily compressed, suffers visible discontinuities at the edges of 8x8 pixel blocks. More modern formats don't do this

Regardless of what’s happened to HEVC since its inception, it doesn't really matter whether the extra data we’re demanding is in pursuit of HDR (admired,) 8K (accepted) or high frame rate (disliked outside sports.) The same pixels in two adjacent frames are related in much the same way as two pixels which are physically right next to one another, and regardless of bit depth. It’s just data.

But there's often been a tendency to apply heavier compression to larger frames. It's easy to leap to the conclusion that this is being done purely for convenience and with a hang-the-consequences sort of attitude. Sometimes that's probably true, but subjectively, it can sometimes work out OK. For some reason, it seems to be true that higher resolution images will stand more compression per pixel than lower-resolution ones. Yes, that implies that an 8K image needs less than four times the data of a 4K one. A higher resolution image needs more data - just not, perhaps, as much more as simple multiplication would imply.

This is an 8K display, but it's still only 32 inches, meaning problems can be invisible

Mathmatically ridiculous?

Mathematically, that sounds ridiculous, but let’s examine why.

We could describe an image as detail edges connected by gradients. We could demand that those gradients be recorded in bit-perfect accuracy and recreated precisely as they were recorded, but that's not the sort of compression artefact people actually see. Many kinds of modern compression avoid the problem of seeing the edges of compression blocks, which are most visible in smooth gradients.

So it's really about the edges, or to be more specific, areas of high contrast. Those are rendered with more pixels in the larger image and in order to be seen in a higher resolution they'll need to be stored as more data – but they don't make up the whole image. What's more, something that looks like a really sharp edge in a lower-resolution image might be revealed as a rather less sharp edge in a higher-resolution image of the same scene, so the amount of contrast we're trying to describe, per pixel, goes down.

Fantastic as LTO is, it isn't a camera format

This is all speculation. With the best will in the world, it's easy to be duped by the fact that we often see both higher and lower-resolution images on screens of the same size, as opposed to looking at them with pixels of the same size. At some point, it becomes an issue of what we're trying to achieve by increasing resolution: a bigger screen that looks just as good, or a screen the same size that looks better?

But if anyone has any doubts about this, consider the following example. Given a 4K monitor and both HD and 4K video files compressed to the same bitrate with the same compression, which would we expect to look sharper?

Most people would instinctively assume the 4K, despite the fact that both contain objectively the same amount of information. There has to be a reason why that is.

Tags: Studio & Broadcast

Comments