<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

How to make 4K look like 8K

3 minute read


Sometimes even 4K doesn't look like 4K. There's a lot you can do to make your video seem sharper.

First of all, I need to say that you can use the words "sharp" and "soft" in different senses. Some of the best 8K footage I've seen doesn't look sharp - in a good way. And some looks soft, but in a bad way.

What I'm going to be talking about here is how you can make your audience think you shot your film in a higher resolution than you actually did. I'm using the term "sharpness" here to mean "looks like a higher resolution than it was actually shot at.

There's a very important place to start this article. It sounds obvious, but I'll say it anyway: don't do anything to reduce the resolution of your pictures. I'm saying this because, although 8K sounds massively more than 4K, it's very easy to throw away that resolution advantage accidentally, or though not taking enough care. It's absolutely true that 8K has 4 times the number of pixels than mere 4K, but linearly (eg horizontally and vertically) it only has twice the resolution (ie around 8K pixels horizontally instead of 4K).

What this means in practice is that you have to shoot with care to make sure that your shots are in focus. Even the slightest error can reduce your resolution to 4K or even HD. If you're shooting in 4K, making it good 4K can make it look better than ordinary 4K, if you see what I mean.

What is sharpening?

Sharpening is an effect that boosts the contrast at edges. Used with moderation, it can be useful. It's important to understand that sharpening doesn't really add information, but it might make certain elements seem more visible and - sharper. It works a bit like a tone control on a HiFi amplifier. In the same way that sound can be broken down into a set of frequencies, so can images. Think of a chess board. Ideally, the edges between the squares are really sharp. If you were to move a tiny camera across a chess board at a regular rate, you'd see a brightness signal that was essentially a square wave, rising almost instantaneously to maximum as it passed over a white square, and falling to a minimum, again instantaneously, as it passed over a black square.

The sudden change between black and white represents a high frequency. If the squares on the chess board were blurred, you'd get a lower frequency. So to make it seem sharper, you apply a boost to the high frequencies in the image. This might not give you a more accurate image, but it will look sharper.


Higher contrast edges are high frequence, while softer edges are low frequency. Image: Shutterstock

Sharpening used to be turned on by default in video cameras, even with HD, and it led to certain artifacts. Almost the whole point of 4K and 8K is that you don't need sharpening, but for a specific effect, it can be justified, and although it will never turn 4K into 8K, it might make your 4K material look sharp enough to fool viewers into thinking they're watching 8K - especially in comparison with other 4K footage that might not look as sharp.

Another technique is to do your post production graphics at 8K resolution. Titles and motion graphics are computer generated and so can easily be output at 8K. I remember the first time I saw graphics and titles in HD, and I was amazed that such small text was clearly legible. It sets the expectation for the whole project.


Finally, we're almost certain to see software coming soon that will claim to upscale HD and 4K footage to 8K. I have an 8K TV in my living room (by Samsung) that does exactly that. It seems to work.

How does it do this? In the case of the 8K Samsung, I don't know, although I'm assured that it is genuine AI as opposed to an empty marketing trick. One way that AI might work is by "recognising" objects at various levels of abstraction and drawing the edges of them. This is entirely different to the type of sharpening mentioned above, because it is actually adding information, although this will lower the overall authenticity of the image.

Ultimately, there's a clear message here. Apart from a few tricks mentioned above, the way to make your image look like it was shot at a higher resolution is to put all your effort into making the best image at the current resolution. Use good lenses, light well. Focus accurately. Maybe even use a smaller sensor with a deeper depth of field (contrary to the current fashion for full frame sensors).

4K isn't just 4K and 8K isn't just 8K. I've seen 8K material that looked like 4K or worse, and I've seen 4K that's so good it was hard to believe it wasn't 8K. While 8K is a worthwhile improvement, it really isn't that much better than very good 4K. And with so many 6K cameras coming on to the scene, you'll be able to make 4K that's so good that people will just watch the pictures in awe. No one will even mention resolution.

Tags: Production