<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

You can tell the difference between HD and 4K at normal viewing distances

3 minute read

Phil Holland

A common accusation levelled at 4K and 8K is that it makes no visual difference at normal viewing distances. Or does it?

Back when 720P and 1080i were battling it out to be the dominant HD format, there was a lot of discussion about how there was no visual difference between them. In some respects there is some validity to some of the arguments, seeing as we are talking about a 50/60fps progressive scan format (720) versus an interlaced 1080 format.

That said, the discussions were very similar to what we are hearing now. Particularly when talking about the arguments against any form of HD that some were making. It was said that in the average living room at the sorts of distances that people view their televisions, the difference between HD and SD is minimal. It goes without saying that if anyone suggested this now it wouldn't be entirely inappropriate to ask if they were smoking illegal substances.

Why did we move to HD?

High definition was developed for one primary reason. To allow for a more cinematic style viewing distance from the screen, and hence a more immersive viewing experience. From an ideal viewing distance in a cinema, the viewers are around three to four screen heights away from the screen. But in order to view at this sort of distance and field of vision from a home screen, standard definition just wouldn't cut it.

I could talk about the many different analogue implimentations of HDTV, but if we stick with the digital realm for the sake of simplicity, 1080 lines is generally the minimum accepted resolution for the intended distance of viewing of around three screen heights away from the screen. Notice that I said minimum. Remember that such formats were being developed by finding a compromise between both an ideal resolution, and a practical and effective way of actually delivering the content into viewers homes, primarily via over the air and satellite broadcast. If bandwidth and technology at the time wasn't an issue, could anyone predict that if given the chance the technicians wouldn't have gone even higher?

Individual pixels in 1080 HD at a tend to become visible at a normal viewing distance on a screen size of around 100 inches diagonal - but that's not to say that we can't see the limitations before that. As high a resolution as 1080 is, it isn't a patch on the resolution captured by a stills camera, and printed on a 'large telelvision' sized framed print for instance. I am sure that such a high quality print looks far higher quality and much more detailed at the average viewing distance than simple 1080p video.

Aliasing on fine patterning is still very noticeable at HD resolutions. Such picture artifacts are visible at any normal viewing distance. To pretend that they aren't is nonsense.

Can you see the difference between HD and 4K?

Just before we recorded this weeks RedShark Live show we did a brief unscientific experiement on Dave Shapton's rather impressive 4K computer monitor.

We played back Phil Holland's highly impressive aerial footage, which was recorded at 12K, then downsampled to 8K for upload to YouTube. Of course the monitor we were watching on couldn't show 8K of resolution, but it could downsample the result to 4K for display.

We marvelled at how clean it was. It looked sharp on another level that we were simply not used to. So we then decided to switch it to HD mode, and I decided to step well back to see if I could see a difference, as a lot of commenters on RSN claim you can't.

The result? Sorry detractors, but it was night and day, and I don't even have 20:20 vision! The HD downsample was noticably less defined even from my viewing distance. Zebra crossing with tiny people in the distance lacked definition. By HD standards it was still an incredibly good picture. If I hadn't seen the 4K version I could have said it was some of the best HD I had seen.

Dave switched it back to 8K (which was being displayed as 4K), and immediately it snapped back into amazing detail. It looked much more '3D'. There was an undefined realism that simply wasn't present in the HD version. It also showed quite clearly that there's a lot to be said for taking footage at 8K resolution and downsampling it to 4K. And if the differences were noticeable on a £350 computer monitor, what will they be like on a dedicated high end one?

I have a feeling that just like different resolutions previously, by the time 4K and even 8K becomes normal, we will look back and wonder how we ever thought that HD and even 4K was enough. Building 8K equipment gives so many other benefits too, such as higher framerates at lower resolutions, and cropping ability in 4K.

Do we need 8K? Yes, we do. If you don't agree that's fine, but from a personal standpoint I'm more than happy to accept and embrace all the fringe benefits that come with it, even if some others do not.

Tags: Production

Comments