<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

The problem that 8K has to overcome: Motion Blur

3 minute read

RedShark PublicationsThe problem with motion blur

It's an unfortunate situation to say the least, but the fact is that motion blur can completely wipe out the advantages of very high resolution video.

 One of the benefits of moving images is that, you know, they move. When you think about it, we do have a rather sledgehammer approach to recording and playing back moving pictures: taking 24, 30 or even 60 pictures per second, sometimes for several hours in the case of a feature film.

With high resolutions like 4, 6 and 8K, that also means a pretty extravagant use of data. An 8K frame is equivalent to a 32 megapixel still image.

So you'd think that with all that resolution, the format would be very good at reproducing motion, but, unfortunately, out of the box, it isn't.

Earlier this year I saw an impressive display of 8K television. You can't fail to be knocked out by the detail in an 8K video. It doesn't matter how close you sit to the screen. It's like real life in that the closer you get the more detail you see.

I saw a recording of a football match. When the players were standing still, or walking, it looked amazing: absolutely pin-sharp. And then they started moving, and suddenly looked like ghosts.

I'm told the recording was made at 60 fps. In HD, that's normally regarded as pretty good for motion. In 8K, this was nowhere near enough.

We were shown the same match but this time recorded at 120 fps. It was better but nowhere near perfect. The figures moving on the pitch would, I suspect, have looked sharper at HD resolution.

So, what was going on here?

It's a very simple effect. The smaller the pixels (or the more of them on the screen) the more susceptible the image will be to motion blur.

You don't have to delve too deeply into theoretical physics to understand why.

For one pixel to blur into another, you have to move the camera though the width of a pixel during a frame. 8K has sixteen times number of pixels that HD has. But it's only four times the linear resolution. That means that it will reproduce objects that are four times smaller, vertically or horizontally.

Which means that you only have to move the camera one quarter the distance during a frame to cause motion blur. (Or, objects in a scene with a static camera only have to move a quarter of the distance during a frame).

The only way round this is to shorten the frame, or to have a faster shutter speed. But faster shutter speeds look unnatural. At some point, you lose the continuous movement effect and start to see a rapid sequence of still images. Shorter frames means that you have to have more of them.

I remember the BBC claiming that you actually need closer to 300 frames per second for very high quality motion. At the time I heard this (about five years ago) I thought it was a bit over the top, but having seen that 8K demonstration, it seems much more reasonable.

But at that rate, an 8K video stream would require 160 times the bandwith of 30 fps HD (8K is sixteen times the data rate of HD, and 300 fps is ten times 30 fps. 16 x 10 = 160). So if HD at 30 fps needs approximately one gigabit per second, 300 fps 8K would need 160 gigabits per second; very roughly 15 gigabytes per second, or 54 terabytes per hour.

Even with 50:1 compression, this doesn't sound like a format you could deliver to the home.

4K is obviously in the middle between HD and 8K. With a small-ish shutter angle and 60 fps playback, 4K can look pretty good, although motion blur is still an issue, but there is a worthwhile overall improvement over HD. The case for moving to 8K in sport is yet to be made. It can look fantastic, but at times, is no better than HD, at "sensible" (but already huge) data rates.

All of which makes me think that well before 8K is widely adopted by broadcasters, and certainly before it becomes mainstream in the homes, some other video format will arrive that will provide an alternative. If it does, it will probably involve some sort of vector encoding of objects that will take account of their movement (of course, most compression formats to this by the time they come into play, the damage has already been done).

I believe the only time we will actually need 8K is when we have screens that are truly the size of walls, and when we have fully immersive virtual reality.

Until then, we'll have to manage with 4K, which I believe is easily good enough for most of the time.

Tags: Technology

Comments