<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

If your eyes had pixels, how many would they have?

If eyes could have pixels... Image: Shutterstock.
4 minute read
If eyes could have pixels... Image: Shutterstock.

Resolution is a very hot topic at the moment, so let's take a journey as Dimitris Stasinos establishes how this all relates to our vision and how we really perceive things.

If eyes could have pixels...If eyes could have pixels... Image: Shutterstock.

These are some thoughts that I would like to share with you after reading many articles & debates in the context of vast camera tech development. It is not my intention to make big statements or to criticise anyone, but only to trigger a constructive discussion on what we know so far about our perception of resolution & motion. I tried to keep colour & dynamic range off board as these subjects deserve a different approach.

Perceived Spatial Resolution

This won’t be something new but let’s establish a theoretical base using a simple example and some 1st Grade math. It is widely accepted that the resolution of the human eye for 20/20 vision is 1 arc minute. This is 1/60th of a degree and of course varies from person to person, but the maximum resolution that a human eye can resolve is 0.4 arc minutes and this is a hard limit.

Let’s see what this means in the case of a computer monitor. I am sitting in front of a 27 inch monitor right now. In metric units, the width of this specific monitor is 59 cm and my viewing distance is almost 50 cm. My vision, while i am wearing my glasses, is close to 20/20, so which is the highest spatial resolution that my eyes can resolve?

Let’s do this to establish a reference. The circumference “C” of a circle is calculated using the formula C=2*π*r. Our radius is the viewing distance, which is 50cm. So C=2*3.14*500mm=3140mm. A single arc minute of this circumference would equate to C/(360*60)=3140/21600=0.15mm. So, if the minimum distance between 2 neighbouring pixels (actually the distance between their centres) must be at least 0.15mm, then my monitor must have at least 59cm/0.015cm pixels. That would be 3933 pixels, right? This for the horizontal axis, so for the vertical axis, which on my screen is 33mm that number would be 2200 (pixels).

So this would be a screen resolution quite close to cinema 4K. My monitor is actually 5K so even from a shorter viewing distance my eyes perceive every line that I am seeing as linear. In the case of a bigger monitor the viewing distance would also be increased so that maximum perceived resolution would actually be the same.

Would a 6K or 8K monitor make a difference for video?

Even if I had an 8K camera, I would have a hard time trying to spot differences between 4K & 8K content from my typical 50cm viewing distance. What if I get closer to my screen? I would not do that, as the optimal viewing angle is 50 to 60 degrees for moving images, so if i get closer to my screen I won’t be able to inspect the whole frame and my eyes will be stressed after a short period of time.

Have you ever sat in the front seats of a cinema theatre? Even if you pay me to watch a movie like this, I won’t do it (and this could be a separate subject for discussion). What if my visual acuity is optimal (0.4 min arc or 6/2.6)? Well, identifying static optotypes is not the same thing as watching moving pictures. Of course our eyes can collect every bit of info in 1/24th of a second but this doesn’t necessarily mean that our brain will process them all by giving priority to details. Our brain’s priority is to find a meaning on what we see, and only then, it starts to collect additional data like colour information and fine details. We must keep in mind that human perception has evolved through natural selection, and every bit of incoming information from our senses to our brain is passing through priority based filters and defensive mechanisms.

It is impossible for our brain to work like a detailed evaluation instrument while we are experiencing the complex and intense narrative techniques used in modern cinema…In other words, if my vision was optimal, then maybe I could perceive sub arc minute details, only in a static shot of a very boring movie that did not manage to trigger a single emotion in my poor mind. I said that I won’t make big statements here but I believe that pixel peeping is a bad habit for content creators. Even if you have this bad habit, you can’t pixel peep while watching a fine piece of art, or alternatively let’s say that a fine piece of art won’t let you pixel peep.

Of course anyone could disagree, but believe me, true or false, this point of view is truly liberating for a content creator, plus I am almost sure that this point will remain valid when we manage to understand our mechanisms of perception better in the years to come. But even if I am wrong here, an 8K monitor would cover my theoretical 0.4 arc min optical needs, right? So could we consider 8K as a fair limit in screening applications (or 11K if you enjoy sitting in the front seats)? Maybe… But wait…the optotypes that optometrists use to measure visual acuity are black & white (crashed shadows & overexposed highlights a colourist would say)…, so I bet you get the point…

Do we need multiK cameras?

This is different. I certainly don’t need an 8Κ camera at this time (i admit that I want one though) but you may do, and many filmmakers out there might wish they did. An 8K camera would give you much more room in post to reframe & stabilise your footage or program custom movements. There are also benefits in colour reproduction and signal to noise ratio when downscaling to lower resolutions, so of course I would like to have an 8K camera.

But in my opinion those K’s are growing so fast that our ever evolving editing workstations will never keep up with our post production needs. A sensor can easily make a step to a doubled resolution in 2 years but processing power won’t, nor will codec efficiency. At least for now, and when I am saying “for now” I mean for the next 2-3 years, I am really waiting for a technological breakthrough to change this. But Moore’s law doesn’t seem to apply any more, and binary computing is reaching a critical limit.

Of course big production studios can overcome these problems with cloud computing but I can’t see myself and other “indie” producers like me, editing 8K footage anytime soon. What about using proxies for editing 8K? Sooner or later in your workflow you will switch to original media for CC, so proxies will only make a difference in the editing process. But anyway, 8K is already here, it’s a great tool for specific high end projects and everybody will undoubtedly use it sooner or later.

In the next part we will take a look at resolutions greater than 8K and how motion plays into all of this.

Tags: Production

Comments