<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

Where are the limits of development?

Will we even need eyes to view 'video' in the future?
5 minute read
Will we even need eyes to view 'video' in the future?

(Replay) This article is a few years old now, but the questions it asks are still relevant. All about development and whether, in the not-too-distant-future, we'll even need eyes to see our creations.

Shutterstock / RedShark NewsEye graphic by www.shutterstock.com

We are all familiar with Moore's Law. The law (that many now deem 'irrelevant') came from an observation in 1965 by Gordon Moore that the number of transistors per square inch on an integrated circuit had doubled every year since the circuit was first invented. The prediction was that this would continue for the foreseeable future.

Moore's Law has come to be associated with and placed its imprint on the way technology has developed. The problem is that, eventually, you run into an inconvenient thing called physics. And with cameras, when it comes to the gathering of light and displaying it, physics really is a big limitation.

Current limits

The question is, when it comes to capture and display, where are the real limits? 4K is now pretty much normal. Well, I say that, but in reality many shooters I know may own a 4K camera, but generally still shoot 1080. But the fact is that you would be hard pressed to find a main camera in the shops now that doesn't shoot 4K. And 8K is already here, too.

With stills cameras, manufacturers used to bombard us with how many megapixels their systems capture. By and large, that war is pretty much over. In some cases, such as the Sony A7S, the manufacturer will even reduce the resolution of the sensor to allow higher performance in other areas, in this case, low light capture.

Stills cameras generally need higher resolution than video because photographs can be used in any number of different ways in physical print, which requires different dots per inch. At small sizes, say the size of the average 8x6 family photo print, a 40 megapixel camera is simply overkill. That resolution is not required.

Both this and the case of the Sony A7S show us two things. When it comes to certain sizes of display and viewing distances, ever increasing resolution offers ever diminishing returns. With the A7S, it shows us that the laws of physics will always win out when it comes to gathering light. Generally, you either have higher resolution but lower light gathering ability or vice versa. Certainly, there are ways to help light sensitivity, for example Sony's practice of placing micro lenses over the photosites, but there are still limits.

Increasingly, having all that extra resolution available in cameras isn't being seen so much for the benefit of displaying on a single screen, but as insurance, such as being able to crop the sensor for use with different lenses with minimal loss of eventual quality. Or to be able to crop the framing in post production. These are pretty solid benefits, in all honesty. The idea of being able to own both a large chip camera and an effective B4 news gathering system in one camera is pretty cool.

Paths forward

But for single display, the extra resolution offers ever diminishing returns. So what should we focus on instead? The first and foremost development is HDR video. This is an area that will grow and grow. I predict HDR will succeed the resolution march. But here, too, we run up against a pretty big limitation. And when I say big, it really is.

Much of the success, well all of it, really, depends on the maximum brightness than any given pixel can be made to do. If we want to represent absolute reality on a screen, then it stands to reason that we would need to be able to display the maximum brightness that our eyes are capable of seeing, as well as the colour gamut. Health and safety issues aside (watching an eclipse of the sun on such a device could be… 'interesting'!), could we ever produce a television that could display the same brightness as the sun during midday?

Maybe we could with laser technology and, let's face it, the handful of laser TVs that have been produced have all rather impressed the people who have watched them, with brightness and colour depth far in excess of traditional LCD or LED displays. You can buy a laser TV now, at a price, but they haven't taken off in the way that OLED has. With my cynical head on, perhaps manufacturers want us to go through an OLED period first, so that laser TVs can be sold as the next big thing in a few years time, but I digress. For the record, although laser TVs consume more power than OLEDs, they aren't far off and are certainly far better in this regard than LCDs.

The point I am ultimately driving at through the long and winding road is that, if we assume that the limits of useful resolution have been reached and if we all owned laser televisions that can display as nearly most of the colours and brightness that the human eye can see, where would we go from there?

A non-optical future?

Some have suggested a more 'analogue' way of capturing images, so that physical resolution doesn't exist. Maybe your image is just one big equation of light that can be scaled accordingly. You might have gathered that I am not an engineer! However, even if we could store an image an infinitely scaleable metadata, would it give us any benefits if it was simply displayed on a flat screen?

VR, I feel, is a cumbersome experience. 3D is a also a no go, as consumers have told manufacturers on more than one occasion. The question or choice I think that we will be left with is whether or not we want to stick to flat screens or to go the whole hog and produce real VR, by getting our brains to produce the visuals and the environment. It will be a debate fraught with pros and cons, not just from a preference point of view, but from a safety standpoint.

I do not believe that we are a huge number of years off this debate in full, either. Science has already made a lot of progress into helping people with blindness with tests with artificial 'eyes', which create an image in the tests subjects brain to effectively give them 'sight'.

As a current BBC documentary series is keen to point out, everything we see, everything, is created by our brains, effectively artificially. We are only ever seeing what our individual brains want to see. Your brain's interpretation of the world could be very different to mine, which is why some people can suffer conditions where words produce colours or they can see maths, numbers floating in front of their eyes, and all number of other weird and wonderful effects. It's also why we as humans can have arguments over what we see. We all assume that what our brains are giving us is reality, when in fact they are only giving us what they think is relevant for us to see.

That's why I could perhaps argue with someone over a colour, with one person saying it is blue, while another is convinced it is purple. Our colour interpretation is not calibrated! Humans would be a visual engineer's nightmare!

It was surprising, too, to hear how prisoners in Alcatraz who were put into a solitary confinement cell for nearly a month at a time, deprived of all light and sound, saw their own world created by their brain. By that I do not mean that they imagined it with their eyes closed; they actually saw it and smelt it. Their brain created its own artificial world. After all, everything you see is created by your brain. Your eyes are only a conduit for light for your brain to make sense of, much like the 'dumb' CMOS at the front of your camera gives the light voltage information to the camera's processors to make sense of.

By such a point of artificial reality, we really will be at the limits. Where will the manufacturers go from there? After all, you cannot get much better than reality. And in fact, such a system would allow us to experience a complete hyper reality. The danger is, of course, that aside from potential psychological side effects, artificial reality created by our brains could be so interesting that we could get bored of real reality! Remember that this is not just about visuals, but about stimulating all our senses artificially, from smell, through to feel.

Or would we? We experience dreams all the time. We are used to seeing some very surreal things as if they are real all the time. Maybe, just maybe in such circumstances, we will see a resurgence of the good old tradition flat screen. Perhaps we don't actually want to be inside the worlds we see created by story tellers and, in fact, prefer to see things at arms length, in safety. Perhaps reaching the limits of what is possible will force us to evaluate what it is we really want from our storytellers.

Graphic by Shutterstock

Tags: Technology

Comments