<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

Does computer processing stop images from being real?

5 minute read
Shutterstock.

Simon Wyndham's recent article about computational optics triggered a few responses from readers saying that computer processing of images stops them from being real.

It's a critically important question, but it has a complete fallacy at its heart. The misconception in question is what philosophers called "begging the question". It means that you're asking a question about something that has yet to be defined but won't be defined by the answer to the question. Essentially, how can we talk about whether something is real when we don't know what we mean by "reality".

Any recording process, whether it's a wax cylinder or a Digital Betacam recorder, is, in my view, almost a miracle. I still find it remarkable that we can record something as delicate, nuanced and transient as a soundwave or an intricate image on a physical device. It's easy to get blasé about this achievement, but it's grounding to remind ourselves of where all this started.

And where it all started is with evolution. Where did your ears and eyes come from? They came from an evolutionary process, beginning with an almost random arrangement of cells and biological material that gave a living organism an advantage over others that did not have the same arrangement at some crucial point in history. Over time, the ability to hear or see became more acute. Today, while probably optimal for our own purposes, human senses are vastly exceeded by animals who have developed them into ultra–specialised weapons of self-survival.

Nevertheless, our perceptual abilities become almost infinitely more useful because we are, at least in some sense, and at least some of the time, intelligent. The combination of highly functioning senses, a physical body and our ability to think adds to everything we are today. Our ability to record sounds and images with machines mimics our sense organs.

Modern recording techniques

Modern recording techniques for audio and video are superlatively detailed. With audio, the pinnacle in recording so far is 32-bit floating-point, which has an almost infinite dynamic range, and means you don't even have to set the levels when pressing the record button. Video 8K cameras have been available for several years, and there's even a 12 K camera made by Blackmagic. It goes without saying that all of these devices record and reproduce incredible images. So it's tempting to say accurate images as well.

So what tempted the RedShark readers to suggest computational optics will render images untruthful?

One of the arguments is that computational optics creates something " that wasn't there before". There's a sense in which that has to be true because the resultant images look better than before they were processed. But if you make a picture better than before, what are you basing that additional information on? To answer that, you have to make a distinction between processing that is based on the information it is given from the original image and sheer randomness. Let's say an artificial intelligence program applies itself to a picture of my face, but instead of giving me ears, it gives me teapot handles. (If you've ever looked closely at my ears, you would understand how that might happen).

But let's assume that the enhancement processing is merely very effective at sharpening and noise reduction. Traditional sharpening algorithms boost the higher frequencies (the sharper transitions) in an image. That's like turning up The tone control on an amplifier. And yet, this approach, which is almost universally applied at some stage, doesn't draw the same criticisms of inauthenticity. Why is this?

AI can do some amazing things, but does an upscale such as this make the image less ''authentic'?
AI can do some amazing things, but does an upscale such as this make the image less ''authentic'? Image: Shutterstock.

Assumptions about AI

Perhaps, it's because of assumptions we make about artificial intelligence and AI-assisted processing, and it's understandable why people are reluctant to accept that AI can have a relatively neutral role in image processing. And this tendency is exacerbated by the rash of new text to AI apps that have appeared in recent months. The idea that you can ask an AI program to draw you ridiculous, impossible pictures, like a giraffe on a skateboard made of truffles, is almost designed to instil scepticism in non-experts. I would add that very few people actually are experts at this decent stage in the development of AI image processing.

Just because you can ask AI to draw impossible things, you shouldn't assume that AI can't draw possible things. Just because you can drive a car into a lamppost doesn't mean you have to drive a car into a lamppost.

Let's get back to basics. Every recording method is a cumulative process of quite massive abstractions. The wavering groove on a vinyl record is only a vague approximation of violin's sound. A digital image of a beautiful flower has been through layer upon layer of analogue capture, analogue to digital conversion, raw processing to cope with the fire filter pattern on the sensor, and all kinds of colour pipeline processing - to even approximate the original scene. When you look at all that in context, additional computational optics seems a relatively minor part of the process, and certainly in terms of whether or not it is adding to or subtracting from the reality of the image.

The nature of reality

With a friendly warning that this conversation is about to get deeply philosophical, I want to talk about the nature of reality.

Most people get through their entire lives without even considering the nature of reality. We barely even possess the words with which to talk about it. When we talk about reality, we are often also talking about authenticity, which boils down to whether or not a given phenomenon is fake or not. Is that a genuine memory card you've just bought from an online retailer, or is it fake? That's just another way of saying, "is it real?"

We live in our own everyday model of the world, and everyone lives in a fairly similar model through social interaction. We have to do that just to get by in life. When you go to a country with a radically different culture, that's a different model, which we have to learn, and translate into our own terms, whether it's language, customs or the laws of that country. So we really have to see or think beyond our everyday model. It works exceptionally well for us. But it shields us from questions about the nature of reality itself. These are hard questions to answer because our own linguistic facilities don't extend easily beyond our model of reality. When we do step beyond our empirical comfort zone, it gets tricky because there are no guidelines.

Here is an example: What colour is a red rose? The answer is that we can't take anything for granted here. In the real physical world, there are no colours: only wavelengths of light. We don't see the wavelengths. We don't measure frequencies. We perceive these phenomena, But in a sense, colours are only metaphors. So it's tempting to argue that We only perceive reality "indirectly. This calls for an analogy.

Why do we need operating systems? It's because we need a metaphor for the complex processes that take place at a deep level inside a computer. Even the computer experts couldn't live day to day working at the level of the ones and zeros inside their machines. Take a hard disc, for example, where arrangements of magnetic patterns represent digital data - and that data can be anything from this year's accounts to a cat video. At a low level, discs are complicated: they move data around in ways that don't correspond to our memories. We can't draw on our experience to understand how a hard disk works. And yet when we request a hard disk to do something -store a file, for example, that's simple enough. With Modern operating systems, the process is as simple as dragging and dropping a file from one desktop icon to another. The operating system's user interface reflects our own internal model. But, under the hood, that's far more going on than we ever need to know. But we need to understand that the internal processes, usually hidden from us, bear no resemblance to our own concept of what is taking place.

I don't claim this as a completely original thought, but what I believe is that our own model of the world – what we see, hear, taste and smell every day, is analogous to a user interface belonging to an operating system. The complex and obscure world of atoms, molecules, quantum physics, and raw reality is invisible to us without our reality model acting as a friendly user interface. And if that is the case, then we are in no position to talk about whether or not images reflect reality beyond holding a photograph up to the original scene for comparison.

I believe that computational optics do nothing but enhance the accuracy of images. This processing layer is just one of an indeterminate number of other layers.

These processes don't detract from the accuracy of an image. Instead, they make it possible to see it in the first place.

Tags: Technology computational photography

Comments