We're only at the beginning of what computational photography will bring

Written by Simon Wyndham

NeuralCam

Computational photography is nothing new, but it usually gets associated with multiple cameras on one device. These apps show that computational photography can get an existing camera to perform well above its paper specifications.

If there was one advantage that a smartphone has over a dedicated single purpose device like a camera, it's that it is a computer. A computer with a lens and sensor attached to it. Not only that but it is a computer that is programmable by anyone who has the skills.

Existing smartphones use computational photography quite extensively. It's how they produce pictures with artificial bokeh and other effects due to being able to calculate depth by using information from the multiple cameras and sensors that adorn most modern high end phones.

But you don't always need multiple cameras, lenses, or different sensor types to perform computational photography. All you really need is fast processing and some clever thinking. As processors get faster and more efficient much more is possible in a shorter amount of time. And that's important when I talk about a couple of apps that I have come across that perform some really quite amazing feats.

Hydra

I recently came across Hydra by chance as I am always on the look out for photographic apps that offer something new. Hydra grabbed my attention, firstly because what to promised seemed almost impossible, and secondly because the reviews of it seemed to suggest that there was some substance behind the hype. Yes I know, online reviews can be fake. But at £4.99 on iOS I decided to give it a try anyway.

All of Hydra's modes offers something new. I'll break them down in just a moment, ut in order they appear on the interface there's HDR, Video-HDR, Lo-Light, Zoom, Hi-Res.

On first inspection, if you haven't used the app these all sound nothing new. But I can assure you that they all do things that the built in camera-app in cannot dream of doing.

HDR and HDR video clearly do what they say on the tin. But unlike the built in camera app, what Hydra does is takes up to 20 images and then merges them together. You are then presented with the option of B&W, having a medium HDR effect, or a strong one. Strong generally tends to look like the highly overdone HDR pictures the traditional effect has a reputation for. But this is way more advanced, and with much more dynamic range than you'll ever get with the built in app.

Lo-Light is, as the name suggests, a mode that allows you to take photographs in low light conditions. This mode takes up to 50 images, and again amplifies the light based up on them, but also performs some clever noise reduction, which I'll come to in just a moment.

Zoom lets you zoom the image up to 8x, while High-Res mode produces up to a 32MP picture, even though the built in camera is only 12.

Forest original photo.jpg

The whole high resolution shot

High resolution 100 percent.jpg

A 100% crop of the 32MP image

Standard resolution 100 percent.jpg

A 100% crop of an image taken with the standard camera app of the same scene.

So how on earth is it doing this? When I first downloaded the app my thoughts were similar to what you might be thinking right now. That it likely used some up-rezzing algorithm and that the details probably wouldn't be 'real'. I was in for a bit of a shock, Hydra really does do what it says, and it achieves it by being very clever indeed.

Each mode requires you to hold the camera as still as you possibly can. That's the big limitation, that you can't photograph moving subjects with it. But those subtle movements that you make while trying to handhold the device as steadily as you can are actually the secret sauce behind how Hydra works its magic.

It takes the minor differences between each shot and uses them to built up a picture containing the extra details. If you zoom in on a car number plate for example with the 8x zoom, you will find that when you look at the final picture, the letters on the car are perfectly sharp. There's no artificial detail here or smart up converting. The app has simply built the extra detail by examining the differences in the many pictures it takes.

It uses this same method to eliminate noise in the Lo-Light mode, meaning that any pictures you take in near dark have next to no noise in them whatsoever. Lo-Light mode even takes advantage of the low light capabilities of the iPhone 11, and so in combination should make some pretty amazing results.

The time it takes for the app to take, say, 50 pictures, can be slow, but once it has taken them the processing into the final file is blazingly fast. I wonder if there was a way that the app could gain access to the iPhone's burst mode, as this would make taking a picture much more practical.

HDR Hydra test.jpg

The HDR effect can be overdone, as seen here, but in the right circumstances the results can be very good.

NeuralCam

Hydra is, then, an impressive achievement. It isn't alone in having such abilities. For example NeuralCam is a dedicated low light photography app that produces some absolutely unbelievable results. It is slower at processing the final result than Hydra, but the results are arguably better, with a much brighter result. NeuralCam is certainly one of the first places you should look if you don't have the latest phone and want to take low light pictures.

Dolly Dog normal.jpg

This is my dog, Dolly, in one of the few moments she stays still, under a very dark table, taken with the standard camera app.

Dolly Dog NeuralCam.jpg

Let there be light! Same situation taken with NeuralCam. No I know I won't win any photography awards, but sometimes we just want to take snaps on the go.

NeuralCam has a great interface, and has the added ability that pictures can be saved in HEIC format. It would be great if both apps could take at least TIFF photos for more detail as the JPEG and HEIC compression can create results that are less than stellar. Would it be possible for such apps to take and merge DNG data? I'm not sure. I guess I can dream. There's also a timer in NeuralCam that can be set, so the app could be used on a tripod for the sharpest shots possible. Just like Hydra, NeuralCam really works best with static subjects. Although the difference is that while NeuralCam will also benefit from tripod use if you want to, Hydra actually needs you to be handholding the camera in order for it to perform its voodoo.

But what Hydra does give over NeuralCam is that it places a number of different, and clever, photo modes in one place. If you have these two apps plus either Halide or the newly released Firstlight, you will be set for any eventuality when it comes to using your smartphone for photography. But they also show the way forward. As processor technology gets faster, such apps will only get better and more capable. It also shows very distinct ways in which stills cameras can perform feats that are not practical for video cameras to do. Yet.

HDR field sunset.jpg

HDR sunset taken with Hydra

Sunset 4x zoom.jpg

4x zoom taken with Hydra

Sunset zoom crop.jpg

100% crop of 4x zoom

Tags: Production

Comments

Related Articles

3 July, 2020

Frame.io: What is the future for remote workflows?

Frame.io’s online Workflow From Home series, hosted by the company’s Global SVP of Innovation, Michael Cioni, has been a definitive look into how to...

Read Story

3 July, 2020

Laowa's 9mm full-frame lens is one of the widest of its type

The new Laowa 9mm rectilinear lens, designed for full-frame cameras, is the widest lens of its type on the market.                               

The...

Read Story

3 July, 2020

Lighting tutorial: How light a period drama on a low budget

DP Neil Oseman gives some effective tips on how to light a period drama when finances are tight.                                                    ...

Read Story