<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

Project Indigo: Adobe's powerful new computational photography app

A long exposure image captured using Project Indigo
2 minute read
A long exposure image captured using Project Indigo
Project Indigo: Adobe's powerful new computational photography app
3:39

Last week Adobe Labs quietly unveiled Project Indigo, its impressive new (and free) computational photography app for iOS with some serious provenance.

Available for iPhone 12 Pro/Pro Max, 13 Pro/Pro Max, and all iPhones 14 and above, Project Indigo is a new and agreeably free app that Adobe Labs says should offer something interesting for professional and casual photographers alike.

Its headline feature is the introduction of a custom computational photography pipeline. "Instead of capturing a single photo, Indigo captures a burst of photos and combines them together to produce a high-quality photo with lower noise and higher dynamic range," runs the blurb.

project indigo highlights

The benefits of using HDR...

One of the developers of the app is one of the godparents of computational photography on mobile devices too. Marc Levoy is an Adobe Fellow nowadays but was one of the people behind the impressive computational photography features in the early Google Pixel cameras. 

Photos produced by Indigo employ both computational photography and AI to produce what Adobe says is a natural (SLR-like) look for photos, including "special (but gentle)" treatment of subjects and skies. This look is applied when generating JPEG images and is embedded as a rendering suggestion in raw DNG files (if enabled). 

project indigo pinch zoom
Superior zoom with multi-frame super-resolution

Multi-frame super-resolution, meanwhile, boasts the ability to restore much of the lost quality when using pinch-to-zoom, and there are plenty of overrides allowing you to wrest control of focus, exposure time and ISO, exposure compensation, and white balance away from iOS. Also the Indigo filmstrip allows you to send an image directly to the Lightroom mobile app for editing.

And there are some technology previews too, such as removing reflections when shooting through plate glass windows that cover most of the camera field of view.

Levoy and Florian Kainz, principal scientist at Adobe, have written a detailed blog post about the ins and outs and whys and wherefores of smartphone photography. It's a decent read in itself, but it also talks about some of the future aspirations for the app.

"What's next for Project Indigo? An Android version for sure. We'd also like to add alternative "looks", maybe even personalized ones. We also plan to add a portrait mode, but with more control and higher image quality than existing camera apps, as well as panorama and video recording, including some cool computational video features we're cooking up in the lab."

There's also talk about supporting exposure bracketing, focus bracketing, and a number of other multi-frame modes where one or more camera parameters are varied between frames.

"Exposure bracketing in particular would help with astrophotography. Unique to Indigo, these bursts would be combined in-camera to produce photos with extreme dynamic range or depth of field (sometimes called all-in-focus), respectively."

There are also likely to be plenty more technology previews, so check it out while it's free (which might not be forever), though there do seem to be some bugs as you'd probably expect with a Labs release. Oh, and while it will work on a range of iPhones, Adobe recommends iPhone 15 Pros and newer. I tested it on my plain vanilla iPhone 15 hoping to get some impressive photos of Donegal beaches while on holiday last week, and the thermal overload pretty much shut down my phone. Clearly it's doing *a lot* of work under the hood and you can understand why the likes of Apple shy away from implementing such tech in their own base spec software.

Tags: Technology Apple computational photography Adobe

Comments