<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

Night Sight on the Pixel 3 is a step change for mobile photography

2 minute read

Google

Shoot photographs of scenes so dark that you can't see them clearly with your own eyes? Sounds like almost like a super-power but one which Google is sharing with users of its Pixel phones.

Night Sight is a new feature of the Pixel Camera app that lets you take sharp, clean photographs in near darkness. It works on the main and selfie cameras of all three generations of Pixel phones, using a single shutter press, does not require a tripod or flash.

How dark is dark? Well imagine it’s so dark you can’t find your house keys on the floor.

Google rate this light level as 0.3 lux. Technically, lux is the amount of light arriving at a surface per unit area, measured in lumens per meter squared. So, 30,000 lux is direct sunlight, 300 lux is typical office lighting, 50 lux is your average restaurant, 10 lux is the minimum for finding socks that match in your drawer, 3 lux is the pavement lift by street lamps and 0.1 is so dark you need a torch to find the bathroom.

According to Google, smartphone cameras that take a single picture begin to struggle at 30 lux. Its boffins have deployed a barrage of software tweaks to improve this performance.

Google Pixel nightime.jpg

An example of the iPhone XS (left) vs the Pixel 3's Night Sight (right)

HDR+ with a machine learning twist

The technology builds on HDR+, a computational photography technique introduced a few years ago that captures a burst of frames, aligns them in software, and merges them together to improve dynamic range.

As it turns out, merging multiple pictures also reduces the impact of noise, so it improves the overall signal to noise ratio in dim lighting.

To combat motion blur that Google’s existing optical image stabilisation algorithm can’t fix, the Pixel’s use ‘motion metering’, which measures recent scene motion and chooses an exposure time that further minimises blur.

All three phones use the technique in Night Sight mode, increasing per-frame exposure time up to 333ms if there isn't much motion.  On the Pixel 3 the function uses Super Res Zoom (whether you zoom or not) which also works to reduce noise, since it averages multiple images together.

A machine learning based enhancement to auto white balancing has been trained to discriminate between a well-white-balanced image and a poorly balanced one. In other words, the colours of the scene illuminated in extreme low light conditions should appear more neutral. 

A related problem is that in very dim lighting humans stop seeing in colour, because the cone cells in our retinas stop functioning, leaving only the rod cells, which can't distinguish different wavelengths of light. Scenes are still colourful at night; we just can't see their colours. 

It hasn’t gone so far as making night scenes ‘day for night’ (which for most people who want that shot at night look might be pointless) although Google intimates it probably could. Instead, it’s employed some tone mapping tricks to lend colour to the night time shot while ensuring it reminds you when the photo was captured. 

Now, Night Sight can't operate in complete darkness, so scenes do need some light falling on it.

Also, while Night Sight has landed on Pixel 2 and the original Pixel, Google says it works best on Pixel 3. In part because its learning-based white balancer is trained for Pixel 3, so will be less accurate on older phones.

Pixel-3-night-shot.jpg

Pixel 3, Night Sight shot, with tripod by Alex Savu

How dark can Night Sight go?

Below 0.3 lux, autofocus begins to fail. If you can't find your keys on the floor, your smartphone can't focus either.

Below 0.3 lux you can still take amazing pictures with a smartphone, and even do astrophotography, but for that you'll need a tripod, manual focus, and a third party or custom app written using Android's Camera2 API.

“Eventually one reaches a light level where read noise swamps the number of photons gathered by that pixel,” explains Google in a blog by imaging engineers Marc Levoy, and Yael Pritch. “Super-noisy images are also hard to align reliably. Even if you could solve all these problems, the wind blows, the trees sway, and the stars and clouds move. Ultra-long exposure photography is hard.”

Tags: Technology

Comments