<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

New smartphone image sensors from Apple and Sony promise 20 and 17 stops

Pic:
2 minute read
Pic: dreamstime.com
New smartphone image sensors from Apple and Sony promise 20 and 17 stops
4:26

We could be in for one of those occasional but impressive step changes in smartphone imaging quality thanks to new sensors from (maybe) Apple and (definitely) Sony.

We’ll reverse the order from the headline and start with Sony, as its new imaging sensor, the LYT-828, is actually shipping next month while the Apple sensor is still at the patent stage.

Sony’s 17 stop sensor

1.jpg

The LYT-828 is a new effective-50-megapixel CMOS image sensor, that Sony says features a wide dynamic range of more than 100 dB, significantly suppressing blowout in bright areas while dealing effectively with noise and blackout in dark areas. And when you run those numbers through the usual 6 db = one image stop conversion, you come up with an impressive figure just shy of 17 stops.

It achieves this by incorporating one of the latest HDR technologies, Hybrid Frame-HDR (HF-HDR), into its design. Sony describes this as a fusion of multiple HDR functions: single-frame HDR technology that uses dual conversion gain (DCG), and is included on conventional products; and multi-frame HDR technology, which merges short time exposure frames with DCG data on the latter stage application processor. 

The 100 dB this results in is the highest of any SSS’s CMOS sensors for mobile applications. And usefully, the HF-HDR technology maintains HDR functionality even while zooming, ensuring high-quality imaging for more use cases. Also well worth a mention is that the chip has been designed for low power consumption, mainly in the logic circuits, making it possible to deliver HDR functionality at all times without overheating the phone.

Apple’s 20 stop patent

Apple-iPhone-16-Pro-camera-system-240909-1

Kudos to our industry colleagues at Y.M. Cinema for uncovering news that Apple had filed a new image sensor technology patent last month. The patent is titled Image Sensor With Stacked Pixels Having High Dynamic Range And Low Noise, and that title is pretty much the story. 

It achieves this by stacking a sensor die on a logic die. Each 3-transistor pixel comprises a sensing circuit on the sensor die, including a photodiode and a lateral overflow integration capacitor (LOFIC) circuit to enable sensing over a wide range of lighting conditions, from indoors to bright sunlight, without automatic exposure control. In addition, each pixel comprises a pixel circuit on the logic die, which includes a current memory circuit. The current memory circuit is used in sensing the level of noise in the detector element in realtime, which it can then nerf using standard noise reduction techniques.

The upshot is that a) it looks like Apple has developed a chip that allows it to control each pixel individually depending on the amount of light hitting it. And b) it is capable of detecting light over a dynamic range of about 120 db. Running the standard conversion over that we get to the magic number of 20 stops. 

Stop already!

sony hdr comparison

2x zoom taken with with DCG-HDR (left), and HF-DDR right. Pic: Sony

It has to be said that Apple is really active in this field at the moment. A veritable blizzard of recent patent applications covers everything from liquid cooling systems (yes, really), to a camera and lighting system designed to intelligently adapt both image capture and scene illumination. 

Camera tech was long a point of difference between smartphone generations that encouraged people to upgrade their handsets. Given that AI so far has failed to kickstart the same process, never mind engender the kind of supercycle replacement that 4G and the introduction of larger displays did, investing once more in something that makes an obvious, visible difference to users is no bad thing.

And 17 stops is a lot; up there with pro cinema cameras and significantly in advance of the current smartphone average of 10-13 stops. 20 stops is, of course, even more (shades of Spinal Tap, anyone? None more black!) though there are mutterings over how well the Apple tech will really work in the real world.

Of course, the big difference between the Apple and Sony chips is that the Apple one is vapourware until it gets put into production, while the Sony chip will likely appear in new phones before the end of the year, including potentially new models from Apple. Either way though, we could be on the verge of a big improvement in smartphone image quality, and without the need for computationally heavy post processing to make it happen.

Comments