We're fairly used to the idea of a DSP, a Digital Signal Processor which operates on a series of samples that’s broadly equivalent to an analogue filter circuit, but which is capable of much more than that. Recently, the initials “ISP” have been used to refer to an Image Signal Processor, a class of device that includes Canon's DIGIC, the BIONZ processing system used in many Sony Alpha-series DSLRs, and the EXPEED name used by Nikon. Many of these are combination devices including both a conventional CPU and task-specific hardware aimed at image processing. The overall approach generally involves vector computing, where the device applies the same mathematics to a large array of data simultaneously - ideal for audio and images.
This is the same technique used by a graphics processor for a computer, but at a much smaller scale, given the cost, power consumption and size restrictions of a camera or cellphone. The work of building viewable images from raw sensor data is not easy, involving several stages of normalisation and manipulation, including white balance, noise reduction and exposure processing. Taking the Bayer-patterned image from the sensor of a modern camera and turning it into a complete colour image is just part of the process and a job that has no ideal solution. Any approach to doing it is subject to a lot of matters of opinion and compromises between noise and detail, and, as a result, there's both a technical and a commercial incentive to chase improvements.
Many of the systems designed to do this sort of work have used CPU designs produced by ARM, particularly DIGIC, and EXPEED since the third iteration in 2011 (BIONZ originally used a design based on the MIPS R3000, but the overall approach is comparable.) Bearing in mind that ARM licenses designs, and doesn't actually sell manufactured microchips of its own, the company has already sold (licenses for) hundreds of millions of its Mali GPU line, and in May 2016 bought machine vision specialists Apical, presumably in pursuit of exactly this sort of technology. Apical is now referred to as the imaging and vision group of ARM, and the new Mali-C71, announced last month, is described by the company as an Image Signal Processor.
The ‘C’ stands for camera.
Most of the publicly-available information about the products arising from the Apical acquisition, chiefly the Mali-C71 camera processor, concentrates quite heavily on automotive applications – self-driving cars, in other words. This is a difficult area for cameras since the device must deal with a real-world lighting environment which we're all aware can be difficult to photograph even when we're working with a full crew and a lot of equipment. Making life-or-death decisions about the behaviour of an autonomous car based on images from a completely automated camera system is something that we'd want to be careful about.
What relevance this new work has to cameras for film and television work remains to be seen, but it's already the case that many of the images we acquire right now are already being handled by image signal processors built around an ARM core (as opposed to an actual ARM ISP). These are generally designed and promoted by manufacturers as a proprietary solution to the image processing problem, hence the very existence of names such as DIGIC, EXPEED and BIONZ. In the case of higher-end dedicated imaging devices such as DSLRs and video cameras, there is a question as to how willing a manufacturer might be to give up what's arguably a level of mystique regarding their special sauce image processing.
As such, it's perhaps most likely that the principal beneficiaries of all this will be lower-end devices – cellphones and cheaper DSLRs, or multi-camera applications such as cars. The ability to buy better image signal processing devices off the shelf (or at least to license a design like ARM's and have a third-party semiconductor foundry manufacture it) is certainly likely to lead to better images in those sorts of devices.
PCB image: shutterstock.com