<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

There's no escaping the laws physics with sensor design

4 minute read

Replay: We'd all like a 16K camera with a super-35mm sensor that has 18 stops of noise-free dynamic range, shoots 2000 frames a second and can see enough colour to overflow Rec. 2020. But first we have to deal with the laws of physics.

Such a camera might just about be buildable in the year 2050 at the current rate of progress. The eternal quest to get there as soon as possible is what makes progress happen at all, and in some ways, it's a shame when we reach a post-scarcity world in which we have everything we want. The thrill of the chase is what keeps engineers interested.

Until we achieve that future nirvana, we're going to have to deal with some engineering compromises. Exactly because of that – and here's the controversial part of this – there's a level on which most cameras at any point in time actually will have something like the same performance. How can we possibly say that, though, when some of them have four or 16 times the pixels, some of them shoot double the frame rate or are two stops faster than others, or have measurably wider colour gamut?

Reality

Well, the reality is, if we just went and bought the sort of sensors that are available off-the-shelf from electronics suppliers worldwide, they'd all have astonishingly similar performance if we account for the size of their pixels. Most people are happy with the idea that bigger pixels mean more sensitivity and it's fairly clear that a smaller number of bigger pixels will fit on a sensor of any given size, so resolution and size are a direct tradeoff. Having bigger pixels also leads to less noise since a larger signal requires less amplification to achieve a given ISO rating.

This gets complicated because what we think of as separate things are really combined. That bigger signal might also be taken to mean that the less-noisy shadows are more viewable, increasing effective dynamic range. Or, to look at it yet another way, the camera is more sensitive for the same amount of noise. In the end, sensitivity, noise performance and dynamic range are all basically controlled by the same design concerns.

Assuming all chunks of specially-processed silicon have the same basic reaction to light, it means that a sensor designer can optimise for sensitivity, resolution and other factors more or less at will; sensor design is at some level a zero-sum game. This has been very knowingly leveraged by Sony in the A7 series stills cameras. The A7s has a comparatively modest 12-megapixel resolution but spectacular low-light performance. Its higher-resolution siblings are sharper but not quite so good at seeing in the dark, just as we’d expect.

However, the sensor in an A7s is (probably) not something that can be freely ordered from an electronics supplier. Sensors in leading stills cameras, as well as those in the best-known cinema cameras, are representative of the absolute cutting edge of imaging technology. It's worth reinforcing that: there are a lot of fields in which image sensor technology is used, from astronomy to bioscience. While some of them demand exacting specifications, perhaps the most demanding combination of characteristics, arguably, comes from cinematography.

Panasonic S1H.jpg

The Panasonic S1H puts 6K video recording into a small form factor and shows that sensor design is improving all the time.

Small differences

Companies working in the field, therefore put their most advanced technologies into sensors for film and TV work, knowing that the specifications of the resulting cameras will be minutely examined in a way that rarely applies to the sensors used in barcode scanners at supermarket checkouts. Does it make any difference? Well, it's hard to say. If the people who make Red's sensors made a sensor of exactly the same size, resolution and colour performance as the people who make Panasonic's sensors, it might be possible to do some sort of side-by-side test. Since it's rare for two manufacturers to ship a camera with precisely the same specs, that's not usually possible. But if that happened, we could evaluate how much of the difference between manufacturers is down to different secret-sauce manufacturing techniques and how much of it is simply down to choices – that compromise we mentioned earlier.

One day, some comedian will come up with a bit of mathematics to score sensor performance on some sort of scale. We could take the natural logarithm of megapixels multiplied by maximum frame rate and somehow incorporate ISO sensitivity and colour gamut, but it might not mean all that much, given that things like ISO (and, because they're connected, a lot of other things) are to some extent a matter of opinion. Taking a broad view of the whole market, it might be reasonable to suppose that at any time, given the release dates of various cameras, there might be a stop or a stop and a half's real performance difference between the available sensor technology.

That's not much and the landscape will constantly change as new cameras are released, but it is evidence that progress is still occurring. A lot of technologies currently allow cameras to be as good as they are. Putting a microlens array on the front of a sensor might allow the light falling on a wider area, that might have been lost in the gaps between photosites. Dual-channel offload (various manufacturers call it different things) allows the image data to be read twice using different amplification settings and then be unified later by other electronics. Back-side illuminated sensors, where the light hits the most sensitive part of the silicon first, are a reasonably recent innovation.

Future sensors might be able to make the sensitive layer on an entirely different fabrication process, optimising for the performance of the photodiodes rather than the processing electronics, as is now the case. Sticking those two layers together is the challenge, and that's being looked into. Changing the coloured dyes on a Bayer filter array might let us trade off sensitivity for colour performance. The list goes on.

Some of those engineering changes make a lot of difference and some of them are subtle. Some companies own patents on various approaches and prefer those approaches for financial reasons. But that's why these subtle differences exist. As long as there's a piece of technology in the world, someone will want to make and sell a better one, but in the end, they'll all be subject to the same laws of physics.

Tags: Production

Comments