<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

Pixels Schmixels

3 minute read

David Shapton/RedSharkPixellated Rose

Phil Rhodes injects some reality into the current race for megapixels

It's been some time now since the digital stills fraternity stopped worrying about megapixels, at least to the extent that the Canon EOS 5D mark 3 was designed with only trivially greater resolution than its predecessor.

I certainly read one comment that the 18-megapixel sensor on a 7D had a greater appetite for resolution, given its smaller APS-C dimensions and higher absolute pixel density, than most Canon lenses could satisfy.

Given that, and the fact that casual 35mm film photographers had for years been getting no more than a couple of thousand horizontal pixels out of high-street minilab machines, an eighteen megapixel image with (by even the sternest analysis) 3-4K of real horizontal pixel resolution seems positively generous.

So, unless you want to start comparing medium-format stills cameras, the battle for numbers seems won, notwithstanding endless subjective arguments about highlight handling and colorimetry. This happened in the stills world long before it even began to happen to movies simply, and obviously, because the technology is easier. Lower average data rates are demanded from even the highest speed stills cameras, at perhaps eight frames per second, than are required for a movie camera, and the difficulties faced by the first video DSLRs are evidence that the stills to video transition is not without challenges. After all, if the 5D mark 2 had been capable of shooting its raw stills image at 24fps, it would have immediately become by far the finest video camera in the world, and probably still would be.

Have we got enough pixels?

Recently, though, one might suspect that things are beginning to change. At the low end, every cellphone now has a camera in it that's capable of both stills and video at a standard that would amaze a video camera engineer from as recently as the 1980s. That's trivial, though; they're not filmmaking tools, and no matter how easy it becomes to create the merely workable, building something truly excellent has so far remained difficult. What's significant is that, as the low end improves, the high end has as well, to the point where Sony can now ship a camera with an 8K sensor that's very suitable for high end filmmaking and for a price so competitive that its announcement provoked a collective intake of breath.

All of this is fairly obvious at this point; it's just historical fact, but it does suggest that perhaps the same point of technological convenience is being reached for video as it has for stills. Almost certainly as a result of CMOS semiconductor manufacturing techniques, the ability to build smart sensors with a lot of onboard electronics, with all the engineering conveniences that implies, means that almost all recent cameras actually have sensors that are bigger than HD. Of course, it's possible to argue about bayer pattern sensors and their effective resolution, but as a measure of manufacturing prowess, it seems to be getting very rapidly easier to make high resolution image sensors.


Remember film?

This is great, but the only way to evaluate usefulness as a working tool is to compare what's being built around these sensors with something else, and that's difficult. The gold standard used to be film, but with F65 the 8K sensor downsampled to 4K simply outresolves 35mm in the general case, if you take projection losses, both digital and film, into account. It probably outresolves most negative above a few hundred ASA, given the losses inherent to lenses and grain. So should more conventional, affordable 4K devices such as the Canon C500, and even the more recent, less noisy Red cameras. Even straight HD cameras are, with digital projection, capable of putting more resolution on screen than the 35mm photochemical finishing process, which has frequently been described and measured at 1.5 to 2K, on a good day.

If you want resolution, then, you can have it. In three to five years, it should be as little of an issue as it is to stills cameras now.

It was never about resolution anyway

There are endless arguments about formats and codecs, lenses and sensors, but in many ways it was never about resolution anyway – in the past we simply didn't have the tools to measure it as trivially as we can now simply by stating a number. Cold Mountain was shot super-35 on 500-speed film stock, with a digital intermediate (on Northlight scanners), and as such would be comfortably outresolved by the sort of cameras people buy for a couple of days' wages and take backpacking in 2012. John Seale's cinematography was nominated for the Academy Award. Audiences weren't counting pixels.

Of course, I don't intend to make excuses. Some productions will demand more resolution than Cold Mountain. It's not that we don't care. The astonishing longevity of 35mm film – perhaps showing the first signs of a definite end at the time of writing – is a tribute not only to the greatness of the technology, but to the determination of cinematographers to hold the beancounters back (It is also, pragmatically, down to the fear of production companies that audiences would reject an even slightly feeble product). But the point is that we've had video cameras – such as the Sony F900 – which can approach the resolution of 35mm film for more than ten years, and only now, once the replacements are really good, is the industry genuinely beginning to abandon the older technology. Given how this sort of thing has gone in the past, with VHS winning out over the rather superior Betamax, we should be happy about that.

But none of that is about resolution, so please, in a world where any production above a near-zero budget level can afford really, really excellent camera gear, can we stop pixel-peeping?

Tags: Technology

Comments