<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

Does it really matter what camera you use any more?

 A glimpse of the post-scarcity future?
3 minute read
A glimpse of the post-scarcity future?

Replay: Phil Rhodes argues that we're almost at the post-scarcity point with cameras where the equipment is so uniformly excellent that, if the end results don't look good, it's down to the user.

The inhabitants of the Star Trek universe, Iain M Banks’ Culture novels and several other seemingly Utopian futures enjoy what philosophers and economists have defined as a post-scarcity society, meaning that the limitations under which we live today – energy, food, materials – have become trivial. Everything is available to everyone. Standards of living are uniformly high and people aren't necessarily required to work (although most such utopian futures posit that people are generally expected to make themselves useful somehow).

What’s interesting is that this sort of situation can reasonably be applied to almost anything that's subject to technological process. Audio acquisition, for instance, more or less hit that particular point some years ago, and it's long since been easy to record sound that's essentially indistinguishable from the real thing. 24-bit, 192KHz recordings have sufficient fidelity to maintain their complete transparency even through significant post production processing. Post-scarcity requires not that this is merely possible, however; it requires that it's easy, and as regards audio, it is. Skill aside, the required specs can now be met by a recording device that's built into the microphone. It's cheap. It's easy. It's become, to all intents and purposes, trivial. With some qualifications about access and availability, that's post-scarcity technology.

Post-scarcity cameras

It seems dangerously possible that camera equipment will go the same way, at some point. We're visibly approaching it even now. At conventional screen sizes, more than 4K of resolution is widely held to be more or less a waste of time, for instance, and trivially inexpensive cameras now offer beyond-HD resolutions. The goals are different, too: mainstream filmmaking doesn't have the same aims as audio. The goal for audio was transparency, invisibility. The goal for conventional movies isn't that, it's a sort of processed hyper-reality that was established by technological limitation and has since become popular. This is why nobody liked the high-frame-rate work that was done on The Hobbit. It's inarguably more realistic, specifically, but it isn't actually what people want, and that post-scarcity nirvana is brought nearer because less stringent requirements are being made.

Now, we can reasonably hope that there are applications which aren't mainstream or conventional, such as virtual reality, which will continue to take advantage of future developments for a long time yet. VR can require very high resolution in order to adequately fill the entire human visual field, depending on the way the displays are built and how they're attached to the user. Higher frame rates are helpful for VR and arguably for stereoscopy too, reducing lag and the associated discomfort. Things like HDR are likely to shake up the situation at least a little bit, requiring, in an ideal world, more from cameras in terms of their own dynamic range, noise performance and recording precision (though many existing cameras can quite happily produce HDR, too).

All of those are reasons why continued camera development is necessary, and an indication that we're not quite in the sunlit uplands of cheap, easy camera equipment quite yet. Even if you're a more mainstream filmmaker, looking to fill a 16:9 rectangle with 24 pictures every second on conventional displays, there are still things to yearn for. It's still reasonably rare to find lower-cost cameras capable of 4K at more than 60 frames per second for slow motion, for instance, and the compromise of noise and sensitivity could always be better, as could dynamic range and colorimetry. Much as HDR has improved the representation of highlight and shadow beyond all recognition, we might reasonably hope that one day, the colour reproduction capabilities of cameras and displays might receive the same treatment.

The value lies in people

To forestall one particular criticism, we should be clear that the abilities of the people using the equipment will never become obsolete. It has been quite some time since it became possible to get reasonable, if not absolutely world-class results from relatively low-cost gear, and putting really experienced people behind the camera, and a really talented subject in front of it, has always been the best route to success anyway. As the gear gets better and better, then, we will be forced to distinguish ourselves by ability rather than the fashionability of what's inside all the flight cases. As a meritocracy, that's hard to argue with, although it might make a few manufacturers nervous. It should also make anyone nervous who's ever used equipment as an excuse for mediocrity, or at least anyone who's done that in the last ten years. The equipment is now superb. If it doesn't look good, the problem is almost certainly not the camera.

As regards the attractiveness of a future where a camera can go on for just as long as an old, faithful microphone, though, we may be on the cusp. For most normal jobs, we're so nearly there as to make no difference, at least technically. For most of us that'll be a very attractive idea, although for manufacturers — again — the prospect may seem less rosy. Still, chin up. The same situation didn't destroy the audio equipment industry.

Futuristic cityscape graphic by shutterstock.com

Tags: Technology

Comments