They’re often entertaining and they undeniably create headlines, but we need to stop treating camera shootouts as if they give us the definitive picture.
There's something attractively competitive about camera shootouts – a crucible of competition in which a clear winner will be announced, and thereafter we won't need to bother analysing the performance of cameras because we'll know which one is best.
Well, OK, nobody's really trying to simplify it to that level, but there's a certain rather gladiatorial aspect to direct competition between products that makes manufacturers nervous and has the audience baying for blood. Are we not entertained? Well, possibly, but we might not actually be all that informed.
Pointing cameras at test charts in a well-lit prep bay evaluates certain aspects of their picture output, but if that was all that mattered, the Blackmagic Pocket Cinema Camera would have been the best camera in the world, certainly on the basis of price-performance ratio, more or less ever since it was launched. Take that camera – excellent though it is – outside and wave it around violently, or, if you're feeling really brave, attempt to shoot an entire feature on it handheld, and some of the other issues will come crashing to the fore. It's a great little camera, but it has a noticeable rolling shutter, and it's about as much fun to handhold as – well – a DSLR. On the right production, it shines. On the wrong one, well. It'd be about as much fun as trying to shoot an extreme sports video on an Alexa.
Only, of course, you would rarely attempt to shoot an extreme sports spot for the internet on an Alexa, because the business model associated with that sort of production doesn't generally support fifty thousand dollar cameras, or even five hundred dollar a day rentals. I make no judgment here; practical video on demand began with YouTube and is currently eating the traditional model alive, so anyone with an elitist attitude about this may need to proceed with caution. The point is, though, that no test, no matter how carefully shot, is ever going to make a skateboard video producer rent an Alexa when he can't nearly afford one, regardless of the results. It's also not a strictly a fair test; a group of engineers directed to produce a camera for the F65 market is going to make different decisions, for different reasons, than the people who designed the Sony A7S, but the point is that either way, other factors intervene.
Useful not definitive
It's as well to be clear here that I don't mean to advance an argument that tests such as the Radiant Images camera evaluation (given below) are not useful. They are useful. What they aren't is definitive, and to change the subject slightly, there's also the issue of repeatability. When we attempt like-circumstance comparisons we're invoking at least part of the scientific method, and in that sense there's a need to present a cautious (and doubtless time-consuming) writeup detailing exactly how the cameras were configured. Modern cameras have a lot of switches and dials which can be configured into an enormous factorial total of variations. In these circumstances, the purpose of ensuring repeatability is not so much that anyone's actually going to repeat the tests, but so that the circumstances are adequately documented and properly considered before the shoot.
Radiant Images' tests have drawn several comments that the Sony A7S camera was not properly configured, and from a subjective look at the pictures I strongly suspect they're right and that the camera is being done a terrible disservice, but we have no information to go on.
For all these reasons, camera shootouts are only slightly interesting. Yes, there is some value in keeping manufacturers honest, and it's fun sometimes to see how amazingly well some of the low-cost stuff competes with the high end, at least in the context of a six-megabit H.264 of the results as we see on Vimeo. In the real world, though, the fact that a C300 outperforms a low-cost DSLR in at least some respects is neither surprising nor particularly useful information. Comparisons within a price range are much more worthwhile. It's a shame Radiant's test didn't include a Canon 5D mark 3, which, with the Magic Lantern firmware, would be a useful comparator to the A7S, but that's another issue – nobody can include everything.
If, on an ambitious project, a camera department does not shoot tests, then a huge oversight is being made in any case. If there's a legitimate option to choose one of several cameras, comparing them in the context of the production in question and its unique requirements is a perfectly sensible thing to do. For what it's worth, the endless configurability of modern cameras can easily mean that one camera body needs testing in half a dozen potential configurations. Still, take heart: £50k cameras are now being compared, with a straight face, against £1k cameras. We can wax poetic about democratisation and the removal of barriers to entry, but I don't think anyone's really complaining.