Let’s start with the obvious.
We now shoot movies on stills cameras
Amazingly, it was only just before the start of the decade in 2008 that the Canon 5DMkII appeared – the first stills camera fully capable of shooting 1080 video. Since then we have seen the DSLR displaced by the rise of mirrorless video-capable stills cameras from Sony and Panasonic. ‘Video-capable’ is now a misnomer – the latest cameras are hybrids aimed equally at the video and stills markets.
For those on a low budget the appeal was immense. Your camera may cost the tenth of the price of a pro video camera, but it produces cinema-quality images! A vast range of high quality, mass-produced lenses become available to filmmakers at a fraction of the cost of movie-dedicated lenses. With this revolution, a generation of new filmmakers, moving from stills into video production brought with them a different set of expectations – why are standard video chips so small? Why this obsession with zoom lenses? Why can’t we shoot everything in RAW? What’s an XLR?
It's hard to believe that it was only ten years ago that the DSLR revolution really took off.
There was, of course, a downside to this revolution. Ergonomics took a big hit. Functions that had been taken for granted on professional video cameras for the last three decades – built-in NDs, two high quality audio inputs, usable monitoring, parfocal fast zoom lenses – were absent, and a cottage industry developed to add on accessories to make these cameras video practical. Many also discovered to what extent stills lenses are a big compromise in video terms.
The S35 size sensor became seen as small in the rush towards bigger sensors (more of that next). One of the consequences of this was a transformation in the way we make films – particularly for documentarians. The classic way of shooting with a single shoulder mounted camera and a long-range zoom lens became superseded by very different style – less spontaneous, more studied. Wider lenses, gimbals and multiple cameras became the norm. And sometimes only parts of faces would remain sharp as autofocus hunted desperately to find something to lock on to.
What has become apparent is DSLR/mirrorless cameras, rather than replacing traditional video cameras in the broadcast/hire markets, have created a market of their own - the owner/operator independent, for whom the conventional TV market is irrelevant.
Sensors got bigger, resolutions higher and HDR arrived
Younger readers may be surprised that it was only just before the beginning of this decade that shallow depth of field became seen as a positive attribute. There was a sudden awareness that this was a key element differentiating the ‘cinematic’ from the ‘video’.
Shallow DoF is seductive, it isolates subjects from the background, and has the consequent effect of making those parts of the frame that are in focus look sharper.
So the S35 sensor, a form that goes back to silent movie days, is seen as small in the move to full frame and larger. This, of course, brings issues of its own. Sometimes shallow DoF is a big pain rather than an asset, and lenses become bigger, heavier and more expensive.
Resolution is another matter. At the start of the decade, many pundits thought 4K would never catch on. Now 4K TVs are the norm, even if the vast majority of viewers don’t watch a 4K image on those screens. As for origination, I would guess (and it is only a guess) that the majority of filmmakers reading this shoot in 4K for final delivery in HD. Once you have tasted the option of re-framing in post without losing quality, it is hard to give it up, and there is good argument shooting in higher-than-delivery resolution gives better results anyway. So it looks like camera resolutions will continue to increase whereas, for domestic use, we may have hit a sweet spot with 4K.
HDR, which many believed was a more significant development than UHD arrived, but few consumers seem to understand it, with a mess of standards, a confusion of what programmes are actually in HDR and which are not, and sets labelled HDR which aren’t actually up to the job.
Cameras from familiar brands dominated the high and mid range
The beginning of the decade saw the launch of the Arri Alexa, the camera that dominated movie and high-end TV for most of the decade. Arriflex has an unrivalled history of producing fabulously reliable, versatile and well-engineered movie cameras. And when it entered the digital realm in 2005 with the D-20 and refined that technology with the Alexa five years later, it brought that history with it. Arriving slightly late at the digital cinema party, the Alexa avoided some of the teething problems that beset the first Red cameras. For DoPs and crews, the Alexa not only felt like an Arri, its pictures, to put it simply, looked more like film. No one, to my knowledge, complained about the picture quality of the Alexa but because its output was not fully 4K, the original Alexas were not compliant with Netflix requirements for 4K TV (a striking development in this decade – TV demanded higher technical standards than cinema). Arri subsequently overcame this with larger format cameras, but not before Red had become the go-to camera for Netflix production.
It was only in the latter part of the decade that the Arri Alexa became a full 4K camera.
For mid-range TV production, one camera dominated the broadcast hire market – the Sony FS7. Sony had monopolised TV production in the analogue and SD digital era (only Sony made DigiBeta gear), but they began to lose their grip on the market in the post-tape era. It was only halfway through the decade that Sony came out with what would soon become the TV workhorse – the FS7, and its younger brother, the FS7-II. An evolutionary rather than revolutionary camera, it felt very familiar to those who had grown up using ENG/PSC cameras, and had the most versatile lens mount. The Sony range would develop through the decade, exploiting its proprietary technologies from both the high-end (the Venice) and the low-end (the A7 stills range) to great advantage. I think there is little doubt that the FX9 will continue this tradition.
In the next decade, I believe Arri will face stiff competition from the Sony Venice at the high end, and Sony itself has Canon snapping at its heels. Let’s not forget Panasonic, which makes great cameras too but never quite seem to claim its fair share of the marketplace. But maybe the real competition will come from somewhere else entirely.
Camera technology became commodified
In the days of film it would need a massive investment in engineering to manufacture a new movie camera – it is no surprise that the Mitchell movement became the basis of movie cameras for decades. In the second decade of the 21st century anyone, in theory, could buy a chip, a lens mount, and with a bit of electronics and CNC engineering turn out a movie camera. A few did, and a few found it was a lot more complicated than that. So far, as seen above, the established brands continued to dominate.
This was the decade in which the traditional companies were really challenged by new upstarts.
The action camera, pioneered by GoPro and copied by many others showed you could make an essentially disposable camera that could produce decent pictures. Perhaps the most interesting and certainly most disruptive manufacturer of this era was BlackMagic Design, breaking all the rules and slashing the costs, and leading those who started on the DSLR route into high spec/low cost video cameras. Despite its innovation, BlackMagic lacked the proprietary technology of established companies like Sony, Canon and Arri, and perhaps this is part of the reason they have never been embraced by the broadcast/hire market as they have by independents. Will it break through in the next decade? Does it need to?
Smartphones keep getting smarter
At the start of the decade I think few would have predicted just how sophisticated smartphones would become. For the non-professional, you will probably not get better stills than those from a high-end smartphone. That might be true for a few professionals too, although they might find it hard to admit it. And, yes, features have been shot on iPhones but no one will persuade me that is an easy way of working. And you still need a sound crew.
Smartphone image quality has improved exponentially over the last decade, but making professional video with them hasn't necceserily become easier.
The difference between professional and consumer equipment today is not so much about technical quality as control; a consumer device is designed to get the best results with as little intervention as possible whereas a professional device is designed to give the user the maximum options.
What smartphones tell us is the amazing potential of the computational camera. These technologies are only just beginning to feed into professional market, but there is no doubt in my mind this is where the future lies.
Film is still with us – just
For the vast majority of filmmakers today, there is no point in even considering the significant extra cost of shooting on film. For those with the prestige and the budgets who can, this is not because the film image is better but because it is different. That difference maybe very slight (authentic grain, a subtlety different colour gamut) or, if you want to shoot in home processed black and white with a clockwork camera, enormous. But better, it ain’t. For those in Hollywood who chose to shoot on film, it may be more to do with the working practices that have grown up over decades. For younger filmmakers, it maybe the limitations that the film medium imposes rather than the seemingly endless options of the digital image. Analogue systems force you to work within their restraints and, for many creatives, that is what is exciting about them. But digital continually improves and film becomes increasingly expensive as the necessary infrastructure breaks down. I believe the next decade will be the final decade movies are shot on film.
But, beyond camera, what else happened this decade? A lot – as I will discuss in the next part of this review.