<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

Where does camera development go next?

Retro-futurism:
4 minute read
Retro-futurism: Shutterstock

Replay: Rather than simply pilling on more resolution and indulging in an AI-feature arms race, maybe the most fruitful conversations that camera manufacturers can have is with each other.

Unless you've been in orbit without an internet connection for six months, you'll have noticed that we're living in a rather special time. AI is very obviously changing everything and doing so rapidly. But there are some other aspects of the creative industries that we need to look at because we are at another kind of inflexion point, and it's one that is less shouty than AI. 

When RedShark started in 2012, we couldn't have picked a better time. We didn't know it then, but it was as if a silent starting gun had just signalled the start of a race for better quality digital cameras, more pixels and more dynamic range. What differentiated progress in the 2010s from the 2000s was that digital cinema had become a thing. 

How we got here

Sony's experiments with modified HDCam systems to shoot at 24 frames per second were highly influential, and Thomson's Viper Filmstream camera produced fantastic pictures but needed cumbersome storage. The cameras' resolution - essentially Full HD - was only seen as adequate at the time because the experience in a typical out-of-town cinema was of surprisingly low resolution. Generational loss in printing films for distribution was a culprit. It only takes half a pixel worth of blurring to quarter the apparent resolution. So expectations for digital filmmaking on the part of the consumer weren't exactly stellar. And then came 4K. Largely pioneered by RED Digital Cinema, and despite the tricky workflow in the early systems, it was finally possible and reasonably practical to shoot high-resolution feature films digitally and show them to appreciative audiences. 

In the first few years of RedShark, 4K cameras appeared at all levels, from high-end filmmaking to Prosumer to Consumer. Meanwhile, despite predictions by some in the industry that 4K would take twenty years to catch on, it quickly became impossible to buy anything less than a 4K TV in the shops. 

The speed at which 4K became almost universal (apart from ARRI - which persisted with its "old" sensor technology - with quite a bit of justification given the satisfying results from its Alexa cameras) distracted from the astonishing development of 8K. Even as 4K cameras appeared at trade shows, there was always a corner devoted to future technology. In these less-than-glamorous locations, you'd probably find a demonstration of some 8K prototype system. What was impressive in these early incarnations of extremely high-resolution video was not the picture quality but the hardware needed to get it on the screen in the first place. Look around the back of the screen, and you'd see a tree trunk worth of parallel SDI cables, necessary because then and now 8K has around 85 times the pixels of SD. So 8K today is breathtaking; back then, it lacked dynamic range and looked rather smeary, which does not detract from the technical achievement. 

At around 32-35 megapixels per frame, 8K is video at the same resolution as a professional stills camera. In July 2020, when everyone was expecting Blackmagic Design to announce an 8K camera, it didn't: it launched a 12K design with over 80 megapixels per frame instead. We had gone from being unsurprised when experts said 4K would take 20 years to become widespread to being able to buy 12K cameras in the shops a mere ten years later. That is, without any hyperbole, astonishing. 

But what happens now? 16K? No. I don't think so. Even if pixels become free of charge, it's hard to see 8K being surpassed except for in public venues and outdoor advertising where an entire football stadium is wrapped in an LED video surface. 

The problem for manufacturers is that we've almost reached a hard stop. For ten years, camera sensors have been an almost textbook illustration of Moore's law, doubling in resolution regularly. Alongside that exponential phenomenon, the supporting technology has grown to support moving, processing and storing all those pixels. But now that everyone is still waiting for even higher resolutions, where do camera manufacturers go next? 

The answer isn't obvious at all. 

The need for standards

I'm a great fan of the idea of consolidation. At the start of the last decade, there was always a mad rush to announce new products at the major video and photography trade shows. For a publication like RedShark, it was exciting - but also problematic. It meant that two or three days per year, there would be twenty times as much news as during a typical non-show day. That left large parts of the yearly cycle when there wasn't much news at all. Now, the pressure of trade-show release cycles has diminished as developers and manufacturers take a more measured approach to innovation. It's better for everyone, and a side benefit is that products feel like they last longer because buyers don't get such a strong sense that their devices need to be updated in the space of only twelve months. 

So, now, I think the inflexion point is this. Camera makers can keep making cameras - there will always be people who need to replace older gear or are expanding their businesses and need more equipment. A period of stability is probably a good thing. But what do camera designers do now, given that the race for resolution is over? Of course, they can continue to improve their products. Maybe spend more time on user interfaces and usability. Perhaps take computational imaging into new territories, almost certainly involving AI. 

But perhaps the most helpful direction might be for competing manufacturers to start talking to each other. Figure out what future filmmaking is likely to be like. What would a brilliant camera for the next decade look like? 

If this conversation were to take place, my suggestion for the agenda would be to talk about new standards. Standards about style, quality, authenticity, security, AI codecs that use conceptual vectors rather than pixels, and other concepts and dimensions haven't been considered yet.

Ironically, standards, which are not supposed to change, often facilitate that change. It was only because of HTML that the internet drove the biggest changes that enterprise and leisure have ever seen. What standards do we need for the future? That very future will depend on the effort we put into today's standards to facilitate it.

Tags: Production

Comments