<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

Digital was supposed to simplify everything - it really hasn't

Bright idea:
4 minute read
Bright idea: Shutterstock

The modern lived experience of digital media production is a complex chain of processes, conversions and conventions. We can do more, but we have to do more as well. It was all meant to be so simple...

"Perfect sound forever" was the gloriously optimistic slogan from the Phillips marketing department as it launched the Compact Disc format in 1982. In hindsight, it's obvious that CD audio isn't perfect, and the discs don't last forever. But it was a brilliant marketing phrase that probably wasn't meant to mislead.

Digital media is tricky to explain because it's abstract and based on maths. But "Perfect sound forever" captures digital audio's essence for consumers: numbers don't degrade over time, and you can store perfect copies. It also hints at the optimism of a new era, where analogue and its artefacts quickly fade into a distant and unloved past.

There was good reason for hope. Analogue is full of weirdness, weaknesses and annoyances because it relies on the physical properties of materials like vinyl, plastic tape and wires. But digital media boils down to a single logical conclusion: whether a voltage represents a one or a zero.

So digital presented us with a tabula rasa: a completely blank slate to draw a new and coherent future, where we could design problems away from the start. In short, it was an opportunity to create a nearly perfect system where resolutions, frame rates and formats are all standardised, and everything is compatible with everything else. Done right, it would be like the metric system, where everything is coordinated, and you never have to multiply by awkward numbers like, say, 29.97.

Wait. What? That obviously didn't happen.

Welcome to digital dystopia

Instead, we have a digital dystopia with more variables than we ever had with analogue. And we do have to work with numbers like 29.97. So how did it come to this?

It's because the transition was gradual. It had to be. Significant infrastructural changes rarely happen in an instant. But it did happen once, in 1967.

After decades of debate, the Swedish government made arrangements to switch from driving on the left to the right. By definition, this can't happen gradually. According to the Volvo Owners Club:

"On September 3, 1967, at 04.50 in the morning, the traffic everywhere in Sweden was directed over to the right side of the road and stopped. Everything stood absolutely still for 10 minutes, and at 05.00, when it started again, all road users in Sweden, from heavy trucks to cyclists, were already on the right side of the road, and they have stayed there since."

Replacing analogue with digital in a single deft move was never a possibility. Each part of the analogue ecosystem went digital at a different time. And somehow, nothing met up in the middle, and the great "digital reset" never happened.

No one was in charge of the digital revolution. Instead, companies innovated at their own pace. Until the RED One - a digital 4K cinema camera - came out in 2007, broadcast standards dictated video formats. RED broke away from this and said you could have any resolution and aspect ratio within the sensor's geometry. The Irvine, CA, camera company made its own raw video codec, and RED footage wasn't compatible with anything, because it was ahead of its time. But this was just an early glimpse of the modern world where there are dozens of codecs, complete flexibility over resolutions and aspect ratios, and where we have better cinematic images courtesy of digital than we had even with the best film.

Nor was the technology ready for an instant change-over. Modern consumer devices can easily handle digital media, but a sudden upgrade in digital processing didn't accompany the arrival of CDs in 1982. The early proofs of concept for CDs featured a sleek-looking player, hard-wired to a hidden wardrobe full of electronics that would easily fit on a single chip only a few years later. RED camera footage called for bespoke processing cards or modern GPUs to handle the footage. Today, within reason, laptops can do the job.

The digital multiplier

Digital media has so many advantages that it would take a whole book to explain them. But it still isn't simple.

Instead, the digital era has multiplied the factors you have to get "just right" in production and post-production. You can see how this happened.

With today's cameras, it's typical to shoot video at a much higher resolution than viewers will see in their homes. That means you need multiple copies of the production, each at a lower resolution. The files from the camera might be in any of several formats: ProRes, ProRes Raw, a proprietary raw format, H.264, H.265, I frame-only codecs and so on. Each video file will have associated metadata and, when edited, will also have project and colour data. More versions emerge after colour grading for SDR and HDR with all its flavours.

There's more digital media produced too. Film was expensive, but memory is reusable. So it makes sense to shoot more to give additional options to editors or to make sure the camera team has really got the shot. Shooting ratios went through the roof when digital video arrived. At the very least, this dramatically changed the economics and even the narrative options available in documentary-making.

The burgeoning number of media distribution platforms - including social media - multiplies the quantity of assets. Instead of "one format for everything", we now have "everything needs its own format".

Added complexity

So we're left with more varieties and variants of media than we ever had with analogue. The modern lived experience of digital media production is a complex chain of processes, conversions and conventions. It's not easy at all. The achievable quality of digital media is going up, but so is the number of things you have to get right to produce it. An unreasonably pessimistic picture, perhaps, but "setting up an HDR colour grading workflow is laughably simple", said no one, ever.

Meanwhile, we must deal with the unwelcome fact that "digital" doesn't last forever. Old drives die. Magnetic imprints fade. Obsolete interconnects prevent access to perfectly good media. There are, and always will be, analogue factors in a digital world. Every workflow should include a plan for things not working as they should.

But there's good news too. Technology is thousands, if not billions of times more capable than in 1982, when CDs arrived. But even so, it's tempting to think that if we didn't have all these incompatibilities, we wouldn't need a workflow.

Really? Can that be right?

Possibly not. But think back to the days of film. Was there a workflow then? Essentially you'd take the film out of a camera, send it off to the labs, edit it, and show it in a cinema. Well, it wasn't quite as simple as that, not least because of sound, which required editing and mixing. It's tautologous to say that there were no digital workflows before digital, but I would suggest that digital workflows are far more complex than whatever the analogue version would have been.

Has it all been worth it? Definitely, yes, and obviously so. What we can do with a digital workflow is so far beyond what could have been dreamed about a few decades ago that there's no going back. It's just that... it could have been so much easier if we had standardised at the start.

Tags: Technology

Comments