<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

Why are there so many camera codecs?

3 minute read

SonySony F55

Why can't cameras all use the same codec?

 

It's something that we all probably have an opinion on, and when you meet people and ask them about it, it's very likely that it's the same one: why can't manufacturers agree on standardising their camera codecs?

But what on the face of it can seem just plain maddening does have some logic behind it.

What happens when you get a codec choice wrong? You can end up losing the most famous, most valuable brand of all. Sony achieved this with the Walkman.

What happened was this.

Sound brought out the Minidisc format. It was, in a way, revolutionary in that it brought affordable digital recording to consumers using an optical disk that was far more robust than the earlier Digital Audio Tape format, which was expensive and complicated, and - to the horror of the largely paranoid recording industry - capable of making perfect copies of already "master quality" CDs.

So DAT was sidelined except as an esoteric enthusiasts toy or in recording studios, where it because an medium for delivering the day's work.

Minidisc, on the other hand, used compressed digital audio. To fit a whole album onto the small discs, it needed to be shrunk, and Sony developed their own codec called "ATRAC" - and it sounded pretty good. But it had a fatal flaw: it wasn't compatible with MP3, the upstart codec that everyone else, including Napster, an early file-sharing service, was using.

So very few people wanted to buy into the Sony system that wouldn't even let them play files that they could buy or (ahem) download from the internet.

It certainly didn't help when Apple's iPod came along.

Within a year or two, file-based portable players were the norm and Sony's "Walkman" brand was out of favour in a way that could have conceivably be avoided.

Differentiating products

Meanwhile, in the world of professional video, it's always been the case that manufacturers have had their own formats. You can understand this up to a point: they're businesses, and as such they have to have ways to keep their products differentiated from their competitors.

It's a shaky argument, though, because you could argue that at the very least train companies conform to the standard track width. But then their trains would be pretty useless if they didn't.

Whatever's gone on in the past - it's a good time again to ask why there are so many variations on the same theme, with so many manufacturers creating new variants of what is essentially the same codec: H.264.

Now, this discussion has the potential to get very technical, very quickly; but I'm not going to go there.

I just want to relate the gist of a very good conversation I had on a recent visit to Sony Professional's European headquarters in England.

Talking to an extremely knowledgeable in-house codec guru, I asked the inevitable question: why does Sony (and other manufacturers, for that matter) keep bringing out new codecs?

The answer was one that I hadn't thought of before and which does make sense.

Not a stand-alone codec

It seems that H264 or AVC or MPEG 4 or whatever you want to call it, is not a codec in a stand-alone sense. You can think of it more reasonably as a kit of parts to build a suite of codecs (with different bitrates, resolutions and colour sub-sampling, as well as as choice of inter-frame or intra-frame encoding).

There are even more detailed options like Entropy Encoding which very few people outside the industry understand.

So, it is armed with this raw kit of parts that the camera designers figure out how to bring out the best in their new devices.

It's hardly surprising, then, that there are enough variations in the final "packages" to say that they are not, natively, mutually intelligible.

It's a bit like the difference between Dutch and English. In terms of linguistic DNA, they're probably 98% the same, but the average English person can't understand the average Dutch person.

If the manufacturers were to liaise with each other, this situation could probably be avoided, but, if it really is the case that the main differences are to do with optimising their camera designs, then to talk to competitors would also be to give away the (competitive) advantages of your new product.

What's more, there may actually be some big or small aspect of a manufacturer's new product that absolutely demands a specialised version of a codec.

All of this kind of makes sense. On hearing this, my feeling was "why can't they all just use ProRes?".

Well, they could, technically. But there are licensing issues with implementing ProRes, and it's not always easy to get permission from Apple to use it  - especially for encoding.  What's more, ProRes is not the most recent of codecs and so it's not the most efficient either. This matters when you're recording to expensive solid state memory.

But many new cameras really do have specific needs for their own optimisations of the H.264 codecs.

Squeezing the maximum quality out of a new camera is probably a better number one aim than potentially throwing away some (albeit probably very small) aspect of performance in order to ensure better interoperability with other cameras.

And it's very easy to say "these guys could do it better by doing x,y, and z", and we're entitled to say that. But, you also have to concede that the traditional camera makers really do know what they're doing. There's no substitute for having worked with analogue and digital video for multiple decades.

 

Tags: Technology

Comments