<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

Why are my colours wrong?

Image:
3 minute read
Image: Shutterstock.

Never Twice The Same Colour used to be the nickname for NTSC. But now we could say the same of our entire production chain. What is going on?

Sometimes, we ensure our technology is set up right by making the right settings automatic. When we can't do that, we present the user with a neatly-ordered list of choices, labelled clearly with copious documentation. And sometimes, we commit to an unholy combination of the worst of both, with a miasma of interacting options, software which makes dubious and inconsistent assumptions, and checkboxes with names that are nice and friendly but don't really describe what they actually do.

Such is the case with colour and brightness encoding in video.

For the sake of disclosure, I'm not going to promote any universal solutions here; the problem takes in so much of the lens-to-screen chain that we can barely touch on some of the main issues. Almost a generation of trying to make things simple has in fact made things so complicated that step-by-step advice is almost impossible to give.

The problem came to prominence in the late 2000s with Canon's EOS-5D Mk. II. Like most things, the 5D stored its images in subsampled YCrCb format rather than RGB. The details of each barely matter other than the former is often used alongside studio-swing brightness ranges, meaning that in an 8-bit file capable of representing values between 0 and 255, black is not zero, it's 16, and white is not 255, it's 235. A wide variety of software assumed that anything stored in YCrCb used studio swing.

A wide variety of software was wrong.

The chain

The problem is that the 5D used all of the 0-255 range, so anything which assumed 16 was black and 235 was white ended up clipping whites and crushing blacks enough to be objectionable, without being unwatchable. It was routinely overlooked and there's a lot of thoroughly-mangled footage out there from the 5D and other cameras. Happily, since then, things have been standardised and unified so it very rarely occurs in modern practi... oh no, wait, hang on.

The problem is not really in acquisition. Yes, there are concerns; Since the 5D Mk. II, there's been the addition of various HDR formats and there's been an explosion of camera-specific brightness and colour encoding. It can be complicated, but it's comparatively easy to handle because we control both ends of the process. No matter what we shoot, it's likely the post production software can handle it, and that software is likely being handled by someone with significantly more technical knowledge and willingness to tinker than the average YouTube viewer.

The problem is distribution, when we have no control over how the pictures will be treated. To an extent it was ever thus; trying to get movie theatres to all do the same thing has been tricky since before even frame rate was standardised. Now, having things look right on everyone's favourite video sharing platform - for whatever value of "right" appeared on your monitor during the edit - can involve a really unreasonable amount of trial and error.

Confusing software options

The problem isn't really the websites. Some could be clearer about exactly how material should be set up, but they would be expected to handle a wide variety of brightness ranges, subsampling formats, resolutions and colourspaces. Some sort of automation based on the contents of the file is inevitable. The trick is getting the right contents into the file, which means having the data set up right and the right flags set, but that's not as obvious as it should be. For instance, one program has a checkbox called "preserve super-whites". What does that mean? Will the output use studio swing or full swing? Will the file be marked studio or full swing? If we uncheck it, will the results be clipped or scaled? Sometimes, the desperation of software to seem approachable robs it of the specificity it needs to be useful.

So, between the light hitting the sensor and the light leaving the monitor, there are a huge number of places all this can go wrong; in camera, in the NLE itself or in a subsystem such as QuickTime. The internal monitoring and timeline setup of that NLE can be dazzlingly complicated, and outputting the finished production back to a file presents many of the same problems in reverse.

The role of the user

Then, even more intractably, the part we can do nothing about: the user. Happily, most cellphones and laptops are set up to be reasonably consistent sRGB devices, although ask around a Discord server full of wannabe YouTubers and "sRGB" is likely to draw blank stares. Macs tend to understand that they have strange default display gamma behaviour and correct accordingly, although almost any laptop or workstation has more than a few buttons that will break video display.

If this all sounds like a fully-fitted, shag-pile, wall-to-wall festival of misery and disaster that's spent the last twenty years making large numbers of things slightly the wrong colour... well, yes. It is. And there is no solution other than to learn the intricacies of the software and ensure everything is correctly set up, which will work until someone alters something without publishing a change log. There is no one magic configuration that gets everything right for every target.

Fixing this situation involves an amount of standardisation we seem unlikely to achieve. In the meantime, if this has provoked anyone into the inconvenient realisation that thousands of hours of work is besmirched with these problems, well, sorry. Once seen, it cannot be unseen.

Comments