<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

Is it time for all cameras to offer ten-bit recording?

4 minute read

RedShark News8-bit or 10-bit?

We revisit the 'eight-bit versus ten-bit' debate, exploring if eight-bit recording's days are numbered.

First of all, let's be clear about this. What we're talking about here is strictly the bit depth (i.e. how many bits in each colour channel are available to describe the value of each pixel) of the signals and files outputted by a camera. Internally, cameras use all kinds of bit depths, mostly, if not always, much greater than eight-bit.

We looked at the question of eight or ten-bit some time ago and our conclusion was - well, inconclusive, if only because there are so many other factors that can affect picture quality that the number of bits per colour channel is not as big a factor as you might think. But, if it's only a matter of two extra bits, why don't manufacturers simply build it into their cameras by default?

Barriers to ten-bit

Well, they could do this, but if they were to, there might be cost implications. First, there's a simple and rather old-fashioned reason (if indeed it's appropriate to call anything connected with digital video old fashioned!). The signal paths that carry digital messages tend to be multiples of eight-bits wide. So, if you were going to have a ten-bit system, you would probably have to use chips and interconnects that were sixteen bits wide. This wouldn't always be the case, but it is a factor.

Secondly, more bits means more data. That means more bandwidth and more storage. You would need to be able to process the additional data and that would call for additional processing power. While this isn't at all surprising, as we've just seen, the extra expense might be larger that what's proportionate to the additional two bits. It may even need a category of components that is a major step up from the ones that have been price/performance optimised for eight-bits.

Nevertheless, since most of the inside of a camera works at much more than eight-bits, why not allow that depth of signal to find its way out of the camera?

There may be marketing reasons for this. It's quite possible that it's used for market differentiation. We are not normally given access to manufacturer's costings versus their retail pricing.

Clarifying the issue

It may be that eight-bit is simply good enough to justify building cameras that are significantly cheaper than ten-bit ones. And perhaps we worry too much about it.

Let's just clarify one thing here. With video, strictly, the number of bits used has nothing to do with dynamic range. You can express a fifteen stop dynamic range with an eight-bit signal, just as you can convey an eight stop dynamic range with a 15-bit signal. Neither is optimal. With a fifteen stop dynamic range and eight-bits, there really aren't enough bits to provide sufficient levels to give an accurate picture. And with an eight stop image and a 15-bit signal, well, it's just wasteful.

Log recording makes better use of the available bit depth. But, it doesn't help much once you start severely stretching colours or luminosity in post production. One step you can take to make your eight-bit recordings go further is to convert them to ten-bit using a codec like ProRes. This won't add information - the image will look the same - but it will make additional levels available to you if you want to do some subtle grading or gain changes. Otherwise, you'll be compounding the 'damage' that 8 bit  processing can do to eight-bit signals.

There's a big irony with all of this. It's that high quality video signals can look worse in eight-bit than lower quality ones. That sounds surprising, so let's just attempt to explain it.

When ten-bit matters

Most of the time (absolutely the vast majority of the time), you can't tell the difference between eight and ten-bit video. The reason is that if there's any detail at all in the picture, it will mask the effect of eight-bit quantisation. It's only when there are very shallow or smooth colour gradients in a picture that you'll really notice the effects of eight-bit. A blue sky is a classic example.

Blue skies aren't really blue at all – not in the sense that blue is a single colour. You actually need thousands of shades of blue to express the real nature of a blue sky, which is that it changes from a deep blue to a very light blue or, worse, from a very light blue to an even lighter blue. At some point, there is simply too much distance between the discrete levels of blue that can be described by an eight-bit signal. So you start seeing contours: distinct jumps between colours.

These contours are always there, but you don't normally see them because they tend to follow the shapes of objects. So, they merge into the objects themselves. With a blue sky, there are no discrete objects, so the contours are forced into the open.

When you add noise to the picture (or if it's there in the first place), it changes the boundaries of the quantised colours. It confuses the contours. It can almost make them disappear. The irony is that if you have a camera with a lower quality, noisy sensor, it can actually mask eight-bit quantisation. Of course, the overall amount of information is reduced, although that isn't necessarily the case.

It's important to note that this is probably the only benefit of having a noisy camera, but it does make a significant difference – enough of one that it might even support our original conclusion, that in real-world, practical terms, it really doesn't matter whether you have an eight-bit or ten-bit camera.

It's also important to say that if you have a high end camera, one that can maybe shoot with a bit depth of  16-bits, it's pretty pointless outputting through an eight-bit SDI port, unless its for uncritical monitoring. Outputting raw will mostly circumvent this anyway.

So, is it time for all cameras to offer ten-bit recording?

To return to the original question of whether camera manufacturers should make all their professional cameras output ten-bit video – the answer is kind-of 'yes', unless there are strong commercial or technical reasons for them not to do so. But if they stay with eight-bit, don't worry. It's very likely to be perfectly good enough.

Tags: Technology

Comments