<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

Here’s what everyone wants to know: is 8-bit good enough for today’s video?

3 minute read

A while ago, we published an article in which we outlined why it's possibly reasonable to shoot in 8-bit, albeit under some fairly constrained circumstances. Does it still stand? Well, sort of...

The original piece was to some extent an act of extreme journalism, given the inevitability that lots of the comments will be based on having skimmed the headline and skipped the article, but the point remains: a lot of cameras, perhaps almost all cameras, produce noise that's detectable in 8-bit recordings, so it would be instinctively reasonable to conclude that none of them is worth recording in more than 8-bit.

This being the internet, we should probably recap some of the caveats we examined last time. The visibility of noise in any digital image is altered enormously by the various kinds of luminance encoding that are used — in short, we put high-bit-depth pictures through the video equivalent of Photoshop's curves filter before rounding them down to eight bits. Raw sensor data looked at on a conventional computer monitor, looks like a big field of greenish, almost black shadows with spots of highlight detail in it. If we tried to store raw sensor data in eight or even 10-bit images and then make it viewable later, the results would be terrible with not nearly enough levels of graduation in dark areas.

There are as many ways to do that processing as there are camera companies, but it's worth being aware that the famous Rec. 709 standard actually pulls the very darkest shades of the image down to black just to make noise less visible. That's the main difference between 709 and the sRGB computer standard, which doesn't have that provision because computer images don't have noise. Unless it's YouTube, maybe, but I digress. The question is whether things have changed in the time since we last discussed this, or, to put it another way, is it still reasonable to shoot for really nice-looking, cinematic results on something like the JVC GY-LS300. It's a great little camera but gives us neither recording nor output at more than eight bits per pixel.

New sensors and cameras

Cameras have of course improved somewhat since we last discussed this, particularly with things like the Sony FS5 and FS7 and Blackmagic's 4.6K cameras, and there's the particular factor of HDR. Displaying a wider range of brightness as HDR does, means that each digital number represents a bigger change in brightness and thus the differentiation between them is more visible (see our previous discussionhttps://www.redsharknews.com/technology/item/3891-hdr-fundamentals-and-techniques-what-you-need-to-know,-part-one). We usually like to originate material at a higher bit depth than we distribute, and it's probably going to be normal to distribute HDR material using at least ten bits. HDR itself is currently a rather fragmentary effort, and subject to a lowest common denominator approach that limits absolute performance, but few would argue that 8-bit cameras can reasonably be expected to produce good results if we're mastering for the best possible circumstances.

In other situations, though — and let's be clear, there will continue to be circumstances other than HDR for a long time — it's worth looking at some numbers. Let's consider an imaging sensor such as the Cmosis CMV12000, which provides us with 4096x3072 pixels, a global shutter, and rates beyond 100 frames per second, which may recall Blackmagic's 4K cameras in a way that's not entirely coincidental. The company documentation suggests a signal-to-noise ratio of 41.3dB. Since every six decibels represents, almost exactly, a doubling of signal power, and since each bit also represents a doubling of signal power, we might assume that we could encode a signal with a 41dB SNR in a little less than seven bits, again, with a significant wiggle room for gamma processing.

Now, 41.3dB is not a particularly stellar signal-to-noise ratio by modern standards, and we only need another 7dB to hit a theoretical eight bits. On the other hand, 7dB represents more than twice as much signal mixed with the same amount of noise. But haven't cameras got at least a stop faster (which might alternatively mean a stop quieter) since the Blackmagic 4K?

Why yes, they have. It's becoming clear that even quite affordable cameras (the Ursa Mini, perhaps, or the FS5) are starting to boast sensors which will be significantly hamstrung by low-bit-depth recordings. Even sensors such as the one reputedly used in Blackmagic's 2.5K camera achieve something around 88dB. 

The previous advice stands: it's still not necessary to capture all of that for all types of production, just like it isn't necessary to record everything in uncompressed DPX sequences. But now affordable 4K sensors are starting to boast similar performance, the decision to shoot 8-bit might start to cost us more than ever before.

Tags: Production

Comments