20 May 2017

The Great Monitoring Dilemma: how to preserve picture quality for the DIY filmmaker

  • Written by 
  • submit to reddit  
What price the state of the art? $8k in the case of Flanders Scientific's DM 250 What price the state of the art? $8k in the case of Flanders Scientific's DM 250 Flanders Scientific

Index

RedShark Replay: What is the single greatest technical dilemma facing today's DIY filmmaker? Which Camera? Which OS? Which NLE? Which Plug-in? Which Colour Grading Software package? According to Craig Marshall, it’s none of the above. Instead, it’s all about the monitor.

I reckon the biggest dilemma we face today is sorting through the mire of technical data that is 'monitoring for video'. Video editors wanting to build up a post production facility today still endure a nightmare of options and technical jargon designed to confound your innermost Geek. I've been down that road over the past few years, so I want to share some of what I've learned. I've over simplified much of this article so you'll find the references at the end essential reading if you want to explore further monitoring options.

If you shoot video today on a modern digital camera, you'll expect to see a pretty good picture, irrespective of whether it's a consumer DSLR with a video option, a miniature Point of View camera or a professional cinema camera costing more than your car. No matter what you shoot, it would be nice to actually see what you've got, unadulterated by any system impediments. That's fair enough but unfortunately, it's not as straightforward as it first appears.

Let's suppose like most of us, you've shot some 1920 x 1080 HD material at 24P and your camera has an HDMI output. Most cameras do, so you plug in a common computer monitor with an HDMI input, hit 'Play' and up comes a pretty picture. So far, so good but are you actually seeing what you've shot? Does your monitor run at native 1920 x 1080 resolution? Is the calibration of your monitor accurate? How can you be sure? If your monitor is not a native HD display, then it could be adding visual artefacts which may detract from the image you shot. Same goes for 4K or UHD material. The result is that you are not really seeing what you shot.

Now, this is an oversimplified scenario but when we take our pictures into the chain of post production hardware and software that many of us will maintain on our desktop, the situation can rapidly degenerate from simple monitor induced artefacts to a complete colour misrepresentation.

Monitors 101

Lets talk briefly about bit depth and colour space and how to preserve the colour and detail you shot, right through your production process to your final Delivery format. If today's video cameras shoot great pictures out of the box, then why does so much of the material posted to YouTube (for example) look so poor?

In the late '80s, I transitioned from being a television video camera operator to a post production facility owner. One of the things I discovered when sourcing equipment was that the broadcast videotape 'acquisition' standard of the time, Betacam SP was a component format. To simplify, I discovered that each of the three Red, Green and Blue video components which made up Standard Definition Betacam were effectively recorded onto three separate tracks of video tape. Actually, Luminance or 'Y' channel, the monochrome component was allocated one large track and the Red-Y and Blue-Y 'colour difference' signals were compressed slightly to share separate portions of an adjacent track, but the point is, if those discreet components were kept separate throughout the entire post production process, then the picture quality at the end of the chain could (almost) match that of the camera original.

Back in the day, this was rarely the case and our particular facility was one of the first to achieve this goal. Fortunately, the additional expense to observe the 'component' standard soon generated financial rewards as work flooded in and many clients saw their work for the first time as it was shot. The same applies today in the digital HD and 4K domain where the highest technical standards can be maintained for but a fraction of the investment required back in the early 1990s.

In an ideal world, all video cameras would record a 16-bit RGB 4:4:4 (Red, Green, Blue) signal, uncompressed to cheap removable storage but as of today, we find most consumer video cameras still shooting heavily compressed 8-bit signals internally (to removable cards) in the 4:2:0 colour space. In my Betacam analogy, that's 4 for the Luminance (or monochrome Green) component but only '2' and '0' respectively for the colour difference signal. Many cameras now feature 'clean' HDMI outputs where an 8-bit or, better still, a 10-bit 4:2:2 signal can be output to record on a third party device which usually preserves the signal with less quality sapping compression than the camera's internal option.

An 8-bit signal enjoys only 256 individual steps or gradations for each colour resulting in the common '16.7 million colours' noted in the specs sheets of most cheap 8-bit computer monitors. On the other hand, a 10-bit video signal displays a much finer graduation of 1064 steps per colour to result in the less common '1.07 billion colours' noted in the spec sheets of high end 10-bit PC monitors designed for pre-press, professional photographers and graphic designers. 10-bit is certainly the way to go if you can afford it. Even if you're shooting 8-bit, monitoring at 10-bit offers an excellent future upgrade path.

Frames of reference

Backtracking to the 1920 x 1080 24P HD pictures we shot earlier, if we ingest them into our favourite NLE and start work, we'll soon want to look at our pictures on an external monitor and this is where the everything starts to become murky. How can we be sure what we are looking at is what we actually shot? If you visit a professional post production facility, you'll find they all use a 'Reference' monitor. Could you imagine building a recording studio without VU/Peak Meters or Reference audio Monitors? I know many filmmakers who regularly edit without any form of reference waveform display or reference monitoring and this is courting disaster if you plan to submit your work to any format other than Youtube (and perhaps that's why a lot of YT material looks so poor...)

So what makes a good Reference Monitor? I'm not about to insist that you all to go out and buy Sony's latest OLED 4K Reference display or a state of the art SDI display from Flanders Scientific, though these are good options if you have the budget. If you have your heart set on becoming your neighbourhood's go to Colorist, then a lot of money spent on a quality 'reference' display will be money very well spent. However, these high-end monitors are expensive, so a return on your investment could be a long time coming.

So, can you use a computer display to accurately monitor video? Well, yes and no. You can so long as you know what you're doing and you choose wisely. As a generalisation, computer displays are designed for displaying still images, not high quality video at high frame rates. OK, Televisions are designed to display video so why not use a modern LED backlit TV to monitor my video? Well again, you can and you can't. Let's look at this more carefully:

Computer monitors fall into three main display categories TN, VA and IPS displays. IPS displays have the widest viewing angles (about 178 degrees) so they could be considered ideal for viewing video because as you move your head from side to side in the edit suite, the colours displayed on the screen remain largely the same. Try this with your common, garden variety TN display and see how even the slightest offset can effect the accuracy of what you see.

Another thing we need to consider is the accuracy of the computer display. 8bit or 10bit - how many colours will it display? 16 million or 1.07 billion? How uniform is it? Does it cover 100% or Adobe RGB, Rec.709 or just sRGB? Can the monitor be accurately calibrated? Fortunately, during the last few years, a number of 'affordable' 10bit IPS computer displays have become available (sub $1K) with 14-bit or better LUTs (LookUp Tables) built in for accurate calibration — a repeatable process using specialised software and a probe attached to the screen. (See the Reference section below for more detail on this topic.)



« Prev |


Craig Marshall

Retired freelance TV commercial and documentory producer/director/camera/editor

Website: www.HDvideo4K.com

Twitter Feed