<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

Building the 10-bit desktop display

4 minute read

MatroxA bit of history for you: 2002's Matrox Parhelia. 10-bit displays have been a thing for 15 years now.

RedShark Replay When we discussed building workstations, there was much talk about the practicality of achieving a ten-bit display of desktop video.

To be clear, what we're talking about here is telling a piece of software such as Premiere to display its video on a second monitor that ordinarily shows the windows desktop, as opposed to using something like one of Blackmagic or AJA's I/O cards to drive an SDI monitor.

To forestall criticism, there's a few quite legitimate reasons one might want to do this. First, it's cheaper, which is always a legitimate if unfortunate reason to do something. It's cheaper both in terms of the I/O board and the SDI monitor connected to it. Second, perhaps strangely, some SDI I/O boards can seem slow and laggy compared to an on-desktop display. Third, if we're aiming for a YouTube finish, we're grading for sRGB in any case, and spending a lot of money on an I/O card only to end up on an sRGB computer monitor anyway is something of an exercise in perversity.

However, putting a video edit workstation's output on a desktop display creates some problems, too, perhaps chiefly the issue of frame rate conversion. Computer desktop displays (and things like tablets and cellphones) commonly default to run at 60 frames per second, whereas it's still reasonably rare for the material we're working on to be at that rate. This means that there has to be some sort of conversion going on between the timeline video and the desktop display.

Arbitrary conversion

That conversion is typically done fairly arbitrarily because it works much the same way for any piece of computer software that plays back video, from a Blu-ray player to YouTube to Premiere. If the application is playing back video, it'll be sending video frames to the computer's graphics hardware every (say) one twenty-fourth of a second. The graphics hardware stores that frame until the monitor needs to be updated, say every one-sixtieth of a second. The interaction between those two events means that each video frame will be shown either two or three times, but there's no specific intent to ensure a tidy repeated pattern.

Playback of video on computers is generally timed to the sound hardware's sample rate, with the video frames updated whenever necessary to keep in sync. Ordinarily the video hardware and the sound hardware will drift against one another, which will cause the pattern of repeated frames to be slightly irregular even if the monitor and the video are at the same rate. This is not great, but not usually a critical issue. Lots of video is edited like this, although it's a nice idea to set the monitor update rate to a clean multiple of the video rate you're editing.

That's easier on some systems than others. Modern Macs don't generally offer any way to do it, although third-party software can. On Windows, software such as Nvidia's control panel has the options. Usually, rates such as 50Hz (for 25fps material), 60Hz (for 30fps material) and 72Hz (for 24fps material) are supported, and will ensure that each video frame is usually shown for the same amount of time. Editing 24fps video on a 60Hz display can look visibly lumpy, especially on things like slow pans or material that's already been frame-duplicated to create slow motion.

Comparatively, the bit depth problem is far less visible. If we're to be applying LUTs to the material after it's left the computer, whether in a dedicated LUT box or in a monitor, we'll naturally want to start with more precision. It's worth bearing in mind that “applying LUTs” is something that a majority of monitors do, in effect, when they preprocess images to ensure that the output of that particular monitor acceptably matches a standard. So, it's difficult to say that high bit depths are ever completely useless, but these issues, which attend effectively all monitors everywhere, are not generally a source of really objectionable quantisation noise. In the part of the market where the cost of monitoring and I/O boards is a big factor, most LUTs will be applied in the computer anyway– in something like Resolve – where things are calculated at a far higher bit depth.

Remaining issues

Still, let's assume we've decided that a 10-bit desktop display is necessary, and we're prepared to pay for it. This is has been possible at least since the days of the Matrox Parhelia graphics card, which advertised its “GigaColor” technology. Even then, it's a bit of a stretch to justify an Nvidia Quadro board just for display, since new ones are generally more expensive than a Decklink board, but there are lower-end and used options. There are, however, two remaining problems.

The first is that software has to know how to draw in a way that supports 10-bit. Desktop displays have been an eight-bit world for decades and this is inculcated into the very core of how they work. In some circumstances, 10-bit requires an OpenGL surface which has been specifically set up to support it, or similar special efforts. Photoshop, for instance, knows how to do this, but if it hasn't been done, the output won't be 10-bit regardless of the graphics card and monitor setup.

The second issue is getting the picture to the monitor. The old Parhelia could use the fact that monitors might be connected to it using analogue VGA cables, so it only needed to implement ten-bit digital to analogue conversion. Modern displays will be connected via HDMI or DisplayPort. Both theoretically support 10-bit. In the case of HDMI, for example, this means that 10-bit modes (and beyond) are mentioned in the paperwork which describe what HDMI is. This does not mean that all HDMI devices actually implement it – and comparatively few do.

Assuming HDR gains traction, it will require higher bit depth for reasonable results, so all this may change. In the meantime, anyone desperate for 10-bit output will need to be careful.

Tags: Technology

Comments