<img src="https://certify.alexametrics.com/atrk.gif?account=43vOv1Y1Mn20Io" style="display:none" height="1" width="1" alt="">

When it comes to HDR is bitrate more important than format?

4 minute read

ShutterstockThe good ship EBU has given HLG it's approval

The EBU's catchily-named TECH REPORT 038 has attempted to conduct a subjective evaluation of HLG for HDR and SDR distribution and concluded that it's just as good as PQ. The real story though looks to be about the bitrates broadcasters are going to typically allow for the signal. 

Format wars are much like real wars, at least to the extent that we'd all prefer that, if they have to happen, they should be fought fairly and concluded quickly. In this context, the notional battle between the two most popular open standards for HDR distribution could hardly have been conducted more fairly.

There are essentially two: Hybrid Log/Gamma, the BBC-NHK collaboration, and Perceptual Quantisation, which is published by the SMPTE as standard ST-2084. Both have been commercialised — HLG on its own merits and PQ as both the fabric underlying Dolby Vision as well as the Consumer Electronics Association's HDR10 media profile. What's really interesting about all this is that despite the existence of many commercial entities with a stake in the outcome (including production equipment manufacturers, broadcasters, and the consumer electronics trade), the European Broadcasting Union has just published a technology report containing an entirely subjective competitive assessment of the two systems.

With so many commercial interests at play, it takes a reasonably detached organisation such as the EBU to do something like this. The EBU worked in concert with IRT (the Institut für Rundfunktechnik, operating in Germany, Austria and Switzerland), Orange (who will not enjoy being referred to as such, but which fundamentally is the cell phone company) and the Italian organisation Rai (Radio Audizioni Italiane until 1954; now Radiotelevisione italiana and not to be confused with the Dutch convention centre). The purpose of the exercise was twofold: to compare HLG and PQ, and to establish whether HLG's attempt to provide backward-compatibility works properly. What's interesting about this are the results, which in general suggest quite strongly that HLG is just as good (for some measure of “good”) as PQ and does indeed produce a workable image on both HDR and SDR displays. What's also interesting is the way those results were arrived at, because the way these tests are performed is naturally of huge importance to the results.

The first point is that HLG, which we might expect to be less capable than PQ, made the lay audience at least as happy. The graphs aren't quite coincidental — HLG pushes ahead by a small amount — but we wouldn't necessarily have expected parity and this raises some interesting questions. The display used for the tests was Sony's BVM-X300, one of the most capable video displays currently available, although the peak brightness was limited, for the sake of the test, to 1000 nits. This represents 66% of the X300's 1500 nit capability, but all of the common standards designed for home TVs. The appearance of the X300's display is, however, enhanced greatly by the low black levels of its OLED technology which will not be available in the sort of TVs that achieve 1000-nit peak whites.

We might also voice concern that the 1000-nit target for TVs adopted for the test is rather low. Many domestic displays fail to achieve it, but even if they do, it's less than a stop brighter than some conventional TVs and computer monitors, which sometimes achieve over 500 nits anyway. Such a display lacks the sheer punch of a 4000 nit Dolby Pulsar, and while we wouldn't expect to sell that sort of monitoring technology directly into the home, the UHD Premium 1000-nit target can sometimes look barely better than the standard dynamic range status quo.

Subjective results and test limitations

Whether these concerns undermine the results, especially regarding the risk of reaffirming a set of under-ambitious targets that may affect the commercial success of HDR, is a matter of interpretation. It's a matter of predicting what the future of home TVs might be, but together these issues might go some way to explaining the apparent lack of differentiation between HLG and PQ. If home TVs are going to continue at the 1000-nit level but eventually start to achieve better blacks — as seems likely given the move toward OLEDs — that’s fine. The commercial pressures against this from manufacturers who simply want a new badge to slap on a TV set are obvious. The EBU, for their part, is very aware of the limitations of its test setup and states that “for displays with higher peak luminance capabilities the performance of the HDR technologies should be tested considering these new conditions.” The influence of ultra-low OLED black levels in concert with high TFT peak brightness, which is available via the X300 but not domestically, is another matter entirely.


Backwards Compatibility

The other significant conclusion is that the backward-compatibility of HLG apparently works nicely. The tests involved comparison to a manually graded SDR picture and the report states that there were certainly visible differences between HLG displayed as SDR and the “native”, manually graded SDR signal. This conclusion is subject to criticism that the manual grading is even more subjective than the test process itself, but it seems that the (enormous) convenience of being able to broadcast HLG as HDR or SDR is well-realised. At least for some kinds of broadcast, such as live events, this is more than a big enough convenience to drive adoption. As short-film makers who graded on a cheap computer monitor have known for years, a picture doesn't necessarily have to look identical to a reference in order to be acceptable and watchable.

Dolby Vision was not assessed, presumably on the basis that it is an entirely commercial project. The pity of it is that Dolby's effort is the only system which is actually capable of accounting for the capability of the display and adjusting the image accordingly. As such, it is best placed to deal with potential future improvements. In the long term, we might end up wishing that everyone had paid the Dolby fee per television, although the reticence of manufacturers, broadcasters and ultimately consumers is understandable.

Perhaps the greatest irony, though, is that these tests, which are being discussed in terms of high dynamic range, show something else much more clearly. They also included an assessment of various levels of compression, from a tiny 2.5 to a much more generous 20Mbps of HEVC. This is a modern, efficient codec, but for UHD images, 2.5 megabits is a microscopic data rate. Perhaps unsurprisingly, the change in picture quality, as reported by non-engineers, was affected vastly, vastly more by bitrate than by the type of HDR signal in use.

A person's reaction to this news is, of course, to be taken in consideration of how responsible that person is for the cost of satellite transponder bandwidth.

Graphic: shutterstock.com

Tags: Production

Comments