HDR promises to be one of the biggest advancements in picture quality in a very long time. But it is living up to the hype, and is it really giving us the increases in quality that were expected?
Let’s face it, we like great images, and HDR, when it was announced promised great things, and this combined with a wider colour space (WCG) could surely only mean a more lifelike image. It was great to see adoption both from streaming services including Netflix and Amazon and also on UHD Blu-ray discs. At the inception of HDR there were two standards, HDR10 and Dolby Vision.
Now, this shouldn’t come as a surprise. After all, we’d seen it all before in the VHS vs Betamax and Blu-ray vs HD-DVD wars to name just two. However, hindsight is a wonderful thing and if experience is anything to go by we’ve all learnt that technical superiority doesn’t necessarily guarantee victory. Would it be different this time around, could the two standards peacefully co-exist? Well, as it turns out, that was just the start.
As someone who is always looking for the next big thing, I was happy to buy into this idea, but wanting to hedge my bets, I waited until I thought I could have the best of both worlds. I’d seen all the demos, been to the trade shows and had even seen the presentation at Dolby HQ in Soho. I was ready to embrace this next leap in quality delivered to the home. I bought both an LG OLED TV with support for HDR and Dolby Vision and a Sony UHD Blu-ray Player, which didn’t have Dolby Vision support. The reason being at the time that no UHD Blu-ray player properly supported Dolby Vision (I have learnt never to trust manufacturer promises). As a subscriber to both Netflix and Amazon Video, I also had access to their HDR content in both HDR10 and Dolby Vision. I was fortunate in being able to view both HDR standards from multiple sources and sit in judgement.
From a streaming point of view, I started off with series 2 of The Grand Tour on Amazon Video. I was rather disappointed. There was an issue with colour balance and also light levels. Unfortunately, it was recorded on a very sunny day and you could see the effect it had on the picture. Very bright grass in the background through the tent window and skin tone that made everyone look slightly flushed with a magenta hue. It was a very difficult situation to light. Yes, I had calibrated the TV before and here lurks another rat hole it would be so easy to venture down – maybe another time. Luckily, things settled down, the second programme in the series was much better. However, I still think <old man filter on> that we’re not seeing the delivery of blow-your-socks-off images. We’re still in the realm of uncompressed SD picture levels of quality, although on much larger screens <filter off>.
Then when the new Star Trek Discovery series started on Netflix with Dolby Vision, I decided to give that a try – again another disappointment. This time the issue was noise in the form of digital grain added to this image in post production that really upset the HDR presentation. HDR exacerbates noise in any guise and that in turn plays havoc with the compression. We all know that a bit of black crush can hide some noise, but this time the HDR was bringing it to the fore. The HDR implementation on UHD was better. Not many compression artefacts to see, but noise was sometimes still an issue. I think I can hear some of you screaming ‘artistic intent’ – don’t get me started on that discussion. I’m saving that one for another day.
I’ve also been watching some of the BBC iPlayer UHD streams of the world cup in HLG. Now I can see a bit of improvement. I’m seeing better highlights when some of the pitch is in shadow and some in bright sunlight, switch to the live broadcast coverage and the highlights are burnt out badly. I even think that the normal HD-image may be suffering from having to be exposed to compliance with the HLG stream.
Maybe HDR is not quite the answer. It's certainly not the visual nirvana it was made out to be. On top of that, what seemed like a simple two-horse format war has got a bit more complex if you include Hybrid Log Gamma, Technicolor’s HDR version (Advanced HDR) and HDR10+. With five HDR variants and even a different type of Dolby Vision supported in some Sony TVs, we really don’t have a winner.
Whilst from a professional standpoint, from production and post production, HDR and WCG feel like really important standards, from a consumer point of view, they don’t. I’m not talking about the enthusiasts here because, broadly, they care more than anyone. I am talking about the kind of people who buy the 25M+ 4K TVs every year, and that was only in China in 2016. There’s a real danger of confusing the public looking to buy their next TV and we have little hope of the majority of staff in the large stores being able to explain the options when things are changing so quickly.
There seems to be a rush on to give what, on paper, looks like great images, HDR and, importantly, WCG. However, it seems like the whole HDR part of the equation is a bit of a mess. At least with WCG, we know what we’re aiming for and that is REC2020, even if we can’t get there yet. I know what we’re ultimately aiming for in HDR is 10,000 nits, but we’re way off that and the roadmap to get there seems to be strewn with diversions. Some TVs are capable of nearly 2000 nits now and can cause squinting, so when we get near 10,000 – well, I think there’s going to be a new gap in the market for adaptive HDR sunglasses. I know these two standards go hand-in-hand (SHOULDN’T THIS BE “HAND-IN-GLOVE”?) but I have a feeling that if it takes too long to get there, we’re going to face a new standard or new technological innovation, which means we’re never going to arrive. Although ultimately the new destination might be more attractive but hey... don’t you just love technology?