The present situation with HDR standards isn't actually as confusing as it might seem though, as although there were about four HDR standards up till this point, HDR10 seems to have established itself as a basic standard with Dolby Vision leading the way as the enhanced higher quality version of HDR that can also play back the HDR10 standard as well. All computer games released so far have all been HDR10 based. HDR10 support is also a mandatory part of the Ultra HD Blu-ray standard whereas Dolby Vision is an optional extra, a bit like the situation where the DTS audio standard was an optional extra for DVD players.
So, if you set aside Hybrid Log Gamma (which will probably be a significant broadcast HDR standard in the future as it is backed by both the BBC and NHK broadcasters) then HDR10 was the main standard with Dolby Vision being the high end super standard if you wanted something extra special. But that situation seems to be about to change slightly with the addition of HDR10+.
One of the major advantages that Dolby Vision has over HDR10 is Dynamic Metadata which is the idea that the standard can vary the brightness levels on a scene by scene or even frame by frame basis. With the "Fixed Metadata" of HDR10, if the movie is quite bright overall but has one or two scenes that are much darker, then those darker scenes will mean that the overall movie has to be encoded darker. With Dynamic Metadata that doesn't have to be the case as the data can vary for each scene.
HDR10+ seeks to close the gap somewhat on Dolby Vision by adding Dynamic Metadata capability to the more basic HDR10 standard although Dolby Vision does have other advantages than just Dynamic Metadata such as 12-bit colour and a maximum brightness of 10,000 nits. Dolby Vision is also compatible with the older HDMI 1.4a standard whereas HDR10+ requires HDMI 2.0.
HDR10 and HDR10+, on the other hand, have a very big advantage over Dolby Vision because they are open standards that TV manufacturers can use for free, whereas using Dolby Vision requires royalties to be paid. So far Samsung has avoided paying to use Dolby Vision in it sets, whereas companies like Sony support both HDR10 and Dolby Vision. This is probably why Samsung is keen to have an enhanced version of HDR10 to try to close the visual quality gap without paying those royalties to Dolby.
Samsung has also collaborated with MulticoreWare to add HDR10+ support to their x265 HEVC encoder, and has teamed up with Colorfront to bring HDR10+ mastering to Transkoder. This means that mastering houses will be able to create HDR10+ content right away by just selecting the correct settings in Transkoder. Hopefully this will help content providers to make their content available in HDR10+ without too much concern and in the case of online content such as streaming video, it should be possible to just update the software on the server side to allow the new content to work with compatible TVs.
Amazon has already announced that it will be making HDR10+ content available globally later on in the year. Samsung will be equally busy making all of its new 4K TVs compatible with the standard too. Older sets from 2016 will get firmware updates to support the new standard. It should be a fruitful collaboration for both parties.
In theory, the new standard should be great news for everyone in fact because it's an open standard that anyone can support freely and it allows those with the right equipment to have higher quality HDR video, at least online. However, there is also the possibility that the new standard will just add to any confusion over the different HDR standards that are already out there. Given that HDR10 sets a basic standard, the additional standards should be no worse than having the option of Dolby 5.1 audio or DTS surround sound on your DVD player. That never stopped anyone from watching movies in stereo after all.