You may remember that resolutions like High Definition (HD) and later 4K, were huge selling points for TVs and other displays. Nowadays manufacturers are trying to push High Dynamic Range (HDR) as the next big leap in the TV viewing experience. However, unlike HDTV concepts which are simple to grasp, HDR standards can be quite confusing.
That is because of the different HDR standards that require different kinds of hardware and content. So what do they all mean? Well, it’s important to know what HDR is in order to understand what to look for. As the name implies, HDR is a feature that improves your TV’s dynamic range. Dynamic range is the difference between the darkest black and the brightest white that the screen can produce.
A high dynamic range means that your display will show deep dark blacks instead of grayish looking hues — similar to the distinction between AMOLED and LCD displays. Also, clear bright colors will not appear to be washed out. That helps the image to not only be more pleasing to look at but also more realistic. Especially so because HDR is typically combined with Wide Color Gamut (WCG).
WCG means HDR displays will usually be able to produce a wider variety of colors. But unlike the leap from HD to 4k, which was really just a matter of cramming more pixels on to a panel, HDR relies on several different encoding standards. It is these HDR standards that can make it confusing for anyone who isn’t a complete audio-visual nerd.
Also Read: How different is 8K from 4K?
HDR 10 is the most popular HDR standard. If you shop for the cheapest TV with HDR branding on the box, this is what you will probably get. This standard is widespread and cheap because it is open source. That means it doesn’t cost TV and display manufacturers any royalties to implement. HDR 10 is also the most common format for content as well.
HDR 10 works by providing static metadata to the display. That basically means that it tells your screen what the lightest and darkest points are in a movie. From that data, the screen then sets one brightness level for the entire production which will best allow you to see the HDR enhancements.
The more dramatically titled Dolby Vision has a reputation for better picture quality in general. Unlike HDR 10, Dolby Vision uses dynamic metadata, that allows the brightness to be adjusted per frame or per scene. The overall effect is that the HDR effect will pop the way it’s supposed to on a scene-by-scene basis.
Which is rather than just using one level of brightness regardless of whether displaying a clear blue sky or a darkened dungeon. The downside is that Dolby vision isn’t open source. Manufacturers have to pay royalties of around $3 for each TV set to get it into their product.
And you won’t find as much Dolby Vision content compared to HDR 10 either. With that in mind, popular streaming services, like Netflix and Amazon Prime have Dolby Vision titles in their libraries. Samsung also has what it calls HDR 10+, a dynamic metadata standard to rival the more expensive Dolby Vision.
Note that Dolby Vision signals can be interpreted as HDR 10 on lower-end displays. So Dolby Vision content should still be watchable on HDR 10 displays.
Hybrid Log Gamma (HLG)
Then there’s this other open-source standard, HLG. You might have noticed this on YouTube, cable, or satellite TV. Some TV broadcasters such as BBC are already broadcasting certain sporting events in HLG. HLG is convenient for broadcasters since unlike other HDR formats, it can work with non-HDR TVs.
Of course, you won’t get an HDR picture if you do this. This backward compatibility also comes with a trade-off. Although HLG should certainly look better than a non-HDR picture, it’s generally considered to be inferior in quality to both Dolby Vision and HDR 10.
The good news is that HDR sets that support multiple standards are becoming more and more common. So you might not have to lose sleep over this if you’re about to buy a new TV or display. Kindly engage us with any additions, questions, or opinions on this subject in the comments section below.