SO WHAT EXACTLY IS HDR AND WHY ARE EXPERTS SAYING IT’S FAR MORE IMPRESSIVE & IMPORTANT THAN EVEN 4K?
HDR stands for High Dynamic Range and is a cutting edge technique for displaying far superior images on TV’s and monitors. At its most basic level, it’s the ability to display a much wider and richer range of colours (closer to what we see in real life), much brighter whites and deeper darker blacks, resulting in a much more vivid ‘dynamic’ lifelike image. HDR technology also preserves details in the brightest and darkest areas of a picture that are lost using current standards. Therefore contrast and colour are the two main factors to keep in mind in relation to what HDR does.
There’s currently two types of HDR to choose from, one from Dolby called Dolby Vision, and the other from The UHD Alliance called Ultra HD Premium, also known as HDR10. We will use the terms Ultra HD Premium and HDR10 interchangeably throughout this article.
We’ll start with the Ultra HD premium open standard, which was created by a consortium of TV manufacturers, broadcasters and film producers, who together form the UHD Alliance (CLICK HERE to see all members). As with most new tech, HDR is open to abuse and that’s what the ‘Ultra HD Premium’ logo aims to prevent. By defining a set of standards for what TV manufacturers can label a HDR capable TV, buyers can have peace of mind that they are getting the real deal. So long as a TV has the Ultra HD Premium logo, then it will support the full defined specs for genuine HDR content. The logo will also appear on Ultra HD Blu-ray discs which for the record, can deliver a colossal 100 Mb/sec.
As mentioned contrast and colour are the two main factors to keep in mind in relation to what HDR does. First up the contrast, which refers to the difference between the lightest level of a display (or peak brightness), and the darkest level of a display (known as the black level), the greater the difference between the peak brightness and the black level, the greater the ‘contrast’. Both peak brightness and the black level are measured in ‘NITS‘ and the difference between these two measurements is known as the contrast ratio. Nits is a term that’s been adopted by the TV industry to indicate the brightness of a display and 1 nit is equal to the light emitted from a single candle.
Ultra HD Premium TVs have to meet specific standards for peak brightness and black level measurements, as laid out by the UHD premium standard of at least 1,000 nits peak brightness and less than 0.05 nits black level. For comparison most current standard LED TV screens in use today offer between 300 and 500 nits, giving indication of the much greater luminosity required for HDR10.
The second most important factor for HDR is colour, and to be truly UHD Premium certified a TV must be able to process whats known as a 10-bit colour depth, which itself is a signal which produces over a billion different colours. To put this in perspective, current Bluray can only process around 16 million different colours (which equates to an 8-bit colour depth), this is far less than a billion, resulting in HDR images looking far more realistic and much more pleasing to the eye. The TV must also be capable of displaying everything in the Rec.2020 color space.
You may notice some TV’s will be simply labelled ‘HDR’ but do not possess the UHD Premium certification or Dolby Vision branding. In these cases, these TVs won’t offer the best possible HDR experience and are best avoided since they will most likely have an inferior panel that doesn’t meet the minimum specification requirements. As an example a TV like this which is simply labelled ‘HDR’ should be avoided if you’re after a true HDR experience: Samsung UE50KU6000 50 Inch UHD 4K HDR Smart LED TV.
Its also worth noting that its not compulsory to use the UHD Premium logo, with some manufacturers confusingly using their own branding to represent their UHD Premium range of products. For example LG labels their Ultra HD Premium TV’s as HDR Pro. So it pays to do a bit of research on your preferred brands line of HDR displays.
THE SECOND OF THE TWO TYPES OF HDR AVAILABLE IS FROM DOLBY, CONVENIENTLY CALLED DOLBY VISION, BUT IS IT A BETTER CHOICE THAN THE UHD PREMIUM VERSION?
According to the specs on paper Dolby Vision is significantly better, Dolby Vision has a superior 12-bit colour depth, as opposed to a 10-bit colour depth for UHD Premium, and a peak brightness target of 4,000 nits (Dolby Vision can actually support up to 10,000 nits), as opposed to just 1000 nits for UHD Premium. Just like UHD10 the TV must also be capable of displaying everything in the Rec.2020 color space.
Also Dolby Vision content includes dynamic frame-by-frame metadata, to tell the screen exactly how to display each frame of the video adjusting contrast on the fly maximising every scenes visual impact. Dolby Vision can individually process over 170,000 frames within a 2 hour film. UHD Premium or HDR10 on the other hand is not capable of doing this, and only possesses static metadata that cannot be adjusted on the fly. So that all makes a pretty significant difference, but there aren’t many consumer TVs that can achieve much over 1,000 nits at this time. Dolby’s higher numbers are certainly impressive technically, but they don’t translate to any real world difference due to the lack of currently available hardware and content.
Even though Dolby Vision is a clear winner in terms of specification, its a closed proprietary system that requires licensing fees from manufacturers resulting in more expensive hardware. This has already put it on the backfoot since HDR10 is a completely free open standard. Microsofts XBOX One S, Sony’s PS4 Pro and most set top boxes and Ultra HD Blu-ray players all exclusively support HDR10 and dont support Dolby Vision, which is another major drawback for Dolby.
The one device Dolby does have in its corner, however, is the new Chromecast. In October, Google announced that Chromecast Ultra would support Dolby Vision, making it compatible with any Dolby Vision TV. According to Dolby, it’s the only streaming device that delivers a seamless playback experience, no matter what HDR content you select, and the first to support Dolby Vision.
Bear in mind Its very early days for Dolby Vision while HDR10 has already had a big head start, but the good news is that most content mastered in Dolby Vision also carries the HDR10 metadata so will play fine on any HDR10 hardware, likewise if you were to buy a Dolby Vision display it will still play all HDR10 content seamlessly.
The bottom line is you can easily future proof yourself by simply buying into Dolby Vision, since Dolby Vision hardware covers all grounds. As Dolby Vision becomes more widely available we will see what sort of price hike is required over HDR10 exclusive hardware. Since Dolby Vision offers a superior experience and could grow in popularity over time, it would be wise to make sure your TV supports both standards from the get go, making a Multi-HDR Ready TV the best option, like the LG OLED55B6V Smart 4k Ultra HD HDR 55″ OLED TV.
SPEAKING OF ORGANIC LED TV’S, IS IT BEST TO CHOOSE AN OLED HDR TV OVER A REGULAR HDR LED TV?
Even though LED TV’s have seen many major improvements in recent years, OLED is still lighter, thinner, uses less energy, displays a superior richer image, offers far better viewing angles, and has dropped in price considerably, although its still a little more expensive than LED. Unless your primary concern is value, then an OLED based HDR TV is the way to go.
However its not quite that simple since OLED TV’s are unable to produce a 1000 nit peak brightness, which is the minimum requirement for the UHD premium standard. This meant that the UHD Alliance had to come up with a second set of technical requirements specifically for OLED TV’s which almost halves the required peak brightness level to 540 nits instead of 1000, but also significantly reduces the black level requirement to 0.0005 nits, much darker than the LED requirement of just 0.05 nits.
So the main advantage of LED TV’s over OLED is the much higher peak brightness, but in most cases a picture with a much deeper black level is superior and many would argue 540 nits is bright enough.
This means there are now two standards TV manufacturers must adhere to, to qualify for the UHD Premium stamp of approval. One standard for LED TV’s and another standard for OLED TV’s.
SO WILL EVERYTHING I WATCH BE HDR IF I BUY A HDR TV?
Only if it was that simple! The content we play needs to be specifically mastered with HDR in mind to take advantage of your shiny new HDR TV, otherwise it wont make the blindest bit of difference. Luckily HDR mastered content is on the rise and we now have Ultra HD bluray movies and players, the BBC as well as online streaming services like Netflix, Amazon and Youtube all fully supporting HDR allowing content creators to deliver their content with ease. Incidentally you will need a broadband connection of at least 25Mbits to watch streamed HDR, although Netflix uses something called adaptive streaming that gives HDR priority over resolution in the case of insufficient bandwidth, proving HDR adds more to the viewing experience than resolution.
Its still in its infancy in 2016, but no doubt HDR will explode in 2017 and beyond. Its worth noting that HDR is actually a mandatory part of the Ultra HD Blu-ray spec, so all ultra HD bluray players and discs will be HDR compliant as standard.
To display HDR your TV must be at least HDMI 2.0a compliant and any TV with the Ultra HD Premium label will be compatible by default making it easy for buyers to simply check for the logo. A quick primer on HDMI is worth noting here, the earlier 1.4 standard released back in May 2009 allowed 4k resolutions but only at 30 frames per second, which is fine for movies but not really for current day gaming and many TV broadcasts. Many gamers now demand 60 fps which significantly improves the experience when playing a game. To fix this HDMI 2.0 was launched in September 2013, allowing the full 60fps at 4k resolution while also supporting the full 12-bit colour range. Then the HDMI 2.0a standard was launched in April 2015 which specifically adds support for HDR.
DO I NEED TO BUY NEW ULTRA HDR HDMI 2.0a COMPATIBLE CABLES?
Probably not! Its worth noting here that there’s no such thing as specific UHD or HDR HDMI cables, or 1.4 or 2.0 HDMI cables for that matter. They all simply come marked as two standards, either standard speed HDMI cables or high speed HDMI cables. As long as you have a cable that was purchased in 2010 or later and is truly certified a high speed cable then you’re probably good to go. Remember a £5 – £15 high speed or premium certified cable will perform the job perfectly fine, obviously price varies depending on length but you don’t need to spend much more than this. Super expensive cables wont make any difference. One drawback however is how long the cable is, the longer the cable the more likely you will experience problems. We recommend sticking to approx. 12 foot or below for the most reliable HDR signals at consistent peak bandwidth.
SO WHATS THE STATE OF PLAY FOR HDR AND PC GAMERS
While PC monitors lag behind their TV counterparts in HDR implementation and we are yet to see any HDR specific monitors on sale, one area of the PC that’s has been HDR ready for over a year now is the mid- to high-end graphics card market, thanks to the healthy rivalry between Nvidia and AMD.
Nvidia has been implementing HDR into its GPU’s since the 900 series and currently certifies all its Pascal-based models as HDR ready. AMD is slightly later to the game with the 390X and current Polaris lineup as their first HDR-capable cards. So If you’ve purchased a decent graphics card relatively recently, there’s an excellent chance that your GPU is good to go.
So what about games I hear you cry? While some new games will release with HDR in mind, older games won’t support HDR at all and will require patching. These older non-patched games will still play fine on HDR-equipped systems, but you just won’t see any benefits without some fresh HDR code patched into the mix.
Fortunately, leveraging HDR’s superior technology will only require a fairly straightforward mapping process that expands SDR color maps to HDR ranges via algorithmic translation. This means for developers to provide HDR patches for their SDR titles will not require massive effort. Popular games in future receiving studio remasters will also no doubt have added HDR support. The mod community is also likely to step in where manufacturers won’t with older classic titles.
Until HDR patches and games start arriving for the PC, it’s going to be all Blu-ray and online streaming content for now, with console games to follow shortly after. It’ll be a while before HDR arrives on the PC so for now just have HDR in mind for your future upgrades, and enjoy your current monitor before it gets replaced.
To conclude. Its unclear who the ultimate victor will be, Dolby Vision or HDR10, but there’s no doubt there’s never been a better time to invest in HDR with many available screens supporting both standards for peace of mind. Although 4K or UHD has been the buzz word in display technology in recent years, in general its failed to get many buyers excited over their existing full HD displays, this is because at a normal viewing distance the human eye doesn’t really notice more pixels above a 2k resolution.
This is where HDR will completely change that, due to its far superior colour depth more pixels end up looking much better even though the human eye cant discern the individual pixels at normal viewing distance. For this reason many argue 4k should never have been released without its much more impressive HDR counter-part, and content creators now have more reasons to produce 4K HDR content than just 4K, so buying a Multi-HDR Ready TV that supports both standards is a smart move as HDR no doubt paves the way for the foreseeable future of how we will consume all our visual content for many years to come.