There are three major HDR models, HDR10, HDR10+, and Dolby Vision, and they have various ways to deal with HDR, each with their very own benefits. When looking for a new TV to buy, you shouldn’t worry about which formats are best, as they all deliver a great experience with advanced equipment.
If you want to enable most of your favorite content, here are the various ways in which these formats deal with the key aspects of HDR.
What is HDR?
HDR (High Dynamic Range) is another component of 4K Ultra HD sets and it represents the high power range, reference to its ability to deliver more colors, higher levels of contrast and higher brightness. HDR is basically a redesign of the 4K, or Ultra HD, group. For this new component, TV producers are initiating new names for the sets to recognize them from standard 4K Ultra HD TVs.
The basic standard in the range for high-dynamic content is called HDR10, as set by the UHD Alliance, an industry trade group. This basic minimum HDR-compatible requirement is supported by dozens of companies, so this year you will see “HDR10” and “Ultra HD Premium” on an increasing number of sets.
Dolby Vision is a more complex form of HDR, produced and approved by people who brought us Dolby sound loss. In theory, a Dolby Vision set must meet a more stringent set of criteria for displaying HDR content, and this seems to be borne out by our testing. As far as exclusive HDR technologies are concerned, Dolby Vision has led the market.
|Tone Mapping||Varies per Manufacturer||Better||Better|
|Content Availability||Good||Limited||Limited, but growing|
Bit depth describes the quality of color graduations in an image. SDR content is commonly advanced at 8-bit, which takes into account 16.7 million colors. HDR content is normally developed at 10 bits, which takes into consideration up to 1.07 billion colors. The more color graduations show the more sensible the picture with less banding and a subtler change between similar-colored regions.
Dolby Vision content requires up to 12-bit shading; HDR10 and HDR10+ are just 10 bits. This may not sound like a big deal, however, 10-bit color shading rises to 1.07 Billion colors graduations, though 12 bit builds that to an amazing 68.7 Billion hues. This takes into account a lot better command over degrees, bringing about a more life-like picture, with no banding.
- 10 bit
- 1.07 billion colors
- 10 bit
- 1.07 billion colors
3. DOLBY VISION
- 12 bit
- 68.7 billion colors
Most Dolby Vision content is currently mastered at 4000 cd/m²; HDR10 and HDR10+ content are developed at levels from 1000 to 4000 compact disc/m² relying upon the title.
The three criteria include up to 10,000 cd / m2 of images, although no screen can reach that level at the moment. Consequently, there is no real difference between the formats as they both peak at 4000 cd / m2.
- Mastered from 1000 to 4000 cd/m²
- Mastered from 1000 to 4000 cd/m²
3. DOLBY VISION
- Currently mastered at 4000 cd/m², but supports up to 10000 cd/m²
If you have a TV with a maximum brightness of 1400 cd / m2, how does it handle the highlights of a 4000 cd / m2 film? How relatively low peak brightness televisions deal with a film that has been mastered at a much higher peak brightness is crucial.
Clipping is the easiest way. Everything from 1400 to 4000 would be clipped in our instance of a 1400 cd / m2 Screen. What does it mean? This implies that in this spectrum of illumination there would be no detail visible and there would be no noticeable colors in this region. This is essential because, in locations as bright as it is above its maximum output, the TV can not replicate information.
Tone mapping is an alternative. The highlights from 1400 to 4000 are remapped on a 1400 cd / m2 television to drop below 1400 cd / m2. In practice, this means that the highlights starting around 1000 cd / m2 have a gentle roll-off of color.
This would indicate that some highlights appear slightly dimmer on a TV that utilizes tone mapping than if the same TV were using clipping. While this is true, in the highlights, a tone-mapped picture shows much more detail than one that is blurred.
With Dolby Vision, the first devices that supported Dolby Vision included a special Dolby chip that scans the TV design and uses tone mapping as a guide using the TV limitations. This requirement has been relaxed by Dolby, and some TVs, including Sony TVs, do this instead of software. Tone mapping is entirely the option of the supplier of HDR10 and HDR10+, which can relate to inconsistency.
Metadata can be considered as a kind of instruction manual that describes different aspects of the content. it is contained along with the series or film and helps the display and manage the content in the most efficient way.
Another means of differentiating the three formats is by using metadata. HDR10 requests static metadata. With static metadata, the limits in splendor are set once, for the whole film/appear, and are calculated by taking the brightness spectrum of the brightest image. Dolby Vision and HDR10+ enhance this by utilizing dynamic metadata, which enables them to advise the TV how to apply tone-mapping on a scene-by-scene, or even a casing by-outline premise.
|Fire TV Stick 4k||Yes||Yes||Yes|
|Apple TV 4k||Yes||No||Yes|
|NVIDIA GTX 900 Series and up||Yes||No||Yes|
|AMD Radeon RX and up||Yes||No||No|
The availability of the new HDR designs has improved in recent times. Both Dolby Vision and HDR10 would now be able to be found on UHD Blu-beams and are supported by most other devices. In spite of the fact, there are still just a couple of motion pictures accessible on the disc that supports Dolby Vision. The first HDR10+ motion pictures have been released on UHD Blu-beam, yet most outside playback gadgets don’t bolster it yet. It’s critical to note anyway that if the TV itself doesn’t enhance Dolby Vision, utilizing an external source won’t have any kind of effect.
While the huge majority of TVs support HDR10 and many versions support at least one of the more modern configurations, both devices are not offered by any of the models available in the U.S. Sony, Panasonic, Vizio, Hisense, and TCL support HDR10 and Dolby Vision on most of their devices, although Samsung continues to develop in HDR10+ and actually does not offer Dolby Vision any of its products.
You shouldn’t expect all the extra formats capabilities to be used by the cheaper HDR TVs. You won’t even see a difference for most of them. It can only be used by high-end TVs.
Although still minimal, HDR is now enabled by most high-end phones. Many Samsung high-end phones, including the latest S10, S10e, and S10 +, support HDR10 +. Since the iPhone 8, Apple smartphones have been supporting Dolby Vision, but only the iPhone X, XS, and XS Max have true HDR displays.
While HDR was originally designed for films, there is no question about the advantages of gaming. Most modern consoles support HDR10, including the original PlayStation 4 as well as the new Xbox Ones and the PlayStation 4 Pro. The Xbox One also offers Dolby Vision, although not all TVs accept it, as it utilizes a modern Dolby Vision ‘low-latency’ model to allow additional encoding of Dolby Vision.
As with film, game developers in their games have to require support for HDR. Unfortunately, HDR10 is not always properly implemented, so the actual results can vary. A handful of Dolby Vision games are now available, including Creed Origins of Assassin, Battlefield 1, and Overwatch, just to name a few.
See our recommendations for the Best 4k HDR Gaming TVs.
Monitors were very slow in adopting HDR. However, it is slowly changing, with increasing numbers of screens adopting HDR10. The VESA Standards Group has been pushing for new standards on monitoring for HDR, which typically can not reach the brightness levels required for a great HDR experience.
Despite this, monitors are still behind televisions for a few years. The few monitors that provide a decent HDR experience are very expensive and are still short of the television HDR experience. No monitors support HDR10 + or Dolby Vision, although with Dolby Vision support a few laptops have been released.
See also: Best 4k TVs to use as Monitor
Dolby Vision is currently the most advanced HDR system from a technical point of view, but although it has significantly improved, it is somewhat maintained by the lack of content. The distinct advantage of HDR10 is that more content is available and enabled on most TVs. HDR10+ nearly matches Dolby Vision’s capacities but is incredibly lacking in quality, and in the U.S it is only supported on Samsung TVs.
Actually, it’s not really that important in the difference between the two formats. Video performance itself has a much greater impact on HDR. Although in recent years the software has significantly improved, HDR still has very early days software. All formats have the ability to produce pictures that are much more realistic than we see on today’s best TVs.