Many of us already know the work HDR (High Dynamic Range) as a tone mapping algorithm that tries to mimic how we see light, often by extending the range between dark and bright details.
It all started with photography and is now invading video games through UHD TVs with advanced tone mapping features (like Nvidia’s DSR for example).
Many modern games can produce very rich colors and various luminosity ranges, but they still rely on your TV to display them…
But what if you want more colors than your TV can provide? What if you want brighter whites or darker blacks?
Or even better colors at extreme values of brightness? Even though there are options for this available in some TVs, it requires manual adjustments.
And since our eyes are naturally better at estimating the bright and dark areas of a scene than finding colors, most people don’t use these options.
Differences Between HDR & UHD
What is HDR?
HDR tone mapping tries to produce better colors and luminosities than our eyes would do, in order to produce a more correct image.
There is no such thing as HDR out there: your monitor and TV won’t magically become able to display an HDR picture (unless they already can). And even if you buy the best UHD TV on the market, it’s still limited by its own specifications.
The process to get an HDR picture usually goes like this: you encode an LDR (Low Dynamic Range) picture using arbitrary methods, then you feed it into an algorithm to transform it into an HDR image that can be correctly displayed on UHD TVs or monitors with advanced tone mapping features.
There are many possible ways of encoding/decoding LDR/HDR pictures, and some standardization groups are even working on it (like the hdr-kite.org group).
Sidenote: Tone mapping is also used with special effects in video games to simulate various types of atmospheric or reflective effects. One example is the new DOOM game that uses this for its spectacular reflections.
However, there is no use for tone mapping here since your monitor already displays the image as it was encoded by the developers.
An HDR image produced without tone mapping will have dithering due to quantization errors, but you won’t see anything on UHD TVs since they only display images encoded using specific tone mapping algorithms (they can’t display LDR images without tone mapping).
HDR is also used to display colors that are unreachable by current TVs. When you see a game or movie with vivid colors, it’s because the video was encoded using HDR (with standard tone mapping algorithms), and then displayed on a UHD TV using one of its advanced tone mapping features.
This is achieved thanks to the wide color gamut offered by UHD TVs since there’s no way for your monitor to produce such colors natively (most monitors offer only sRGB).
If you already own a UHD TV, chances are that most of what was said may not be very interesting since almost everything looks more vivid than usual :).
What is UHD?
UHD (Ultra High Definition) is the new standard for TV resolution, with four times more pixels than 1080p (2160p).
Note that HDR tone mapping isn’t mandatory in UHD TVs since they already display images encoded using advanced tone mapping algorithms. But if you want to benefit from superior colors and luminosities, you should use an HDR source.
The color space of UHD is BT 2023, which offers 75% more colors than sRGB or Rec 709 used by most current games. This also means that your current TV/monitor may have trouble displaying this kind of content correctly, at least without tone mapping…
And since this is a wide color gamut standard, it’s likely that your monitor won’t be able to display those colors even after tone mapping (note: current UHD TVs can do it).
Note that we didn’t talk much about resolution in this article because there isn’t really any point in comparing 1080p and UHD (the difference will only be visible if you sit very close to your TV/monitor).
What is better HDR vs UHD?
Both HDR and UHD improve picture quality, it’s mostly up to you which one is better.
HDR requires tone mapping (and your current monitor/TV doesn’t support it), but if you already own a UHD TV then this isn’t of much use to you since most content is encoded with advanced tone mapping algorithms anyway.
On the other hand, UHD requires using more bandwidth than 1080p, but if you own a decent internet connection this shouldn’t be a problem. And even if it is, streaming services are taking care of that for us by encoding their videos at lower bitrates when possible :).
Sidenote: HDR+SDR on Youtube
Youtube provides some dynamic metadata for some videos that allows them to switch between SDR and HDR based on the capabilities of your monitor. However, this doesn’t work well with UHD TVs since they always use advanced tone mapping (and can’t display LDR/non-HDR videos without tone mapping).
So if you’re watching videos directly on Youtube or Vimeo, make sure that you select 1440p or 2160p at 50fps for the best experience (it should automatically recognize whether your monitor supports it or not).
And if you want to watch Netflix in HDR, make sure that your TV is displaying 4k @ 60fps (some older UHD TVs only support 4k @ 30fps which limits dynamic metadata to only change brightness levels)
HDR is mandatory for games with vivid colors or high dynamic range, and thanks to HDR TVs almost every game looks more vivid than before. But if your current monitor/TV doesn’t support it, you won’t benefit from any of this.
UHD requires sending four times more pixels over the same bandwidth as 1080p, which makes it less efficient in terms of bitrate (when compared to advanced tone mapping).
However, this may be negated by the extra data required by HDR tones. And since most current content is encoded using advanced tone mapping algorithms anyway, UHD can make a lot of sense even without HDR.