Italy's daily coronavirus death toll and new cases fall

High Dynamic Range, explained: There’s a reason to finally get a new TV

HDR noticeably improves content that looks washed out or flat on standard screens

Sam Machkovech
We're dealing with better tech these days—embrace it.
Ever since the HDTV standard emerged in the mid-'00s, screen producers have struggled to come up with new standards that feel anywhere as impressive. That's been a tough sell, as no baseline image standard has yet surpassed the quality jump from CRT sets to clearer panels with 1080p resolution support.
3D content came and went, with its unpopularity owing to a few factors (aversion to glasses, hard-to-find content). The higher-res 4K standard is holding up a little better, but its jump in quality just doesn't move the needle for average viewers—and certainly not those sticking to modestly sized screens.
But there's another standard that you may have heard about—high dynamic range, or HDR. It's a weird one. HDTV, 3D, and 4K have all been easy to quickly and accurately describe for newcomers ("more pixels," "one image per eye," etc.), but HDR's different. Ask an average TV salesperson what HDR is, and you'll usually get a vague response with adjectives like "brighter" and "more colorful." Brighter and more colorful than what, exactly?
Yet HDR may very well be the most impactful addition to modern TV sets since they made the 1080p jump. Images are brighter and more colorful, yes—and in ways that are unmistakable to see even to the untrained eye. Content and hardware providers all know it, and they've all begun cranking out a wow-worthy HDR ecosystem. The HDR difference is here, and with this much stuff on market, it's officially affordable (though certainly not bargain-bin priced yet).
If HDR still has you (or your local retailer) stumped, fear not. Today, we're breaking down the basics of high dynamic range screens: what exactly differentiates them, how good they are, and whether now is the time to make the HDR leap. And as a bonus, we'll answer a bunch of questions about various screens and compatible content along the way.

It’s not just sheer brightness

High dynamic range boils down to a few important factors, and they're all intertwined: luminance, color gamut, and color range.
When it comes to luminance, there's something worth clarifying right away: HDR screens don't necessarily win out just by being really, really bright. What's important is the range of luminance, from the darkest dark to the whitest white on a screen.
Modern LED screens suffer thanks to their pixels being backlit, which means they have struggled to display the kind of deep, dark blacks that would make Nigel Tufnel drool. That's one reason plasma TV set owners have held tightly onto their old sets, especially now that the black-friendly plasma standard isn't being produced by any of the big manufacturers. Where an HDR set helps is by compensating with the ability to render so many more steps of luminance. That could mean an incredibly bright LCD TV or a not-as-bright OLED TV that just happens to display deeper blacks so that its luminance range is still off the charts. (We'll have more on the specifics of modern OLED technology in an upcoming article.)
If you transmit video (on disc, game, or streaming service) via the current, industry-wide HDTV standard, you're capped at a luminance maximum of around 100 nits. Your screen may be brighter than that, but this is where the current standard really stinks. In that case, the signal sends its luminance information as a percentage, not a pure luminance value. It's up to your set to translate that percentage, and the results can look, quite frankly, pretty awful. This is how viewers get blown-out colors and other glaring inaccuracies.
New HDR standards not only jack a pixel's luminance maximum up but also change the encoded value to a specific number, not a percentage. That's the first step to higher color quality on your fancy TV screen. Updating the luminance differential also updates a screen's color gamut. Dolby's engineers explain how:
The problem with restricting maximum brightness to 100 nits (as in TV and Blu-ray) is that the brighter the color, the closer it becomes to white, so bright colors become less saturated. For example, the brightest saturated blue on an ordinary display is a mere 7 nits, so a blue sky will never be as bright and saturated as it should be.
With more quantifiable steps in luminance come more wiggle room for displaying ranges of saturated colors. This is actually different than a screen's color depth, which is typically described as a bit-count for color reproduction. As in, 8-bit, 10-bit, 24-bit, and so on.
Smaller numbers describe the number of bits per color, and since pixels light up with a combination of red, blue, and green data, the bit count of the three colors combined is usually used to describe the overall color quality. HDR jumps from the HDTV standard of 8 bits of data per color being transmitted. In practical terms, that's 8 for red, 8 for blue, and 8 for green, or the shorthand phrase "24-bit color." At this level, individual color data comes in a value of 0-255; multiply that out by three colors for a total range of about 16.78 million colors.
Mmm, banding.
Enlarge / Mmm, banding.
Aurich Lawson
For decades, screen makers have felt like that was a large enough range, but higher resolutions and less CRT-related blurring have made the biggest drawback of limited color depth quite evident: banding. Look at the image above. You've probably seen stretches of a single color on a screen just like this in a movie or TV show, where the screen isn't receiving enough granular color data to fade a color naturally. 10-bit color gets us to a total color range of 1.07 billion colors (1024 per individual primary color).
There's a difference between these two color properties we're talking about. Higher color depth means less banding. Wider color gamut means more natural color representation. The latter is particularly noticeable when looking at explosive jumps in color, like a shiny red fire hydrant or a burst of orange flame. After all, that jump in luminance doesn't mean much if we're only watching content in grayscale (though pure whites and blacks certainly benefit, as well).

Here comes another format war

With those three properties in mind, we can explore the two major HDR-related standards that have begun making their way into consumer-level electronics: HDR-10 and Dolby Vision.
It is impossible to convey the HDR difference on an SDR screen, because HDR's boosts require compatible panels. This mock-up simulates some of the effect by reducing color gamut on one side, but one of the big differences is that HDR screens don't have to add an unnatural glow around a bright point to make it look "bright." Instead, in HDR, something like the sun here has its brightness contained solely within its radius; the natural brightness of the display, and contrast with other pixels right next to it, creates a <em>natural</em> glow effect.
It is impossible to convey the HDR difference on an SDR screen, because HDR's boosts require compatible panels. This mock-up simulates some of the effect by reducing color gamut on one side, but one of the big differences is that HDR screens don't have to add an unnatural glow around a bright point to make it look "bright." Instead, in HDR, something like the sun here has its brightness contained solely within its radius; the natural brightness of the display, and contrast with other pixels right next to it, creates a natural glow effect.
The standards have a few things in common, including support for 10-bit color depth, a jump to the Rec.2020 color gamut standard, and uncapped luminosity levels. (Current HDR-capable displays support roughly 65-75 percent of the Rec.2020 spectrum; they're more closely tuned to the DCI-P3 color gamut standard, which is still far wider than the standard found in standard HDTV content.)
Dolby Vision is technically the more ambitious format because it additionally supports 12-bit color depth and dynamic metadata. The former will, among other things, obliterate any trace of color banding—which you still might notice on images with 10-bit color depth. The latter allows a video source to refresh its baseline color and luminosity levels at any time.
These specific upgrades will pay out on consumer-grade displays to come, but their perceptible bonuses are scant in the current market. As displays creep up into luminance differentials of 2,000 nits and beyond, that dynamic metadata will allow video sources to sweep out baseline metadata in order to better favor a pitch-black look into a starry sky; an outdoor, desert scene; or whatever high-octane sequence comes next. As luminance ranges grow, so will filmmakers' desire to control those more granularly, and Dolby Vision has set up such a payoff.
But current high-end consumer displays aren't there yet in terms of luminance differentials, and it makes the Dolby Vision-specific payoff that much harder to perceive compared to what HDR-10 delivers on current screens. Plus, Dolby's standard requires a certification process and proprietary chips for both screens and media devices, which isn't going to help them win this emerging HDR format war. Right now, some streaming apps, like Vudu and Netflix, support Dolby Vision, but many apps, all high-end game consoles, and most HDR Blu-rays opt for the HDR-10 standard.
For now, just remember: if you buy a set that includes Dolby Vision support, it also supports HDR-10, but not necessarily the other way around.
Annoyingly, you won't find a clearly marked "HDR-10" logo anywhere on modern HDR sets. Instead, different set manufacturers are adopting different logos. The most common one is "Ultra HD Premium," which combines 4K resolution (3840x2160, or, four times as many pixels as a 1080p display) and the HDR-10 spec of luminance range, color gamut, and color range. These have all been "UHD Alliance certified," and some set manufacturers, including Sony, would rather not pay for the certification.
HDR content, and how well a TV set or monitor reads and renders it, is a little harder to appreciate at a fluorescence-soaked big-box retailer. That's why those certifications are important in HDR's early goings.

Comments