brighter fire straight

4K UHD (2160p resolution) is no longer the big news in TV land. HDR (High Dynamic Range) has supplanted it in the conversation. Why? It’s not lack of detail, insufficient pixels, or 1080p resolution in general that’s been the major issue with TVs for the last few years. It’s been the lack of contrast and washed-out colors caused by the adoption of thin-enabling, energy-saving, but color-challenged LED backlighting.

HDR and the technologies that enable it—less leaky LCDs, better backlighting, improved LED/phosphor combinations, quantum dots, to name a few—take contrast to historically high levels, provide an unprecedented level of detail in both dark and bright areas, and as a corollary effect, restore the color we once had with CRTs and LED panels backlit by CCFLs (cold cathode fluorescent lamps). Note that these are largely improvements affecting LED LCD TVs. OLED TVs have been colorful and largely capable of rendering HDR-like images for a while.

richer paint left

Dolby

Dolby needs to up its game with its HDR comparison images. It looks a lot better than this.

Just about everyone who’s seen HDR, which can be implemented at any resolution, finds it significantly more compelling than simply adding pixels; but combine 2160p resolution and HDR in properly mastered content and—yowsers! Sadly, something of a format war has arisen, albeit an odd and not particularly scary one. More on that later.

TABLE OF CONTENTS

  • What is HDR?
  • More colors help
  • Ultra HD Blu-ray
  • Our advice

What is HDR?

HDR is essentially a greater differential between the brightest and darkest points in an image, aka the luminance dynamic range or comparative luminance. It’s notabout peak brightness, as some believe. OLEDs, with their deep blacks, don’t need to generate as many nits (a measure of luminance) to accomplish the effect. That’s why OLED TVs seem so lush and saturated compared to LED-backlit LCD TVs and their grayish blacks. Both produce similar brightness, but OLEDs have that nearly pure darkness. LED/LCD TVs need to increase peak brightness significantly to achieve the HDR effect.

You’ll also want to read OLED vs LED: There’s just no comparison

You might already be familiar with HDR from photography. Indeed, there’s an HDR option on most phone cameras these days. HDR photos are created by capturing the same exact image multiple times with a range of f-stops, and then combining them. HDR video is pretty much the same deal, only harder to accomplish because capturing a wide range of luminance continuously in real time requires a fair amount of processing power—or multiple cameras. HDR animation or digital effects is easy—just move a slider.

To give you an idea of the difference, several HDR/non-HDR comparison shots are scattered about this article (these were provided by Dolby). If you’re viewing it on an WLED (white LED, as most less-expensive TVs use) display, you’re not getting the full impact, but it should still get the point across.

dolbyhdr

Note that HDR images quite often exaggerate reality.

More colors help

Greater comparative luminance isn’t the only thing required if you want to do HDR right. To avoid ugly artifacts such as banding (as seen below), there must also be a large range of colors available to smooth transitions, as well as provide detail in very bright or dark areas. This is known now as wide color gamut and means 10- to 12 bits per color (that’s approximately one billion or four billion colors). Below is an image in its original 8-bit color; the one below it was downscaled to 4-bit color.

8 bit

This is the original image shot in 8-bit (24-bit in photo-speak) color.

4 bit

Notice the loss of detail and the introduction of more artifacts when you remove color information. The pixel resolution is the same.

Notice the loss of detail and the introduction of artifacts as you lose color resolution. The difference between 8-bit, 10-bit, and 12-bit color isn’t quite as salient as what’s shown above, but it is noticeable.

Accurate color is also important, though not as important as wide color gamut. While neither is strictly required for HDR, they are both part of the never-ending quest to generate a picture that can more accurately portray real life, as well as produce dramatic effects beyond those that occur in the real world. Not that the Aurora Borealis, lava, or lasers need much improving on.

Some of the better LED/LCD TVs, as well as OLED TVs from the last couple years, possess the display technology to render HDR to decent effect. This year’s crop of TVs has a number of models advertising HDR capabilities, including Samsung’s SUHD series, Sony’s X800/X900 D-Series, LG’s latest LED and OLED TVs, and Vizio’s P-Series.

You’ll also want to read TV tech terms demystified

You might also see some TVs advertised as “HDR enabled,” “HDR compatible,” or such. This means only that they can recognize and process HDR content; at least the baseline for HDR that the industry currently has in place. It does notmean that they have the color saturation, color accuracy, brightness, or contrast to realize the full intent of HDR content or come anywhere close to it; hence the wishy-washy marketing terms.

Though it’s not quite ready yet, a standard and certification for HDR TVs is being developed by the UHD (Ultra High Definition) Alliance, a group that includes just about every major player in the TV industry. There’s already a standard for Ultra Blu-ray players, though it only supports one of the two standards in the title of this article. As of now, you and your eyes are on your own. That’s not a big issue—once you’ve seen real HDR, you won’t forget what it looks like.

That baseline for HDR I just mentioned is SMPTE (Society for Motion Picture and Television Engineers) ST-2084 and ST-2086, which cover the color gamut and basic protocols for transmitting HDR info to a TV. There’s also CTA-681 (Consumer Technology Association, nee Consumer Electronics Association), a standard to cover the transmission via HDMI 2.0a. Combine that whole deal and you get what’s often referred to as HDR-10.

samsung suhd js9500 tilt

Even last year’s Samsung SUHD TVs support HDR-10.

Note that the “a” in HDMI 2.0 is required. Many existing TVs with HDMI 2.0 can be firmware upgraded to HDMI 2.0a; but as I said, this only renders them capable of playing HDR-10, not necessarily making it appear as it should.

Don’t confuse HDR-10, which is about a standard, with what Sony and Samsung are pushing as HDR 1000. That’s something the companies concocted to advertise the fact that their HDR TVs deliver a 1000-nit (or thereabouts) range of comparative luminance. How thereabouts depends on the TV. Remember, this is an industry that often plays fast and loose with the numbers.

The problem with the HDR-10 “standard” is that few are satisfied with it, especially creative types, many of whom are OCD about their efforts. Dolby Vision, the other HDR implementation you’ll see advertised, uses the baseline, but exceeds it in a couple of important ways.

Dolby Vision vs. HDR-10

The first of the two major differences between Dolby Vision and HDR-10 is that Dolby Vision uses 12 bits per color (red, green, and blue), where HDR-10 uses 10 bits per color. That’s where the name HDR-10 comes from, as well as my supposition of HDR-12 as a possible successor. While those 2 bits per color won’t mean much with a lot of material, as shown in the pictures of the kids above, they allow more subtle detailing and completely eliminate artifacts such as contour banding that can still crop up under some circumstances with 10-bit color. The effect is subtle, but discernible—at least in Dolby’s presentations.

The second and more compelling difference (at least to content creators) is in the metadata—the extra data that tells a TV how to display the HDR content. Dolby Vision uses dynamic, or continuous metadata so that color and brightness levels can be adjusted on a per scene, or even frame-by-frame basis. HDR-10 (Sharp referred to it as Open HDR in my conversations with them) uses static metadata that is sent only once at the beginning of the video.

In real life, that means that Dolby Vision can optimize night scenes, daylight scenes, and everything in between individually, while HDR-10 forces a compromise based on the overall characteristics of the content.

richer city left

Dolby

A Dolby supplied rendering of the effect of Dolby Vision.

One other difference between Dolby Vision and HDR-10: Dolby Vision doesn’t require HDMI 2.0a. Metadata is sent in-band, as an auxiliary stream within the existing HDMI 2.0 data flow. It requires its own decoding, however, which from what I’ve been able to gather, can’t be added to the mix by a firmware upgrade. That renders the advantage moot.

The good news for consumers is that Dolby Vision can happily coexist with HDR-10. Dolby Vision TVs can utilize the dynamic metadata, while HDR-10 TVs can simply do their thing, blissfully unaware of its presence. The major impediment to the adoption of Dolby Vision is that, while free to artists, content providers, encoding houses, filmmakers, and others, TV manufacturers must pay to implement it. It remains to be seen if demand from creative types will be enough to force the issue.

Note that SMPTE (of which Dolby is a member) is contemplating something like Dolby Vision via ST-2094, and the CTA is working on CEA/CTA-861-G to add dynamic metadata support to HDMI.

Though I’ve only talked about Dolby Vision and HDR-10, I should also mention that there are at least two other HDR formats being worked up by BBC/NHK and Philips/Technicolor. Sigh.

Ultra HD Blu-ray

You might have thought you were done with optical discs as a means of multimedia delivery; however, there is at least one reason you might not be: bandwidth, or the lack thereof. Streaming 2160p HDR across the Internet at full quality requires a very large data pipe, larger than many users have access to. There are lower resolutions, of course—720p or 1080p, as well as bit-rate reduction—but that’s not taking full advantage of your new TV.

ubdk8500 600x600 b

Samsung

Samsung’s UBD-K8500 Ultra HD Blu-ray player.

If you’re bandwidth-challenged, or simply old-school, there’s the Ultra HD Blu-ray player—expensive as that option may currently be. The only player shipping at the time of this writing was the $400 Samsung UBD-K8500, though Panasonic just announced its DMP-UB900 at $700. The UBD-K8500 is a nice player with lots of streaming capabilities, but it supports only HDR-10 and not Dolby Vision. That’s a bit odd as streaming services have taken to Dolby Vision like fish to water; and at $400, it’s hard to believe Samsung couldn’t afford the royalties. The Panasonic? The press release says nothing about Dolby Vision; and the price? Gulp.

Note that Dolby Vision is part of the BDA’s spec (the Blu-ray Disc Association’s specification that is), but it’s not mandatory. The UHD Premium certification by the UHD Alliance doesn’t specify Dolby Vision support either. None of the currently shipping Ultra HD Blu-ray titles feature Dolby Vision either.

Our advice

Many people will buy HDR TVs for no other reason than that they are simply the best TVs available. But when playing properly mastered HDR content—be it HDR-10 or Dolby Vision—the viewing experience takes a quantum leap. There’s just no comparison.  Standard dynamic range (SDR, aka, everything produced until very recently) pales in comparison.

But there is that format conundrum. Fortunately, anything that supports Dolby Vision also supports HDR-10—or soon will. And HDR-10, as the baseline, should be supported for a very, very long time. Therefore, there’s no chance of being left out in the cold, content-wise.

darker moto left

Dolby

Details such as reflections that are lost with standard dynamic range content, pop out with HDR. Subtle textures become visible as well.

Despite extensive consultation with industry experts—as well as asking my Ouiji board, crystal ball, and magic 8-ball—I can’t tell you today if any HDR TV you buy now will support the final HDR standard. I am betting that dynamic metadata will be part of said standard, but perhaps not Dolby Vision as it exists today.

Summing that all up, you can’t lose buying a Dolby Vision TV, but you only lose the Dolby Vision if you buy HDR-10. It’s too early to say about the final standard.

Honestly? I’m going to take flak from vendors for saying this, but if you don’t need to upgrade or replace right now, the whole HDR shebang will be a lot clearer in a year or so. But if you need a new TV; seriously, HDR is the best thing to happen to TV in a long, long time. As it’s only on the best TVs, the LED variety of which are vastly improved from only a couple years ago, go ahead and treat yourself.

[“source -cncb”]