In the world of high definition, the brightness of each primary color (red, green, and blue) in each pixel is represented with 8 bits, providing a maximum of 256 discrete brightness levels from 0 to 255. (In practice, consumer-video signals typically use the range from 16 to 235, though the high end sometimes extends above 235.) Combined with specs for peak brightness and color gamut, this is known as standard dynamic range (SDR), and it has worked relatively well for HD and its color gamut of BT.709. All HD displays can accept and render video signals with 8-bit precision.
But Ultra HD includes the possibility of high dynamic range (HDR) and wide color gamut (WCG), and 8 bits per color are no longer sufficient. HDR means there is a greater range from darkest to brightest, and WCG means there is a greater range of colors that can be more saturated than BT.709. If this information is represented and displayed with 8-bit precision, the "distance" between consecutive brightness levels is larger than it is with SDR. The result is clearly visible as banding in areas of the image with gradual gradations in brightness, such as sunsets, blue sky, and underwater shots. Consequently, HDR content is represented with at least 10 bits per color.
The 8-bit version of this image has clear banding, while the 10-bit version does not.
Because of this, you'd think that all HDR-capable displays must have at least 10-bit precision from the input through the electronics to the raw display panel itself, but that is not necessarily true. Of course, the input must be able to accept a signal with at least 10-bit precision, though there are several places along the signal path—such as some HEVC decoders—where the bit depth might be reduced to 8 bits.
Then there's the actual display panel (e.g., LCD, OLED, or projector imager). It turns out that some panels in HDR-capable TVs have 8-bit native precision, because 10-bit panels are more expensive to manufacture and therefore increase the price of the final product.
In a Dither
How can an 8-bit panel—or any other 8-bit step in the signal path—reproduce a 10-bit HDR image without banding? There are two main techniques to do this. One is spatial dithering, in which neighboring pixels are assigned color values in such a way that the banding is obscured. But this sometimes results in visible artifacts such as a checkerboard effect or what looks like noise, so it isn't used much in consumer TVs.
The other, more common technique is temporal dithering, often called frame-rate control (FRC). In this process, a pixel rapidly alternates between two colors to give the impression of a third color. Depending on the specific algorithm used, this can work much better than spatial dithering, but it can also result in visible artifacts such as twinkling, especially in dark areas. Still, this process works so well, it's even used in some professional monitors that are widely used in color grading.
In my recent article listing HDR-capable displays
, one of the most heated discussions in the comments is about the native bit depth of the panel used in this or that model of TV. Unfortunately, some manufacturers, such as Samsung and Sony, do not officially reveal the bit depth of the panels in their HDR displays, saying that an 8-bit panel with good processing can perform better than a 10-bit panel with poor processing.
That may well be true, but I maintain that an 8-bit panel is an inherent bottleneck in the HDR signal chain, and compensating for it with dithering—even high-quality dithering—is not as desirable as using a 10-bit panel with good processing. Such a TV is generally more expensive to manufacture and purchase, but in my view, it's worth it to get the best possible HDR image.
Is it possible to determine the native bit depth of a display's panel? If the manufacturer clearly specifies it, great. But in some cases, a manufacturer's marketing department might not know, or it might be misinformed. And as mentioned earlier, some manufacturers, such as Samsung and Sony, do not officially reveal this information.
One of models most often cited in that heated discussion about HDR-capable displays is the Samsung HU9000, the 2014 flagship UHDTV with no HDR capabilities when it was introduced. Those capabilities were added when owners upgraded the outboard One Connect Evolution Kit to the 2015 SEK-3500. But what is the native bit depth of the HU9000's panel itself?
To answer this question, some AVS members pointed to a video interview from SPSN (Samsung Product Support Network)—a Samsung-sponsored YouTube channel—in which National Product Testing Manager Scott Cohen clearly states that the HU9000 has a 10-bit panel. Also, some members have cited the spec sheet for a replacement panel for the HU9000 from third-party supplier TV Service Parts, which clearly indicates that it's a 10-bit panel.
However, it turns out that the panel within the HU9000 is, in fact, 8-bit, and that Scott Cohen and TV Service Parts were mistakenly misinformed. (There was no intent to deceive; it was a clerical error.) You can verify this by going to Samsung's own replacement-part site and searching for part number BN95-01688A
, which is the replacement panel for the UN78HU9000. As you can see on that page, the panel is specified as 8-bit. This info has also been corrected on the TV Service Parts site.
So is there a way to determine the native bit depth of a TV's panel on your own? Some AVS members use a program like Monitor Asset Manager, which queries the TV about its capabilities via EDID (Extended Display ID). This causes the TV to send information about its capabilities back along the HDMI cable to the computer running the software.
The screen shot above is the result of an AVS member using Monitor Asset Manager with an HU9000 and the 2015 SEK-3500 One Connect Evolution Kit. The EDID reports that the TV supports 30 and 36 bpp (bits per pixel), which translates to 10 and 12 bits per color, respectively. However, this does not reveal anything about the LCD panel's native bit depth, only that the SEK-3500 can accept 10- and 12-bit signals. As we now know, the panel's native bit depth is actually 8 bits, and the SEK-3500 dithers 10- and 12-bit signals to 8 bits.
One member posted the screen shot above from the software control panel of his GeForce 750 graphics card, which was connected to an HU9000 with SEK-3500. The software allowed him to select the card's maximum bit depth for that display, 32 bits per pixel, which was claimed to indicate that the TV can accept a 10-bit signal (30 bpp). However, this probably means that the card can send no more than 24 bpp plus an 8-bit "alpha" channel. Also, if the HDMI bandwidth of this card is 10.2 Gbps (which I haven't yet verified), it can't send any more than 8 bits per color at 4K/60p, as the control panel indicates.
Some members post photographic screen shots of their TVs showing HDR content to demonstrate that they are HDR-capable—and in some cases, to ask other members if the TV is HDR-capable based on the photos. Unfortunately, this is a completely useless exercise because there are so many unknowns. What's the dynamic range captured by the camera? What's the dynamic range of the monitor or display used to view the photos on AVS? (It's probably not HDR.) There is no well-defined way to do this, so it's not valid proof of anything—except perhaps to illustrate that one TV looks different than another if the photos were taken with the same camera using the same settings under the same conditions.
Tyler Pruitt (WiFi-Spy), technical liaison at SpectraCal, started a thread in the Display Calibration forum
that offers a short test clip with two 2160p 10-bit gradient ramps
—that is, 3840x2160 images with smooth gradients from black to white and black to 75% gray created and encoded with 10-bit precision. (The pattern also includes a simulated 8-bit gradient for comparison.) The purpose of the test is to determine if a TV's panel is 10-bit or 8-bit; if it's 8-bit, you should see banding in the image.
Here is one frame from Tyler's 10-bit gradient pattern.
In the same thread, Tyler also included a similar test pattern created by Stacey Spears
(sspears), chief color scientist at SpectraCal and co-creator of the Spears & Munsil HD Benchmark set-up Blu-ray. It includes two gradient ramps—one encoded with 10-bit precision and the other with the 10-bit values rounded to 8 bits. Both gradients slowly rotate, which makes it much easier to see any banding.
In this pattern from Stacey Spears, the two gradients rotate to more easily see any banding. The one of the right is encoded with 10-bit precision, and the one on the left uses 10-bit values that have been rounded to 8 bits.
These test patterns are all well and good—in fact, they're excellent—but they do not directly address the question of the raw panel's native bit depth. They only assess the performance of the display system as a whole, including the processing and panel. For example, if the panel has a native bit depth of 8 bits, it might perform these tests well if the dithering algorithm is good.
To illustrate this point, a new test pattern from Stacey
includes three rotating gradients—one with 10-bit precision, one with rounded 8-bit values, and one with 8-bit spatially dithered values. The 8-bit rounded pattern should exhibit obvious banding in most cases, while the 10-bit and 8-bit dithered patterns should look close to the same, though the 10-bit should look a bit cleaner because compression filters out some of the dither via quantization.
In this pattern, Stacey illustrates the difference between rounding and dithering 10-bit values to 8 bits. In some cases, such as the Panasonic DX800 LCD TV for the European market, he says the 8-bit dithered pattern actually looks better than the straight 10-bit pattern.
As Stacey Spears has pointed out to me, there is no way to definitively measure the native bit depth of a display panel without physically dismantling the TV and testing the panel directly, which is clearly impractical. So I'm afraid those who believe that the tests mentioned in this article can provide this information are mistaken. Instead, these tests measure the performance of the system as a whole.
An HDR display with an 8-bit panel and good dithering algorithms can certainly display HDR content in a way that looks better than the best SDR presentation. But I would prefer a panel with a native bit depth of at least 10 bits, along with a signal path that maintains 10-bit precision from one end to the other. I wish there was a foolproof way for consumers to determine this for all displays, but in some cases, there isn't.
Still, the bottom line is how it looks displaying real program material. If you like what you see, there's no need to stress about the panel's bit depth—just sit back and enjoy!
Many thanks to Stacey Spears and Tyler Pruitt of SpectraCal for their generous help with this article.
Note: Please do not quote this entire article when posting a comment. Feel free to quote the relevant portion that pertains to your comment, but wading through the entire thing in the comments is quite annoying. Thanks!