Forum Jump: 
 5Likes
  • 2 Post By ScottAvery
  • 1 Post By JLaud25
  • 1 Post By lsorensen
  • 1 Post By EvLee
 
Thread Tools
post #1 of 11 Old 11-07-2018, 08:21 AM - Thread Starter
 
Join Date: Aug 2018
Posts: 555
Mentioned: 4 Post(s)
Tagged: 0 Thread(s)
Quoted: 419 Post(s)
Liked: 161
Dolby Vision and hdr10+ relevant or not?

Now that newer oleds have dynamic tone mapping on hdr10 content, what is the advantage of DV or hdr10+? dv and hdr10+ do have 12 bit color that leads to slightly better saturation, but the difference in 10 and 12 bit is minimal. if you compare a 4k blu ray with DV and then using forced hdr10 on the movie to a tv that does dynamic tone mapping, the differences between the dv and the hdr10 seem very minimal. In dv's case, it is dolby that's deciding how the scene/frame needs to be tone mapped, in a tv with dynamic mapping, the tv is deciding that. So i'm not convinced that DV and hdr10+ are offering anything that makes a genuine difference.
Do you think DV and hdr10+ as hdr formats would be relevant in the long run?
JLaud25 is offline  
Sponsored Links
Advertisement
 
post #2 of 11 Old 11-07-2018, 08:35 AM
AVS Forum Special Member
 
ScottAvery's Avatar
 
Join Date: Mar 2002
Location: Great Falls, VA
Posts: 1,386
Mentioned: 19 Post(s)
Tagged: 0 Thread(s)
Quoted: 468 Post(s)
Liked: 136
I'm not entirely sure 12 bit is about saturation, but your question makes me want to watch This is Spinal Tap in HDR WCG.

"It's like, how much more black could this be? and the answer is none. None more black."

If you have any audio questions I still have an "eleven is one louder" quote in my pocket.
RagtopFE and proufo like this.
ScottAvery is online now  
post #3 of 11 Old 11-07-2018, 09:01 AM
AVS Forum Special Member
 
lsorensen's Avatar
 
Join Date: May 2012
Location: Toronto
Posts: 2,193
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 1198 Post(s)
Liked: 697
Quote:
Originally Posted by JLaud25 View Post
Now that newer oleds have dynamic tone mapping on hdr10 content, what is the advantage of DV or hdr10+? dv and hdr10+ do have 12 bit color that leads to slightly better saturation, but the difference in 10 and 12 bit is minimal. if you compare a 4k blu ray with DV and then using forced hdr10 on the movie to a tv that does dynamic tone mapping, the differences between the dv and the hdr10 seem very minimal. In dv's case, it is dolby that's deciding how the scene/frame needs to be tone mapped, in a tv with dynamic mapping, the tv is deciding that. So i'm not convinced that DV and hdr10+ are offering anything that makes a genuine difference.
Do you think DV and hdr10+ as hdr formats would be relevant in the long run?

DV supports 12 bit. HDR10+ is 10 bit. HDR10+ is simply HDR10 with meta data updates throughout the content, which if your TV can do that dynamically on the fly should make essentially no difference (so for most OLEDs HDR10+ ought to behave the same as HDR10). Other than supporting 12 bit (which does give 4 times as many colors as 10 bit, although I am not sure how you will be able to tell them apart), DV also seems to have a more strict control over the display and how it is all handled, so in theory it ought to have a more consistent result. Time will show if that turns out to work in practice.

Len Sorensen

Sony XBR55A1E, Marantz SR6012, Benq W7000, Oppo BDP-93, PSB Image T5/C5/B4/Subseries 200
lsorensen is offline  
Sponsored Links
Advertisement
 
post #4 of 11 Old 11-07-2018, 09:34 AM - Thread Starter
 
Join Date: Aug 2018
Posts: 555
Mentioned: 4 Post(s)
Tagged: 0 Thread(s)
Quoted: 419 Post(s)
Liked: 161
Quote:
Originally Posted by lsorensen View Post
DV supports 12 bit. HDR10+ is 10 bit. HDR10+ is simply HDR10 with meta data updates throughout the content, which if your TV can do that dynamically on the fly should make essentially no difference (so for most OLEDs HDR10+ ought to behave the same as HDR10). Other than supporting 12 bit (which does give 4 times as many colors as 10 bit, although I am not sure how you will be able to tell them apart), DV also seems to have a more strict control over the display and how it is all handled, so in theory it ought to have a more consistent result. Time will show if that turns out to work in practice.
I already spoke of 10 vs 12 bit and all i find is 12 bit has a tiny bit better saturation, like greens look a little deeper, but only if you compare closely, otherwise it's very minimal. Actually, going from 8 bit to 10 or 12 bit is a bigger difference, while 10 and 12 bit look very close (you could also experiment this hooking a pc to the oled with a nvidia gfx card or similar and changing color depths starting with 8 bit). At the end of the day, all these panels are native 10 bit, so employing 12 bit isn't changing things much.

And DV having strict control over the display means absolutely nothing if your tv's dynamic tone mapping algorithm is very good (such as the one found on the sony a9f oled). All you're getting with DV is dolby's customized mapping working within the confines of your tv's peak brightness, while in the case of dynamic tone mapping, the tv is handling the same job directly.
When DV was released there was much hype surrounding it, but it's hardly better in its current form. As newer hdr tv's achieve higher peak brightness levels and their internal mapping keeps improving, i suspect these formats would be rendered inconsequential (unless something changes in DV's implementation in the next few years, or hdr10+ starts putting out content which looks drastically better).
MerrlinT likes this.
JLaud25 is offline  
post #5 of 11 Old 11-07-2018, 09:43 AM
AVS Forum Special Member
 
bobknavs's Avatar
 
Join Date: Feb 2016
Location: Danbury, CT, USA
Posts: 3,491
Mentioned: 26 Post(s)
Tagged: 1 Thread(s)
Quoted: 1789 Post(s)
Liked: 1247
Quote:
Originally Posted by lsorensen View Post
DV supports 12 bit. HDR10+ is 10 bit. HDR10+ is simply HDR10 with meta data updates throughout the content, which if your TV can do that dynamically on the fly should make essentially no difference (so for most OLEDs HDR10+ ought to behave the same as HDR10). Other than supporting 12 bit (which does give 4 times as many colors as 10 bit, although I am not sure how you will be able to tell them apart), DV also seems to have a more strict control over the display and how it is all handled, so in theory it ought to have a more consistent result. Time will show if that turns out to work in practice.
I wonder whether 12 bit data gives improved images on 10 bit panels, which I believe all OLED panels still are?
bobknavs is offline  
post #6 of 11 Old 11-07-2018, 12:26 PM
AVS Forum Special Member
 
lsorensen's Avatar
 
Join Date: May 2012
Location: Toronto
Posts: 2,193
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 1198 Post(s)
Liked: 697
Quote:
Originally Posted by bobknavs View Post
I wonder whether 12 bit data gives improved images on 10 bit panels, which I believe all OLED panels still are?

It might. After all the panel is 10 bit, but only covers part of the color space. The 10 bits in HDR10 covers a larger color space than the OLED panel. So it is possible that having 12bit DV for a larger color space gives a better match for the part of the color space the panel can actually produce than HDR10. After all HDR10 is allocating part of the 10 bit value range to values that the OLED panel can't handle.


The values are not linear, rather logarithmic, but for simplicity if you imagined it was linear:


If you mapped 0 to 1023 to 0 to 1000 (pretend for ease of nice numbers the panel can do that rather than 700 or 800) nits on the panel, and DV did 0 to 4095 as 0 to 4000 nits, then the panel could very accurately produce the 0 to 1023 range in the DV signal with a full 10 bits (the top two bits always being 0. Anything above 1023 would get cut off, or you do some mapping down, etc. This isn't how it works since it isn't linear, but the idea ought to still apply, just with different values.
MerrlinT likes this.

Len Sorensen

Sony XBR55A1E, Marantz SR6012, Benq W7000, Oppo BDP-93, PSB Image T5/C5/B4/Subseries 200
lsorensen is offline  
post #7 of 11 Old 11-07-2018, 01:12 PM
AVS Forum Special Member
 
Join Date: Jan 2011
Location: Cleveland,Ohio
Posts: 7,020
Mentioned: 11 Post(s)
Tagged: 0 Thread(s)
Quoted: 2977 Post(s)
Liked: 2810
Quote:
Originally Posted by bobknavs View Post
I wonder whether 12 bit data gives improved images on 10 bit panels, which I believe all OLED panels still are?
Vincent did this comparison, the 12 bit data improved banding on the 10bits oled.


losservatore is offline  
post #8 of 11 Old 11-07-2018, 02:27 PM
AVS Forum Special Member
 
bobknavs's Avatar
 
Join Date: Feb 2016
Location: Danbury, CT, USA
Posts: 3,491
Mentioned: 26 Post(s)
Tagged: 1 Thread(s)
Quoted: 1789 Post(s)
Liked: 1247
Quote:
Originally Posted by lsorensen View Post
It might. After all the panel is 10 bit, but only covers part of the color space. The 10 bits in HDR10 covers a larger color space than the OLED panel. So it is possible that having 12bit DV for a larger color space gives a better match for the part of the color space the panel can actually produce than HDR10. After all HDR10 is allocating part of the 10 bit value range to values that the OLED panel can't handle.


The values are not linear, rather logarithmic, but for simplicity if you imagined it was linear:


If you mapped 0 to 1023 to 0 to 1000 (pretend for ease of nice numbers the panel can do that rather than 700 or 800) nits on the panel, and DV did 0 to 4095 as 0 to 4000 nits, then the panel could very accurately produce the 0 to 1023 range in the DV signal with a full 10 bits (the top two bits always being 0. Anything above 1023 would get cut off, or you do some mapping down, etc. This isn't how it works since it isn't linear, but the idea ought to still apply, just with different values.
Speaking in total ignorance, I'd expect them to drop the least significant bits, not the highest ones.
bobknavs is offline  
post #9 of 11 Old 11-07-2018, 02:40 PM
AVS Forum Special Member
 
lsorensen's Avatar
 
Join Date: May 2012
Location: Toronto
Posts: 2,193
Mentioned: 3 Post(s)
Tagged: 0 Thread(s)
Quoted: 1198 Post(s)
Liked: 697
Quote:
Originally Posted by bobknavs View Post
Speaking in total ignorance, I'd expect them to drop the least significant bits, not the highest ones.

Sure, if you wanted to map the entire possible color space of HDR10 to what your panel can display. That would result in a very very dark image given HDR10 can go up to 10000 nit at the top end, and most content doesn't even hit 1000 nit most of the time. I think DV currently aims at 4000 nit as max, but I believe that can be adjusted in the meta data if there is a need in the future.


So if the content for the most part has 2 or 3 or 4 zeroes at the top for most content, why not drop those and keep the relevant bits for the part of the color spectrum you can actually display? If you do that, you can have much closer to 10 bits worth of values within the range you can use, rather than if you drop the least significant bits you end up with effectively 8 bits of actual values within what you can actually display.


If you had a 10 bit display that could produce 10000 nits, then sure dropping the least significant bits from a 12 bit signal makes sense. We don't have those yet. And a panel that could produce that level of output might even be more likely to be a 12 bit panel, if it was to exist some day.


Of course to avoid rounding errors during processing, some displays (certainly Sony does it) use 14 bit values while processing, and then reduce it to 10 bit at the end after all the work is done.

Len Sorensen

Sony XBR55A1E, Marantz SR6012, Benq W7000, Oppo BDP-93, PSB Image T5/C5/B4/Subseries 200
lsorensen is offline  
post #10 of 11 Old 11-07-2018, 03:26 PM
AVS Forum Special Member
 
bobknavs's Avatar
 
Join Date: Feb 2016
Location: Danbury, CT, USA
Posts: 3,491
Mentioned: 26 Post(s)
Tagged: 1 Thread(s)
Quoted: 1789 Post(s)
Liked: 1247
Quote:
Originally Posted by lsorensen View Post
Sure, if you wanted to map the entire possible color space of HDR10 to what your panel can display. That would result in a very very dark image given HDR10 can go up to 10000 nit at the top end, and most content doesn't even hit 1000 nit most of the time. I think DV currently aims at 4000 nit as max, but I believe that can be adjusted in the meta data if there is a need in the future.


So if the content for the most part has 2 or 3 or 4 zeroes at the top for most content, why not drop those and keep the relevant bits for the part of the color spectrum you can actually display? If you do that, you can have much closer to 10 bits worth of values within the range you can use, rather than if you drop the least significant bits you end up with effectively 8 bits of actual values within what you can actually display.


If you had a 10 bit display that could produce 10000 nits, then sure dropping the least significant bits from a 12 bit signal makes sense. We don't have those yet. And a panel that could produce that level of output might even be more likely to be a 12 bit panel, if it was to exist some day.


Of course to avoid rounding errors during processing, some displays (certainly Sony does it) use 14 bit values while processing, and then reduce it to 10 bit at the end after all the work is done.
I hadn't considered that the high-order bits would basically not be used. (Defined only for mythical high brightness displays.)
bobknavs is offline  
post #11 of 11 Old 11-10-2018, 06:01 PM
Advanced Member
 
EvLee's Avatar
 
Join Date: Dec 2009
Posts: 674
Mentioned: 6 Post(s)
Tagged: 0 Thread(s)
Quoted: 389 Post(s)
Liked: 367
Quote:
Originally Posted by lsorensen View Post
Sure, if you wanted to map the entire possible color space of HDR10 to what your panel can display. That would result in a very very dark image given HDR10 can go up to 10000 nit at the top end, and most content doesn't even hit 1000 nit most of the time. I think DV currently aims at 4000 nit as max, but I believe that can be adjusted in the meta data if there is a need in the future.


So if the content for the most part has 2 or 3 or 4 zeroes at the top for most content, why not drop those and keep the relevant bits for the part of the color spectrum you can actually display? If you do that, you can have much closer to 10 bits worth of values within the range you can use, rather than if you drop the least significant bits you end up with effectively 8 bits of actual values within what you can actually display.


If you had a 10 bit display that could produce 10000 nits, then sure dropping the least significant bits from a 12 bit signal makes sense. We don't have those yet. And a panel that could produce that level of output might even be more likely to be a 12 bit panel, if it was to exist some day.


Of course to avoid rounding errors during processing, some displays (certainly Sony does it) use 14 bit values while processing, and then reduce it to 10 bit at the end after all the work is done.
Displays aren't natively PQ. They are all gamma based and will remain so for the forseeable future based on conversations I've had with manufacturers. So when they say they are 10 bit, what that means is the display has (ideally) 10 bits precision distributed along a gamma curve spanning display black to display peak white. The internal video processing pipeline incorporates stages that convert PQ into the native display gamma encoding. All of that intermediate processing also needs sufficient precision to prevent bit loss, some places even more than 14 bits, so in reality you don't always get 10 bits precision from input all the way out through the panel. This is actually one of the areas that manufacturers are working on improving (so they can get better detail near blacks as an example). 12-bit input, even with Dolby Vision processing, isn't going to overcome other bottlenecks in the display. What it will do as a result of tone mapping is reshape the tone curve so that you may get the impression of darker blacks, but it is mostly shifting where the detail loss takes place not eliminating it.

Also, a small correction about dropping zeros... The range above 4000 nits up to 10000 nits in PQ utilizes less than 1 extra bit due to the logarithmic shape of the encoding curve, so you can't simply drop zeros. As it is, televisions are already taking the usable range of PQ that they can actually reproduce and mapping that into their native gamma encoding so that accomplishes essentially the same thing. From there the question becomes how well does a gamma sampling cover the display's luminance range, and is 10 bits precision actually enough.
DanBa likes this.

Last edited by EvLee; 11-10-2018 at 06:17 PM.
EvLee is offline  
Sponsored Links
Advertisement
 
Reply High Dynamic Range (HDR) & Wide Color Gamut (WCG)



Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off