Originally Posted by hk2000
I can't believe how many "videophiles" are fooled by this!!! HDR looks FAKE and very unrealistic, I for one would never turn it on, let alone pay extra for it.
I believe that you are under the impression that we're talking about the HDR in the phones. We're not, this thread has nothing to do with that form of HDR. Go read up on HDR for televisions so you can understand what we're talking about. Completely different thing.
Originally Posted by Gillietalls
IMO 1080p was good enough. And I believe most people are with me on that. The bump in resolution hasn't made that much of a difference and HDR10 in it's open standard leaves too many variables on whether it will look good or not. IMO the manufacturer's should have found a way to better 1080p material by adding rec2020 color and high frame rates to regular blu rays would have been great. Again, just my opinion.
It was good enough for most people, since the screen size and where they sit from (most of the time) make the benefits null in some situations.
But you're missing the point. Is there really any reason to stop increasing resolution as long as it's free? We know we will get some kind of benefit until we reach close to phone like PPI.
A lot of people in this sub seem to have a hard time grasping that video has been on the information technology bandwagon for years now. And there comes a point where more pixels are no challenge for the manufacturer / content provider anymore, so why keep producing old televisions, when you can have the new ones be better for free?
Regarding SDR and Rec.2020 support and calling a day... If you're going to change standards, why stop there? We've had most of the components that make up an "HDR TV" for years now. You have to remember that SDR was made with CRT's in mind, and those haven't been around for about 15 years now.
I'm glad that if they were going to push a new format out. they took all of the advancements we've had over the last 2 decades and included them in the new standard.
Originally Posted by goksucats
It's like you're speaking a different language then the one I've been trying to learn as I attempt to grasp everything that is 10-bit, and HDR! I thought that the very point of the move from an 8 bit to a 10-bit panel was for the wide color gamut? So you're saying Samsung 6 series TV's which claim they have HDR, also have 10 bit panels, but not a wider color gamut? Why have a 10 bit panel if you're only displaying rec.709 color? I thought the WCG was necessary to eliminate banding to account for the brighter screens?
If what you're saying is true, then apparently everything I've tried to pick up over the course of the last year is completely incorrect. I have seen people claim that TV's like Samsung's 6 series claim to be HDR compatible, but without a wider color gamut so you are missing much of the benefit of HDR. But what you're saying flies in the face of everything that I feel like I've learned. And I'm not saying you're incorrect. But I am saying I'm totally confused now.
Yes, most low end 2016 HDR televisions (not just Samsung) have 10 bit panels, but no WCG. because they don't have quantum dots.
10 bit helps a lot with banding, regardless of the color gamut you're using. I've seen the difference between 10 bit HDR and 8 bit HDR on my TV playing "shadow warrior 2". (PC game with HDR support) and 10 bit looks perfectly smooth, while 8 bit looks like crap because of all the banding. So the source of the banding is more because of the extra dynamic range than the color gamut.
The thing to realize here, is that only a TV with WCG and 1000 nits is showing (or should be showing) the image as it was graded.
Everything else is using software to compensate for their shortcomings. Some brands have better algorithms than the others.
For instance, I believe only Samsung sets do proper tone mapping for anything above 1000 nits in HDR10. They also do pretty good gamut conversions so the colors don't look off when using different gamut. (of course, you're bound to miss some colors regardless, especially if it's "man made" colors like neon, most "natural" colors are fine with Rec.709)
But, even with a Rec.709 gamut, you still get a lot more color volume than just using SDR. So you still get a lot more colors than you would just using SDR on those sets. SDR will look very washed out in comparison. (I've tested this in Resident Evil 7 extensively. the colors are much more vivid in HDR, with just the 709 gamut)
When it comes to Dolby Vision. there are zero sets out there than can play the content the way it was graded. All of Dolby Vision content is graded at 4000 nits. so every single Dolby Vision television out there is doing tone mapping to compensate.
But that's a good thing. Because unlike HDR10, it means that you know you're gonna get the tone mapping, regardless of what brand TV you bought, as long as it supports DV.
With HDR10. You're kind of at the mercy of how good is the software of the manufacturer you bought.
In the case of Samsung, I know it's pretty good. Since it does gamut remapping and tone mapping. And as soon as HDMI 2.0a gets upgraded via firmware for all 2016 modes, we'll also get dynamic meta data, just like dolby vision.
Dynamic metadata is meant to help with low APL scenes. by making the whole color gamut available even in low APL. (in HDR10, or SDR, what happens is, you only have a fraction of the gamut available in a low APL scene)
I hope I cleared up some of your doubts.