Forum Jump: 
 
Thread Tools
post #1 of 3 Old 12-24-2018, 06:22 PM - Thread Starter
Member
 
MANiaC3173's Avatar
 
Join Date: Sep 2012
Posts: 36
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 4 Post(s)
Liked: 12
Hi guys! I was just about to spoil myself with a new monitor LG 32GK850F to replace my 5 year old 32" Sammy UE32E6100. The LG looks stunning and going by specs all was great but moments before purchasing i realised some people point out this is a 8bit display hence not true HDR. LG's site boasts its display is HDR capable (HDR10,VESA DisplayHDR400) but what will i be missing out if i didnt get a 10 bit display? I have some HDR content with the 10bit encoding, will they show as intended on a 8bit display such as this? Im guessing not hence hesitating to buy it.

If i should hold of from this display is there an alternative display? Needs to be 32" 1440p HDR.

Many thanks!

Last edited by MANiaC3173; 12-24-2018 at 06:41 PM. Reason: Grammar
MANiaC3173 is offline  
Sponsored Links
Advertisement
 
post #2 of 3 Old 12-26-2018, 03:40 AM - Thread Starter
Member
 
MANiaC3173's Avatar
 
Join Date: Sep 2012
Posts: 36
Mentioned: 0 Post(s)
Tagged: 0 Thread(s)
Quoted: 4 Post(s)
Liked: 12
Ah decided to abandon this altogether. Too much configuration to get my head around. I have an i5 6500, a GTX 1060 6gb and getting this monitor i presume i wont even be able to benefit from its hdr feature because of freesync2 is a AMD technology, which means id have to change gpu or my cpu to an APU. What a bummer!
MANiaC3173 is offline  
post #3 of 3 Old 01-01-2019, 08:20 PM
Advanced Member
 
obveron's Avatar
 
Join Date: May 2008
Posts: 564
Mentioned: 1 Post(s)
Tagged: 0 Thread(s)
Quoted: 99 Post(s)
Liked: 48
The biggest issue is that monitor doesn't have FALD backlights. Which is pretty much essential to display HDR properly. That aside, that is a very nice monitor.

You concerns over 8bit VS 10bit are misplaced. Very few displays are true 10 bit. The key is that it has a wide gamut that nearly covers the dci-p3 color space. HDR10 will be accurately converted to 8bit by the panel. Yes, there will be some banding due to it not being a true 10 bit panel. But that's not as bad as it sounds.


Also your GeForce 1060 can work fine in HDR10 on this panel. You only lose the variable refresh rate feature by not using a AMD GPU. If you're not too concerned with VRR, then there's no issue with your GeForce card.


If you want absolute perfect HDR on a pc display, you are right to pass. But not for the reasons you posted. The only real problem with HDR on this display is the lack of FALD.
obveron is online now  
Sponsored Links
Advertisement
 
Reply High Dynamic Range (HDR) & Wide Color Gamut (WCG)



Forum Jump: 

Posting Rules  
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Trackbacks are Off
Pingbacks are Off
Refbacks are Off