2015-11-02, 17:43
(2015-11-02, 17:03)Peekstra Wrote:(2015-11-01, 22:19)XTrojan Wrote: I know 10bit is already restricted to pro cards
Nvidia also supports 10bit output on a normal card since an updated driver release earlier this year. However, in the end you also need to know if your display is able to actually render the image at 10 bit. This can be difficult to know for sure because some displays accept 10bit video but will downsample it to 8 bit thus giving a false impression of it's capabilities.
You can find more info in this thread: http://forum.doom9.org/showthread.php?t=172128
How do i get the "10bit" option output? I only have 8bit and 12bit available.
My TV is an 10bit panel (JU7100), I doubt it would downsample as it's latest gen, altough i don't trust Samsung.