2015-10-27, 16:41
(2015-10-27, 12:21)oldpoem Wrote: Apart from "Higher is better" mindset, I really doubt anyone would tell much (if any) differences when output in 8bit /10bit/12 bit. Because 99% of media files now coming from 8bit source even if Hi10P files usually encode with 10bit encoder from 8bit source anyway.
The major difference is RGB or 4:4:4, but 2160P at 24Hz isn't 60Hz, HDMI 2.0 should be capable of 2160p 24Hz 10bit RGB.
Does 8bit->10bit upscale barely make any difference? As there's about x4 as much colors? I haven't tested it, but i do know that newer 4k Blurays will be 10bit 4:4:4 or RGB (Rec 2020), along with the dolby vision backlight program.