2015-10-27, 22:25
(2015-10-27, 20:07)huhn Wrote:(2015-10-27, 16:41)XTrojan Wrote: The major difference is RGB or 4:4:4, but 2160P at 24Hz isn't 60Hz, HDMI 2.0 should be capable of 2160p 24Hz 10bit RGB.
Does 8bit->10bit upscale barely make any difference? As there's about x4 as much colors? I haven't tested it, but i do know that newer 4k Blurays will be 10bit 4:4:4 or RGB (Rec 2020), along with the dolby vision backlight program.
the UBD doesn't use 4:4:4 at all and RGB isn't supported too.
the bit deep doesn't increase the number of colors. it increases the steps between each color nothing else.
simply changing 8 bit to 10 bit is done using padding.
10 bit output is pretty much worthless with lossy 8 bit source and most TVs dither this to 8 bit anyway.
The bit depth does increase the number of colors, but only between the existing black and white. The range remains the same. I believe I can see the decrease in noise on my TV vs. ordered dithering, so I don't believe it is worthless. It does appear to increase image quality, at least, to my eyes.
This is what Google says:
"That sounds like plenty, and to the naked eye, it is. But subtle differences between those 256 shades, impossible with 8-bit color, can help create depth and a sense of reality. That's where deep color comes in. With 10-bit color, you get 1,024 shades of each primary color, and over a billion possible colors."