2018-03-26, 12:35
ah yes, the old 8/10/12 bit 444 full RGB vs YBCR - this is another minefield.
If you have a lot of 4k HDR material and you have an AMD card then you want to run your PC in 10 bit mode, however your pc wont support 4:4:4 FULL RGB 60hz in 10 bit. So, you either have to set your desktop at 4:4:4 8bit and hope it switches to 10bit when you play a HDR or 4k Movie or like me sack it off and just run the entire show in 4:2:0 or 4:2:2 10 bit.
When I run my PC in 4:2:2 10 bit I get drops outs now and again and I havent been able to track down where they come from, i'm on my 3rd set of HDMI cables, so I run my PC in 4:2:0. I am aware this can reduce quality due to an extra conversion from 4:2:0>4:4:4 RGB for output but I see no difference at all in quality at all, nada, nothing.
for now anyway i've opted for stability and simplicity over the best possible picture quality as the difference must be so tiny in real world viewing that i cant see if from 8ft away on my 65 inch TV, so this argument for me is moot.
If you have a lot of 4k HDR material and you have an AMD card then you want to run your PC in 10 bit mode, however your pc wont support 4:4:4 FULL RGB 60hz in 10 bit. So, you either have to set your desktop at 4:4:4 8bit and hope it switches to 10bit when you play a HDR or 4k Movie or like me sack it off and just run the entire show in 4:2:0 or 4:2:2 10 bit.
When I run my PC in 4:2:2 10 bit I get drops outs now and again and I havent been able to track down where they come from, i'm on my 3rd set of HDMI cables, so I run my PC in 4:2:0. I am aware this can reduce quality due to an extra conversion from 4:2:0>4:4:4 RGB for output but I see no difference at all in quality at all, nada, nothing.
for now anyway i've opted for stability and simplicity over the best possible picture quality as the difference must be so tiny in real world viewing that i cant see if from 8ft away on my 65 inch TV, so this argument for me is moot.