2018-04-06, 06:42
I just went through this a week ago again so I'll share. When I use 12bit with NVidia, it introduces banding. You should check what happens when using AMD. If your display supports 12bit this may not be a problem no matter what GPU you use but still worth a check. Also, you should consider matching the source as well be it 8, 10, or 12bit and report your findings. All SDR is 8bit. Most HDR is 10bit. I think DV is 12bit. A good test is the movie Allied 2016. Scene 2:15 through 3:00. If you see bands in the sky, reduce the 12bit to 10 or even 8. If you don't, leave it at 12bit.
"Is that what you select? 23hz RGB 444 12 bit? and then RGB 444 8bit for 60hz?"
Yes, that is what you want to achieve unless banding occurs. Honestly the difference between 8bit, 10bit, and 12bit is slight to unnoticeable.
"Is that what you select? 23hz RGB 444 12 bit? and then RGB 444 8bit for 60hz?"
Yes, that is what you want to achieve unless banding occurs. Honestly the difference between 8bit, 10bit, and 12bit is slight to unnoticeable.