2018-10-07, 16:58
No, there is no loss of quality. If you wanted to be really anal, you could set up two profiles in madVR and output from madVR at 8-bits when the content is above 30 fps and 10-bits when it is 30 fps or below to eliminate any extra dithering by the GPU.
If (deintFps <= 30) "10-bit"
else "8-bit"
I've never heard of an SDR display that can't handle 12-bits without creating banding. Only recent UHD displays seem to have this issue. My best guess is they are all 10-bit panels and 10-bit panels are new and haven't been perfected yet. I would guess they would also show some banding with a 10-bit input. The extra 2-bits shouldn't matter. The majority should handle 12-bit inputs as well or better than 8-bit inputs.
If (deintFps <= 30) "10-bit"
else "8-bit"
I've never heard of an SDR display that can't handle 12-bits without creating banding. Only recent UHD displays seem to have this issue. My best guess is they are all 10-bit panels and 10-bit panels are new and haven't been perfected yet. I would guess they would also show some banding with a 10-bit input. The extra 2-bits shouldn't matter. The majority should handle 12-bit inputs as well or better than 8-bit inputs.