2020-01-26, 17:47
(2020-01-26, 12:13)Warner306 Wrote: That is what is output from the GPU, which can be different than the output from madVR.
madVR can dither HDR content from 10-bits to 8-bits without any consequences or banding, so there is nothing wrong with that. But you should set madVR to 8-bits to match the GPU, or otherwise the GPU is doing the dithering from 10-bits to 8-bits rather than madVR.
I want the HDR content to reach the TV in 10bit, so shouldnt it read "NV HDR 10bit" then?
GPU is an RTX2060
madvr is set to pass through HDR to the TV.
when I enable HDR in windows 10 then then I get "OS HDR 12 bit" in the madvr overlay instead of "NV HDR 8bit".