2018-08-27, 18:04
(2018-08-27, 06:44)noob00224 Wrote: @Warner306If that image is accurate, that is brutal banding. Is that HDR -> SDR or SDR?
Reduce banding artifacts in madvr were both set to high, tried it without, still get banding.
Also updated to 398.82, no difference.
I did the required tests to see if the display can actually see 10 bit and was fine. I put these in MPC HC and did not see the 256 grading (disabled the 10 bit software emulation/FRC):
http://www.bealecorner.org/red/test-patt...-16bit.png
http://i.imgur.com/qv2qxVq.png
With 8 bit 256 gradiants are visible.
The source which I noticed is The Greatest Showman, both in HDR and SDR.
https://ibb.co/d4aU7p
Do you get the same outcome with both 12-bits and 8-bits? If your display is FRC, you would probably get the best results with 12-bits if the algorithm is good. I can't tell you if 12-bit or 8-bit will look the best on your projector, but if you aren't seeing banding on gradients with dithering enabled, the projector could be creating banding. There is a difference between rendering a simple gradient test and a scene like that. I can't understand why it would be that bad. If the GPU is simply using passthrough, it should be madVR's fault. You could try another media player to see if it is any better. Setting chroma upscaling to Bilinear might mask some of that banding, but not much.