Win HOW TO - Kodi 2D - 3D - UHD (4k) HDR Guide Internal & External Players ISO Menus
(2018-08-27, 06:48)Mount81 Wrote:
(2018-08-26, 19:29)Warner306 Wrote:  
I have read on AVS that some TV's even with 10bit panel, would show less banded PQ with 8bit HDR, than with 10 bit HDR input. They explained it with that these TV's -somehow- making the 8bit dithering better, than the 10bit dithering, so even in the case if both the source video and the display panel is 10bit, setting the output to 8bit, would look just simply better. How common could this be? Or is this just true mostly in the case of the "fake" (8bit + FRC) 10 bit panels? On top of that and more confusingly on the subject, I've also read a couple of times, that even when both the video and the display is just 8bit, one will still could have a slightly better and more banding-free PQ when the output is rather set to 10bit (I'm juts guessing, but maybe because the playback device does some "upscalling dithering" in this case). So what's truly behind these contradicting observations and opinions?

I have seen many 10 vs 8bit comparison example images in some relevant articles before, that hugely assured me regarding the undeniable and clearly perceptible advantages of a (truly) 10bit picture compared to the 8bit only "version", mostly outstanding in the much smoother and banding-free color transition. If I whenever would buy a new 10bit / HDR TV, one of the main reasons would surely be these presumed and expected advantages of the 10bit color transition, as the banding is sometimes very perceptible on my current FHD, 8bit one (Samsung H6400). So I definitely would like to avoid any possible according disappointment.  
 An 8-bit display with FRC can sometimes render smoother gradients than a 10-bit panel, but it does this by adding nearly invisible noise. So it is cheating.

Every display would use its own proprietary processing and could process the input in different bit depths, producing different results. You would have to read a review for a display, check the results of the 10-bit gradient test and look for any mention of banding or posterization issues with actual content. It could be more than just bit depth that causes display banding. Sony and Samsung are known not to release the bit depth of its panels, and it is quite likely they are still mostly 8-bit FRC if they are 120 Hz.

In terms of what we can see, 8-bits with dithering and 10-bits with dithering on a good display would look almost identical. There are 16.7 million color shades in an 8-bit source without dithering and human beings can only see 10 million colors across the visible spectrum. If you can smooth difficult gradients, the two should be nearly identical. The shots showing a huge advantage for 10-bit video would be doctored for marketing purposes. If you master and compress an UHD video, you would need at least 10-bits or the video would come apart and show banding when it arrived on the disc.

With that said, 10-bits does offer a very small theoretical advantage because it has a slightly lower noise floor compared to an 8-bit source with dithering. I would prefer a 10-bit panel, but it likely wouldn't matter if it was an 8-bit FRC panel with good processing.
Reply


Messages In This Thread
RE: Kodi 3D Guide - by brazen1 - 2015-06-15, 22:20
RE: Kodi 3D Guide - by brazen1 - 2015-06-15, 22:29
RE: Kodi 3D Guide - by michaelsammler - 2015-06-22, 00:38
RE: Kodi 3D Guide - by brazen1 - 2015-06-23, 17:15
RE: Kodi 3D Guide - by michaelsammler - 2015-06-24, 02:23
RE: Kodi 3D Guide - by brazen1 - 2015-06-24, 17:22
RE: Kodi 3D Guide - by michaelsammler - 2015-06-25, 09:26
HELP! - by brazen1 - 2018-08-22, 18:50
RE: HELP! - by zxaura1 - 2018-08-24, 09:32
RE: HOW TO - Kodi 2D - 3D - UHD (4k) HDR Guide Internal & External Players - by Warner306 - 2018-08-27, 18:18
Problem with DXVA scaling videos - by obstler - 2018-10-27, 10:13
Logout Mark Read Team Forum Stats Members Help
HOW TO - Kodi 2D - 3D - UHD (4k) HDR Guide Internal & External Players ISO Menus39