2018-05-27, 11:36
So is it confirmed that S905X and S905D impose an 8-bit footprint on a 10-bit HDR (or SDR) signal? That's a pity if that is the case. I know lots of HDR displays are using 8-bit panels with processing to hit 10-bit dynamic range, but it strikes me that if you have a choice of a device that delivers a clean 10-bit path rather than 10->8->10 which has to use dither to hide the 8-bit quantisation errors, that's a significant issue to be aware of. Given that you are using a greater dynamic range with HDR content, the 10 vs 8 bit processing is more of an issue than it would be for an SDR 10-bit source (which is rare in the consumer space)
AIUI that's a significant benefit of the S912 SoC isn't it - as it DOES offer a clean path that doesn't require an 8-bit process somewhere in the chain. I'm assuming that the Apple TV 4K supporting Dolby Vision means it's a clean path, and have heard no suggestion that the nVidia Shield TV imposes an 8-bit process either?
AIUI that's a significant benefit of the S912 SoC isn't it - as it DOES offer a clean path that doesn't require an 8-bit process somewhere in the chain. I'm assuming that the Apple TV 4K supporting Dolby Vision means it's a clean path, and have heard no suggestion that the nVidia Shield TV imposes an 8-bit process either?