(2016-06-20 11:00)Zokkel Wrote: Fritsch,
I hope you can help me out with some "confusion" about HDR.
Does a 10-bit capable system automatically means HDR support? Or is that a whole different hardware implementation.
I understand that you need 10bit and minimum HDMI 2.0 for that wider color gamut (BT.2020), so I can conclude that Kaby Lake will support that natively?
Can i make the same conclusion about HDR?
In other words: Does a 10bit, hdmi 2.0 system meets the requirements to play "Ultra HD Premium certificate" material? (Image Resolution: 3840×2160, Colour Bit Depth: Minimum 10-bit signal, Colour: BT.2020 colour representation, High Dynamic Range: SMPTE ST2084 EOTF)
If you have a link to a good article, just put me through. I will read it, cause I don't understand that part very well of the whole 4K-story...
TL;dr IMHO it's too early to tell how HDR will be handled by Kodi, Hardware manufacturers, SoC producers etc.
This is quite a good primer on the issues surrounding HDR (clue - 10 bits isn't really enough - so you still need some processing to get a 12-14 bit signal into 10 bits) http://downloads.bbc.co.uk/rd/pubs/whp/w...WHP283.pdf
Part of the paper is general, then it moves into the BBC/NHK Hybrid Log Gamma arena I think. (Just skimmed it)
This also looks worth a read : http://www.lightillusion.com/uhdtv.html
ST2084 EOTF is only part of the equation. Dolby Vision and HDR-10 both use it (BBC/NHK HLG doesn't AIUI as their HLG replaces it?)
Dolby Vision approach here http://www.dolby.com/us/en/technologies/...-paper.pdf
Looks like explicit Dolby Vision support is required (not just 10 bit decode) I assume the 12 vs 10 bit arguments are a (not so) subtle dig at HDR-10?
When it comes to HDMI output : http://www.flatpanelshd.com/news.php?sub...1457513362
AIUI HDMI 2.0a (or internal connectivity) is required for HDR-10, but Dolby Vision 'tunnels' through an HDMI 2.0 connection? Both are stuck with static grade metadata (one of the aspects of HDR is that metadata is sent to the display telling it what to do with the video it receives AIUI) whereas HDMI 2.1 should add scene-by-scene metadata support (though if Dolby Vision tunnels does it need HDMI 2.1?) Or am I misunderstanding this.
However it appears that HDR-10 and Dolby Vision both require specific metadata handling in addition to just sending 10 bit video over HDMI, so 10 bit decode and output may not be enough.