Android "Google Chromecast with Google TV" dongle with a new "Google TV" ecosystem and UI
(2021-01-21, 22:13)Tamas.Toth.ebola Wrote:
Quote:HDR content can exist in both wide Rec 2020 and narrow Rec 709 gamut...
This is exactly what I wrote about still images. Basically all images which have more then 8 bit luminance resolution in the file itself defined as HDR images independently from their targeted color gamut and also independently from their target real word luminance. Sadly as wrote in this case there is no real word luminance defined as any type of standard so at the end there could be simply such 10 bit images what have no real high dynamic range in their content and also no have wide color gamut informations _(I mean referenced to the source of the image)_.
In video - bit depth has no relationship to display dynamic range - 10-bit video can be used to carry SDR and HDR content. In consumer video - DVD and HD Blu-ray are both 8-bit formats, but I have quite a lot of 10-bit SDR HD content I have mastered myself from 10-bit broadcast video masters (with massively reduced banding artefacts as a result). In consumer devices h.264/AVC decode is usually limited to 8-bit, but h.265/HEVC can be 10-bit (in both SDR and HDR dynamic ranges), My SDR 10-bit HD content is in Rec 709 gamut, in h.265/HEVC.

In video - there should always be a real world dynamic range that is either implicit or defined. For Rec 601 and Rec 709 - if no EOTF is defined in the video, then it is expected to be a Power Law Gamma (of around 2.2-2.4 I believe), as that is the implicit definition in the respective Rec 601 and Rec 709 standards.

If BT.1886 EOTF is defined as the dynamic range, that is the newer SDR standard (which is often used for SDR Rec 2020 WCG content).

If ST.2084 EOTF is defined as the EOTF, then this is a PQ standard that defines a precise relationship between display pixel absolute light output (in nits) and the source video value. This allows you to set a pixel to 100nits, 50nits, 500nits etc. (with the display handling pixel values that it cannot display correctly)

If HLG (ARIB B-67) EOTF is defined then this flags that the video is in Hybrid Log Gamma format, which doesn't mandate a fixed light-output to pixel-value mapping, but does define how this content should be displayed on a high dynamic range display.

HLG is backwards compatible with SDR (the EOTF tracks the SDR Power Law Gamma EOTF for around 60-70% of the dynamic range) within the same colour gamut, whereas ST.2084 is totally different to SDR Power Law Gamma EOTF.

In other words - HLG treated as SDR looks 'OK' (within the same colour gamut), HDR10 doesn't.
Quote:
Quote:- though in consumer video pretty much all HDR content is Rec 2020.
And here we are. Seems in consumer practice HDR content is that content what use more then 10 bit resolution to store luminance values and have wide color gamut (perhaps P3 in HDR10 and BT.2020 in Dolby Vision but you know it better). So technically one 10 bit and originally BT.2020 targeted file in theory is a HDR file. On SDR display devices these files need DCR (it's type is irrelevant). And at this point we did not talked about the tone curve to store those color*luminance values properly aligned to real world luminances. Just got the facts that the file 10 bit and it's color values need to targeted to BT.2020 RGB primaries. And here I directly not go further as from this point the viewing environment also such important that the file content itself (and the display device). For proper dynamic range we should define one exact viewing environment and use a display device what can really reach some required luminance results, etc. I think (and you now) Dolby Vision and HDR10 as standards contain these type of regulations. But we from the aspect of Kodi sadly not yet arrived here.

Afraid I don't understand what you are saying here. HDR10 and Dolby Vision (if you ignore their IQ standard), doesn't include any tailoring for viewing environment - it's PQ-based (1:1 defined mapping between display light output and video signal levels), with additional metadata in the stream to let the display optimise its output for out-of-range values (i.e. a display that can only go to 700nits has to be able to cope with 1000nit sources and display them in a way that looks OK). These standards are 'display referred'. HLG has ambient viewing conditions compensation incorporated into the standard, and doesn't mandate a 1:1 light-output to pixel-value EOTF, and is 'scene referred' (as is SDR Power Law Gamma) so this gives the standard a lot more flexibility in end-user options for display within the standard.

Dolby Vision includes a new-ish "IQ" variant I believe - which is kind of an admission that PQ isn't ideal for home display viewing... (i.e. you can't mandate fixed pixel light values for a display that is viewed in darkness and in bright sunshine...)

My guess is that most files that 'look wrong' in Kodi are probably amateur re-encodes where the HDR metadata has been lost in conversion. (ffmpeg can be tricky to use to retain HDR metadata)

And to get back on topic - Kodi on a number of platforms handles HDR10 and HLG HDR content that has been correctly mastered with few problems - either by handing off the content to Android and letting it deal with it, or by Kodi and/or the OS having the correct device-specific support implemented (AMLogic, Rockchip etc.). Support on x86 hardware is more patchy in Linux, though HDR10 on Windows is entirely possible (again because there is OS-level support for HDR10, though this may not pass-through some display optimisation metadata specific to the content being played that is contained in addition to the basic EOTF and Gamut metadata)
Reply


Messages In This Thread
RE: "Google Chromecast with Google TV" dongle with a new "Google TV" ecosystem and UI - by noggin - 2021-01-22, 06:36
Logout Mark Read Team Forum Stats Members Help
"Google Chromecast with Google TV" dongle with a new "Google TV" ecosystem and UI0