• 1
  • 3
  • 4
  • 5(current)
  • 6
  • 7
  • 42
4K HDR10 - State of Play - important media player limitations - LAST UPDATE sept 2020
#61
1. You don't need SSH. You can either enable under Settings -> Display; or type in the  hotfix code (https://discourse.osmc.tv/t/10-bit-and-b...am_nazarko).
2. The changes are persistent after this; but of course can be reverted if necessary (i.e. moving to non HDR display).

Hope this helps. Let me know if you have any more questions.

Sam
Reply
#62
Sam - is this the fix for 10-bit output rather than HDR per se?  ISTR that I got HDR correctly EOTFs flagged over HDMI without the command line prompt - but they were flagging video that was 8-bit not 10-bit.  The command line tweak wasn't to enable HDR, but to enable 10-bit output?  Or has this changed?
Reply
#63
It is indeed for 10-bit output. While colour space, bit-depth and HDR are not always correlated, I was simplifying it a little bit here Smile

I have also added support for manually toggling BT2020 and HDR and HLG EOTFs from sysfs for debugging purpose.
Reply
#64
Does anyone know if HDR is supported on Fire TV 3 (4k) for HDR10 MKVs with Kodi? Just got one after hearing refresh rate issue has been fixed (and this is a decent hardware for $35). Thanks!
Reply
#65
Fire TV as decent hardware? Hmm, interesting, decent for what? And far as I know HDR is still not supported on any device with KODI.
Reply
#66
(2018-08-08, 05:53)Mount81 Wrote: And far as I know HDR is still not supported on any device with KODI.
 I watch HDR all the time on Kodi and my Shield TV.
Reply
#67
Yea and Amlogic with LE (still not sure if it's proper 10bit HDR BTW). After long development specializing the KODI capabilities for these particular SOCs. Any other device you know about that can do HDR with the official KODI fork?
Reply
#68
(2018-08-08, 05:53)Mount81 Wrote: Fire TV as decent hardware? Hmm, interesting, decent for what? And far as I know HDR is still not supported on any device with KODI.
 nVidia Shield TV supports HDR10 output on Kodi. (There isn't automatic colour gamut switching over HDMI so if you run with Rec2020 gamut output any Rec 709 content is remapped into Rec 2020 colour space rather than the HDMI format switching)

There is support for HDR10 on AMLogic platforms too. The S905X and S912 SoCs support 10-bit decode and HDR10 output over HDMI.  Some kernels on the S905X may include a dithered 8-bit process (After testing @wesk05 has stated the Vero 4K - which is S905X based doesn't have this limitation)  They support dynamic Rec 2020 / Rec 709 colour gamut switching.

AIUI the Apple TV 4K running MrMC (a fork of Kodi Krypton) also has HDR10 output (though it may be doing so with fixed metadata still due to limitations in tvOS?)

For a player to output HDR10 content to an HDR display effectively it needs :

1. To decode HEVC 10-bit and output it as 10-bit with a clean 10-bit path all the way through.
2. Correctly flag the colour gamut of the source material (i.e. Rec 2020 for most) in the HDMI stream
3. Correctly flag the EOTF of the source material (i.e. ST2084 for HDR10 - this is what says 'I'm HDR') in the HDMI stream
4. Correctly flag the static HDR10 metadata of the source material (which is stuff like Max light level, average light level, white point etc.)
5. Output in specific HDMI video formats for some displays to avoid issues (There are multiple options for HDMI video output - YCbCr 4:4:4, YCbCr 4:2:2, YCbCr 4:2:0 etc. and some are more supported than others, and some are only available at certain bit depths, or resolutions and frame rates)
Reply
#69
Amlogic Boxes with Android OS are all lame with KODI (or with any other player app), so they definitely need the the OS LE or CE and their "version" of KODI. (Boxes like Minix and Zidoo also doesn't count as they have their own specialized KODI fork developed fro Android->And thus so does not Apple with their MrMC). 

It would be informal and clarifying to read more specific and concrete regarding the Amlogic vs proper 10bit output issue. The info fragment I'we found so far were very contradictious and mind-scramblingly evading or mystifying oppositely any assurance. 

And what is the source of your presumption, that ATV 4K (or just the MrMC) has some limitations with HDR output?
Reply
#70
(2018-08-08, 14:00)Mount81 Wrote: And what is the source of your presumption, that ATV 4K (or just the MrMC) has some limitations with HDR output?
  
@wesk05 reported it initially I think - and I think @davilla confirmed it was a tvOS not MrMC issue, as he confirmed he was passing the correct metadata? It may have been fixed in tvOS since then.

It's relatively easy to confirm if you know the source metadata and have an HD Fury Vertex or similar which will display the metadata values in the HDMI stream.
Reply
#71
(2018-08-08, 05:53)Mount81 Wrote: Fire TV as decent hardware? Hmm, interesting, decent for what? And far as I know HDR is still not supported on any device with KODI.
 Any device that can do HEVC 4k@60Hz is decent in my book, especially at the price point. Now throw in Netflix, Amazon VOD, DirecTV Now and SlingTV capabilities - it's more than decent for almost anybody at $35. Yes, sale price, but $70 is still way cheaper if one doesn't need all the power (and awkward looking box) of Shield.

What device you consider decent? My far more expensive (and power hungry) HTPC (i5 6500/16g) used to struggle with 4K until I added a GTX 1050. Others have already responded on the devices supporting HDR and I'd add that I have been enjoying 4K HDR for almost a year with Kodi DSPlayer build. That was the whole purpose of getting the 1050. The main point of FTV would be to make it more wife friendly to avoid support calls while I am traveling.
Reply
#72

 
(2018-08-08, 10:16)noggin Wrote: For a player to output HDR10 content to an HDR display effectively it needs :

1. To decode HEVC 10-bit and output it as 10-bit with a clean 10-bit path all the way through.
2. Correctly flag the colour gamut of the source material (i.e. Rec 2020 for most) in the HDMI stream
3. Correctly flag the EOTF of the source material (i.e. ST2084 for HDR10 - this is what says 'I'm HDR') in the HDMI stream
4. Correctly flag the static HDR10 metadata of the source material (which is stuff like Max light level, average light level, white point etc.)
5. Output in specific HDMI video formats for some displays to avoid issues (There are multiple options for HDMI video output - YCbCr 4:4:4, YCbCr 4:2:2, YCbCr 4:2:0 etc. and some are more supported than others, and some are only available at certain bit depths, or resolutions and frame rates)
For a device that natively supports HEVC and HDR, can Kodi not use the default player once it disassembles the MKV into a video stream? Something like the DSPlayer build that takes advantage of MadVR that's already installed.

Wonder if MrMC maintains feature parity across device types - 3.6 supports HDR on ATV.
Reply
#73
(2018-08-08, 14:00)Mount81 Wrote: It would be informal and clarifying to read more specific and concrete regarding the Amlogic vs proper 10bit output issue. The info fragment I'we found so far were very contradictious and mind-scramblingly evading or mystifying oppositely any assurance. 
And what is the source of your presumption, that ATV 4K (or just the MrMC) has some limitations with HDR output?

I know this whole thing about S905X not having true 10-bit processing started with a comment from BayLibre developer. I have checked the output on Vero 4K and it does seem to have a true 10-bit output (or is employing a magical dithering algorithm).

If you are sure that your TV can do proper 10-bit rendering, then you can quite easily check whether the output of a device is true 10-bit or not with the Quants2D test patterns - http://test.full.band/index.html. You really don't need to have any expensive equipment to do this.

If it's true 10-bit, you would see 450 distinct shades of gray squares (this would suggest 1-bit LSB accuracy; you will need an analyzer to confirm it). Something like shown below. In this image, I have changed the "exposure" to visualize the distinct squares. You can see a few sticking out. That is because of rounding errors in the decode process.

Image

If the output is truncated/dithered 8-bit, then you would see something like shown below:
Image

As for the question regarding ATV 4K, it has been known for quite some time now that tvOS replaces the original HDR10 metadata with its own generic metadata. This could potentially alter the tonemapping on at least some displays that actually considers HDR10 metadata.
Reply
#74
(2018-08-09, 00:41)wesk05 Wrote: As for the question regarding ATV 4K, it has been known for quite some time now that tvOS replaces the original HDR10 metadata with its own generic metadata. This could potentially alter the tonemapping on at least some displays that actually considers HDR10 metadata. 

This makes me ask two questions :

1. Does it do this for Netflix HDR10 or Amazon Prime HDR10 (Assuming they have varying HDR metadata show by show, movie by movie)
2. Presumably it doesn't do this for Dolby Vision - as Dolby wouldn't approve a platform that did (and it nullifies one of the main USPs of Dolby - dynamic metadata?)
Reply
#75
(2018-08-08, 04:11)rexian Wrote: Does anyone know if HDR is supported on Fire TV 3 (4k) for HDR10 MKVs with Kodi? Just got one after hearing refresh rate issue has been fixed (and this is a decent hardware for $35). Thanks!
It very likely does HDR10 with Kodi. Untested personally tho. No idea if auto refresh switching is working or not.

The FireTV gen3 uses a AMLogic S905Z SoC, and that S9xx chipset usually switches from SDR <--> HDR automatically independent of the OS used. The Kernel handles such matters.

Reply
  • 1
  • 3
  • 4
  • 5(current)
  • 6
  • 7
  • 42

Logout Mark Read Team Forum Stats Members Help
4K HDR10 - State of Play - important media player limitations - LAST UPDATE sept 20209