Posts: 3,816
Joined: Mar 2006
Reputation:
151
I don't understand why the talk has moved from the original, reasonable request, to use of extra hardware. DV tone mapping, to either SDR or HDR, appears to be doable on all platforms, regardless of DV support. You won't get Dolby Vision but and SDR or HDR conversion of it. I don't know what quality that would entail (or why somebody with HDR capable screen wouldn't just use a HDR source) but there's no need of extra hardware for that.
For troubleshooting and bug reporting please make sure you
read this first (usually it's enough to follow instructions in the second post).
Posts: 1,100
Joined: Oct 2011
Reputation:
375
jogal
Team-Kodi Member
Posts: 1,100
2023-11-03, 20:27
(This post was last modified: 2023-11-03, 20:28 by jogal. Edited 1 time in total.)
When platform not supports DV e.g. Windows (because Windows itself... lack of API and graphics drivers NVIDIA/Intel, etc. and graphics hardware not supports it) but supports HDR10 is used HDR10 and is NOT need any conversion... because ALL UHD Blu-Ray's that includes Dolby Vision also includes HDR10 video track.
Because in UHD Blu-Ray spec HDR10 is mandatory and DV is optional (not exist any commercial Blu-Ray with only Dolby Vision video).
Then the BEST conversion is NO CONVERSION and use HDR10 video track.
If display also not supports HDR10 then HDR10 is converted to SDR using current tone mapping algorithms (Reninhard, ACES, Hable). There is no difference if video is HDR10 only or HDR10 + DV.
Posts: 37
Joined: Jan 2016
Reputation:
1
@jogal:
That's not entirely correct. Dolby Vision Profile 5 does not contain a HDR10 base layer and the colors are completely wrong when you watch it on a non-DV screen (on the other hand watching HDR10 content on a non-HDR10 screen will only lead to muted colors, but it is still generally watchable).
Technically software based tone mapping is possible, though hard... The whole stuff needs to be reverse engineered. But apparently MPV has managed to do it, including DV profile 5. Now, why what MPV has done can't be applied to Kodi I don't know.
In any case I feel like this is going to take a long time, so I went and bought a HDFury Arcana. It works and enables Dolby approved tone mapping.
Posts: 37
Joined: Jan 2016
Reputation:
1
It works with my Samsung (S95C), source being an NVIDIA Shield TV Tube 2019. Does it work as good as native HDR10+ would (or if the TV would support DV from factory)? No clue. I suppose not quite, because native support would know exactly what the TV is capable of. I suppose there's also the chance of tweaking the settings on the Arcana, say, to 1300 nits peak brightness instead of 1000. Plenty of experimentation possible. In the end, I would get it to play back content that you only have in DV (so profile 5). If you could fall back to HDR10 content AND you have a S95C/S90C, then I might not bother. Those TVs are bright enough to not require tone mapping with typical 1000 nits mastered content (this will be different once 4000 nits is the goal). With lesser Samsungs, that don't hit 1000 nits, you might want to consider the Arcana. However, maybe that money is better spent on a better TV instead?
@izprtxqkft: Not quite. The player does matter very much, it MUST support LLDV. Quite a few do, but not all. Also, the implementation of LLDV may vary. The player is doing the tone mapping (not the Arcana). All the Arcana does is to pretend that the TV supports DV but requires LLDV (this part can be done with some cheaper HDMI splitter). If you configure your Arcana, it will also tell the player to an extend what the capabilities of the TV are. The player will then, if capable, decode and tonemap the DV signal to HDR10. The Arcana will then tell the TV that the incoming signal is HDR10 (this part only HDFury is able to do). Otherwise, the TV would think the HDR signal is SDR and it will all be very washed out and not pretty to look at.
Internal DV, if implemented properly, is always better, because the manufacturer knows exactly what the TV does at which brightness level and can finetune how content is tonemapped. It would, for example, know, that while peak brightness on the S95C is something like 1300 nits, full field is perhaps closer to 300 nits or so, and could adjust accordingly. The LLDV implementation of DV at best knows the minimum and maximum brightness of the TV. And that's what you tell it.
Posts: 3,518
Joined: Jan 2023
Reputation:
329
i think we're getting crossed on "player"
player to me is a "player" as in VLC, MPV, Kodi, Exoplayer - none of those have anything to do with DV/LLDV
where as the "device" would do LLDV as in FireTV, ShieldTV, Homatics - in this sense it does depend on the "device"
in the case of Kodi, it just detects capable display and sends the video to the appropriate "device" decoder instead of the hevc decoder
still, i am a samsung display fan and samsung refuses to do DV so Arcana is great for my needs
Posts: 37
Joined: Jan 2016
Reputation:
1
Ah yes, in that case, it is player independent. The device matters, but the conversion is done by the device, so the software running on it does not matter.