Req Dolby Vision Tone Mapping for playing HDR files on SD screens
#16
I don't understand why the talk has moved from the original, reasonable request, to use of extra hardware. DV tone mapping, to either SDR or HDR, appears to be doable on all platforms, regardless of DV support. You won't get Dolby Vision but and SDR or HDR conversion of it. I don't know what quality that would entail (or why somebody with HDR capable screen wouldn't just use a HDR source) but there's no need of extra hardware for that.
For troubleshooting and bug reporting please make sure you read this first (usually it's enough to follow instructions in the second post).
Reply
#17
When platform not supports DV e.g. Windows (because Windows itself... lack of API and graphics drivers NVIDIA/Intel, etc. and graphics hardware not supports it) but supports HDR10 is used HDR10 and is NOT need any conversion... because ALL UHD Blu-Ray's that includes Dolby Vision also includes HDR10 video track.

Because in UHD Blu-Ray spec HDR10 is mandatory and DV is optional (not exist any commercial Blu-Ray with only Dolby Vision video).

Then the BEST conversion is NO CONVERSION and use HDR10 video track.

If display also not supports HDR10 then HDR10 is converted to SDR using current tone mapping algorithms (Reninhard, ACES, Hable). There is no difference if video is HDR10 only or HDR10 + DV.
Reply
#18
@jogal:
That's not entirely correct. Dolby Vision Profile 5 does not contain a HDR10 base layer and the colors are completely wrong when you watch it on a non-DV screen (on the other hand watching HDR10 content on a non-HDR10 screen will only lead to muted colors, but it is still generally watchable).

Technically software based tone mapping is possible, though hard... The whole stuff needs to be reverse engineered. But apparently MPV has managed to do it, including DV profile 5. Now, why what MPV has done can't be applied to Kodi I don't know.

In any case I feel like this is going to take a long time, so I went and bought a HDFury Arcana. It works and enables Dolby approved tone mapping.
Reply
#19
I think the same. Ffmpeg with libplacebo can decode DV. Mpv uses that library as of 0.37.0.
AFAIK HdFury just add a static HDR10 layer on top of LLDV to make its tonemapping played on HDR10 displays, customised by the user's display nits, along with primaries, etc sent on the HDMI EDID information.

Libplacebo is upstream and cross-compatible. Why not update Kodi's ffmpeg to decode DV and tonemap it to the user's display using mpv library and add a HDR10 layer to HDMI?

@kadajawi I have a Samsung Oled TV that can't decode DV. How does HDFury Arcana perform on your end, which LLDV source player do you use?
Reply
#20
(2023-12-17, 17:35)wyup Wrote: I have a Samsung Oled TV that can't decode DV. How does HDFury Arcana perform on your end, which LLDV source player do you use?

i use arcana with a samsung and every player works, it's not player dependent as it sits inline with the hdmi
i love mine, can't believe it took me so long to get one
Reply
#21
It works with my Samsung (S95C), source being an NVIDIA Shield TV Tube 2019. Does it work as good as native HDR10+ would (or if the TV would support DV from factory)? No clue. I suppose not quite, because native support would know exactly what the TV is capable of. I suppose there's also the chance of tweaking the settings on the Arcana, say, to 1300 nits peak brightness instead of 1000. Plenty of experimentation possible. In the end, I would get it to play back content that you only have in DV (so profile 5). If you could fall back to HDR10 content AND you have a S95C/S90C, then I might not bother. Those TVs are bright enough to not require tone mapping with typical 1000 nits mastered content (this will be different once 4000 nits is the goal). With lesser Samsungs, that don't hit 1000 nits, you might want to consider the Arcana. However, maybe that money is better spent on a better TV instead?

@izprtxqkft: Not quite. The player does matter very much, it MUST support LLDV. Quite a few do, but not all. Also, the implementation of LLDV may vary. The player is doing the tone mapping (not the Arcana). All the Arcana does is to pretend that the TV supports DV but requires LLDV (this part can be done with some cheaper HDMI splitter). If you configure your Arcana, it will also tell the player to an extend what the capabilities of the TV are. The player will then, if capable, decode and tonemap the DV signal to HDR10. The Arcana will then tell the TV that the incoming signal is HDR10 (this part only HDFury is able to do). Otherwise, the TV would think the HDR signal is SDR and it will all be very washed out and not pretty to look at.

Internal DV, if implemented properly, is always better, because the manufacturer knows exactly what the TV does at which brightness level and can finetune how content is tonemapped. It would, for example, know, that while peak brightness on the S95C is something like 1300 nits, full field is perhaps closer to 300 nits or so, and could adjust accordingly. The LLDV implementation of DV at best knows the minimum and maximum brightness of the TV. And that's what you tell it.
Reply
#22
i think we're getting crossed on "player"

player to me is a "player" as in VLC, MPV, Kodi, Exoplayer - none of those have anything to do with DV/LLDV

where as the "device" would do LLDV as in FireTV, ShieldTV, Homatics - in this sense it does depend on the "device"

in the case of Kodi, it just detects capable display and sends the video to the appropriate "device" decoder instead of the hevc decoder

still, i am a samsung display fan and samsung refuses to do DV so Arcana is great for my needs
Reply
#23
Ah yes, in that case, it is player independent. The device matters, but the conversion is done by the device, so the software running on it does not matter.
Reply
#24
(2023-12-19, 03:42)kadajawi Wrote: If you could fall back to HDR10 content AND you have a S95C/S90C, then I might not bother. Those TVs are bright enough to not require tone mapping with typical 1000 nits mastered content (this will be different once 4000 nits is the goal). With lesser Samsungs, that don't hit 1000 nits, you might want to consider the Arcana. However, maybe that money is better spent on a better TV instead?
I agree. DV capability doesn't seem to be necessary on my Samsung S95B. I can play 10,000 HDR10 mastered nits without clipping. My tv does the down-mapping already, and it watches even better than DV converted to HDR10 by Resolve.

I have checked q comparison video clip from a DV movie converted by Resolve into a 10k-nit HDR10 video split into HDR10 and DV-L2 and they look very similar, in fact sometimes I don't like the dynamic tonemapping of DV. Check this youtube channel, it has links to download these samples. The difference is not as much IMO as for additional hadware. Furthemore, we can already convert any DV feature to HDR10+ seamlessly with Resolve.
Reply

Logout Mark Read Team Forum Stats Members Help
Dolby Vision Tone Mapping for playing HDR files on SD screens0