4k 10 bit on Ubuntu?
#1
I am using Ubuntu 20.04 and Kodi 18.6 on my HTPC, which is based on a Ryzen 2400G.  Both the HTPC and the projector support 10 bit color.  If I were to enable 10-bit, would it fix the faded colors on 4K HDR videos?  It looks like it would have to be done manually by editing a configuration file, so I wasn't sure if I should bother with it or not.  HDR support in Linux seems spotty, even though it's been in the works for quite a while now.
Reply
#2
No, all you get is tonemapping, no native 10 bit output with hdr metadata.
First decide what functions / features you expect from a system. Then decide for the hardware. Don't waste your money on crap.
Reply
#3
(2020-04-24, 07:36)fritsch Wrote: No, all you get is tonemapping, no native 10 bit output with hdr metadata.

Getting metadata through is not important if 10bit can got out.
Reply
#4
Sadly most TVs don't accept 10 bit sRGB out, which is why DRM and others including wayland try to transport the buffers within yuv planes and signal the TV how to interpret the decoded values.
First decide what functions / features you expect from a system. Then decide for the hardware. Don't waste your money on crap.
Reply
#5
Perhaps to be bit verbose, as why HDR metadata is still needed might not be totally clear from the start.

In the old days content was produced in BT709, with a fixed maximum brightness. Each television set if properly calibrated onces, could then basically see the decoded sRGB output as it was meant to be. With new dynamic HDR, there are brightness, color settings, beyond the capability of the TV. Means, if one decodes it, does all the mapping with the internal metadata already correctly and produces the output to a rgb whatever visual, it still won't look like it is intended on the THE TV in use.

The provided metadata is used _in the TV_ so that the TV can adjust to this dynamic data and "workaround" its own shortcomings, e.g. max brightness and so on. That's the reason why a yuv scanout + providing static and especially dynamic metadata to the TV itself is important. For static metadata a certain "calibration" concerning brightness might be doable, but not in dynamic case if temporal transitions are taken into account.

Edit: So, a quite good looking POC in OpenGL output on platforms supporting it, with same basic settings, might be doable obviously - no doubt - but getting it 1:1 that way right that the TV is not breaking it, is hard to do - there is no back channel from the TV to us, that we could dynamically include into our shader.
First decide what functions / features you expect from a system. Then decide for the hardware. Don't waste your money on crap.
Reply
#6
Processing of metadata can be done on Kodi side to the yuv buffers (sure, you have to provide some data of the TV). After all HDR looks different on different brands of TV. I agree that it would be nice to have metadata passthrough but it is not necessary for getting 10bit.
Reply
#7
I noticed that in an earlier version of Ubuntu (back around 18.04 or 18.10), my projector would always say it was receiving a 12 bit signal.  Now it always says 8 bit.  But I think it may have been YUV before, and now it's RGB?  I'll have to look at that again.  Either way something seems to have changed in the video driver, and there is no control panel option to adjust anything beyond resolution or rotation.  I just want to use the hardware to it's maximum potential.
Reply
#8
Nope. Was a kernel change. Intel used 12 bit yuv for a very long time, not matter if you rendered to 8 bit srgb. It was changed by Chris Wilson some time ago.

But in fact input always was 8 bit srgb, while they converted and up dithered. BTW. srgb and yuv have different color spaces, means you cannot say 12 > 8 and therefore better.
First decide what functions / features you expect from a system. Then decide for the hardware. Don't waste your money on crap.
Reply
#9
(2020-04-24, 18:32)fritsch Wrote: Nope. Was a kernel change. Intel used 12 bit yuv for a very long time, not matter if you rendered to 8 bit srgb. It was changed by Chris Wilson some time ago.

But in fact input always was 8 bit srgb, while they converted and up dithered. BTW. srgb and yuv have different color spaces, means you cannot say 12 > 8 and therefore better.

Ah, thanks for the explanation.  That's been bugging me for a while, and nobody ever talked about it.

Linux has come a long way, but it seems to have hit a wall with 3D MVC and 4k HDR.  The other annoyance is not having accelerated video playback in web browsers.
Reply
#10
No really. It's just that culture changed :-). Android has a very good 80% out of the box experience and as today content and 4K HDR is "theeee shit" to have, most just go with it. DRM is one of the things that makes it hard for linux to really compete with these developments. Most clients work from the browser good enough on top, etc.

So in short: We wait for you! Get hands dirty and implement what you need and is currently missing :-)
First decide what functions / features you expect from a system. Then decide for the hardware. Don't waste your money on crap.
Reply

Logout Mark Read Team Forum Stats Members Help
4k 10 bit on Ubuntu?0