UHD HDR choppy with large files on i7 7700K machine
#16
Kodi's visuals on X11 are 8 bit only. So whatever you think you get out to the TV is something you are imaginating. Kodi renders in OpenGL which means it needs to do a transformation to sRGB while keeping all the clibber BT601/709/2020 and additional undoing certain non linear transformations.

How did you actually output 10 bit on your linux machine? Can you tell us?

Edit: With even the transformation wrong (until some days ago) I think the LUT you used just gave you some "contrast effect" but should be far away from the real values that you would get.
First decide what functions / features you expect from a system. Then decide for the hardware. Don't waste your money on crap.
Reply
#17
(2018-02-04, 16:47)fritsch Wrote: Default, but still WIP.
Looks much better now, colors are not washed out anymore, thanks to all devs.
Reply
#18
Did you check with: https://github.com/xbmc/xbmc/pull/13477 - which just made it in some hours ago?

Edit: and this: https://github.com/xbmc/xbmc/pull/13481 (just 2 minutes old) ;.)
First decide what functions / features you expect from a system. Then decide for the hardware. Don't waste your money on crap.
Reply
#19
I pulled sources from GitHub about 3 hours ago, so it should have been included.
Reply
#20
(2018-02-04, 18:28)fritsch Wrote: Kodi's visuals on X11 are 8 bit only. So whatever you think you get out to the TV is something you are imaginating. Kodi renders in OpenGL which means it needs to do a transformation to sRGB while keeping all the clibber BT601/709/2020 and additional undoing certain non linear transformations.

How did you actually output 10 bit on your linux machine? Can you tell us?

Edit: With even the transformation wrong (until some days ago) I think the LUT you used just gave you some "contrast effect" but should be far away from the real values that you would get.
Thx for pointing out the 8 Bit issue. I wasn't aware of it and I will look into that. My X11-log indeed tells me 24 Bits, so that would translate to 8 bits for each channel, correct? Funny thing is, my Projector tells me 12 Bits, so I trusted that. I will ask Florian Höch, the writer of displaycal about that. He seems to know a lot about the inner workings of all things color in Linux. He was the one to tell me, the LUT creation from Rec.2020 to DCI-P3 was done the right way and the result is what I would have expected. A very much better picture.

Anyway, what I am seeing are the correct colors and due to less bits probably little to no HDR-Effect. The colors are correct, because the LUT is based upon a measurement, so displaycal expects a certain color and manipulates the LUT to output as close as possible to that. Displaycal tells me I get ninetysomething percent of the DCI-P3 color range.
To my understanding a colorspace is not dependent on bitdepth but to the correct color coordinates, so that should work fine with 8 bits as well. It might be that the resolution of these coordinates suffers from 8 bits but the corners of the triangle should be up to spec.
Gamma should also be correct, because again it is measured and corrected to match the expected curve.
Well, this Rec.2020+HDR stuff all rather new to me so there is a ton of stuff to look into and learn.. Isn't that the fun part?
Reply
#21
(2018-02-04, 19:51)P.Kosunen Wrote: I pulled sources from GitHub about 3 hours ago, so it should have been included.
 Yeah - the new one is just 5 minutes old. If you run with VAAPI a little compile fix is needed - looks really great.
First decide what functions / features you expect from a system. Then decide for the hardware. Don't waste your money on crap.
Reply
#22
Just did some more reading. Displaying in DCI-P3 using 24 bits should display colors correctly but could lead to color banding. Visually, I see the colors now really jumping out like in real life, especially faces look like they do in real life. Faces are what usualy gives away color problems in video immediately, so I am confident to be all set up, colorwise. For the bitdepth itself there are still open questions and as it seems, limitations..
Reply
#23
(2018-02-04, 20:35)((( atom ))) Wrote: Just did some more reading. Displaying in DCI-P3 using 24 bits should display colors correctly but could lead to color banding. Visually, I see the colors now really jumping out like in real life, especially faces look like they do in real life. Faces are what usualy gives away color problems in video immediately, so I am confident to be all set up, colorwise. For the bitdepth itself there are still open questions and as it seems, limitations..
 If you are happy, please stay happy. I can only tell you that a lot of things in the processing chain was suboptimal / not right. Also kodi renders sRGB - which is a complete difference to the YUV colorspace.
First decide what functions / features you expect from a system. Then decide for the hardware. Don't waste your money on crap.
Reply
#24
When you write "was" that means that things are improved in 18 now?

So you are saying Kodi plays out colors expecting a sRGB display, right? When I fire up displaycal under X11 instead of kodi it also assumes sRBG and so my 3D-LUT again will work just right in Kodi. Displaycal "knew" where Kodi would locate the color when playing out, since they both work in sRGB colorspace.
Reply
#25
(2018-02-04, 21:06)((( atom ))) Wrote: When you write "was" that means that things are improved in 18 now?

So you are saying Kodi plays out colors expecting a sRGB display, right? When I fire up displaycal under X11 instead of kodi it also assumes sRBG and so my 3D-LUT again will work just right in Kodi. Displaycal "knew" where Kodi would locate the color when playing out, since they both work in sRGB colorspace.
 It's not so easy. YUV covers another colorspace than RGB does, especially in brightness. If the first transformation is wrong, whatever you do with a simple LUT won't correct these things. V18 will have "perfect" mapping to sRGB, which is the colorspace kodi renders.
First decide what functions / features you expect from a system. Then decide for the hardware. Don't waste your money on crap.
Reply
#26
It's all not easy, since I never took the deep dive into color-management, just got my way around to achieve good results. So I am trying not to stumble, haha.

Sure, if there are errors in the entire process before video reaching the LUT/TV/whatever it will be impossible to make up for that. If done right though, I can expect let's say the white point at a certain coordinate, red, green and blue as well. That I can transform. Surely my way of bending the output may not be the perfect way, but I can enjoy the really much more beautiful DCI colorspace without spending an extra penny and that's very worth going for!

So you are saying that v 18 now generally puts out better (more correct) video? That's very good then!
What's not so good is that I cannot use it since colorcorrection is still broken. So for now I will have to live with what I have but believe me, it looks way better than blurays ever did.
Reply
#27
Can you tell me the general outlook for bitdepth higher that 8 bits? I dug around a little and it looks like there is some support coming up hardware-driver wise, e.g. this post: https://lists.freedesktop.org/archives/i...90944.html

I guess there will be quite some fuzz about it soon, as 10 bit capable displays will become common overnight as well as 10 bit content.
Reply
#28
(2018-02-04, 22:28)((( atom ))) Wrote: Can you tell me the general outlook for bitdepth higher that 8 bits? I dug around a little and it looks like there is some support coming up hardware-driver wise, e.g. this post: https://lists.freedesktop.org/archives/i...90944.html

I guess there will be quite some fuzz about it soon, as 10 bit capable displays will become common overnight as well as 10 bit content.
 Jep and we will be ready for it :-). At the end it will be like "Multi channel PCM" vs. "Passthrough" on Audio side, just for video.
First decide what functions / features you expect from a system. Then decide for the hardware. Don't waste your money on crap.
Reply
#29
OK, that sounds promising. Just got confirmation from Florian Höch, that my assumption about DCI-P3 colorspace with 8 bits is correct. No problem here, colors are accurate, contrast not. Since my PJ/Screen combination just kills you with brightness it still looks very good, which is amazing, since it reserves quite a bit for the highlights. I had to turn up the lamp though.

How far are plans for the 10 bit playback chain? Kodi 19? I'm eager to test that now of course. Smile
Reply
#30
(2018-02-04, 22:28)((( atom ))) Wrote: I guess there will be quite some fuzz about it soon, as 10 bit capable displays will become common overnight as well as 10 bit content.
Whole HDR is overrated, i can't see much difference in Amazon Prime/Netflix HDR/Dolby Vision content colorwise compared to some old school 1080p 8-bit/color BluRay movies with excellent picture. I don't need any more brightness, you can already crank it up until eyes hurt, black side is the important one. 4k resolution is welcome though, benefit is clear.

Updated 18 today and took better look at test clips today, UHD/HDR colors look really good now.
Reply

Logout Mark Read Team Forum Stats Members Help
UHD HDR choppy with large files on i7 7700K machine0