Posts: 23,266
Joined: Aug 2011
Reputation:
1,074
fritsch
Team-Kodi Developer
Posts: 23,266
2017-11-11, 12:51
(This post was last modified: 2017-11-11, 12:57 by fritsch.)
Why don't you just read the code? Kodi renders in OpenGL <- at this very point the above discussion is finished, nothing to add. From Kodi's POV the output device is RGB888, while the available visuals help to keep precission when doing the yuv2rgb.
For the Limited / Full it depends. On X11 we have dithering to do Full RGB Range, we also have the possibility to transfer it Limited RGB Range (Setting "Use Limited Range") - this does not work on AML or Mediacodec Surface - cause the output is a blackbox to us. Kodi _does not_ output YUV by itself. The only difference are ByPass renderers where you have no idea what they do or how it is composited, e.g. AMLogic or Android's Mediacodec Surface. As AMLogic on Linux has an RGB framebuffer - the same conversion might happend before it goes to screen.
So, once more:
We have several matrices available with the single goal to produce RGB, which might be limited or full. On that way we use the matrices I linked some pages before. From that, let's call it OpenGL Surface / Texture whatever. Kodi does nothing. We don't output YUV, we don't send meta data, nothing. We have no idea what Mediacodec Surface does (see e.g. FireTV3 vs. Nvidia Shield).
Is it now clear what the "endpoint" of kodi's render is?
First decide what functions / features you expect from a system. Then decide for the hardware. Don't waste your money on crap.
Posts: 116
Joined: Jan 2016
(2017-11-11, 12:51)FernetMenta Wrote: You still did not get the point. There is no conversion fron SD to HS or vice versa. Computer graphics are in RGB.. That is a well defined color space and has nothing to do with ITU 601/709 etc. The only conversion we do is to convert from the original color space of the vido to a standard computer graphics colorspace. That's it.
With respect, I do understand what you're saying, but what you're saying isn't addressing my question. You're talking about colour spaces; I'm talking about colour gamuts.
Posts: 116
Joined: Jan 2016
(2017-11-11, 12:51)FernetMenta Wrote: The Android port is not as matured as Windows and Linux (OpenGL). If you want hight quality, don't use Kodi on Android. Windows and Linux even offer CMS.
Does "Linux" include the Raspberry Pi 2/3 version in this context?
Posts: 1,234
Joined: Mar 2011
Reputation:
80
I don't get the point of this conversation... is it purely academic to ask these questions or will they eventually lead to a proposal for improvement by the asker?
Posts: 116
Joined: Jan 2016
2017-11-12, 11:47
(This post was last modified: 2017-11-12, 11:48 by User 309201.)
(2017-11-12, 11:24)HeresJohnny Wrote: I don't get the point of this conversation... is it purely academic to ask these questions or will they eventually lead to a proposal for improvement by the asker?
From my perspective, it's mostly a question of "If Kodi doesn't do this correctly then I guess I'll need to look for another media player that does (if such a thing exists)." And if, hypothetically, Kodi behaved differently on different platforms in this respect, that would be useful information too.
If, in fact, Kodi does not handle upscaling of Standard Definition optimally, then clearly it would be nice if that could be rectified; but that goes without saying, surely? You don't need someone to formally propose doing it?
Posts: 5,952
Joined: Sep 2008
Reputation:
201
Koying
Retired Team-Kodi Member
Posts: 5,952
Just poking into this interesting conversation to make it clear that for Android Mediacodec in Surface mode, which is probably what 90%+ of users are using daily, Kodi doesn't even see the actual frames, everything stays into the VPU end-to-end, up to the point that the hardware compositor blit it to screen.
So, if we're speaking about that use-case, there is no conversion on our side, nor any that we could do, really.
As I see drm (as in Direct Rendering, nothing to do with encryption) picking up speed, I assume it will be the case as well in the linux (at least embedded) world soon.
So yeah, interesting convo, but now (for android) or in the future (for every platforms), this should be directed to the h/w manufacturers, really.
Posts: 1,065
Joined: Oct 2011
Reputation:
27
Soli
Posting Freak
Posts: 1,065
I've seen these questions come up in different subforums lately. I don't think it's worth pursuing legacy SMPT-C/EBU. Let's just focus on BT.601 and BT.709.
The only platform that might get the conversion right is OSX, since it uses system wide color management. (Also iOS since iOS9, although not really used until iOS10). Also some videoplayers in Windows supposedly support color management, like MPC-HC and MPV, but I've yet to be 100% convinced that neither do do it correctly. They both differ. (digress: If you compare Chrome, Firefox and Edge/IE you'll notice some differences too. What's right and what's wrong is hard to say without proper analysis and comparison to an original picture inside Photoshop or similar).
It seems everybody, not only in this thread, but in all threads concerning these topics, seem to touch on everything else that doesn't really have anything to do with these topics.
Fact of the matter is: From my understanding Kodi applies the correct matrices when dealing with SD and HD. The problem is that the display expects BT.709, so the BT.601 decoding will still look wrong because the decoded RGB values will conform to a BT.601 gamut, whereas the display expects BT.709.
Fortunately Kodi already includes 3DLUT processing, so it's just a matter of applying a BT.601->BT.709 3DLUT. This is the only and correct way of dealing with this. It would also be an advantage if the Kodi framebuffer could be extended from RGB888 to say RGB12.12.12 or higher to keep the precision. (and if you apply a custom 3DLUT to calibrate your monitor, then Kodi should ideally be able to combine your custom 3DLUT with the BT.601->BT.709 3DLUT, so we don't have to do double processing)
This is also the correct way of downscaling HDR to SDR (although it needs to have a few different presets to cater to different luminance preferances). I've seen this discussed elsewhere in this forum.
This might not be possibe with Android SOCs, but who cares about that anyways?
Posts: 116
Joined: Jan 2016
2017-11-14, 11:39
(This post was last modified: 2017-11-14, 11:44 by User 309201.)
(2017-11-11, 12:54)FernetMenta Wrote: Well, I know how that works and I understand the math and the software that does all the work. I maintain the reference platform for Kodi's video player, which is Linux (OpenGL). I know that this platform does it correctly. I have searched the code and doubt that the Android ports does it correctly.
I'm not in a position to test the Linux version, I'm afraid. I've just downloaded Kodi 17.5 for Windows onto my laptop, and I'm fairly sure it gets the conversion wrong - again, it handles the difference in the decoding standards correctly, but it ignores the difference in gamuts.
Posts: 116
Joined: Jan 2016
2017-11-14, 12:26
(This post was last modified: 2017-11-14, 12:27 by User 309201.)
(2017-11-13, 09:25)wesk05 Wrote: 1. BT.601 has no chromaticity primaries defined. It is based on SMPTE RP145/SMPTE 170M/SMPTE-C (NTSC) or EBU 3213 (PAL).
Yes!!!
(2017-11-13, 09:25)wesk05 Wrote: 2. I actually checked this today on Android, LibreELEC (Intel & Amlogic) & OSMC. (Vero 4K). RPi2 for whatever reason couldn't play the test patterns.
When HDMI output is YCbCr then the colorimetry bits in the AVI InfoFrame are set correctly for the appropriate color space (ie., SMPTE 170M/BT.470M or BT.470BG for PAL). So there is no need for any color space transformation or color management. The display will manage it.
I'm still trying to get my head around how Kodi handles colour spaces, but I thought it renders everything internally in RGB, and if the device is outputting YCbCr then that's the result of a last-minute conversion from RGB by the OS or video driver...? If so, does all of that information survive all of the internal colour space transformations?
(2017-11-13, 09:25)wesk05 Wrote: The problem is when the HDMI output is RGB. CTA 861x specs don't define colorimetry in the AVI InfoFrame for BT.601 or BT.709 RGB signal. It is sRGB by default, AdobeRGB can also be specified. The display gets no information as to which CIE color matching function should be used. Now, this isn't much of a problem with BT.709 because it has the same primary coordinates as sRGB. It is also possible that the display applies one of the BT.601 primaries based on the resolution of the content ie., if it is in SD resolution, it may automatically treat is as BT.601 (this is defined in CTA specs.). It is when you upscale NTSC and output in RGB that things can go wrong.
Again, yes!!!
(2017-11-13, 09:25)wesk05 Wrote: PAL folks shouldn't really worry about this because only BT.709 green primary is slightly different.
We-ell, I'll grant you the difference between the EBU and rec.709 gamuts is quite subtle, but it's not invisible.
In the NTSC case, the difference is big enough that if it's a TV show with a consistent colour palette that I'm familiar with, I can tell after a minute or so of viewing whether the gamut is being handled correctly or not, even without doing a side by side comparison.
For PAL it's definitely much less obvious, but if you were to switch to and fro between two versions of the same frame, one handled correctly and the other not, you'd certainly be able to see pixels changing. (Well, I can, anyway - maybe my colour vision is unusually good).