Does Kodi convert from SD to HD correctly?
#16
Why don't you just read the code? Kodi renders in OpenGL <- at this very point the above discussion is finished, nothing to add. From Kodi's POV the output device is RGB888, while the available visuals help to keep precission when doing the yuv2rgb.

For the Limited / Full it depends. On X11 we have dithering to do Full RGB Range, we also have the possibility to transfer it Limited RGB Range (Setting "Use Limited Range") - this does not work on AML or Mediacodec Surface - cause the output is a blackbox to us. Kodi _does not_ output YUV by itself. The only difference are ByPass renderers where you have no idea what they do or how it is composited, e.g. AMLogic or Android's Mediacodec Surface. As AMLogic on Linux has an RGB framebuffer - the same conversion might happend before it goes to screen.

So, once more:
We have several matrices available with the single goal to produce RGB, which might be limited or full. On that way we use the matrices I linked some pages before. From that, let's call it OpenGL Surface / Texture whatever. Kodi does nothing. We don't output YUV, we don't send meta data, nothing. We have no idea what Mediacodec Surface does (see e.g. FireTV3 vs. Nvidia Shield).

Is it now clear what the "endpoint" of kodi's render is?
First decide what functions / features you expect from a system. Then decide for the hardware. Don't waste your money on crap.
Reply
#17
(2017-11-11, 11:30)Shasarak Wrote:
(2017-11-11, 09:19)FernetMenta Wrote: OpenGL rendering is in RGB and finally your display is RGB. Hence there is no need for convertion of colour spaces YUV -> YUV.

I'm afraid I don't know enough about this to tell you what maths you need to use to do this correctly. I suspect it's quite complicated, because hardly anything seems to do it correctly! But if you don't you end up with significantly oversaturated colours on upscaled NTSC DVDs. 

Well, I know how that works and I understand the math and the software that does all the work. I maintain the reference platform for Kodi's video player, which is Linux (OpenGL). I know that this platform does it correctly. I have searched the code and doubt that the Android ports does it correctly.
Reply
#18
(2017-11-11, 12:51)FernetMenta Wrote: You still did not get the point. There is no conversion fron SD to HS or vice versa. Computer graphics are in RGB.. That is a well defined color space and has nothing to do with ITU 601/709 etc. The only conversion we do is to convert from the original color space of the vido to a standard computer graphics colorspace. That's it.
With respect, I do understand what you're saying, but what you're saying isn't addressing my question. You're talking about colour spaces; I'm talking about colour gamuts.
Reply
#19
(2017-11-11, 12:51)FernetMenta Wrote: The Android port is not as matured as Windows and Linux (OpenGL). If you want hight quality, don't use Kodi on Android. Windows and Linux even offer CMS.
Does "Linux" include the Raspberry Pi 2/3 version in this context?
Reply
#20
(2017-11-11, 13:20)Shasarak Wrote:
(2017-11-11, 12:51)FernetMenta Wrote: The Android port is not as matured as Windows and Linux (OpenGL). If you want hight quality, don't use Kodi on Android. Windows and Linux even offer CMS.
Does "Linux" include the Raspberry Pi 2/3 version in this context? 

Nope. I was refering to Linux X11/Wayland(OpenGL). Wayland also supports GLES but Kodi's GLES implementation is not as matured as OpenGL.
Reply
#21
(2017-11-11, 08:34)fritsch Wrote: I see. I looked up the conversion, which is mainly a section wise defined function, yeah - two sections. One < 0.04045, which is handled by a scaling factor and one larger, which is computed by a pow function to produce the linear mapping. This is for sRGB to linear RGB. Now the problem is, we don't have sRGB as input, but a different representation, yuv with certain other values, depending on BT601, smtpe, whatever.

So basically you say: we need section wise defined methods to do this conversion first? And from there we can go to sRGB again? Because of the non linear character of the functions, therefore: G(X) != H(I(x)) where H and I are linear, cause they are both not.

So much about the math. What I don't get is, the goal of linear RGB is, that if we concentrate on one channel, let's say R, that means: double the value is double the brightness. My question here is: Why does linear RGB come into play at all? Where in the original color / video editing did we have a linear combination between those two color value and brightness?

CRT response to voltage is non-linear and it so happens that this non-linearity resembles the inverse of human visual response, early on the standards defined a power law function(gamma) for the electro-optical transfer function (EOTF) designed for CRTs. A gamma correction (opto-electric transfer function, OETF) is applied to the scenic tristimulus at source. The coded video signal (R'G'B'/Y'CbCr) is a non-linear gamma corrected signal. BT.1886 defines a gamma of 2.4, sRGB - 2.2, NTSC - 2.22, EBU - 2.8. The older specs. aren't clear on the gammas for NTSC and EBU. These days the BT.1886 gamma is used universally. For BT.709 an exponent of 0.45 is used for gamma correction. As you know that the slope of a pure power function with an exponent less than 1 is infinite at 0, all tristimulus values less than or equal to +0.018 (an arbitrary number, at least to my knowledge) have a slope of 4.5 . The similar value for sRGB is +0.04045. The CIE XYZ color matching defined in the standards (BT.709, SMPTE RP145 (used for NTSC), EBU 3213 (used for PAL), BT.2020 etc.) are based on RGB (no gamma correction applied). Because of this, color space transformations are to be done on linear RGB values. Hope this explains why we are talking about linear RGB.

I couldn't find the reference for OpenGL, but here is one for OpenVX describing the equations and steps used in color space transformations: https://www.khronos.org/registry/OpenVX/...nvert.html
(2017-11-11, 11:19)fritsch Wrote: We don't do that. The only position in kodi where we use 601/709 conversion is when converting from Yuv to RGB. After that, the colors are not touched by us at all. Kodi outputs RGB via OpenGL. Rest is done by drivers under the hood, without our control.
If you don't know what OpenGL does later on, how is it different from Mediacodec? (Edit: I see that you are making the distinction of whether Kodi can write to the native RGB framebuffer or not).
(2017-11-11, 09:19)FernetMenta Wrote: OpenGL rendering is in RGB and finally your display is RGB. Hence there is no need for convertion of colour spaces YUV -> YUV.
I don't think I understand what you are trying to say. I was asking about color space transformations (gamut mapping to be specific).
 
(2017-11-11, 12:51)FernetMenta Wrote: You still did not get the point. There is no conversion fron SD to HS or vice versa. Computer graphics are in RGB.. That is a well defined color space and has nothing to do with ITU 601/709 etc. The only conversion we do is to convert from the original color space of the vido to a standard computer graphics colorspace. That's it.
The Android port is not as matured as Windows and Linux (OpenGL). If you want hight quality, don't use Kodi on Android. Windows and Linux even offer CMS.
Are you suggesting that Kodi is actually converting all coded video (BT.601/709/2020) to sRGB and then leaves everything to OpenGL?
Reply
#22
I don't get the point of this conversation... is it purely academic to ask these questions or will they eventually lead to a proposal for improvement by the asker?
Reply
#23
(2017-11-12, 11:24)HeresJohnny Wrote: I don't get the point of this conversation... is it purely academic to ask these questions or will they eventually lead to a proposal for improvement by the asker?
From my perspective, it's mostly a question of "If Kodi doesn't do this correctly then I guess I'll need to look for another media player that does (if such a thing exists)." And if, hypothetically, Kodi behaved differently on different platforms in this respect, that would be useful information too.

If, in fact, Kodi does not handle upscaling of Standard Definition optimally, then clearly it would be nice if that could be rectified; but that goes without saying, surely? You don't need someone to formally propose doing it?
Reply
#24
(2017-11-12, 11:24)HeresJohnny Wrote: I don't get the point of this conversation... is it purely academic to ask these questions or will they eventually lead to a proposal for improvement by the asker?

The current state is that the reference platform, Linux X11, is correct. Proof me wrong and I will fix it. I don't think the Android port is correct in this regard. If anybody bothers about this fact, bring this to the attention of the maintainers of the Android port. (I don't care much about this platform)
Reply
#25
Just poking into this interesting conversation to make it clear that for Android Mediacodec in Surface mode, which is probably what 90%+ of users are using daily, Kodi doesn't even see the actual frames, everything stays into the VPU end-to-end, up to the point that the hardware compositor blit it to screen.
So, if we're speaking about that use-case, there is no conversion on our side, nor any that we could do, really.

As I see drm (as in Direct Rendering, nothing to do with encryption) picking up speed, I assume it will be the case as well in the linux (at least embedded) world soon.

So yeah, interesting convo, but now (for android) or in the future (for every platforms), this should be directed to the h/w manufacturers, really.
Reply
#26
(2017-11-12, 12:07)Koying Wrote: So yeah, interesting convo, but now (for android) or in the future (for every platforms), this should be directed to the h/w manufacturers, really.
Not entirely correct. Cheap boxes offer less options and less quility but there will always be platforms that offer more. On Linux X11 and Windows users have i.e. CMS and this won't go away. Kodi is not the same on every platform/hw. Users should first make up their minds what they expect from a HTPC, then decide what hw/platform best fits their needs.
Reply
#27
I've seen these questions come up in different subforums lately. I don't think it's worth pursuing legacy SMPT-C/EBU. Let's just focus on BT.601 and BT.709.

The only platform that might get the conversion right is OSX, since it uses system wide color management. (Also iOS since iOS9, although not really used until iOS10). Also some videoplayers in Windows supposedly support color management, like MPC-HC and MPV, but I've yet to be 100% convinced that neither do do it correctly. They both differ. (digress: If you compare Chrome, Firefox and Edge/IE you'll notice some differences too. What's right and what's wrong is hard to say without proper analysis and comparison to an original picture inside Photoshop or similar).

It seems everybody, not only in this thread, but in all threads concerning these topics, seem to touch on everything else that doesn't really have anything to do with these topics.

Fact of the matter is: From my understanding Kodi applies the correct matrices when dealing with SD and HD. The problem is that the display expects BT.709, so the BT.601 decoding will still look wrong because the decoded RGB values will conform to a BT.601 gamut, whereas the display expects BT.709.

Fortunately Kodi already includes 3DLUT processing, so it's just a matter of applying a BT.601->BT.709 3DLUT. This is the only and correct way of dealing with this. It would also be an advantage if the Kodi framebuffer could be extended from RGB888 to say RGB12.12.12 or higher to keep the precision. (and if you apply a custom 3DLUT to calibrate your monitor, then Kodi should ideally be able to combine your custom 3DLUT with the BT.601->BT.709 3DLUT, so we don't have to do double processing)
This is also the correct way of downscaling HDR to SDR (although it needs to have a few different presets to cater to different luminance preferances). I've seen this discussed elsewhere in this forum.

This might not be possibe with Android SOCs, but who cares about that anyways?
Reply
#28
(2017-11-13, 08:14)Soli Wrote: I don't think it's worth pursuing legacy SMPT-C/EBU. Let's just focus on BT.601 and BT.709.
Fact of the matter is: From my understanding Kodi applies the correct matrices when dealing with SD and HD. The problem is that the display expects BT.709, so the BT.601 decoding will still look wrong because the decoded RGB values will conform to a BT.601 gamut, whereas the display expects BT.709.

This might not be possibe with Android SOCs, but who cares about that anyways?
1. BT.601 has no chromaticity primaries defined. It is based on SMPTE RP145/SMPTE 170M/SMPTE-C (NTSC) or EBU 3213 (PAL).

2. I actually checked this today on Android, LibreELEC (Intel & Amlogic) & OSMC. (Vero 4K). RPi2 for whatever reason couldn't play the test patterns.

When HDMI output is YCbCr then the colorimetry bits in the AVI InfoFrame are set correctly for the appropriate color space (ie., SMPTE 170M/BT.470M or BT.470BG for PAL). So there is no need for any color space transformation or color management. The display will manage it. The problem is when the HDMI output is RGB. CTA 861x specs. don't define colorimetry in the AVI InfoFrame for BT.601 or BT.709 RGB signal. It is sRGB by default, AdobeRGB can also be specified. The display gets no information as to which CIE color matching function should be used. Now, this isn't much of a problem with BT.709 because it has the same primary coordinates as sRGB. It is also possible that the display applies one of the BT.601 primaries based on the resolution of the content ie., if it is in SD resolution, it may automatically treat is as BT.601 (this is defined in CTA specs.). It is when you upscale NTSC and output in RGB that things can go wrong. PAL folks shouldn't really worry about this because only BT.709 green primary is slightly different.

3. Amlogic SoCs by default have HDMI YCbCr output and the colorimetry bits in the InfoFrame match the ones in VUI.
Reply
#29
(2017-11-11, 12:54)FernetMenta Wrote: Well, I know how that works and I understand the math and the software that does all the work. I maintain the reference platform for Kodi's video player, which is Linux (OpenGL). I know that this platform does it correctly. I have searched the code and doubt that the Android ports does it correctly.
I'm not in a position to test the Linux version, I'm afraid. I've just downloaded Kodi 17.5 for Windows onto my laptop, and I'm fairly sure it gets the conversion wrong - again, it handles the difference in the decoding standards correctly, but it ignores the difference in gamuts.
Reply
#30
(2017-11-13, 09:25)wesk05 Wrote: 1. BT.601 has no chromaticity primaries defined. It is based on SMPTE RP145/SMPTE 170M/SMPTE-C (NTSC) or EBU 3213 (PAL).
Yes!!! Smile
(2017-11-13, 09:25)wesk05 Wrote: 2. I actually checked this today on Android, LibreELEC (Intel & Amlogic) & OSMC. (Vero 4K). RPi2 for whatever reason couldn't play the test patterns.

When HDMI output is YCbCr then the colorimetry bits in the AVI InfoFrame are set correctly for the appropriate color space (ie., SMPTE 170M/BT.470M or BT.470BG for PAL). So there is no need for any color space transformation or color management. The display will manage it.
I'm still trying to get my head around how Kodi handles colour spaces, but I thought it renders everything internally in RGB, and if the device is outputting YCbCr then that's the result of a last-minute conversion from RGB by the OS or video driver...? If so, does all of that information survive all of the internal colour space transformations?
(2017-11-13, 09:25)wesk05 Wrote: The problem is when the HDMI output is RGB. CTA 861x specs don't define colorimetry in the AVI InfoFrame for BT.601 or BT.709 RGB signal. It is sRGB by default, AdobeRGB can also be specified. The display gets no information as to which CIE color matching function should be used. Now, this isn't much of a problem with BT.709 because it has the same primary coordinates as sRGB. It is also possible that the display applies one of the BT.601 primaries based on the resolution of the content ie., if it is in SD resolution, it may automatically treat is as BT.601 (this is defined in CTA specs.). It is when you upscale NTSC and output in RGB that things can go wrong.
Again, yes!!! Smile
(2017-11-13, 09:25)wesk05 Wrote: PAL folks shouldn't really worry about this because only BT.709 green primary is slightly different.
We-ell, I'll grant you the difference between the EBU and rec.709 gamuts is quite subtle, but it's not invisible.

In the NTSC case, the difference is big enough that if it's a TV show with a consistent colour palette that I'm familiar with, I can tell after a minute or so of viewing whether the gamut is being handled correctly or not, even without doing a side by side comparison.

For PAL it's definitely much less obvious, but if you were to switch to and fro between two versions of the same frame, one handled correctly and the other not, you'd certainly be able to see pixels changing. (Well, I can, anyway - maybe my colour vision is unusually good).
Reply

Logout Mark Read Team Forum Stats Members Help
Does Kodi convert from SD to HD correctly?0