Does Kodi convert from SD to HD correctly?
#31
(2017-11-13, 08:14)Soli Wrote: Fortunately Kodi already includes 3DLUT processing, so it's just a matter of applying a BT.601->BT.709 3DLUT. This is the only and correct way of dealing with this.
I was wondering about that myself. That is essentially the approach that my Lumagen RadiancePro processor takes. There is only one LUT stage in its pipeline, and its primary function is to get accurate calibration - you calibrate the TV as precisely as you can, and the Lumagen then does a 17x17x17 LUT to get things as accurate as possible across the whole colour volume. But it can maintain multiple LUTs which have been separately calibrated for different colour gamuts.
 
(2017-11-13, 08:14)Soli Wrote: if you apply a custom 3DLUT to calibrate your monitor, then Kodi should ideally be able to combine your custom 3DLUT with the BT.601->BT.709 3DLUT, so we don't have to do double processing
Ideally, yes. But for a user like myself who is doing calibration via an external device, it would be helpful if someone could come up with a standardised Kodi LUT which exactly maps from the rec.709 gamut to SMPTE-C; as those gamuts are defined statically, if it's been done once, that can be distributed to everyone without any further modifications ever being needed.

But the best solution might be an additional LUT stage, at an earlier point in the pipeline - at the point when the initial conversion from YCbCr to RGB happens. And that can be applied automatically (or not) based on flags in the original video file.
Reply
#32
(2017-11-13, 09:25)wesk05 Wrote:
(2017-11-13, 08:14)Soli Wrote: I don't think it's worth pursuing legacy SMPT-C/EBU. Let's just focus on BT.601 and BT.709.
Fact of the matter is: From my understanding Kodi applies the correct matrices when dealing with SD and HD. The problem is that the display expects BT.709, so the BT.601 decoding will still look wrong because the decoded RGB values will conform to a BT.601 gamut, whereas the display expects BT.709.

This might not be possibe with Android SOCs, but who cares about that anyways?
1. BT.601 has no chromaticity primaries defined. It is based on SMPTE RP145/SMPTE 170M/SMPTE-C (NTSC) or EBU 3213 (PAL).

2. I actually checked this today on Android, LibreELEC (Intel & Amlogic) & OSMC. (Vero 4K). RPi2 for whatever reason couldn't play the test patterns.

When HDMI output is YCbCr then the colorimetry bits in the AVI InfoFrame are set correctly for the appropriate color space (ie., SMPTE 170M/BT.470M or BT.470BG for PAL). So there is no need for any color space transformation or color management. The display will manage it. The problem is when the HDMI output is RGB. CTA 861x specs. don't define colorimetry in the AVI InfoFrame for BT.601 or BT.709 RGB signal. It is sRGB by default, AdobeRGB can also be specified. The display gets no information as to which CIE color matching function should be used. Now, this isn't much of a problem with BT.709 because it has the same primary coordinates as sRGB. It is also possible that the display applies one of the BT.601 primaries based on the resolution of the content ie., if it is in SD resolution, it may automatically treat is as BT.601 (this is defined in CTA specs.). It is when you upscale NTSC and output in RGB that things can go wrong. PAL folks shouldn't really worry about this because only BT.709 green primary is slightly different.

3. Amlogic SoCs by default have HDMI YCbCr output and the colorimetry bits in the InfoFrame match the ones in VUI. 

1. Well, you are wrong about that. Although not explicitly ratified until version 6, there is always a gamut involved. See youself: https://en.wikipedia.org/wiki/Rec._601 Even though it wasn't defined until v.6, these are the primaries typically used long before.
2. Doesn't matter if the infoframes are set correctly or not. (Also I suspect they are essentially wrong because of the incorrect scaling as described in (3.) and SOCs are probably using the inverse rec.709 matrix from internal RGB->YCbCr, wouldn't be the first time somebody f**** up, although in this case it doesn't really matter in practice.) If you use 1080P into a HDTV monitor it'll be treated as Rec.709. Our displays it won't do a secret-sauce-3DLUT conversion thingy just because of some infoframes are sent. Plus thise infoframes aren't sent with Kodi on Linux/Windows etc. .. And what about software decoding with FFMPEG?
3. Doesn't matter. And all those Android Socs (I'll just call them that), involve an internal conversion to RGB, hard clipping WTV/BTW, and then scaled back do YCbrCr. This also goes for the specific things you mentiioned in (2.). Doesn't matter if the output is is RGB or YCbCr.
Reply
#33
(2017-11-16, 02:17)Soli Wrote: 1. Well, you are wrong about that. Although not explicitly ratified until version 6, there is always a gamut involved. See youself: https://en.wikipedia.org/wiki/Rec._601 Even though it wasn't defined until v.6, these are the primaries typically used long before.
2. Doesn't matter if the infoframes are set correctly or not. (Also I suspect they are essentially wrong because of the incorrect scaling as described in (3.) and SOCs are probably using the inverse rec.709 matrix from internal RGB->YCbCr, wouldn't be the first time somebody f**** up, although in this case it doesn't really matter in practice.) If you use 1080P into a HDTV monitor it'll be treated as Rec.709. Our displays it won't do a secret-sauce-3DLUT conversion thingy just because of some infoframes are sent. Plus thise infoframes aren't sent with Kodi on Linux/Windows etc. .. And what about software decoding with FFMPEG?
3. Doesn't matter. And all those Android Socs (I'll just call them that), involve an internal conversion to RGB, hard clipping WTV/BTW, and then scaled back do YCbrCr. This also goes for the specific things you mentiioned in (2.). Doesn't matter if the output is is RGB or YCbCr. 

I had begun to wonder what you haven't replied yet.

1. There are two terms used in standards: define & specify. When a parameter is described the first time it is defined and after that it is specified. I never said that BT.601 doesn't have a color gamut. What I said was, BT.601 doesn't define the colorimetry. It only specifies it because the colorimetry was defined previously in other relevant standards. It is clear from your post that you don't see this difference in the meaning of these two terms.

2. Kodi might not set the AVI InfoFrame, but some thing else is doing it because AVI InfoFrame can be detected in the HDMI output and you do need to send a AVI InfoFrame if the sink supports it. It is required for HDMI spec. compliance. What makes you think that displays cannot adjust their color space based on the input? I know that you have a pattern generator and colorimeter/spectrophotometer. Why don't you test it out yourself on your display? Have you not seen options like this in display's picture menu?

Image

3. I know you have an aversion to Android devices. I have no idea whether you have actually tested any of the newer Android devices. If you have actually tested one, you wouldn't make your 3rd statement. All recent Amlogic SoCs, nVIDIA Shield, Realtek 1295 SoC allow passthrough of super white/black without any clipping or scaling in YCbCr output mode. Shield was doing it even with RGB output, but nVIDIA has messed it up with the latest update. I will agree with you that it is highly possible that there is an internal YCbCr to RGB conversion, but there is no unnecessary scaling or clipping in the output. I have verified this on all the previously mentioned Android SoCs. This also applies to BT.601 (SMPTE C) content. It isn't just the AVI InfoFrame, the coded YCbCr values in the output also match the content values. In other words, there is no incorrect BT.709 matrix being applied like you suggest. The statements that I make in my posts are almost always after verification. If I make a statement on assumption, I would be explicit about it.
Reply
#34
I had work to do I can't always reply.. Smile
I really don't see colorimetry or what not being relevant here. I can't go into nitpicking mode, cause I'll probably lose Smile

There is clearly a need for a 3DLUT. As you can see from the screenshot yourself: The HD resolutions are REC.709. We need to do apply an 3DLUT to correct for gamut differences. Maybe some TV's do change their primaries when you send a SD signal with the correct infoframes, but seeing as most TV's are inaccurate to begin with, I don't really see the point. Unless 1) you have equipment to calibrate your TV's for both SD/HD, and 2) that your TV actually possess separate calibration memory presets and apply them accordingly to the resolution and infoframe. If you have 1) your display probably doesn't do 2) (Most displays haven't even been able to separate SDR and HDR presets fully, but that's probably about to change) and if it did 2) would it adjust for bt.601 when you are using HD resolutions, or would it be limited to ONLY SD resolutions?
Reply

Logout Mark Read Team Forum Stats Members Help
Does Kodi convert from SD to HD correctly?0