Kodi Community Forum

Full Version: Question about correctly setting Limited Range with a Beamer
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Pages: 1 2 3 4 5
Let's see what happens, maybe the devs will come to their senses:-) I imagine there will be a graduall shift towards rec.2020 colorspace with at least 10bit precision for OS, displays etc, now that UHD BD is fiiinally here. Perfect time to throw out old stuff. But will of course be a bumpy ride. Would be nice with one colorspace that rules them all instead of all these Adobe RGB for hi quality prints, DCI P3 for Cinema etc. Ok ok, time to end off topic :-)
Sorry I have to ask: Would it be possible to do this the other way around? Now it seems were in a "fake" rgb full mode. What if we could tell GPU that we are in limited mode, so the GPU takes care of compressing the data levels. This way the OS doesn't have to support it. At the same time, we have a way of "annotating the surfaces" ( as you call it) when playing video.
(2016-01-11, 13:30)Soli Wrote: [ -> ]Sorry I have to ask: Would it be possible to do this the other way around? Now it seems were in a "fake" rgb full mode. What if we could tell GPU that we are in limited mode, so the GPU takes care of compressing the data levels. This way the OS doesn't have to support it. At the same time, we have a way of "annotating the surfaces" ( as you call it) when playing video.

Did you read my sticky post in this forum? You have to distinguish between: Video Data, decoding of video data, Rendering of the data and what the display driver does.

We do all of that, perfect color from input to output, though the kernel needed to get convinced to not "touch" the data ... which was normally done by "Full Range" specifying and send Limited Range data. Some TVs see that FullRange and switch to full range -> baaaaaang, that was the only point of this kernel patch we discuss here.
Intel's Limited Mode was:

Everything you get is clamped to 16:235 - no matter if it was clamped already (!) ... that's the whole issue we discuss here. All the other stuff, color processing, shaders for BT709/BT601 conversion and so on are fully implemented in kodi - the kernel does nothing for us in that regard. We also get the initial decoded video surfaces without any color banding, just the original untouched decoded data.

Is it now clear what this kernel fix is about?
I think you misunderstand: I can live with double scaling/clamp with youtube, as long my video playback in Kodi is fine. It beats manually setting my TV to full rgb each time I launch Chrome from Kodi to browse around.

So my point was to let Intel clamp whatever they want, but leave a loophole for videoplayback where we tell the GPU to not touch anything. I know it's not possible today, but my real question was if this loophole could be implemented in the GPU driver (by Intel devs) or is it totally out of the question. In other words, would such a solution be possible to implement or would it not be possible because of hardware limitations?
That's what the wayland guys want to do ... but yeah, as said, it's not there yet ... no API and no interface to tell them. No a hw limitation of course.

If you also use youtube. I would set the screen to full and use kodi's 8 bit dithering to upscale the content to full range -> best out of both worlds.
Question : Is it proposed that Wayland allows different areas of a screen, or screens (rectangular? irregular?) to have different colour spaces, with the final rendering deciding on the appropriate output space and converting non-native stuff? Presumably this gets complex when transparency gets involved? Or do modern screen renderers not work on a direct memory->raster mapping for output?

(I used to work in broadcast - where picture processing engines used to usually use one of two standards - read-side and write-side processing. Both had their merits and limitations, and back then a lot of the processing was done in hardware, or software controlled 'hardware' implemented in custom silicon)
@noggin: Yes, that's what the intel dev told me. For more details ask daniels on #intel-gfx channel.
(2016-01-11, 14:12)fritsch Wrote: [ -> ]That's what the wayland guys want to do ... but yeah, as said, it's not there yet ... no API and no interface to tell them. No a hw limitation of course.

If you also use youtube. I would set the screen to full and use kodi's 8 bit dithering to upscale the content to full range -> best out of both worlds.

So are you saying that Intel guys wants to it one way and the Wayland guys wants to do the "kinda" opposite way?
Yeah, i know I can use full with dithering. That was not my question Smile True, that is the best overall compromise, but ideally I want Kodi to be accurate and rather have a less accurate desktop. Of course, if the drivers and desktop supported 10/12bit, I could have my cake and eat it:-)
The intel guy is the wayland guy.
(2016-01-11, 12:44)fritsch Wrote: [ -> ]Intel wants to add such possibilities to wayland, e.g. annotation of surfaces. It won't hit for the next years and work has not yet started - i am quite sure. I also sent the above code to intel's Mailing-List, but chances are low to get that stuff in, as said their wayland devs don't like it. But in their arguing OpenGL players like kodi don't have a future anyways - one should use gstreamer they said :-)

OK, so what you mean is that the Wayland guy that is also the Intel guy wants to implement it, but the rest of the Wayland team oppose it (at least for the time being).
No, nobody is against it. But every single application on the planet needs to be changed. It won't be a solution that you can use "this year" or "next year". If you don't want to watch color perfect files for the time being - all fine :-)

I searched for a solution that given applications can use now.
Thanks for taking the time and answers Fritsch :-) It's not that big of a problem, really. I already launch Chrome with a script. I just realised I could just add an xrandr line to the script and another one to change back when I close Chrome Big Grin (doh)
One more question Fritsch,

in this Post http://forum.kodi.tv/showthread.php?tid=231955 you write about the changes that have to be made as well :

Quote:While watching a SD(!) video, that is accelerated by VAAPI, e.g. mpeg-2 or h264, click the film role and choose: Deinterlace: Auto (Never set this to ON it will harm for everything that is not interlaced) Deinterlacing-Method: VAAPI-MCDI or VAAPI-MADI (Sandybridge) and VAAPI-BOB (BYT), Scaling Method: Lanczos3 Optimized and choose save for all files. Remember to do this only in combination with the above "scaling above" for 20%. This Lanczos3 Optimized filter is too heavy for BYTs, here you might - depending on the file - choose Bilinear.

It is obviously clear, that you won't see the VAAPI-MCDI settings when you play a video that is software accelerated only.

and this :

Code:
<advancedsettings>
   <loglevel hide="false">0</loglevel>
  <cputempcommand>sensors|sed -ne "s/Core 0: \+[-+]\([0-9]\+\).*/\1 C/p"</cputempcommand>
  <gui>    
    <algorithmdirtyregions>3</algorithmdirtyregions>
    <nofliptimeout>0</nofliptimeout>
  </gui>    
<video>
  <latency>
    <delay>50</delay>
    <refresh>
      <min>23</min>
      <max>24</max>
      <delay>175</delay> <!-- set to zero or adjust if audio seems out of sync with 24p movies -->
     </refresh>
  </latency>
</video>
</advancedsettings>


Do i have to make them aswell ?

Thanks
Only the first part. Not the xml.
Pages: 1 2 3 4 5