• 1
  • 10
  • 11
  • 12(current)
  • 13
  • 14
  • 17
WIP 3dLUT support
Hey there,

I think it's great and important that you're working on this. Good 3dlut support would've finally caused me to switch from W7 to Linux.

Unfortunately, it's not there yet and so I'd like to share my experiences comparing W7 (MPC-HC & madvr) to the latest Kodi nightly on Linux Mint.

This was the file I tested with: https://www.techpowerup.com/downloads/53...ddplus-5-1
My LUT is about 96MB in size.
System information:
Motherboard: ASRock H110M-ITX
CPU: Intel G3900T
RAM: 16GB
GPU: Intel Integrated (1GB shared VRAM in Windows, 256MB in Linux - as far as I read, one can't change that and people think it's good that way)

In Windows, it plays just fine and CPU usage is on average at ~20%.
Linux - lots of stuttering, CPU usage at ~50%. (And even without 3dlut, the CPU usage is at ~50%, there's still slight stuttering and a lot of frame tearing)
Screenshot of the system monitor: http://i.imgur.com/I7xNJUT.png

I don't know whether any of this is something new to anyone here but I hope that it'll either somehow help development or that you have a special trick you can tell me to magically speed up things because I'd really like to finally ditch Windows. But being able to play movies in good quality is more important to me than security...
Reply
(2016-08-08, 14:31)sh4dow Wrote: Unfortunately, it's not there yet and so I'd like to share my experiences comparing W7 (MPC-HC & madvr) to the latest Kodi nightly on Linux Mint.
I would first test it with one of the LibreELEC test builds.

If that gives you good results, you can start looking into your Mint setup.

The performance may not be very good yet, I have been using an i7 for development and haven't had a chance to look more into optimizations. I will though when I have time.
Reply
Huh... that's interesting... #0808 shows a lower CPU usage than even MPC-HC & madvr. Just 10% on average! (Although I suppose that also might have to do with various image enhancements that madVR does using the default settings)
And there isn't really a difference between using or not using a 3dlut file. Hell, one might have to actually log it and analyze the values but unless my perception tricked me, the cores were used more evenly when using the LUT than without using it.

However... there were issues with artifacts. Very noticeable e.g. at ~1:02 of said sample, when the image is darkened (by which I mean that the darkening happens in the video, not manually decreasing the brightness or something). Lots of blocking artifacts that are not present in MPC-HC.
And it's not related the LUT.

Also... even if I was to ignore the artifacts, try other builds, etc. - how does that help me with Mint? After all, Mint 18 is pretty new and so I at least suppose that the drivers aren't outdated?
Reply
sh4dow, the new test builds use the latest code and drivers to pass decoded video directly from the decoder into textures. I don't think that's possible on Windows and may be why you're seeing lower CPU usage.

If you turn on debug logging, you should see whether the 3dlut is applied or not.

I suggest using something else than VC-1 for testing as the Intel drivers still have some bugs in that area. Alternatively, turn off the VC-1 hardware decoding in expert settings.

If you get everything working the way you'd like in LibreELEC, you can head over to the support forum and check out http://forum.kodi.tv/showthread.php?tid=231955
Reply
I've updated the help strings in https://github.com/xbmc/xbmc/pull/10296
Reply
I think you misunderstood regarding there not being a difference when using the 3dlut. I meant no difference in terms of performance.
I did see that it was applied because... well... it simply changes the image quite a bit. Smile

Anyway - I really appreciate that you pointed out that thread but ever since I realized that all of the open ports and vulnerabilities on Windows aren't that big of a deal if you're halfway security-conscious, I decided that I'll just stick with W7 and madVR until driver support stops and my hardware is getting too old. So... see you in 5-10 years. Wink
Reply
Right, I misunderstood. My displays are pretty well calibrated so there's not much difference when using the 3dlut. Smile You shouldn't see a difference in CPU use as the processing is done on GPU.

Maybe someone will port this to Windows Kodi as things stabilize.
Reply
I finally got round to trying this out with beta1 of Krypton running on Kodibuntu 14.04. Chromebox Celeron 2955u.

Unfortunately the files Calman generates when interfacing with MadVR don't seem to be compatible. Not wanting to do another 5000 point reading straight away, I used the 3dlut file someone else postes earlier im the thread. I'm happy to say it works well, and negligable extra cpu/gpu usage.

What was the reason it struggled on Celeron earlier?

I might be forced to use DisplayCal if Calman doesn't support the file format. But DisplayCal doesn't support my meter yet, so I'll be forced to use an xrite i1 display pro which is going to takes ages with a 5000pt reading. I also don't particularly trust that meter, espescially when it comes to low light handling and with reading plasmas generally.

It's a great beginning. In the future it might be cool to autoswitch 3DLuts depending on the film. (Depending on If it's coded in rec.2020 or rec.709) but of course that will require some other changes in Kodi and/or FFmpeg. Maybe even include some sort of tonemapping from HDR to SDR like MadVR does.

Of course all of these things would require using the displays native color primaries/gamut, maximum brightness (not brightness as in black level but as in contrast), a 10bit display, and updated Intel GfX drivers for Linux that support 10/12bit. (I think Kodi is ready for this since framebuffer is now 16bit?)



Anyways, I'm blabbing.
Reply
Soli, thanks for testing and glad to hear your Celeron success! If you share the 3dlut file with me, I'll try to fix the parser - it implements the bare minimum at the moment. No promises on the schedule though, work is keeping me busy..
Reply
Thanks. Here, have a look, should be pretty trivial.
https://drive.google.com/open?id=0B9Lr85...HJlbUtJT1E
Reply
Oh btw. Calman has an internal cube generator since v5.4. I've used it before, but I was set on trying MadVR that I forgot about it.. Seems with the internal cube generator, I can choose the format of the generated 3Dlut formats. (Which was something I remember doing earlier, so I was a little surprised at first when Calman only supported one type of output when interfacing with MadVR)

But i do think Kodi should support the eecolor txt format too, as this is really easy to implement and really is a 3Dlut file in it's most basic form. (Tabbed/spaced text format)

Edit: Seems the *.3dl file Calman generates isn't compatible. In case you want to take a look at it, it's here: https://drive.google.com/open?id=0B9Lr85...y1XMW5Da0k
Reply
Does Calman not support the madVR 3dlut format?

It might be better to write a script to convert the text files to the binary format.
Reply
It doesn't seem that way. Calman generates (eecolor) .txt files that MadVR also accepts.
It seems strange to support only a proprietary 3Dlut format from a closed source renderer.
Even though madvr files are binary, they are also very big. The one I downloaded was 90MB, the eecolor txt file was less than 18MB and the 3dl file was less than 4MB.

Anyways, I tried using the file through my hardware eecolor 3DLut, and apart from correcting gamma just a tad around the 0-5% level (my hardware controls are too coarse and I figured + 2% over was better than 4% under) everything else was exactly the same. This doesn't warrant having the hardware in the chain since I literally have no physical space left. I'm unsure if the i1d3 is sensitive enough to correct the near black shades (black to around 5%) , but I'll have a go anyways.
Reply
How would you guys recommend I create a 3DLUT file on my system?

I'm using the latest LibreELEC test build (i5 CPU, Intel graphics) and I have an i1Display Pro.

Thanks Smile
Reply
This is a big subject. Doing 3DLut properly is for advanced users who already know how to calibrate their TVs manually, as walking you through this is outside the scope of this thread.

So.. I'm gonna assume you have a laptop with HDMI output, and I'm going to assume you know how to manually calibrate your TV with HFCR.. (the less the 3DLUT has to correct, the better it will be). Use Madlevelstweaker to force RGB FULL but choose videlevels inside HCFR. Disable overscan and all other enchancements on your TV. Calibrate your 2pt white balance, CMS towards REC.709 (a slightly bigger gamut is OK), gamma 2.4. If your TV has Black Field Insertion, enable this. (It's probably disguised as Smart LED or something else). This will give you better motion resolution so it's a good thing. Everything else you want off, although you might want to experiment later with different levels of motion interpolating.

3DLUT: DisplayCal and MadTPG. The easiest would once again to use hdmi output from your laptop, use a program called MadLevelstweaker to force RGB full (which you already should have done earlier when calibrating manually), in MadTPG choose limited range. In Dispaycal choose limited range. You may use the pattern generator in Display or MadTPG, doesn't really matter. DisplayCal needs to connect to MadTPG, so enable network access in MadTPG.

Set the target to REC.709. Make one LUT for gamma 2.2 for daytime and one with gamma 2.4 or BT1886 for nighttime. Will take a long time with i1d Pro so try it out with a limited numbers of readings first.
Reply
  • 1
  • 10
  • 11
  • 12(current)
  • 13
  • 14
  • 17

Logout Mark Read Team Forum Stats Members Help
3dLUT support3