• 1(current)
  • 2
  • 3
  • 4
  • 5
  • 17
WIP 3dLUT support
#1
I'm working on 3dLUT support for calibration. I have a simple proof-of-concept implementation with the following features:
  • static dithering
  • ICC device link profile support
  • ICC device profile support with BT.1886 gamma
ICC profile loading, profile linking and transformation sampling is implemented using lcms2.

The shader and LUT loader support output curves, but I've disabled them for now. I'm not sure if it's better to sample the whole transform, or parse the input and output curves and the LUT.

Current limitations:
  • Linux and GLSL only
  • probably doesn't work well with two pass rendering (dithering applied too early)
  • only one profile file supported
  • using device profile, source is assumed to be BT.709
  • transformation sampled into a 65x65x65 LUT
  • scales input to full range RGB

Wishlist:
  • support for multiple device link profiles
  • configuration options for device profile linking (gamma, primaries etc)
  • other people might be interested in other platforms too..
  • photo viewer: embedded ICC profile support, 3dLUT for display

Goals:
  • eliminate banding from full range RGB output and provide more precision for 3dLUT output
  • display calibration
  • simulate different displays (for example video monitor calibrated to 2.2 gamma and BT.609)

The code is at https://github.com/laurimyllari/xbmc/tree/Gotham-3DLUT. To try it, place a device profile or device link profile as rec709.icc where XBMC finds it (I'm building from source and running without installing - the source tree root works for me).

I've followed the excellent ArgyllCMS tutorial to create the device profile. I can provide more detailed instructions later.

I've also been working on a simple pattern generator for more flexibility than dispcal provides. Currently it supports multiple displays, configurable pattern window size and constant or APL background. It has ordered dithering, but no support for gamma curves yet (would also need integration with dispcal).
Reply
#2
Really nice! I appreciate your work on this.
Reply
#3
Nice work.

My approach I had in mind is a bit different. I would use a full 256x256x256 lut to do the yuv/rgb color conversion (+ any other color correction). Is there a particular reason to implement it like you did?

(2014-08-23, 01:30)lmyllari Wrote: [*]probably doesn't work well with two pass rendering (dithering applied too early)
Because you apply the dither pattern before scaling? This should be fine for 1080p content where no scaling is needed.
Reply
#4
Do note that it would probably be better to work against our code from master branch. That's our main development branch and there could be several changes compared to Gotham. Could likely cause some headaches rebasing your work onto that.
Read/follow the forum rules.
For troubleshooting and bug reporting, read this first
Interested in seeing some YouTube videos about Kodi? Go here and subscribe
Reply
#5
(2014-08-24, 08:21)FernetMenta Wrote: Really nice! I appreciate your work on this.
Thank you! Smile

(2014-08-24, 22:00)membrane Wrote: Nice work.

My approach I had in mind is a bit different. I would use a full 256x256x256 lut to do the yuv/rgb color conversion (+ any other color correction). Is there a particular reason to implement it like you did?
Thanks!

I chose 65x65x65 because that is the usual LUT resolution in the ICC profile. Initially I copied the data directly, and applied output curves separately. Then I noticed that I'd need the input curves too in some cases, and switched to sampling the whole transform.

The LUT resolution could easily be made configurable.

I initially fed YCbCr to the LUT, but ran into issues creating a suitable device link with ArgyllCMS collink (it requires "-n" when doing YCbCr->RGB). I didn't want to look into it further before I knew that the rest was ok.

I did leave in variables for lift and gain ("brightness" and "contrast") in the BT.1886 gamma calculator. I think that might be the best place for them - assuming it's not too slow to recreate the LUT when user adjusts those.

(2014-08-24, 22:00)membrane Wrote:
(2014-08-23, 01:30)lmyllari Wrote: [*]probably doesn't work well with two pass rendering (dithering applied too early)
Because you apply the dither pattern before scaling? This should be fine for 1080p content where no scaling is needed.
Agreed.

I'd like to do it properly to support other than 1080p, but couldn't figure out how to yet. The first pass output would need to be kept in higher precision.

(2014-08-24, 22:04)Martijn Wrote: Do note that it would probably be better to work against our code from master branch. That's our main development branch and there could be several changes compared to Gotham. Could likely cause some headaches rebasing your work onto that.
I'll do that.


I'm not sure who "notspiff" on github is, but thank you for the review and very helpful feedback!
Reply
#6
Notspiff is cptspiff - a long year xbmc developer.
First decide what functions / features you expect from a system. Then decide for the hardware. Don't waste your money on crap.
Reply
#7
aka ironic_monkey on the forums
Reply
#8
Thumbs Up 
(2014-08-23, 01:30)lmyllari Wrote: I've also been working on a simple pattern generator for more flexibility than dispcal provides.

This part seems very interesting to me Smile

Would be really nice if there was a way to perform a calibration directly in xbmc, is this what you are doing or am I misunderstanding?

It's very cool to see someone step up and tackle such a high end feature.

Definitely subscribing to this thread.
Reply
#9
(2014-08-27, 03:00)sebj Wrote:
(2014-08-23, 01:30)lmyllari Wrote: I've also been working on a simple pattern generator for more flexibility than dispcal provides.
This part seems very interesting to me Smile

Would be really nice if there was a way to perform a calibration directly in xbmc, is this what you are doing or am I misunderstanding?
It's a standalone qt program. There's two ways forward that I've considered - either implement the pattern generator in xbmc (as a plugin maybe?), or integrate with x264 and stream the output to any player. The latter would have the advantage of using the actual signal path, whatever it happens to be.

For now it's just a tool to provide APL patterns for dispcal to calibrate plasmas. Wink
Reply
#10
Thumbs Up 
(2014-08-23, 01:30)lmyllari Wrote: ..., or integrate with x264 and stream the output to any player. The latter would have the advantage of using the actual signal path, whatever it happens to be.


Is this is even possible?

I hope people realize how far the rabbit hole and innovative this idea is. (To my knowledge)

Makes absolute sense,

lmyllari, you are really taking this seriously!
Reply
#11
(2014-08-27, 18:04)sebj Wrote:
(2014-08-23, 01:30)lmyllari Wrote: ..., or integrate with x264 and stream the output to any player. The latter would have the advantage of using the actual signal path, whatever it happens to be.
Is this is even possible?

I hope people realize how far the rabbit hole and innovative this idea is. (To my knowledge)
I don't think there are any fundamental issues with it, and I think one of the commercial packages offers something like this. I was kind of bummed out to see that after I thought I came up with a unique neat idea. Tongue

My idea for the implementation is something like this:
  • optional rendering to a bitmap instead of a window (should be easy with qt)
  • encode bitmap with x264 (timing needs some thinking though to be suitable for streaming; need to provide frames fast enough but don't want to build too much of a buffer, causing delay when pattern is switched)
  • make stream available with a simple http server, or maybe even plain udp if that works
  • use xupnpd to announce stream url to dlna devices
Reply
#12
@lmyllari

If I understand correctly, you are thinking of :

running an app on a laptop for example
remotely using xbmc to play the patterns in a scripted way, probably driven by the app running on the laptop?

A la "calman" right?
Reply
#13
I'm very interested in trying this but I'm also very "green" with XBMC in general. I run OpenElec on a Chromebox.

Is there anyway I can add this code to my OpenElec installation?... And can someone possibly summerize some of the main points of how I would go about it?

Thank You!

-Brian
Reply
#14
Ok,..no reply coming I guess.

I'll just say I'm very interested in this topic (provided use of the 3DLut doesn't bring the playback system to a crawl but I don't know because I have never used one before.)

I enjoy monitor and projector calibration and used properly a 3D Lut takes it to a much more detailed level.

I don't know if the author lost interest in this but If I had a choice it would be to package it like an add on that simply makes it easy to browse for and install the 3DLUT.

A separate system could be used for making the 3DLut which is no problem... any PC type device can do that easily with freeware tools.

Ultimately I would like to see this as a simple option in the "video" settings. That would be so nice.

Of course fancier things like the ability to generate the LUT and other calibration functions would always be welcome.

I think the best way for me to proceed with the work that was already done is to try to add it to a system I use that's running on a Mac mini... that system is not part of my projection setup and also running on full computer type system may make it easier for me to figure out how to add the new code.4



-Brian
Reply
#15
Bring this back into focus hopefully...

Looks like this was done using older code. Can it be implemented in current revisions?... I know the majority of users won't use it but it would give the ability to tune my projector approx perfect so I really want it. Smile
Reply
  • 1(current)
  • 2
  • 3
  • 4
  • 5
  • 17

Logout Mark Read Team Forum Stats Members Help
3dLUT support3