• 1
  • 13
  • 14
  • 15(current)
  • 16
  • 17
WIP 3dLUT support
Kodi should be compiled with liblcms2. an installation liblcms2 doesn't solve missing settings.
Reply
if any buddy here plz reply
my kodi is not open
plz suggest what i to do?
Reply
(2017-04-10, 18:50)((( atom ))) Wrote:
(2017-03-30, 14:16)Soli Wrote: You also have to be sure to match limited and/or full range, which is a whole topic in itself. ICC files are mainly full range whereas 3Dluts for video is normally limited range. (although you can surely override this for if needed) Maybe this can explain some of the differences.

As said, the LUTs I initially made looked worse than the according icc-profiles. I then went over to the displaycal-forum and the author asked me to send all the stuff in. He said, I should not have measured using full RGB output and I should set the entire chain to limited output.

Well, I did use full RGB, because I understood that it is the best way from the "Video levels and color space" Kodi-wiki page here: See 2.2
http://kodi.wiki/view/Video_levels_and_color_space

The author of displaycal now told me, I could use additional arguments in displaycal to make argyll produce limited range output. This would then fit to the setup described as best in the wiki.

How do you guys set up measuring and playback?

That wiki might be a little hard to understand. I mean, I understand all of this stuff but when I look at those graphs with the smiley faces it makes my head explode.

There's basically 3 choices:
Kodi at default (full), gpu at full, tv at full. Videolevels are scaled once. (Most common with dedicated pc monitors)
Kodi at default (full), gpu at limited, tv at limited. Videolevels are scaled twice, probably the most common setup for "normal" people who are using Kodi with a HDTV (All platforms, from intel, android, LE and so on)
Kodi at limited, gpu at full, tv at limited. Videolevels are not scaled, preservation of btb and wtw, the only right mode to maximize image quality.

Sorry to repeat this if you already know, but the same kinda goes for displaycal when you want to make the most accurate luts. You tell displaycal to use limited range, but gpu has to be full, and tv limited.
So authour of displaycal is wrong, never ever set entire chain to limited. GPU always has to be full since that will leave video input/output untouched.
Reply
OK, understand it better now. Setting displaycal to output limited using -E in davced options for displaycal and dispread didn't seem to work. Got the same results as without.

Anyway I seem to have a problem with the spyder5. I get a reddish look, very bad even in the dark regions. Tried variuos tips, still all reddish.
Reply
Just checked: Uninstalling kodi-git (self-compiled) from my laptop and replacing it with regular kodi from the arch repository still gives the the icc-profile-option. This means, it was compiled with liblcms2. It does not show up on my videoplayer-machine, so something must be missing. This is very bad, because I could at least make use of those..
Reply
Sometimes it's best to fast forward and do the debugging afterwards.

Try cloning your laptop install to the new machine. Gparted Live.
Icc should now be available right? Try with icc profile.
My guess you'll still have a reddish tone.

I don't think Spyders are nowhere near sensitive enough to be used with projectors. (Also probably why you get vlack crush) I know JVC autocal supports them but I guess the software includes a special profile for the meter and you are then supposed to measure the lens directly rather than the screen.

If you did measure the screen, did you try loading other profiles? (Not that it's 100% accurate but should/could be more accurate than the default profile)

I read the differences between Spyder 5 Elite vs Pro included projector calibration. Since the sensors are the same the limitation is in the included Spyder software. Or course you don't have that limitation in Displaycal, but if you think about it the Spyder software can't know if you are measuring a HDTV or a projector.

This means the limitation has to be either (1) that the max Y value (luminance) the sensor will register is artificially "stopped down", with the result that you cannot measure directly from the projector lens since the luminance of the lens would higher than this artificial limit. (2) And/or that the Spyder software won't apply a projector profile (an offset correction) for projector calibration, meaning that you will get inaccurate results. (And that the difference somehow is noticeable enough to justify a premium price increase too) This means using a Displaycal without a correction profile for projection will yield inaccurate results too.

Sorry I don't have time to google the exact difference on my ipad. Only thing that's sure is that you're asking for trouble if you measure a projector screen with a Spyder.

The time I've saved by not having to second guess things all the time, is why I've spent a fortune on some pro meters.. Smile
Reply
Cloning the machine is not really an option. It is a fully blown production laptop that would not even fit on the 40 GB SSD of the dedicated player. I only need to figure, which packet is missing and it's a ten second thing..

I just ordered an xrite i1 Display pro to make sure to have exact measurements.

Displaycal has a correcrion profile for projectors and people have successfully profiled prohectors with Displaycal. That should work.

Hopefully someone here knows what package is missing and hopefully the i1 will give better results. I will wait for that before I try anything else, especially since taking measurements always takes overnight, with the projector running, which is annoying and costs lamp-life. The i1 is at least twice as fast, so I will perhaps even get to see the results on the same evening.

Gesendet von meinem SM-G920F mit Tapatalk
Reply
Kodi v18 for Windows will also supports 3dlut -> https://github.com/xbmc/xbmc/pull/11982
Reply
So the xrite i1 Display Pro arrived. Measuring speed is way higher and the overall results are near very satisfying as far as I can tell. Just checked a few scenes so far but what I saw was stunning. Smile

Most amazing what kodi can do with that now... Thx again for that! Smile

Gesendet von meinem SM-G920F mit Tapatalk
Reply
Not a big surprise! The i1Display Pro, while having an almost identical pricing as the Spyder 5 Elite, represents a much better value. But being fair, it does cost quite a bit more than a Spyder5 Pro or even Express (which is really the same hardwareHuh?) Anyways, the i1Display Pro is much faster, sensitive and accurate, and not least: much more consistent. Sometimes you can get it for a bargain price too.

I guess Datacolor (parent company of Spyders) were the first to offer calibration solutions for the consumer/prosumer market and I guess that's why a "Spyder" is still ubitquitous as "calibration meter/puck" for many.
Reply
Not first really, just way more, and better, marketing. But X-Rite have had the distinctly better option available for most of the last 15 years except maybe for a brief period before the iDisplay 2 was replaced by the Pro.

But X-Rite are a badly run company that have consistently failed to understand how to approach the consumer market...
Addons I wrote &/or maintain:
OzWeather (Australian BOM weather) | Check Previous Episode | Playback Resumer | Unpause Jumpback | XSqueezeDisplay | (Legacy - XSqueeze & XZen)
Sorry, no help w/out a *full debug log*.
Reply
This is OT but whatever.. I guess we're the 0.000001% of the userbase anyways.. Smile
yes,no,yes...I think we mostly agree. It's not that long ago calibration meters costs were in the thousands..the market was a bit fragmented all over and Datacolor obviously saw fit to consolidate it, so to speak.

I guess in the old days X-rite always made money in the pro/nearly pro market: paint/prints/you-name-it and with yearly recalibrations that amounts to quite a bit. At least they made enough money to buy Pantone, Gretag Macbeth, Monaco, and the company behind the Chroma5 colorimeter.
They got the "i1" line from Gretag and dropped their own products. Gretag Macbeth was behind the legacy i1display products, including the infamous i1Display2.
The DTP94 Colorimeter they got from Monaco. (Was called Optix before the merge). This is a great device, even to this day, although it's not accurate with modern display technologies and needs a dark reading each 10min. (You could probably override this and it'd be mostly fine but don't quote me on that) But it used glass filters that don't detoriate (as much), and was pretty sensitive so it was great for measuring monitors that had a low black black level.
The Chroma5 was from a company I can't remember the name of. This was seen as a pretty good colorimeter back in the day. The DTP94 was really superior but the Chroma5 didn't need constant dark readings.

They kept the DTP94, i1Display 2, and Chroma5 around for a while, plus their own crappy legacy colorimeters (dtp92?), and at one time they even had a big and expensive colorimeter called the X-rite Hubble that had laser pointing and needed to be mounted on a tripod. They dropped alle these one by one and introduced the i1Display Pro. It had the glass filters from DTP94 and took it a step further by sealing them with rubber o-rings behind a light mixing chamber, the same chamber giving it superior luminance sensitivity. It also took a page from the Chroma5/Display2 by not needing dark readings. And they individually calibrate each i1Display Pro to a known reference. It was also better in all possible ways than even the X-Rite Hubble, save for the laser pointing mechanism, at about 1/15 of the cost. Today's version of the i1Display Pro is a revision B, the main difference is better refresh rate detection and synchronization (AIO), and might be be ever so slightly faster when using the new AIO mode.

The consensus is that they really did hit a home run with the i1Display Pro. Still, it does cost quite a bit so I guess that's why there's still a market for Spyders.
The alternative is to save about 40% of the cost by buying the Colormunki Display. It's the same hardware as the i1Display Pro but comes with a crappier software (doesn't matter if you use HCFR or Displaycal). The firmware is crippled so the speed is limited to about 1 reading each second. Mind you, that's still faster than a Spyder5 Pro at a lesser cost Big Grin
Reply
Yeah, mine was intended to be the short version of all that ^ .... I've been along for that whole ride.
Addons I wrote &/or maintain:
OzWeather (Australian BOM weather) | Check Previous Episode | Playback Resumer | Unpause Jumpback | XSqueezeDisplay | (Legacy - XSqueeze & XZen)
Sorry, no help w/out a *full debug log*.
Reply
Good meter that is. The better my measurements became, the smaller the difference to the calibrated but unprofiled projector. When I finally figured, that the displaycal option "apply vcgt-correction to 3dlut", which defaults to "on" leads to the correction being applied twice I was able to produce perfect correction luts, by leaving it off. Now using the 3dlut corection leads to nearly no visible difference, haha! The projector is simply very very good. The amazing colors I saw on the way were the correction being applied twice, but in the end they were just wrong and became annoying after a while.

What I am now desperately looking for is a way to switch between "corrected" and "not corrected" without going through the menu. Going through the menu dosen't allow me to switch fast enough to really see what is different, since the two results are so close. I could not find a way to do that using the regular keymap. It is simply not implemented, at least not officially. http://kodi.wiki/view/keymap

Any hints?
Reply
Atom, it's supposed to be on, that's the gamma/white balance correction. It's the most important thing to get right, which means it's also the most visible thing, which means that if you get it wrong it will stick out like a sore thumbWink

Of course your whites aren't reddish anymore.. you didn't do anything to them! But at the same time a non-corrected projector (it's almost the rule with few exceptions) has a very non-linear gamma, so this is something you want to correct.

Just use hcfr to get the white balance as neutral as possible (that the respective R G B lines are together, but not neccesarily a totally straight line overall), adjust the gamma as best as you can but prioritize to get it neutral and get a "feel" for when enough is enough. Don't overdo anything (among other things that can introduce new errors on *real* content) as the 3DLUT (with vcgt) will take care of the rest. Use relative colorimetric rendering intent, as I understand that will leave the whites alone except correct for gamma. You may have to experiment..

When it comes to espescially projector calibrations, the i1disp pro alone, even though it's a great meter, still has limitations. I'm not saying it's at fault, but just something to keep in mind, and don't be afraid to experiment...even thinking outside the box sometimes.

Afedchin: dunno if you already thought of this, but for Windows it would be logical to follow madvr's lead and autodetect if vcgt is included in the 3DLUT or not, and if included then reset the GPU 1DLUT accordingly when loading the 3DLUT.. it's a good idea for Linux as well, but not that important obviously..
Reply
  • 1
  • 13
  • 14
  • 15(current)
  • 16
  • 17

Logout Mark Read Team Forum Stats Members Help
3dLUT support3