• 1
  • 40
  • 41
  • 42(current)
  • 43
  • 44
  • 54
Win HOW TO - Set up madVR for Kodi DSPlayer & External Players
@Warner306 I keep having random noise problems, I can't quite get the balance between using my processing power between Image Upscaling and reducing random noise.  Should I use Image enhancements to increase sharpness instead?  Not really sure what the best mix of the 3 is.  I seem to be able to get rid of 720p noise at about level 3 of random noise without a ton of sharpness loss with NGU Sharp Low.  I will keep tinkering but if you have any guidance that might be useful it would be great.  This is on my larger 75" setup but my upstairs setup needs some noise reduction as well, I just have a bit more power/options up there.
Reply
The current noise reduction algorithm in madVR isn't that great. It is supposed to be improved at some point, but it is what it is at the moment.

The noise is likely part of the source video. Upscaling with NGU Anti-Alias instead of NGU Sharp might help or NGU Sharp with reduce compression artifacts checked as part of NGU Sharp upscaling is an option. However, both options likely won't remove enough of the noise to be completely satisfying. Adding sharpening shaders to reduce random noise might also be worth trying, but that is attempting to fix one issue by adding another. Still likely not that satisfying. 

The general recommendation is to use high-quality sources and live with the grainy look that comes with anything shot on film rather than attempt to remove source noise with video processing. It rarely works without significant compromises.
Reply
(2019-12-07, 11:20)Warner306 Wrote: The current noise reduction algorithm in madVR isn't that great. It is supposed to be improved at some point, but it is what it is at the moment.

The noise is likely part of the source video. Upscaling with NGU Anti-Alias instead of NGU Sharp might help or NGU Sharp with reduce compression artifacts checked as part of NGU Sharp upscaling is an option. However, both options likely won't remove enough of the noise to be completely satisfying. Adding sharpening shaders to reduce random noise might also be worth trying, but that is attempting to fix one issue by adding another. Still likely not that satisfying. 

The general recommendation is to use high-quality sources and live with the grainy look that comes with anything shot on film rather than attempt to remove source noise with video processing. It rarely works without significant compromises.
Is it possible to set different settings for different content types?  Like I don't want the noise filter on anything above 1080p.
Reply
Yes, the last section in the guide provides profile rules based on the source resolution. Here you go:

if (srcHeight > 1080) "2160p"
else if (srcWidth > 1920) "2160p"

else if (srcHeight > 720) and (srcHeight <= 1080) "1080p"
else if (srcWidth > 1280) and (srcWidth <= 1920) "1080p"

else if (srcHeight > 576) and (srcHeight <= 720) "720p"
else if (srcWidth > 960) and (srcWidth <= 1280) "720p"

else if (srcHeight <= 576) and (srcWidth <= 960) "SD"
Reply
(2019-12-06, 11:33)petoulachi Wrote: Hello there (first post on this forum !) !

First of all thanks for this complete and detailled post. I've just bought an 65OLED, coming from an old 1080p TV. I always used mon HTPC (GTX750 / i5-6600 / 16GB)  as my video source, with Kodi/MPC/LAV/madvr
Now that I can enjoy 4KHDR content, i'm starting to earn again how to get the best from my HTPC. But reading thoses few post it seems that it probably won't be able to decode 4KHDR properly ?
I still havent received my TV yet so I cannot test, but as i'd love to enjoy it the day I received it I can get a 1050Ti (the most powerfull fanless graphic card at this moment).

I do not plan to do crazy things with MadVR for 4K contents, as it's on a 65 display I'm not sure (but I could be wrong !) it will really be noticable.

Thanks a lot for your advice !
So I received my 65OLED this week end and start somes tests with my current HTPC.
4K h265 source is NOT an issue with my current GTX750. Of course i'm not doing anything crazy with MadVR for now, but it's ok, at least for 24fps movies. (render time is about 14ms).
I'll give a try with a 1050Ti, using more powerfull feature and see what I'll get though.

Anyway, I have one simple question about HDR movie (and MadVR sending HDR signal). It's working as it should and my TV detect HDR content and switch to HDR mode.
Now, I just want to calibrate the HDR content as color seems a bit to crazy. 
I've always used calibrate TV using DisplayCal and i1DisplayPro to generate 3DLUT for MadVR.
Already done it for "normal" mode with rec709 movies, with great result (average ΔE 0.27 and maximum ΔE 0.81).
How should I do for HDR mode ?
Reply
It generally isn't advised to use a 3D LUT for the HDR mode of LG OLEDs. They aren't very accurate compared to SDR 3D LUTs due to the display's tone mapping. You will end up with a lot of luminance and saturation errors.

You can create an HDR 3D LUT with DisplayCal, but I'm not familiar with the workflow and this will only create a single static tone curve for the display for all sources. So you could possibly create several static tone curves to handle sources with different mastered peak nits. You would have to do some searching on the DisplayCal forums to find the correct settings parameters if you wanted to experiment with these type of LUTs.

The built-in CalMAN AutoCal feature of the LG OLEDs would likely produce better results, but you must first purchase a copy of CalMAN.

If you want to calibrate the display's HDR mode, you are usually better off by adjusting the grayscale manually with HCFR. Don't touch the brightness or contrast controls in HDR mode because they are set by default to optimize the display's tone mapping. Some calibrators will only use the RGB high and low controls to calibrate HDR mode and others will use the full 10-20 point white balance controls. It would be safest to start with the RGB high and low controls to see how much of an improvement can be made. The rest of the calibration process is the same as SDR mode.
Reply
I have a Philips OLED but I guess it's the same.

I'm not sure I fully understand this "this will only create a single static tone curve for the display for all sources. So you could possibly create several static tone curves to handle sources with different mastered peak nits. "
My TV is about 720 peak nits, and it seems I can set this value in DisplayCal, so peak nits should be ok ?
What are "static tone curve" ?

Anyway, I could give a try with HCFR to adjust grayscale, is there any tuto in this forum to do it as I'm not familiar with this soft ?
Reply
A static tone curve uses a fixed highlight roll-off for all mastering peaks. In general, it is best to use a more aggressive roll-off for high mastered frame peaks (e.g., 10,000 nits) and a less aggressive roll-off for lower mastered frame peaks (e.g., 1,000 nits) to only sacrifice brightness for less highlight clipping when it is actually needed.

3D LUTs are always static tables, so you can only create multiple LUTs and have madVR select the appropriate 3D LUT before playback with profile rules that reference hdrVideoPeak.

I don't have a link to a tutorial for HCFR. You would have to do a Google search for one.
Reply
This is something I don't get, the peak 720nit from my TV it NOT related to the "mastered frame peaks (10,000nits)" ?

One video is "encoded" with a defined peaks ? Sorry if the terms are not good, I'm not expert at all but trying to figure out how it's working altogether. 
I saw lots of 1,000nits/4,000nits/10,000nits that should be related to video defined peaks nits ?

Meaning I could do 3 3DLuts, one for each video peaks nits, and then in MadVR use profile to apply the correct one with hdrVideoPeak to select to appropriate profile ?

For HCFR I'll google it, thanks for the help !
Reply
(2019-12-09, 14:07)petoulachi Wrote: This is something I don't get, the peak 720nit from my TV it NOT related to the "mastered frame peaks (10,000nits)" ?

That is the brightness level your display is capable of.
(2019-12-09, 14:07)petoulachi Wrote: One video is "encoded" with a defined peaks ? Sorry if the terms are not good, I'm not expert at all but trying to figure out how it's working altogether. 
I saw lots of 1,000nits/4,000nits/10,000nits that should be related to video defined peaks nits ?

That is the maximum frame peak nits of the HDR source video. Each source peak nits must be tone mapped to the 720 nits your display can actually produce. So any highlights above 720 nits are compressed using a smooth roll-off curve that ends at the display peak nits.
(2019-12-09, 14:07)petoulachi Wrote: Meaning I could do 3 3DLuts, one for each video peaks nits, and then in MadVR use profile to apply the correct one with hdrVideoPeak to select to appropriate profile ?

Yes, that is one approach. Higher mastering peak nits can use a roll-off curve that starts sooner in the display curve to preserve more detail at the expense of peak brightness. And lower mastering peaks can use a roll-off curve that starts very late in the display curve to preserve more brightness.

Another more elaborate approach would be to pre-measure all of your HDR movies with madMeasureHDR and use another variable such as the Average Frame Maximum Light Level (AvgFMLL in madVR) to have a more representative value for the source peak brightness than just the maximum frame peak nits provided by HDR10 metadata.
Reply
This is perfectly clear, thanks a lot !

Now I need some more informations about HDR 3DLut with DisplayCal, there are some points that are not completly clear (particulary the correct whitepoint, what cd/m² should I use ? 150 as for SDR content ? when in HDR mode this goes crazy, it's very bright).
Reply
The white point is still D6500. Peak display nits in HDR are based on a 10% window. Try measuring Y with HCFR using test patterns such as these:

https://drive.google.com/drive/folders/1...Qlr0wvpWhG
Reply
You're talking a bit chinese to me for the moment Wink

D6500 means I should have 150cd/m² (as for non hdr content for me) ? I mean, by default, the TV probably were somewhere about 300cd/m² when in HDR mode, so I was guessing it was on purpose ?
Reply
An OLED panel can only do 720 nits or so in a very small window because of its Auto Brightness Limiting (ABL) feature that prevents the screen from becoming too hot.

But, for HDR purposes, you are only concerned with making very small parts of the screen as bright as possible to represent the small HDR specular highlights.

So peak luminance or peak nits in the display's HDR mode is different than its SDR mode. Standard peak luminance measurements in HDR mode are measured in a 10% white window and not a 100%, fullscreen white pattern.

You either need to measure a 10% white window pattern mastered in PQ HDR10 with your colorimeter. Or you could estimate the peak nits of your display in HDR mode by finding an online review where someone else measured it in the most accurate picture mode.
Reply
I know all the stuff about ABL, because I'm coming from a plasma TV and already had to deal with this (even if it seems you are suggesting that ABL is not activated on SDR content ?).

I have the estimated peak nits yes, there's a lot of review, it's ~720nits.
So are you suggesting that when doing the white color interative calibration I should aim 720cd/m² ? Is this the same as 720peak nits ? nits is quite new for me and I don't really know if it's the same !

Beside that I want to thank you for your kindness and help, this is really gold for me Smile
Reply
  • 1
  • 40
  • 41
  • 42(current)
  • 43
  • 44
  • 54

Logout Mark Read Team Forum Stats Members Help
HOW TO - Set up madVR for Kodi DSPlayer & External Players5