Posts: 658
Joined: Jul 2011
Reputation:
1
2011-11-14, 00:47
(This post was last modified: 2011-11-14, 00:51 by StinDaWg.)
You pretty much need to turn off all advanced video settings in CCC. All they do is mess up the picture. Dynamic contrast, edge enhancement, de-noise, ect all bad. Turn them off. The only thing I turn on is vector adaptive deinterlacing, pulldown detection, and set video level to 0-255. That last setting will depend on your set up. With HDMI and a Panasonic plasma I need to set 0-255 to get correct and consistent white and black levels. Your tv and setup may be different.
Posts: 1,741
Joined: Jul 2010
Reputation:
10
I noticed bad picture quality that's until I turned off all the picture settings in the ati ccc and then the picture was exellent.
Them settings destroy the picture quality.
Posts: 1,741
Joined: Jul 2010
Reputation:
10
Cheers does that go gor 576i as well and does it work with the dvxa2 option enabled
Posts: 278
Joined: Feb 2008
Reputation:
2
touser
Senior Member
Posts: 278
WhiningKhan,
Thank you for the info. Would you recommend leaving the feature enabled or disabled for use with xbmc on an e-350?
Posts: 65
Joined: Jan 2011
Reputation:
0
Why doesn't XBMC incorporate MadVR? It's the best render out right now.
I use an external player MPC-HC with MadVR. XBMC should at least look into this.
Posts: 1,741
Joined: Jul 2010
Reputation:
10
Maybe because alot of people use dxva2 and if u use that it can not use the inbuilt deinterlacer of xbmc and only relays on the gfx deinterlacer maybe ati or nvidia should add the option.
The only way to use the xbmc deinterlacer is to disable dxvaa2 which alot of people do not do as they like there gfx card to decode video