2011-09-18, 10:44
LAV filters play 10-bit flawlessly, have HW support, and are created by XBMC commiters AFAIK...
psorcerer Wrote:LAV filters play 10-bit flawlessly, have HW support, and are created by XBMC commiters AFAIK...
psorcerer Wrote:LAV filters play 10-bit flawlessly, have HW support, and are created by XBMC commiters AFAIK...
jwcalla Wrote:I'm pretty sure it doesn't do HW acceleration of 10-bit decoding.Afaik it uses shaders for most of computation work. Not sure how much more efficient (heat) it is when compared to CPU only computation.
jwcalla Wrote:And that reminds me that bringing Hi10p into the XBMC fold probably isn't going to be as trivial as just patching in the new ffmpeg like I thought. It's probably going to need some kind of logic to check the file first to see if it's 10-bit, and if it is, select a software decoder, and if it's not, select hardware decoding.Up to now Hi10P is almost Anime only feature, so no one will care about such switch, i guess . You can run in CPU only mode with Hi10P (when new ffmpeg is added) or DXVA with artifacts like now.
I have to do this now manually anytime I try to play something in mplayer, and it's absolutely obnoxious. This stuff got pushed out without even thinking of (or caring about) the consequences.
bambi73 Wrote:Afaik it uses shaders for most of computation work. Not sure how much more efficient (heat) it is when compared to CPU only computation.
atari800 Wrote:Question:
It seems the content its testing on is Anime(cartoons) which is going to get best compression of all media, right?
Has any one testing this by ripping Matrix or Fast & Furious (something with a lot of action on the screen)? Is the compression and CPU usage still justifying this?
CharredChar Wrote:But such groups as these obviously dont care about compatibility and never have. This has always been an issue with them and this is just another example of it, as long as it runs fine on their machines to hell with everyone else, its not their problem. I have run into this time and time again with "their" content as Ive been watching fansubs since the 90s, even with that new-fangled codec called DivX.
jwcalla Wrote:I'm pretty sure it doesn't do HW acceleration of 10-bit decoding.
psorcerer Wrote:It does. It uses Nvidia CUDA (yep, Nvidia only).
jwcalla Wrote:Do you have a link?