Kodi Community Forum

Full Version: VDPAU API for Linux released by NVIDIA today - GPU hardware accelerated video decoder
You're currently viewing a stripped down version of our content. View the full version with proper formatting.
Pages: 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28
I agree with hotzenpl0tz that the killa sample is a bad example to use as a reference. The bird scene is good, but the clip that we use was poorly ripped and has definite problems. Perhaps a new rip of the same scene is in order.

That said... It plays fine using vdpau (using the previous set of patches). Using an 8800gt, Q6600 1 cpu still ramps up to 100%, so there's still something weird going on. Also, note that the gpu is underclocked (current bug with the nvidia drivers). When at the correct speed, most 1080p movies top out at 10% cpu or so.

Using the current set of patches, it fails to play with:
Code:
[h264_vdpau @ 0xb9df40]warning: first frame is no keyframe
vo_vdpau: get_image failed

Again, I believe that this is the fault of the rip, not of mplayer or nvidia.
TheUni
hotzenpl0tz Wrote:God I hate that stupid killa sample, why on earth has THAT become the standard for h264 benchmarking, why on Earth has nobody made a high bitrate sample with real world settings that make use of all the h264 features and trying to achieve the best possible picture. I dont't get it.

That's why I started this thread:
http://forum.xbmc.org/showthread.php?tid=41475
But nobody seems to care about what I say :p
Threads are the place for discussion. If you want to make something happen, put up a wiki page, add it to your sig, and references it whenever anyone talks about hardware or "will this play?" material.
I don't see why it's a bad example. You test things by putting them in completely unrealistic situations, which killa is. However, I do agree that striving for playing killa w/o frame drop is just stroking your e-chubby. Wink
althekiller Wrote:I don't see why it's a bad example. You test things by putting them in completely unrealistic situations, which killa is. However, I do agree that striving for playing killa w/o frame drop is just stroking your e-chubby. Wink

Yes exactly. If you can play that clip you can play anything. that scene, hell that whole disk set, is BRUTAL. Personally I think it's a fine benchmark for those reasons. When *my* machine dropped frames on that clip it also dropped frames on other rips although not as noticeably. And I like my e-chubby thank you very much!Nod

I have that scene encoded with my settings which apparently aren't as extreme as I had feared and supposedly will work with vdpau. If I can figure out how best to edit out a section of that for others to test with I will. I just haven't had the time and no one has suggested a way to do it. <shrug>

Sadly it looks like every other project is integrating those patches and ffmpeg isn't doing much with it. Chatter on their dev list concerning vdpau is pretty much dead right now and NVIDIA is helping others who are interested. As a result it could be awhile before we see that acceleration in XBMC I fear.Blush
When you're playing video games you want the fastest video card to get the most frames per second at the highest quality settings and resolution.

Luckily video playback isn't like that. Sure, there are some post processing things that can be done to make things look nicer that take up some CPU, but for the most part you're playing a video and you just want it to hit every frame at the right time.

This is why killa is unreasonable. Sure, if I can play killa I can play anything. But if I can play a 720p scene release, I can play anything I would ever play, so why does it matter if killa drops?

So instead of measuring the FPS, we should be measuring the highest class of video a hardware setup can play. Your e-chubby can play killa, an Apple Mini can play 720p x264, and the xbox can play 480p divx.

With this NVIDIA stuff we can hopefully upgrade a peice of hardware from 480p divx to killa, but if not, getting 720p x264 is still awesome.
BLKMGK Wrote:Sadly it looks like every other project is integrating those patches and ffmpeg isn't doing much with it. Chatter on their dev list concerning vdpau is pretty much dead right now and NVIDIA is helping others who are interested. As a result it could be awhile before we see that acceleration in XBMC I fear.Blush

I think the FFmpeg devs are waiting for a nvidia implementation that

-falls back to software if the hardware doesn't support it
-can disable vdpau at compile time
-is according to the FFmpeg C standards (they are rather picky)

It seems that they are not so keen on depending on closed source projects.

And once the ffmpeg svn is updated, xbmc still has to be updated (OSD, ...), which will also take a lot of time (however in case of Mythtv, that went fast).

One day we can build low power, low noise xbmc boxes that can handle 1080p clips. I hope that day is not so far away.
Malloc, you are happy with 720P but others are not and not everyone plays "scene releases" which in my country would be considered illegal if they are movies from the box office. I happen to rip my own BlueRay movies and I do not rip them at 720 since off the disk they are 1080. For folks looking to play back THAT kind of content Killa is a decent example to strive to play back. If folks are happy with 720P playback then by all means suggest a 720P clip as an example and hardware that can play it smoothly - I'd suggest an overclocked 2ghz dual core Celeron BTW Wink

Beefke, I have followed the ffmpeg dev list on this. I am aware of the objections the ffmpeg guys have with the suggested patches so far and you've pretty well pegged it. Reading some of the responses on the list though some of the comments almost seem petty although they aren't being so obvious as that. There hasn't been too much interest in improving the code so much as picking it apart and criticizing it near as I can tell. The NVIDIA guys post code, they pick at it, no one seems to be modding it unless I've missed something. I can understand that folks don't like closed code but we've not exactly seen the likes of Intel or AMD producing something better despite promises and assurances. Is it so much to ask for to work with what we've got? Maybe Intel and AMD will get it together eventually but it's not today. Maybe this will kick them in the ass?
I'd be very happy playing HD DVB-T content recorded by Myth TV. The most intensive HD content being being digitally broadcast in New Zealand is 1080i H264 with LATM AAC audio. Most channels are 720p H264.

The AAC audio decoding is quite CPU intensive and I've only managed to playback 1080i without massive audio lag by turning on the skiploopfilter and setting it to 32, which drops all sorts of stuff. Can definitely notice the square blocks being redrawn as the content plays.

And, with the skiploopfilter turned on, XBMC is also dropping content for 720p files, which would otherwise have worked fine without it turned on.

I'm running an ASUS M3A78-EM board (ATI HD 3200 with HDMI) with an AMD X2 6000+ CPU. At 3.0Ghz, 2 cores and 125W, needing anything more grunty is getting damn pricey (fastest chip I could buy regardless of price at the time).

I'm hoping that ATI are also going to support VDPAU, but who knows what they are going to do...
dteirney - the AMD CPU is your issue. I would be willing to bet that an Intel C2D at that speed wouldn't have the issues you are seeing. I would further be willing to bet that if you tried to play the Killa' sample that's been posted about you would get a great deal of dropped frames.

Honestly a dual core 2ghz Celeron pushed to 3ghz might play that 720P stuff fine but probably not the 1080 judging from the testing I did. An E7300 is what I last used - that's a 65watt chip and that system is running fine on a 270watt P/S overclocked to a hair over 3ghz. This might help...

As for ATI supporting VDPAU, I doubt it. That's an NVIDIA closed source piece of code. They will likely go their own way or maybe OpenCL will help us someday as it's crossplatform\manufacturer.
dteirney Wrote:I'm running an ASUS M3A78-EM board (ATI HD 3200 with HDMI) with an AMD X2 6000+ CPU. At 3.0Ghz, 2 cores and 125W, needing anything more grunty is getting damn pricey (fastest chip I could buy regardless of price at the time).

I'm hoping that ATI are also going to support VDPAU, but who knows what they are going to do...

The problem isn't your cpu, it's the ati drivers. They're working on mpeg2 and h264 harware accel, also -- it's called xvba.

@BLKMGK:
nvidia has written VDPAU to be able to be supported by any hardware vendor.
Thanks for the correction Mythmaster - wasn't aware of that!
So what's happening with this? Xine people didn't rely on / wait for ffmpeg guys to get their finger out and did their own thing, and it went pretty quick. I am now watching HD channels which I couldn't before. :-)

I wish I could see 1080p movies smooth utilizing this technology. What's the holdup?
you submitting the diff's. we are all waiting for you.
And I am waiting for you. :-)

I don't have the sufficient knowlege obviously, I was kind of relying on the current XBMC dev team to look into it (how would there be any XBMC if the dev team waited for .diff files all the time?)
Is there not any remote interest in doing this?
Pages: 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28