Kodi Community Forum
Initial native support for DXVA2 in SVN - Time to say goodbye to your firstborns - Printable Version

+- Kodi Community Forum (https://forum.kodi.tv)
+-- Forum: Support (https://forum.kodi.tv/forumdisplay.php?fid=33)
+--- Forum: General Support (https://forum.kodi.tv/forumdisplay.php?fid=111)
+---- Forum: Windows (https://forum.kodi.tv/forumdisplay.php?fid=59)
+---- Thread: Initial native support for DXVA2 in SVN - Time to say goodbye to your firstborns (/showthread.php?tid=69306)



- phillyfan1138 - 2010-06-27

For those of us with crappy cpus, it would be nice to have HA for dvds as well. I dont have menus, just the movie ripped as .vobs.

I can comfirm that dxva is not working for dvds with the latest build.


- elupus - 2010-06-27

if you can't play dvd's on the cpu, it must be really crappy Smile


- phillyfan1138 - 2010-06-27

Pentium 4 2.8 ghz, already takes 30 percent cpu with just xbmc running. Using dxva for DVD is smoother and better quality, especially since I can take advantage of nvidia's video enhancements.


- elupus - 2010-06-27

start a dvd.. i bet cpu usage is way lower than xbmc idling.. the nvidia enhancements may have made sence if we would have been using any of the advanced processing features of dxva which we don't.


- phillyfan1138 - 2010-06-27

My understanding was that once dxva was used nvidia would take over the processing. At this point the nvidia enhancements would have an effect. I could be wrong though. In any case, its not a huge deal, I am happy with hd mpeg2 dxva.


- jagilbertvt - 2010-06-27

Not to derail the hw accelerated mpeg2 discussion, I was doing some reading last night and today and found some information that may relate to the Intel DXVA issues and thought i'd provide them here (though, most are probably more relevant to the ffmpeg developers).

1st thing I noticed was that the GUID's for Intel DXVA Mode C and E are wrong in DXVA.cpp
Code:
DEFINE_GUID(DXVADDI_Intel_ModeH264_C, 0x664F8E66,0x4951,0x4c54,0x88,0xFE,0xAB,0xD2,0x5C,0x15,0xB3,0xD6);
DEFINE_GUID(DXVADDI_Intel_ModeH264_E, 0x664F8E68,0x4951,0x4c54,0x88,0xFE,0xAB,0xD2,0x5C,0x15,0xB3,0xD6);

should be

Code:
DEFINE_GUID(DXVADDI_Intel_ModeH264_C, 0x6[color=red][b]0[/b][/color]4F8E66,0x4951,0x4c54,0x88,0xFE,0xAB,0xD2,0x5C,0x15,0xB3,0xD6);
DEFINE_GUID(DXVADDI_Intel_ModeH264_E, 0x6[color=red][b]0[/b][/color]4F8E68,0x4951,0x4c54,0x88,0xFE,0xAB,0xD2,0x5C,0x15,0xB3,0xD6);

This will allow dxva to be properly detected. Unfortunately, as you probably know, DXVA implementation on the Intel stuff is a bit broken (I got dxva initializing, but would only get a horrible green screen).

Reading the attachment onthis page (see "article attachment" towards the bottom) from Intel (describing the implementation work they did w/ Casimir666 on implementing DXVA in MPC-HC) a couple things stand out:

1) The application/codec needs to inform the hardware it is sending a full list of reference pictures by setting the DXVA_PicEntry_H264.Reserved16Bits to 0x34c.

I believe this is an error in the document and that it should read DXVA_PicParam_H264.Reserved16Bits, as there is no such variable in the PicEntry_H264 structure (which is a member of the PicParam_H264 structure.) Also reviewing Casimir666's current MPC-HC code reveals this:
Code:
    if(m_pFilter->GetPCIVendor() == PCIV_Intel)
        m_DXVAPicParams.Reserved16Bits = 0x534c;
    else
        m_DXVAPicParams.Reserved16Bits = 0;

Interestingly this differs from the article (setting it to 0x534c instead of 0x34c). Both of these are quite different from what Microsoft's DXVA doc specifies (this parameter should be set to 0, 1, 2, or 3). In the current ffmpeg source it is set to 3. (see Microsoft DXVA specification doc: http://www.microsoft.com/downloads/details.aspx?displaylang=en&FamilyID=3d1c290b-310b-4ea2-bf76-714063a6d7a6)


The next interesting peice in the Intel article was this:

One of the fundamental differences about the implementation on the IntelĀ® Graphics Media Accelerator X4500HD should be called out at this point. In other implementations, the RefPicList[][].Index7Bits contains a index into the DXVA_PicParams_H264.RefFrameList[]. The index is its position in the “master” reference list. On the IntelĀ® GMA X4500HD, the RefPicList[][].Index7Bits needs to contain the surface index – so essentially the reference picture structure for slices has the same information that the master list has in its Index7Bits field.

The also include some of the code demonstrating how it was done (though from what I can tell, it's since changed). But, I think this is likely the heart of the issue. Unfortunately, I know next to nothing about the ffmpeg code (Which is where I think the main change would need to be implemented) (and even less about h264 decoding).

In any case, I thought the information might be helpful (or perhaps not) to someone more knowledgeable.

Also, I came across a post regarding a similar issue VLC 1.1.0 is having w/ DXVA on Intel cards (since they also use ffmpeg). That can be found here (including a screen shot of what happens). See that post here: http://forum.videolan.org/viewtopic.php?f=34&t=76994&start=0

Sorry if this is a bit long winded, but I hope someone finds it useful. I will attempt to pass it along to FFmpeg developers, as well. I think one of the obvious issues is that most people developing this code dont have an intel card (or in most cases Intel IGP chipset) to code to (as remarked in the VLC thread).


- SlaveUnit - 2010-06-28

I pulled a single .vob file out of an iso and the DXVA MPEG2 offloading seems to work fine. Personally and for lots of others I would guess that since these do not wwrk with DVDs (img, iso and VIDEO_TS) formats it wont be used too often. That's just a guess though. I would love to have this work with the listed DVD formats though. But I do understand how this could be complicated. You could always skip the offloading on the VTS_01_0.vob file since that is usually the menu. But I do realize you might run into issues where there are multiple menus or the naming scheme isn't conventional. Just thinking out loud I guess. Also CrystalP asked for testers on the MPEG2 DVXA.


- phillyfan1138 - 2010-06-28

Quote:I pulled a single .vob file out of an iso and the DXVA MPEG2 offloading seems to work fine..

Maybe mpeg2 dxva does not work on my nvidia 8400? I simply pulled a .vob to play as well.


- SlaveUnit - 2010-06-28

Yeah I thought that was said a few posts back. I think you need a newer video card.


- phillyfan1138 - 2010-06-28

Quote:Yeah I thought that was said a few posts back. I think you need a newer video card.

Fair enough. Its weird because I can get 100 percent acceleration on mpeg2, vc1, and avc in Media player classic with this card.


- Mallet21 - 2010-06-28

phillyfan1138 Wrote:I dont have menus, just the movie ripped as .vobs.

if you're interested....google Vob2Mpg. It will take your Vob folder and convert it to one .mpg file. takes about 45 seconds to make the conversion.


- SlaveUnit - 2010-06-28

Mallet21 Wrote:if you're interested....google Vob2Mpg. It will take your Vob folder and convert it to one .mpg file. takes about 45 seconds to make the conversion.

If the files are vobs, they already work fine.


- Kill-9 - 2010-06-28

So i got a 760G motherboard with an HD3200 GPU built in and HDMI out. I put in a 5600+ X2 and wanted to see if I would get the same pixelation.

I installed Windows 7, ATI 10.6 Cats, and XBMC r31453. I enabled DXVA and found no issue, but a bit of frame drop, and the CPU is at about 40% or so, usually. Is this normal? I mean the CPU usage is low for 1080p non dxva but I can't tell for sure. Any ideas?

EDIT Looks like the frame drop teetered off after a few minutes and hasn't returned. CPU is about 30% now... agerage.


- WhiningKhan - 2010-06-28

elupus Wrote:start a dvd.. i bet cpu usage is way lower than xbmc idling.. the nvidia enhancements may have made sence if we would have been using any of the advanced processing features of dxva which we don't.

Speaking of this, one huuuuuge quality improvement for MPEG2's from TV (both SD and HD) that would be really simple to implement is HW deinterlacing. I don't think it is even possible with current CPU's to create SW implemented real-time deinterlacer that provides as good quality as HW based vector adaptive deinterlacers do.

Just opening appropriate processing device with deinterlace support (instead of the progressive device) and passing the stream to processor with interlacing flags set was enough to make it work when I tried this some time ago.

It might even be worthwhile to decouple the DXVA postprocessing from decoding completely, so that the output could be done with the DXVA device even when decoding is done by pure SW. Although, MPEG2/VC1/h264 already cover all practical formats where interlacing is used - but in cases where HW does not support HW decoding for a particular source (as seems to be the case in all ATI chips at the moment for MPEG2), the benefits of DXVA processing could still be had.


- CrystalP - 2010-06-28

WhiningKhan, agreed. DXVA deinterlacing shouldn't be too hard since we can get field flags from the renderer. If you have some code lying around, please create a trac ticket and share.

I have DXVA post proc for non-DXVA decode in the back of my mind and it seems technically doable but requires some reorg of the renderer (which I happen to be working on for other reasons). Running the HQ upscalers on DXVA decodes should be doable too, with similar renderer changes.

What's more difficult (and not as efficient) is DXVA decode then software processing. That requires a copy from GPU to system memory (similar to VLC's approach) and doesn't work on ATI at the moment.