• 1
  • 158
  • 159
  • 160(current)
  • 161
  • 162
  • 189
How to Install XBMC PVR Xvba for AMD/Nvidia/Intel GPUs
(2013-07-04, 20:03)fritsch Wrote: xbmc does not need this "TearFree" option. Intel uses VerticalBlank Sync without any further tuning. Just set "Vertical Blank Sync" to "Let Driver choose" in xbmc settings. Intel has a bug of causing high load, when Always Enabled is chosen there.

For amd / nvidia choose instead Always enabled.
This is not correct. When the TearFree option is disabled all videos exhibit massive tearing artifacts. When it is re-enabled, these problems disappear. I don't know the config of the system you have verified this as working ok on, and how many outputs it is using, but I re-tested this just now with the vertical blank sync option set to 'let driver choose' and restarted XBMC. It made no difference on the tearing. I also tested this repeatedly before learning about the TearFree option, as it certainly wasn't my first choice to try and solve this issue.
I am not sure what your system is doing, by I got feedback of some thousand OpenELEC users, cause we ship this setting as default. If you are having some compiz / whatever running under it, it won't play nice - yes. Tearfree actively breaks how xbmc syncs.

Quote:00:27:47 T:140216985810816 DEBUG: Window Manager Name: Mutter (Muffin)

No further questions ...
First decide what functions / features you expect from a system. Then decide for the hardware. Don't waste your money on crap.
I also tested XBMC running under Fluxbox and the standalone XBMC -> no difference wrt. judder

Some more logs:

Fluxbox: http://pastebin.com/BxN4ecCs

Standalone session: http://pastebin.com/VZr8v2bk

When running it standalone XBMC outputs display on both ports regardless of the HDMI1/2 setting. The only difference is the resolution (i.e, HDMI1 would display 1920x1080 instead of 1920x1200). So for this issue at least, any extra overhead caused by the DE currently in use does not effect the problem with the judder.
xbmc uses SwapBuffers to make "sync". It uses a front and a backbuffer. When you now start to use a Triple Buffering approach. xbmc does not know for sure what already was on the screen and what will come later. This has big influence of A/V Sync and output.

Concerning the HDMI1/2 port, this could be an additional issue, if it is syncing to the wrong "Monitor" and clocks differ. Try to disable one screen (via the Settings Addon), newest testing should have it.

Can you post your Xorg.0.log and your xorg.conf, that you used to make this logfiles?
First decide what functions / features you expect from a system. Then decide for the hardware. Don't waste your money on crap.
I don't think that the majority of OpenElec users are using dual-head, but I could be wrong.

Xorg.0.log: http://pastebin.com/LDY6ayNi (hopefully the correct one, it has been restarted numerous times)

xorg.conf: http://pastebin.com/65znU3vB
Yes, you are right. They mostly use exactly one monitor. I see that you hardcoded all the modelines in there. But the same issue occurs, if you let the TV get auto detected? The error is gone, if you turn off one monitor, e.g. deatch it from the setup?
First decide what functions / features you expect from a system. Then decide for the hardware. Don't waste your money on crap.
I tried removing the customized xorg.conf to start up with an autogenerated Xorg.conf, but the problems remain the same (with the exception of the missing TearFree option, which cause the tearing when running full-screen video on the second display.)

Also tried XBMCs 'turn off other monitor' setting, which didn't do anything. Physically powering off the LCD (or unplugging the cable) also didn't make much difference. Actually it appeared that the judder was more prevalent when running on just HDMI2, as if confusing it when it lost the primary display. When the cable was plugged back in XBMC switched to HDMI1 even though HDMI2 was still active.
Get a HDMI Splitter and use just one HDMI Port.
ASRock Beebox J3160 4GB RAM 120GB SATA SSD - Harmony Elite BT
Intel NUC Kit DN2820FYKH 4GB RAM 120GB SATA SSD - Harmony Smart Control BT
all @ Libreelec Testbuild
HP N54L @ Ubuntu 14.04.4 Minimal Server / MySQL DB
HP N40L @ Ubuntu 14.04.4 Minimal Server
(2013-06-19, 18:25)Robotica Wrote:
(2013-06-19, 15:39)fritsch Wrote: Still some things AMD has to work on:
a) mpeg2 and mpeg4 hw accel
b) Bitstream Audio
c) Better deinterlacing quality.

And regarding the deinterlacing. So this is still not supported by the drivers, thus GPU. But as I understood CPU could still be used. In the past this was only single core due to XBMC's old FFMPEG. But nowadays FFMPEG is updated to a newer version. Does this also mean deinterlacing is done on multiple CPU cores?

(2013-06-19, 18:30)fritsch Wrote: We had this some time before, when the internal xvba was more buggy. It can be done - but it costs a lot more performance. I don't think the AMD GPUs can handle decoding + separate deinterlacing in high quality. We had BOB implemented that way, which also runs via xvba now. Bob needs one field and doubles the line it does not have, you find it in the xvba-sdk dokumentation, XVBATransfer iirc - read the last sentence about the scaling.

Offtopic: The BOB deinterlacer on intel needs more GPU load than decode and even more than encode (!) ... which was quite surprising :-)

As I understand, the h264@high L5.1 issue (and driver support for MPG2 and MPEG4) has been resolved (in driver and SDK) by AMD. I still wonder if Kabini GPU will be strongh enough to also do the deinterlacing?
You understood wrong.

Level5.1 was solved.

mpeg-2: not existent
mpeg-4: not existent
still only BOB deinterlacing.
First decide what functions / features you expect from a system. Then decide for the hardware. Don't waste your money on crap.
Just wondering what's the slowest nvidia GPU which supports smooth VDPAU playback without dropped frames and full temporal/spatial deinterlacing + vdpau sharpness on 1080i50 hdtv streams with XVBA codebase?

Whats the recommended minimum core config? 48 or 96 ps cores? Is 64bit bus enough?
Basically I'm looking for lowest TDP nvidia which could run fanless if that's possible.

Thanks for any suggestions.
GT610 has the lowest TDP I think. I would not go with a GT210 or something, cause that one is really too slow.
Basically look at the Spec of the Zotac ID42 or the D2500 mini itx board from zotac with the integrated GT520 - those i would suggest.

is vdpau sharpness really needed for 1080i50 on a 1080p monitor?
First decide what functions / features you expect from a system. Then decide for the hardware. Don't waste your money on crap.
Thanks! I was actually hoping for GT610 (for the sake of not to much high temperatures with it being fanless)
Hopefully it should work with XVBA 1080i50 full deinterlacing (found FernetMentas post mentioning his GT520 does a good job), not sure about vdpau sharpness though.
I'm really used to it, 0.28 has a nice touch for visual quality perception to me even for 1080p material. So ideally I would prefer not to lose that setting.
(2013-05-29, 08:23)fritsch Wrote: Upgrade howto (to be done via ssh as always) [snipped]

I upgraded to the 13.6 driver and find the "Testing Use Only" in the corner to be highly distracting. I assume this is "hardwired" into the driver? If so, how can I gracefully revert?

Thanks...
No - if you followed the howto I gave, you have seen that the first command will replace the "signature file" - this is enough to get the testing logo go away.

If you missed this step, replace /etc/ati/signature with: https://dl.dropboxusercontent.com/u/55728161/signature
First decide what functions / features you expect from a system. Then decide for the hardware. Don't waste your money on crap.
  • 1
  • 158
  • 159
  • 160(current)
  • 161
  • 162
  • 189

Logout Mark Read Team Forum Stats Members Help
How to Install XBMC PVR Xvba for AMD/Nvidia/Intel GPUs23