Posts: 23,527
Joined: Aug 2011
Reputation:
1,107
fritsch
Team-Kodi Developer
Posts: 23,527
xbmc uses SwapBuffers to make "sync". It uses a front and a backbuffer. When you now start to use a Triple Buffering approach. xbmc does not know for sure what already was on the screen and what will come later. This has big influence of A/V Sync and output.
Concerning the HDMI1/2 port, this could be an additional issue, if it is syncing to the wrong "Monitor" and clocks differ. Try to disable one screen (via the Settings Addon), newest testing should have it.
Can you post your Xorg.0.log and your xorg.conf, that you used to make this logfiles?
First decide what functions / features you expect from a system. Then decide for the hardware. Don't waste your money on crap.
Posts: 23,527
Joined: Aug 2011
Reputation:
1,107
fritsch
Team-Kodi Developer
Posts: 23,527
Yes, you are right. They mostly use exactly one monitor. I see that you hardcoded all the modelines in there. But the same issue occurs, if you let the TV get auto detected? The error is gone, if you turn off one monitor, e.g. deatch it from the setup?
First decide what functions / features you expect from a system. Then decide for the hardware. Don't waste your money on crap.
Posts: 228
Joined: May 2013
Reputation:
3
I tried removing the customized xorg.conf to start up with an autogenerated Xorg.conf, but the problems remain the same (with the exception of the missing TearFree option, which cause the tearing when running full-screen video on the second display.)
Also tried XBMCs 'turn off other monitor' setting, which didn't do anything. Physically powering off the LCD (or unplugging the cable) also didn't make much difference. Actually it appeared that the judder was more prevalent when running on just HDMI2, as if confusing it when it lost the primary display. When the cable was plugged back in XBMC switched to HDMI1 even though HDMI2 was still active.
Posts: 662
Joined: Jan 2007
Reputation:
7
Get a HDMI Splitter and use just one HDMI Port.
ASRock Beebox J3160 4GB RAM 120GB SATA SSD - Harmony Elite BT
Intel NUC Kit DN2820FYKH 4GB RAM 120GB SATA SSD - Harmony Smart Control BT
all @ Libreelec
Testbuild
HP N54L @ Ubuntu 14.04.4 Minimal Server / MySQL DB
HP N40L @ Ubuntu 14.04.4 Minimal Server
Posts: 1,483
Joined: Aug 2010
2013-07-08, 15:59
(This post was last modified: 2013-07-08, 16:00 by Robotica.)
(2013-06-19, 18:25)Robotica Wrote: (2013-06-19, 15:39)fritsch Wrote: Still some things AMD has to work on:
a) mpeg2 and mpeg4 hw accel
b) Bitstream Audio
c) Better deinterlacing quality.
And regarding the deinterlacing. So this is still not supported by the drivers, thus GPU. But as I understood CPU could still be used. In the past this was only single core due to XBMC's old FFMPEG. But nowadays FFMPEG is updated to a newer version. Does this also mean deinterlacing is done on multiple CPU cores?
(2013-06-19, 18:30)fritsch Wrote: We had this some time before, when the internal xvba was more buggy. It can be done - but it costs a lot more performance. I don't think the AMD GPUs can handle decoding + separate deinterlacing in high quality. We had BOB implemented that way, which also runs via xvba now. Bob needs one field and doubles the line it does not have, you find it in the xvba-sdk dokumentation, XVBATransfer iirc - read the last sentence about the scaling.
Offtopic: The BOB deinterlacer on intel needs more GPU load than decode and even more than encode (!) ... which was quite surprising :-)
As I understand, the h264@
high L5.1 issue (and driver support for MPG2 and MPEG4) has been resolved (in driver and SDK) by AMD. I still wonder if Kabini GPU will be strongh enough to also do the deinterlacing?
Posts: 23,527
Joined: Aug 2011
Reputation:
1,107
fritsch
Team-Kodi Developer
Posts: 23,527
You understood wrong.
Level5.1 was solved.
mpeg-2: not existent
mpeg-4: not existent
still only BOB deinterlacing.
First decide what functions / features you expect from a system. Then decide for the hardware. Don't waste your money on crap.
Posts: 824
Joined: Jun 2005
Reputation:
6
2013-07-10, 09:06
(This post was last modified: 2013-07-10, 09:14 by ezechiel1917.)
Just wondering what's the slowest nvidia GPU which supports smooth VDPAU playback without dropped frames and full temporal/spatial deinterlacing + vdpau sharpness on 1080i50 hdtv streams with XVBA codebase?
Whats the recommended minimum core config? 48 or 96 ps cores? Is 64bit bus enough?
Basically I'm looking for lowest TDP nvidia which could run fanless if that's possible.
Thanks for any suggestions.
Posts: 23,527
Joined: Aug 2011
Reputation:
1,107
fritsch
Team-Kodi Developer
Posts: 23,527
GT610 has the lowest TDP I think. I would not go with a GT210 or something, cause that one is really too slow.
Basically look at the Spec of the Zotac ID42 or the D2500 mini itx board from zotac with the integrated GT520 - those i would suggest.
is vdpau sharpness really needed for 1080i50 on a 1080p monitor?
First decide what functions / features you expect from a system. Then decide for the hardware. Don't waste your money on crap.
Posts: 824
Joined: Jun 2005
Reputation:
6
2013-07-10, 13:09
(This post was last modified: 2013-07-10, 13:13 by ezechiel1917.)
Thanks! I was actually hoping for GT610 (for the sake of not to much high temperatures with it being fanless)
Hopefully it should work with XVBA 1080i50 full deinterlacing (found FernetMentas post mentioning his GT520 does a good job), not sure about vdpau sharpness though.
I'm really used to it, 0.28 has a nice touch for visual quality perception to me even for 1080p material. So ideally I would prefer not to lose that setting.