Jittery video, tearing, A/V sync - solved
#1
I was suffering from the same problem as many users here from weird video artifacts in XBMC bu not in Mediaplayer or VLC. I didled around with a bunch of settings and eventually found the solution:

Under system settings for appearance there is a vsync option which can be set to a number of selections, one being 'let xxx decide' (I cannot decipher the exact text as the skin does not show it properly). Other options include 'Always', When playing video' and none. Neither of these work except 'xxxx decide'. Video that was previously out of sync (when turned to 'always'), tearing (when set to 'none') now plays perfect sync, no artifacts.

I continue to believe that this is completely unrelated to any bogus refresh setting. Refresh is an analog setting, HDMI is purely digital and does not include this information at all.

I don't know what exactly XBMC is mucking with when using these settings or whether they serve any meaningful purpose (VGA out?) but when left alone it works perfect for me and I had tons of problems before.

So set to ... xxxx decide. Love to know what the exact text f that message is any maybe the skin being fixed...
Reply
#2
I believe it is "Let driver decide" or "Let graphic driver decide", someone correct me if Im wrong.
Reply
#3
rernst Wrote:I continue to believe that this is completely unrelated to any bogus refresh setting. Refresh is an analog setting, HDMI is purely digital and does not include this information at all.

Here we go again... Smile There is always refresh rate. ALWAYS. No matter if this is CRT, LCD or whatever...
Reply
#4
rernst Wrote:I continue to believe that this is completely unrelated to any bogus refresh setting. Refresh is an analog setting, HDMI is purely digital and does not include this information at all.
Holy... read it up, google it up. You are wrong, stop spreading misinformation, please.
Reply
#5
And Let driver decide was already discussed in the other thread. It only works for some cards such as Nvidia and isn't the solution to the problem.
Reply
#6
HexusOdy Wrote:And Let driver decide was already discussed in the other thread. It only works for some cards such as Nvidia and isn't the solution to the problem.

Let driver decide worked for me on Ati, Nvidia, and Intel graphics and it made things far smoother than any other vsync setting.
Reply
#7
Hi,

I tested this setting with 2 ATI graphic based and 1 Nvidia graphic system.

On both ATI systems, this setting doesn't elimenate the jittery video, but on my nvidia based system, I have a perfect smooth video.

My Nvidia system is a Abit I-N73HD mainboard with build in Gefore 7100.

Bye,

Sidekick
Reply
#8
Nothing too helpful to add. Can only say that I tried 'let the driver decide' (I have NVidia geforce go 7400 card), and tearing still present.

Sad
Reply
#9
PantsOnFire Wrote:Nothing too helpful to add. Can only say that I tried 'let the driver decide' (I have NVidia geforce go 7400 card), and tearing still present.

Sad

I use an 8600GT card, so maybe it is an NVidia issue. But heck, don't glame me guys. Just trying to be helpful.
Reply
#10
ashlar Wrote:Holy... read it up, google it up. You are wrong, stop spreading misinformation, please.
Dude, read up on it, there is no such thing as refresh rate on HDMI:

http://www.tomshardware.com/forum/50988-...rates-crts

If people like you keep posting this stuff there will always be confusion.

CRT's have refresh rates because the phosphorus only lights for a certain amount of time. Try running a CRT at 60Hz. Can you se the flicker - Yes I can. Run an LCD display at 60 HZ? It's rock solid:

Again: THERE IS NO SUCH THING AS REFRESH RATE ON HDMI!
Reply
#11
The main reason why the refresh rate on an LCD may matter in gaming is because of VSync - which is discussed in greater detail in the Vertical Synchronization setting. The simple fact of the matter is that LCD monitors have to work on the basis of receiving new frames of information from a graphics card's frame buffer like a CRT would: i.e, during the VBI. So when VSync is disabled the graphics card will sometimes race ahead and when the LCD monitor indicates it is ready for a new frame during the blanking interval, the graphics card may provide a partially new frame overlapping an older one, just like it would for a CRT. An LCD monitor will then display this just the same way a CRT monitor would, resulting in visible tearing. The alternative of enabling VSync can resolve this, but in turn can reduce FPS to a fraction of the refresh rate. The lower your refresh rate, the greater the performance drop, which is why a 60Hz refresh rate on an LCD may be problem.

Therefore LCD monitors, despite not actually physically working on the same basis as a CRT, wind up being bound by the same limitations and problems - minus the flicker - because they operate in a software environment originally designed with CRTs in mind.
Reply
#12
rernst Wrote:Dude, read up on it, there is no such thing as refresh rate on HDMI:

http://www.tomshardware.com/forum/50988-...rates-crts

If people like you keep posting this stuff there will always be confusion.

CRT's have refresh rates because the phosphorus only lights for a certain amount of time. Try running a CRT at 60Hz. Can you se the flicker - Yes I can. Run an LCD display at 60 HZ? It's rock solid:

Again: THERE IS NO SUCH THING AS REFRESH RATE ON HDMI!
Shouting won't make what you say more relevant. And quoting a forum post as the end all be all answer to this subject borders on ridicule.

If plasma and LCD displays don't have a refresh rate, why is it that my plasma panel reports both horizontal and vertical refresh rates in the signal menu? Do you want pictures?

How is it that at 24Hz movie material plays smoothly and at 60Hz it doesn't?

How is it that plasma and LCDs have both 60Hz and 50Hz modes to correctly display PAL material?

Don't confuse the fact that LCD and plasma are progressive technologies with the lack of refresh rates. Do the pictures move? They do, and they do so at fixed intervals. There is a refresh rate. It might internally work differently, but it's there.
Reply
#13
idioteque Wrote:The main reason why the refresh rate on an LCD may matter in gaming is because of VSync - which is discussed in greater detail in the Vertical Synchronization setting. The simple fact of the matter is that LCD monitors have to work on the basis of receiving new frames of information from a graphics card's frame buffer like a CRT would: i.e, during the VBI. So when VSync is disabled the graphics card will sometimes race ahead and when the LCD monitor indicates it is ready for a new frame during the blanking interval, the graphics card may provide a partially new frame overlapping an older one, just like it would for a CRT. An LCD monitor will then display this just the same way a CRT monitor would, resulting in visible tearing. The alternative of enabling VSync can resolve this, but in turn can reduce FPS to a fraction of the refresh rate. The lower your refresh rate, the greater the performance drop, which is why a 60Hz refresh rate on an LCD may be problem.

Therefore LCD monitors, despite not actually physically working on the same basis as a CRT, wind up being bound by the same limitations and problems - minus the flicker - because they operate in a software environment originally designed with CRTs in mind.
Indeed, I believe that is (somewhat) correct . However, if wikipedia is any proper explanation, here is what it has to say:

---------
Liquid crystal displays

Much of the discussion of refresh rate does not apply to the liquid crystal portion of an LCD monitor. This is because while a CRT monitor uses the same mechanism for both illumination and imaging, LCDs employ a separate backlight to illuminate the image being portrayed by the LCD's liquid crystal shutters. The shutters themselves do not have a "refresh rate" as such due to the fact that they always stay at whatever opacity they were last instructed to continuously, and do not become more or less transparent until instructed to produce a different opacity.

The closest thing liquid crystal shutters have to a refresh rate is their response time, while nearly all LCD backlights (most notably fluorescent cathodes, which commonly operate at ~200Hz) have a separate figure known as flicker, which describes how many times a second the backlight pulses on and off.
---------------

Given that there is some sort of 'refresh rate' of 200Hz I find it unlikely (albeit not impossible) that tearing or jitter is an issue due to this factor.

Instead, I believe the logic in XBMC using this value is somehow defective due to incorrect assumptions (anybody watching HD on a CRT TV set?). This would explain why 'letting the display driver decide' seems to work for many. Nvidia probably has a better grip on these matters.

One should look into the logic behind the vertical sync in the XBMC code and not speculate on multiples of frame rates rates.

It should also be obvious that 'trying to match as closely as possible' the frame rate is a futile attempt as when exactly the frame is rendered is unrelated to when the signal is passed on. The only surefire way to minimize this problem would be to run at the highest possible refresh rate.

However, certainly NVidia's drivers pick one and only one refresh rate for the device they are attached to, in my case 60Hz. That also being said, I would really hear an explanation why I am now getting no tearing and no jitter and perfect A/V sync.
Reply
#14
rernst Wrote:Instead, I believe the logic in XBMC using this value is somehow defective due to incorrect assumptions (anybody watching HD on a CRT TV set?). This would explain why 'letting the display driver decide' seems to work for many. Nvidia probably has a better grip on these matters.

One should look into the logic behind the vertical sync in the XBMC code and not speculate on multiples of frame rates rates.
No... as pointed out, leave the technology alone. LCDs and plasmas do display moving images with updates at fixed intervals. That's what refresh rate means for them. If you have your LCD set at 60Hz it displays 60 frames per second, if it's set at 50 Hz it displays 50 frames per second. And so on and so forth.
Reply
#15
ashlar Wrote:No... as pointed out, leave the technology alone. LCDs and plasmas do display moving images with updates at fixed intervals. That's what refresh rate means for them. If you have your LCD set at 60Hz it displays 60 frames per second, if it's set at 50 Hz it displays 50 frames per second. And so on and so forth.
See my other post: Since there is no such thing as interlacing where pulldown minimizes jitter how come my '60 Hz display rate' produces tear-free and jitter free video?

Something does not add up.

Also note that I am using VFR just in case you presumed that there is some fancy logic in the NVidia card that adjusted to signal changes (which would be fancy indeed).
Reply

Logout Mark Read Team Forum Stats Members Help
Jittery video, tearing, A/V sync - solved0