Issues with VDPAU, Auto Adjust Screen Refresh and Deinterlacing
#1
Hi,

I am looking for some help please, apologies for the long post, I wanted to include any data that may be relevant.

I am having a problem getting my HTPC to work consistently at 1080p. It works perfectly at 1080p 24 fps (recorded from TV by MythTV using an HDHomeRun). For regular recordings of 60i TV, with Auto for the Deinterlacing, it by default achieves 40 or so fps, jerky video and lots of dropped frames. If I disable deinterlacing or use on of the half frame rate deinterlaces, then it works correctly at 29.97 fps. If I use BOB, the I can achieve 59.94 fps, but with temporal or temporal-spacial, I see the low frame rate.

There is one twist. Some of the commercials are 24 fps, if this occurs, xbmc detects the frame rate and switches to 24 fps, when the regular show comes back on, the frame rate stays at 24 fps - which makes jerkiness and frame drop worse. If, I then jump forward or backwards and land on the regular show at 60i, xbmc detects this and switches 59.94 fps, but this time it maintains the frame rate and everything is good. the only way to make this work is to have xbmc detect a 24 fps section and switch to it, then switch back. See http://pastebin.com/6uEnDj8K for a debug log which shows the initial start at 60 fps (but fps limited to 40 or so), the detect and switch to 24 fps, then the jump and switch back to 60 fps, correctly at 60 fps. There was nothing I noticed that seemed to indicate the difference.

Possibly related, with 720x480 DVD rips of TV shows still in original MPG format, which are 16:9, using Auto for deinterlace and auto for scaling has the exact same frame rate and dropped frames as above. However, changing the scaling to Bilinear, or setting VDPAU scaling in advancedsettings.xml seems to clear this up completely. Interesting autoscalemaxfps didn't appear to help.

My goal would be to leave deinterlace and scaling on one setting (preferably auto/auto) - for family ease of use. Using Bob would be perfect, except 720p recordings drop frames with Bob, but work perfectly with auto. Setting scaling to any value doesn't appear to impact 720p recordings.

Any ideas please?

My HTPC front end is an AMD x2 64 3800 CPU, with 1 GB of DRAM and an nVidia G210. This is connected via HDMI to a Vizio VL320M, which is 1080p capable.

I am running
VDPAU from the ppa version 260.19.29
xbmc Dharma release SVN 35648 built on Dec 17th, again from ppa.
Kernel 2.6.32-26, x64_86 from Mythbuntu 10.04

I used this link http://forum.xbmc.org/showthread.php?tid=70068 to configure my system to automatically adjust the refresh rate - my modes.txt looks like this

--- Modes in ModePool for VIZ VL320M (DFP-1) ---
"nvidia-auto-select" : 1920 x 1080 @ 60.0 Hz (from: EDID)
"1920x1080" : 1920 x 1080 @ 60.0 Hz (from: EDID)
"1920x1080_60" : 1920 x 1080 @ 60.0 Hz (from: EDID)
"1920x1080_60_0" : 1920 x 1080 @ 59.94/60 Hz (CEA-861B Format 16) (from: EDID)
"1920x1080_50" : 1920 x 1080 @ 50 Hz (CEA-861B Format 31) (from: EDID)
"1920x1080_24" : 1920 x 1080 @ 23.97/24 Hz (CEA-861B Format 32) (from: EDID)
"1920x1080_60i" : 1920 x 1080 @ 59.94/60 Hz (CEA-861B Format 5) (from: EDID)
"1920x1080_50i" : 1920 x 1080 @ 50.0 Hz Interlace (from: EDID)
"1280x720" : 1280 x 720 @ 59.94/60 Hz (CEA-861B Format 4) (from: EDID)
"1280x720_60" : 1280 x 720 @ 59.94/60 Hz (CEA-861B Format 4) (from: EDID)
"1280x720_50" : 1280 x 720 @ 50.0 Hz (from: EDID)
"1024x768" : 1024 x 768 @ 75.0 Hz (from: EDID)
"1024x768_75" : 1024 x 768 @ 75.0 Hz (from: EDID)
"1024x768_70" : 1024 x 768 @ 70.1 Hz (from: EDID)
"1024x768_60" : 1024 x 768 @ 60.0 Hz (from: EDID)
"800x600" : 800 x 600 @ 75.0 Hz (from: EDID)
"800x600_75" : 800 x 600 @ 75.0 Hz (from: EDID)
"800x600_72" : 800 x 600 @ 72.2 Hz (from: EDID)
"800x600_60" : 800 x 600 @ 60.3 Hz (from: EDID)
"800x600_56" : 800 x 600 @ 56.2 Hz (from: EDID)
"720x576" : 720 x 576 @ 50.0 Hz (from: EDID)
"720x576_50" : 720 x 576 @ 50.0 Hz (from: EDID)
"720x576_50i" : (1440)x 576 @ 50 Hz Interlace (CEA-861B Format 21) (from: EDID)
"720x480" : 720 x 480 @ 59.9 Hz (from: EDID)
"720x480_60" : 720 x 480 @ 59.9 Hz (from: EDID)
"640x480" : 640 x 480 @ 75.0 Hz (from: EDID)
"640x480_75" : 640 x 480 @ 75.0 Hz (from: EDID)
"640x480_73" : 640 x 480 @ 72.8 Hz (from: EDID)
"640x480_60" : 640 x 480 @ 59.94/60 Hz Interlace (CEA-861B Format 1) (from: EDID)
--- End of ModePool for VIZ VL320M (DFP-1): ---


my xorg.conf

Section "Device"
Identifier "nvidia"
Driver "nvidia"
Option "NoLogo" "true"
Option "DynamicTwinView" "false"
Option "NoFlip" "false"
Option "FlatPanelProperties" "Scaling = Native"
Option "ModeValidation" "NoVesaModes, NoXServerModes"
Option "UseDisplayDevice" "DFP-1"
Option "ModeDebug" "true"
Option "HWCursor" "false"
EndSection

Section "Screen"
Identifier "screen"
Device "nvidia"
SubSection "Display"
Modes "1920x1080_60_0" "1920x1080_50" "1920x1080_24"
EndSubSection
EndSection

Section "Extensions"
Option "Composite" "false"
EndSection


my advancedsettings.xml

<advancedsettings>
<video>
<vdpauscaling>true</vdpauscaling>
<adjustrefreshrate>
<override>
<fpsmin>27.0</fpsmin>
<fpsmax>31.0</fpsmax>
<refreshmin>59.9</refreshmin>
<refreshmax>60.0</refreshmax>
</override>
<override>
<fpsmin>23.9</fpsmin>
<fpsmax>24.1</fpsmax>
<refreshmin>23.96</refreshmin>
<refreshmax>24.0</refreshmax>
</override>
<override>
<fpsmin>24.65</fpsmin>
<fpsmax>25.4</fpsmax>
<refresh>50.0</refresh>
</override>
<fallback>
<refreshmin>59.9</refreshmin>
<refreshmax>60.0</refreshmax>
</fallback>
</adjustrefreshrate>
</video>
</advancedsettings>


Noting that there is conflicting info about a g210's ability to do post processing at 1080p rates, here is the output of qvdpautest - I have no idea how defensible this information is, but to my layman's understanding, it implies everything should work at 60 fps - albeit only just


AMD Athlon™ 64 X2 Dual Core Processor 3800+
NVIDIA GPU GeForce 210 (GT218) at PCI:1:0:0 (GPU-0)

VDPAU API version : 1
VDPAU implementation : NVIDIA VDPAU Driver Shared Library 260.19.29 Wed Dec 8 12:27:03 PST 2010

SURFACE GET BITS: 684.746 M/s
SURFACE PUT BITS: 718.099 M/s

MPEG DECODING (1920x1080): 68 frames/s
MPEG DECODING (1280x720): 162 frames/s
H264 DECODING (1920x1080): 62 frames/s
H264 DECODING (1280x720): 135 frames/s
VC1 DECODING (1440x1080): 77 frames/s

MIXER WEAVE (1920x1080): 362 frames/s
MIXER BOB (1920x1080): 598 fields/s
MIXER TEMPORAL (1920x1080): 164 fields/s
MIXER TEMPORAL + IVTC (1920x1080): 108 fields/s
MIXER TEMPORAL + SKIP_CHROMA (1920x1080): 220 fields/s
MIXER TEMPORAL_SPATIAL (1920x1080): 65 fields/s
MIXER TEMPORAL_SPATIAL + IVTC (1920x1080): 52 fields/s
MIXER TEMPORAL_SPATIAL + SKIP_CHROMA (1920x1080): 73 fields/s
MIXER TEMPORAL_SPATIAL (720x576 video to 1920x1080 display): 235 fields/s
MIXER TEMPORAL_SPATIAL + HQSCALING (720x576 video to 1920x1080 display): 130 fields/s

MULTITHREADED MPEG DECODING (1920x1080): 68 frames/s
MULTITHREADED MIXER TEMPORAL (1920x1080): 114 fields/s

Thanks

Alan
Reply
#2
That's the framerate calculation getting in the way here, what you can do for now is switch to 60 hertz for 23.976 fps video and use the 2:3 pulldown correction on the tv until I get this fixed.
Reply
#3
Thanks - I modified advancedsettings to reflect that change - but it made no obvious difference. I still saw the TV blank momentarily as if changing refresh rate when I played 24 fps material. Also, the 60i material could still only manage 40 or so fps. Even disabling the sync display to video settings made no difference - I'll try this again later, as I am probably messing this up in my haste.

My best results are when I disable deinterlacing completely. Pretty much everything plays adequately like this - I am assuming my TV is either line doubling or working well with interlaced video.
Reply
#4
Your videocard is probably too slow for temporal and temporal/spatial deinterlacing at 1080i.
I have the same thing with my 8600gt.
Reply
#5
Thanks, that was one of my suspicions - though, it seems erratic as it does appear to work when I have a 24 fps commercial followed by a manual skip to a 60i section. I didn't see in the log which if any deinterlacer was chosen. Also, with the latest nvidia drivers, MythTV appears to work OK at Advanced 2X on 1080, which I am equating to temporal-spacial - previously I used straight Advanced.

In terms of the qvdpautest results, for temporal/temporal-spacial, would 120 fields per second for 1080 60 fps be needed (I must confess to simply not knowing if 60 fields/30 fps interlaced equates to 60 fields/60fps deinterlaced or I need to double both numbers) - or just more ceiling in general?

Is there any way to force Bob or half rate temporal-spacial for 1080 resolution and full rate for all others - but leaving xbmc to auto select the best fit?

Again thanks
Reply
#6
A deinterlacer makes frames from fields, so 60 frames per second is ok.
Reply

Logout Mark Read Team Forum Stats Members Help
Issues with VDPAU, Auto Adjust Screen Refresh and Deinterlacing0