Linux Best option for deinterlacing
#1
Question 
Hi,

I'm looking at the idea of building an xbmc machine to replace my Sky+ box. The experiment so far has consisted of buying a twin sat tuner card (DVBSKY-S952) and installing it in my NAS box (an HP Proliant N36L) along with the TVHeadEnd backend software. Initially I'm using my existing desktop machine for the xbmc frontend (a DH57JG motherboard with 1st gen i3 and 2G memory). If I can get everything working to my satisfaction then I will build a new dedicated front end machine just for this.

My desktop is running KUbuntu 13.04 and I've installed the latest Frodo release (12.2) and hooked it up to my TV. I have managed to get it working so that I can view TV channels and the first thing I noticed was the combing from the interlaced signal. The tickertape headline display at the bottom of the screen on the BBC News channel is particularly bad. The TV is saying the input signal is 1080p@50Hz. For comparison, when hooked up to the Sky+ box the TV reports either 1080i@50Hz for HD or 576p@50Hz for SD channels.

Looking at the video settings it appears that deinterlacing is off by default and there are several different options to choose from if I do enable it (blend, deinterlace, deinterlace half, weave, weave invert, software blend etc.). After some googling it seems as if the deinterlace methods available depend on the graphics adaptor being used (and possibly also the OS).

I have several questions:

1) which graphics hw (intel, nvidia or amd) do people think gives the best picture quality for viewing interlaced material on an LCD TV ?
2) which of the deinterlacing methods gives the best picture quality (a description of what the various methods do and how they work would be useful here) ?
3) is it possible to output 1080i (or 576i) and let the TV do the deinterlacing (does this depend on choice of graphics hw) ? Is it likely to give better results than getting the graphics adaptor to do the deinterlacing and scaling ?
4) are the deinterlacing methods available under windows better than those under linux ?

I was originally thinking that my custom built front-end would be an intel only solution using an i3 (either Haswell or Ivy Bridge) with built-in graphics (HD3000 or HD4000). But if that doesn't result in the best picture quality I'm happy to consider other options. I'd prefer a linux based machine but will consider a windows build if the picture quality really is noticably better.

--
Glenn
Reply
#2
1 Linux first is nvidia then Intel and a distant last is amd

2 spatial/temporal
Reply
#3
I've done a bit more reading on the subject and found this page:

2460 (PR)

which suggests that with Haswell there will be support within VAAPI for more powerful deinterlacing methods that might be as good as what is currently available with nvidia cards via VDPAU. I'm guessing this is some months away from being stable though so I think I'm going to opt for an nvidia graphics card. Found this link:

http://www.gossamer-threads.com/lists/my...296#536296

which is a great help in narrowing down the choice of card. After reading that I think the Zotac GeForce GT 640 Zone card looks like the best bet:

http://www.amazon.co.uk/ZOTAC-Geforce-ZO...B008J79GHY

It seems to be the most powerful passively cooled card and should have enough grunt to run the best deinterlacing method.
Reply

Logout Mark Read Team Forum Stats Members Help
Best option for deinterlacing0