2013-08-12, 13:12
Hi,
I'm looking at the idea of building an xbmc machine to replace my Sky+ box. The experiment so far has consisted of buying a twin sat tuner card (DVBSKY-S952) and installing it in my NAS box (an HP Proliant N36L) along with the TVHeadEnd backend software. Initially I'm using my existing desktop machine for the xbmc frontend (a DH57JG motherboard with 1st gen i3 and 2G memory). If I can get everything working to my satisfaction then I will build a new dedicated front end machine just for this.
My desktop is running KUbuntu 13.04 and I've installed the latest Frodo release (12.2) and hooked it up to my TV. I have managed to get it working so that I can view TV channels and the first thing I noticed was the combing from the interlaced signal. The tickertape headline display at the bottom of the screen on the BBC News channel is particularly bad. The TV is saying the input signal is 1080p@50Hz. For comparison, when hooked up to the Sky+ box the TV reports either 1080i@50Hz for HD or 576p@50Hz for SD channels.
Looking at the video settings it appears that deinterlacing is off by default and there are several different options to choose from if I do enable it (blend, deinterlace, deinterlace half, weave, weave invert, software blend etc.). After some googling it seems as if the deinterlace methods available depend on the graphics adaptor being used (and possibly also the OS).
I have several questions:
1) which graphics hw (intel, nvidia or amd) do people think gives the best picture quality for viewing interlaced material on an LCD TV ?
2) which of the deinterlacing methods gives the best picture quality (a description of what the various methods do and how they work would be useful here) ?
3) is it possible to output 1080i (or 576i) and let the TV do the deinterlacing (does this depend on choice of graphics hw) ? Is it likely to give better results than getting the graphics adaptor to do the deinterlacing and scaling ?
4) are the deinterlacing methods available under windows better than those under linux ?
I was originally thinking that my custom built front-end would be an intel only solution using an i3 (either Haswell or Ivy Bridge) with built-in graphics (HD3000 or HD4000). But if that doesn't result in the best picture quality I'm happy to consider other options. I'd prefer a linux based machine but will consider a windows build if the picture quality really is noticably better.
--
Glenn
I'm looking at the idea of building an xbmc machine to replace my Sky+ box. The experiment so far has consisted of buying a twin sat tuner card (DVBSKY-S952) and installing it in my NAS box (an HP Proliant N36L) along with the TVHeadEnd backend software. Initially I'm using my existing desktop machine for the xbmc frontend (a DH57JG motherboard with 1st gen i3 and 2G memory). If I can get everything working to my satisfaction then I will build a new dedicated front end machine just for this.
My desktop is running KUbuntu 13.04 and I've installed the latest Frodo release (12.2) and hooked it up to my TV. I have managed to get it working so that I can view TV channels and the first thing I noticed was the combing from the interlaced signal. The tickertape headline display at the bottom of the screen on the BBC News channel is particularly bad. The TV is saying the input signal is 1080p@50Hz. For comparison, when hooked up to the Sky+ box the TV reports either 1080i@50Hz for HD or 576p@50Hz for SD channels.
Looking at the video settings it appears that deinterlacing is off by default and there are several different options to choose from if I do enable it (blend, deinterlace, deinterlace half, weave, weave invert, software blend etc.). After some googling it seems as if the deinterlace methods available depend on the graphics adaptor being used (and possibly also the OS).
I have several questions:
1) which graphics hw (intel, nvidia or amd) do people think gives the best picture quality for viewing interlaced material on an LCD TV ?
2) which of the deinterlacing methods gives the best picture quality (a description of what the various methods do and how they work would be useful here) ?
3) is it possible to output 1080i (or 576i) and let the TV do the deinterlacing (does this depend on choice of graphics hw) ? Is it likely to give better results than getting the graphics adaptor to do the deinterlacing and scaling ?
4) are the deinterlacing methods available under windows better than those under linux ?
I was originally thinking that my custom built front-end would be an intel only solution using an i3 (either Haswell or Ivy Bridge) with built-in graphics (HD3000 or HD4000). But if that doesn't result in the best picture quality I'm happy to consider other options. I'd prefer a linux based machine but will consider a windows build if the picture quality really is noticably better.
--
Glenn