Interlaced video output - proof of principle
#1
I've submitted a feature request http://trac.xbmc.org/ticket/12960 that includes patches to demonstrate how interlaced video output could be achieved.

The method is described in more detail in this old post:
http://forum.xbmc.org/showthread.php?tid=81834

The method overcomes the problem of random bad field synch when watching interlaced video on an interlaced video display.

My hopes are that this will negate the need for power hungry deinterlacers in the mediacenter instead relying on the deinterlacer in your TV.

Thanks
Reply
#2
Can you even buy an interlaced display nowadays?
Always read the XBMC online-manual, FAQ and search the forum before posting.
Do not e-mail XBMC-Team members directly asking for support. Read/follow the forum rules.
For troubleshooting and bug reporting please make sure you read this first.


Image
Reply
#3
if I understand well, the goal isn't support for CRT (interlaced display) but let doing deinterlacing to the TV deinterlacer instead of XBMC.
Reply
#4
Yes when I refer to an interlaced display I include flat panels that accept 1080i and there's a lot of those.

I am pushing this because it potentially avoids a lot of problems associated with watching TV material (XBMC-PVR) and camcorder home movies which generally are interlaced. The right hardware combination to support this is woefully limited. For example all nVidia ION based systems cannot perform combined IVTC and temporal-spatial deinterlacing for 1080i video at the required frame rate. AMD E350 boards running Linux cannot do ANY VA deinterlacing because it is not supported by VAAPI (actually this applies to all AMD graphics cards).

Currently the only adequate hardware setup for Linux and XBMC is one containing an nVidia graphics system with enough grunt to perform IVTC and temporal-spatial deinterlacing at 1080i 60fps. Since the only graphics systems capable of this are PCI cards, the motherboard must have a PCI express 16 slot to. That means a large form factor PC case. These cards are also power hungry.

Reply
#5
This sounds interesting.

Did you solve the issue with scaling interlaced material or are you relying on output in native resolution? Native resolution would be pretty annoying because I haven't see a single TV yet where you could disable overscan for SD content (only for HD content) - so you'll lose the edges of the picture when outputting SD material in native resolution.
Reply
#6
No I did not solve the issue with scaling interlaced material. It has to be IVTC'd and deinterlaced before scaling.

You are right to be concerned about SD content. In the most basic hardware setups displaying SD at native resolution would be desirable but I suspect this is very tricky. For HDMI connected displays 576i and 480i modes require pixel doubling to avoid the lower pixel clock limit specified for HDMI. This means both the video and all of the GUI would need to be rendered to a frame 720x576 pixels for 576i (600x480 for 480i) and then stretched to the full 1440x576i. In OpenGL you would texture map with the setting glTexParameterf(GL_TEXTURE_2D, GL_TEXTURE_MAG_FILTER, GL_NEAREST) to double pixels horizontally.

Also mode switching the TV display and the low res GUI might be annoying. I suspect a halfway-house approach might be more suitable where if you say had an ION system it is quite capable of IVTC and deinterlace of SD material so you could then display this on a 1080i display. For 1080i material the deinterlacer would be turned off and the interlacer turned on.
Reply
#7
A big +1 from here!!! Wink

Since there is still very much content being published in interlaced form (especially SD and HD TV in the Netherlands), the problem of deinterlacing is among the last of problems I am still trying to tackle.

My nvidia ION system certainly does not have the power to properly de-interlace 1080i content and even an rather powerful PCI-E card in my upstairs PC (430 GT) is not always up to the job.

Being able to send the output to my TV in it's original interlaced form (just as the set top box of my TV provider does) would be awesome! Smile
Reply
#8
This would be of HUGE interest to me and I know a lot of others. Many people with HT setups have Pre/Pros or even receivers (in addition to those that feed directly to their TV's) with onboard chips that handle scaling and deinterlacing duties far more effectively than XBMC can with software (or even hardware assist). To me, this goes hand-in-hand with bitstreaming audio, and let the components that are designed for this do the work.
Reply
#9
Another +1 here...

I record a lot of free-to-air tv with TVSchedulerPro (http://sourceforge.net/projects/tvschedulerpro/). (3 Hours per Day)

The Australian broadcasts that I record all suffer from the interlacing issue discribed, unfortunatly my ION and ION2 systems are not really up to the deinterlacing.

This would greatly improve the playback of recorded content for me (and wife).
Reply
#10
Some good news; the patch to the FFMPEG tinterlace filter has been adopted (modified slightly) by the FFMPEG maintainers.

I'm now trying to see if it is possible to do something similar for VDPAU output. I ruled out the getbits and putbits method because for 1080i this would involve something like 350mbytes/s transfers which I suspect ION would not be able to achieve. Instead I'm trying the NV_vdpau_interop nVidia OpenGL extensions method to manipulate video using OpenGL. It's a steep learning curve for me.

The problem with this approach is it requires the use of OpenGL header files specific to nVidia drivers. Do the XBMC developers have any kind of policy on the use of OpenGL extensions specific to a particular vendor?
Reply
#11
More good news. After some advice from FernetMenta I dropped the FFMPEG filter approach. I successfully implemented a new interlaced output mode within the XBMC render manager, I have called RENDER_WEAVEX2. The mode can work with software rendering and shader rendering modes. It can also work with VDPAU rendering however to achieve this, this fork of XBMC must be modified with the new mode:
https://github.com/FernetMenta/xbmc

I will request that the new mode be implemented in both the official and FernetMenta XBMC repos.

Current limitations are WEAVEX2 works well in 1080i output modes with perfect field synch, but 576i and 480i modes are not possible yet because of the pixel doubling requirement of HDMI at these low resolution modes. I have tested 576i video at original size within 1080i mode (imagine big black borders round a small picture) and the TV quite happily accepts that, but not suitable for comfortable viewing.
Reply
#12
Is there a branch that I could use to test with VDPAU rendering?
VU+ / Enigma2 PVR Client: Documentation | Development | Discussion
Reply
#13
Yes Fernetmenta has just pulled the WEAVEX2 modification to his master branch.
Reply
#14
An alternative approach for 576i and 480i might be to do what broadcast digital video effects devices did when they had to scale interlaced video.

If you scale an interlaced frame as a frame, you end up mangling your interlaced fields together and you get all sorts of nastiness. (Motion judder, odd banding on motion etc.)

However if you scale in the field-based domain, you avoid this. Of course you don't get the benefit of the improved vertical resolution that might be possible with a high quality de-interlace, scale (in the 2x frame domain) and then re-interlace - but it might be an option? (As most DVEs were shrinking rather than zooming pictures the resolution loss was less of an issue)

Effectively you'd scale each 240 (480i) or 288 (576i) line field to a 540 line field (1080i). As you are scaling within the field domain you don't end up with mangled fields.
Reply
#15
Sorry for not keeping this thread up to date. I think the Weave deinterlacer in Frodo has been modified to perform field rate weave (double rate weave) but only for software rendering.

Quote:However if you scale in the field-based domain

This is called Bob. Select the Bob deinterlacer and try it.

Quote: Of course you don't get the benefit of the improved vertical resolution that might be possible with a high quality de-interlace

What is worse is if you are watching a progressive movie delivered over interlace video your TV will not get the chance to perform inverse telecine and you will lose half the resolution.

Stu-e
Reply

Logout Mark Read Team Forum Stats Members Help
Interlaced video output - proof of principle0