GSOC Proposal
#1
Hi
I am Shashwat Pandey, a B.Tech CSE student from National Institute of Technology, Hamirpur, H.P. India. I have used XBMC on multiple platforms like Ubuntu, Fedora and Mac OSX over the years and I really like the presentation and media management it provides. It is the best Media Center that i know of and the best thing about it is that it is free and Open Source.

Gesture based control of UI
Name: Shashwat Pandey
forum/e-mailConfusedhasha/[email protected]

Summary: Mostly people prefer to listen to music or watch videos on their couch, ie. at a distance from their PC or laptop. To change songs one has to reluctantly get up. The solution is a motion based control of the UI so that XBMC can be controlled from a distance without the need of a remote control. The plain and simple menu system can easily be traversed using such a system.

How will I achieve this: All laptops have webcams and mostly people have webcams with their PC too. The webcam video feed can be monitored for motion (using some library like OpenCV) and using simple gestures (up,down,left right etc.) the whole UI of XBMC can be controlled.

What will the project focus on: To implement this feature into XBMC as an optional user interface control system. To make this as intuitive as possible and fine tune the functionality to reduce lag. All in all i will try to make this a feature one would like to use regularly rather than something that is just cool to have to show off to family and friends.

Benefits: XBMC will have a unique feature that will appeal to anyone who likes to enjoy their media at leisure lying on their couch. Big Grin

Goals: To implement this functionality so that it is able to utilise any integrated or 3rd party webcam. To provide basic functionalities like moving up and down menus, in and out of folders, play and stop music/video, move through photographs etc. More sophisticated actions can be attempted if time allows.

What does it touch in XBMC: An Enable/Disable button needs to be added to the settings panel. The processing of the webcam feed etc will have to be written anew.

Requirements: Knowledge of APIs that allow use of webcams( like v4l ). A suitable motion detection algorithm that works on streaming video. Knowledge of XBMC internals.
Possible mentors: Place to add possible mentors (Team-XBMC will add this).

Perhaps I got a bit too creative... Tongue
Please provide feedback to this idea.
Reply
#2
Hi there, and welcome!

The idea is fine, however note that this would be almost completely external to XBMC: You'd do it either with an event client, or by some other technique (even JSON-RPC if you really wanted to).

This is a good thing - you don't want to be trying to get a whole heap of code integrated with XBMC - much better if you can do something from outside, which would allow you to get something up and running and much easier testing (the event client you write could quite happily run in a test mode that just output what it receives to the console). Also, you don't have to worry quite so much about platform-specific stuff.

Cheers,
Jonathan
Always read the XBMC online-manual, FAQ and search the forum before posting.
Do not e-mail XBMC-Team members directly asking for support. Read/follow the forum rules.
For troubleshooting and bug reporting please make sure you read this first.


Image
Reply
#3
This sounds really promising, but difficult to achieve.
I think this must be optional, since it would add multiple dependencies, and (as far as I know) require much processing power.
It needs to be enabled and disabled on the fly, too, to avoid processing unexpected motion. Maybe a gesture with a hand to enable it, and then the command with the other hand?

If you can do it, this would be really awesome. I might even buy a webcam. Seriously Smile

EDIT: oops, jmarshall, you were way quicker than me. I agree with you on the client/server design.
I need to improve my English skills, feel free to correct me ;-)
My GSOC 2014 post CANCELLED 2015 one
Reply
#4
Thank you for your replies.

jmarshall - I would write an event client for the processing part. I think using OpenCV libraries to process the webcam stream and to track the motion of the users hand will be a good idea as it is also platform independent. Suggestions?

M@yeulC - I am still working on the finer details of the idea. I hope this will force you to buy a webcam just to try this awesome feature out. Smile

I wanted to implement this as a general control interface but it did not seem feasible with the complex UI in our operating systems.
The simple (yet beautiful) UI of XBMC has motivated me to give it a shot.
Reply
#5
Did you read about ARM Mali GPU demos at CES 2014? They actually did demo basic gesture recognition demo with XBMC just to show of the OpenCL capability of their new GPU.

Checkout http://www.cnx-software.com/2014/01/23/a...-decoding/

Quote:Phill Smith, Demo Manager at ARM, has filmed and uploaded four very interesting demos of what new features will be possible thanks to new generation ARM Mali-450 and Mali-T6xx GPUs including 4K 3D user interfaces and games, ASTC texture compression, and OpenCL accelerated gesture recognition and HEVC / H.265 video decoding.

The next demo show OpenCL accelerated Eyesight gesture recognition in Arndale board powered by Exynos 5250 SoC with ARM Mali-T604. The board runs Linux and is connected to a standard (and crappy) Logitech webcam. It can follow hand gestures with OpenGL acceleration, something that is not possible with the dual core Cortex A15 CPU only. They’ve also integrated the demo with XBMC, and showed how to navigate XBMC user interface with your hand only, no remote needed.
Gesture Recognition with XBMC

As I understand it though for this demo they simply used Eyesight ( http://eyesight-tech.com ) closed source gesture recognition software for mouse input and feed it to XBMC instead of a proper implementation using via a proper JSON-RPC or EventServer API based client/server design. Though I think that the concept is still cool, even if using the gesture control to only emulate mouse cursor type control like they did in their demo, similar to what Microsoft did with Kinect for its initial Xbox menu control, must be the worst idea for XBMC usage.

Their point of the demo was really only to show of OpenCL accelerate things like OpenCV while also doing OpenGL ES for XBMC at the same time.

Whatever you do, please make the client cross-platform or at least portable if possible so can use on all OSes XBMC supports. Suggest to base it on something like OpenNI2 https://github.com/OpenNI/OpenNI2 or similar framework that have a strong upstream community, like OpenNi community is at http://www.openni.org and something like OpenNI2 can support both Kinect and standard webcams via OpenCV http://www.openni.org/files/webcam4openni2/ and can also be extended for Python input
Reply
#6
Thank You Hedda for the info.

I have exams coming up at college so i will look into it after 2 weeks. OpenNi looks promising though. Thanks for that.

P.S. The demo is not the way i intend to implement this. Angel
Reply

Logout Mark Read Team Forum Stats Members Help
GSOC Proposal0