Kinect, hacked.

Posted in Technology,Video by Scott Meisner on December 29th, 2010

Since its North American release date on November 4, Microsoft’s Kinect controller-free motion accessory has proved to be a boon for the open source community.  Over the last month or so I’ve collated the most intriguing Kinect hacks and today is the day I’ve decided to let them live free inside this post. Tinkerers are constantly throwing up their latest creations online, so expect Kinect, hacked to become an ongoing series.

First up we have Yankeyan‘s Super Mario Bros. Kinect hack.  Using OpenKinect drivers and NES emulation he’s figured out how to make the plumber’s on-screen movements mimic his physical jumps and arm flails.  It doesn’t match up perfectly, but that doesn’t make the hack any less impressive.

Now go on, hop after the break to browse oodles of Kinect hack videos; I promise they are all super inventive!

Oliver Kreylos uses two Kinect sensors to produce a better 3D representation of an object (in this case, a Kinect box).

MIT Media Lab’s Fluid Interfaces Group created DepthJS, a web browser extension that allows any web page to interact with Kinect via Javascript.

Multitouch screen technologies company Evoluce bridged together Kinect and Windows 7 using the Multitouch Input Management (MIM) driver.  Flash and Java-based applications can be controlled using hand gestures, too.

Chris O’Shea‘s gone ahead and created a Kinect air guitar prototype.  Air strumming never looked so good.  Software is written in C++ using openFrameworks and openCV for image processing and the ofxKinect addon and the libFreenect driver.

Martin Kaltenbrunner also brings musicality to the Kinect with his “Therenect” hack.  Using the openFrameworks and OpenKinect libraries he has invented a virtual theremin.

Look! It’s a Kinect MIDI controller from Ben X!

This Keyboard Anywhere hack, created by Peter Moz, was inspired by the Tom Hanks film Big–no surprise there.  Using the libFreenect library and Python coding, a keyboard can be called up onto any flat surface no matter its size.

Japanese coder Takayuki Fukatsu played around with openFrameworks to make it so that the Kinect turns you nearly invisible!  The Kinect creates this camouflage effect by effectively “skinning an image of the background onto the contours of your body in real time”, as Engadget so skillfully deduces.

Looks like Joy Ride isn’t all that grand.  Apparently all you have to do is sit perfectly still to be awarded a bronze medal.

RazerFish, a marketing company that specializes in gesture-based experiences and 3D imaging, has brought their Surface Physics Illustrator called DaVinci to the Kinect.  “Gestures are used to create objects and control the physics of the environment”, they explain.  “Your hands appear in the interface which allows you to literally grab objects out of thin air and move them in the environment.  Additional gestures allow you to affect the gravity, magnetism and attraction.”  It’s quite neat, check it out.

From VirtopsyProject comes a way to manipulate medical images with simple hand gestures picked up by the Kinect.  The sensor controls a PACS system (in this case OsiriX) and the software relies on ofxKinect, libFreenect, and openFrameworks to make the magic happen.

And a little Kinect-World of Warcraft collaboration wouldn’t hurt anyone, hm?  Tinkerers at the USC Institute for Creative Technologies are developing the Flexible Action and Articulated Skeleton Toolkit (FAAST), or “middleware to facilitate integration of full-body control with games and VR applications.”  This Kinect software layer, built on top of the OpenNI framework, essentially “emulates keyboard input triggered by body posture and specific gestures.”  The developers say these mapped gestures can be ported to other games and applications, so hopefully we’ll be seeing FASST enabling more Kinect-converted games in the future.

I want to end with this Kinect-enabled interactive shadow puppet prototype because it’s so imaginative.  Design I/O’s Theo Watson and Emily Gobeille churned out the prototype in one day using openFrameworks and libFreenect.  The software performs skeleton tracking to determine arm, shoulder, elbow, and wrist positioning and this in turn controls the bird’s movement and posture.

[Via Alltop; Engadget, here, here, here, here, here, here, here, here, here & here]

Leave a Reply