It was an honor to be part of the Microsoft Kinect SDK for Windows Beta launch. Code Camp was an intense process. We were asked to build a Kinect application in 24 hours. We got access to the SDK in the morning, Microsoft gave us an overview of the SDK and plenty of food and caffeine to get the creative process going—by lunch time we were rolling.
The teams consisted of a select group of developers from academia and industry. Many of us had never worked together before. My teammates Alex Wiggins and Ruma Paul were awesome. We went heads down immediately and came up with the idea in 5 minutes: use Kinect to control the Parrot AR Drone quadricopter.
With the time crunch, writing the app from scratch was not an option, so we relied on the
http://www.stephenhobley.com/blog/2010/11/28/c-sdk-for-ar-drone-now-available/ code to access the drone.
We spent 5% of the time working with the SDK, 10% troubleshooting drone communication errors, 85% fine-tuning gestures/voice commands and 0% sleeping.
There were two modes for the gestures: flying mode and altitude mode. In flying mode you could control pitch, roll and yaw as if you were a “person joystick.” In altitude mode, we created a “lift” gesture for moving the drone up and down—think of lifting or lowering a box with both hands.
Getting the SDK going was not an issue at all. It was easy to acquire skeleton data by merely copying and pasting sample code. The challenge was adding voice commands.
We had voice commands for takeoff, land, rotate right and rotate left. Although the SDK relies on the Microsoft Speech SDK, you need to use a specific Kinect audio source (Microsoft.Research.Kinect.Audio.KinectAudioSource). In addition, the speech recognition code needs to run in an MTA apartment thread, which caused additional work in WPF. The biggest issue: the noise generated by the drone motors. I was surprised it worked at all. I think the Kinect microphone array helps. The Kinect SDK provided us with the direction where the speech originated. During demos we tried to fly the drone behind the Kinect, otherwise speech recognition did not work well.
Our application was chosen for Microsoft’s live demo event. Unfortunately, the event took place in a small studio and we needed a fairly large space to fly the drone, so I don’t think we were able to do justice to the application. Keep an eye out for additional video footage. We were on camera a lot and I’ll post the clips as they come in.
You can get the Beta SDK at http://research.microsaoft.com/en-us/um/redmond/projects/kinectsdk/default.aspx.
Without a doubt, Kinect Code Camp was an unforgettable experience. Can’t wait to see what other people will do with Kinect.