RealSense Gestures: Reality and Wish Lists

As part of the continuing series covering our experience with the RealSense technology from Intel, I’ve been thinking about gestures…

I’ve been saying for a long time that one of the keys to Apple’s success in getting developer buy-in for iOS was the very approachable and well designed tool kit they provided in X-Code. It was as if they polled 100 random potential coders¬†and asked, “If you made an iPhone app, what’s the first thing you would want to tinker with?” and then they made all of those APIs easy to find and easy to use. The result was a tool kit that rewarded you early for modest effort and thereby encouraged developers to try more, to get better, to learn more again and keep exploring the tool kit for the next cool thing. It made adoption of something totally new feel manageable and rewarding. That not only encouraged the curiosity crowd, but also the business-minded crowd who has to ask, “How long will it take to adopt this tech? And is it likely to be worth it?” So long as the first answer is “Not too much.” then the second question is less acute.
The point being: it enabled early adopters to show off quickly. That drew in the early followers and the dominoes fell from there.

RealSense would benefit greatly from this lesson. Hardware appears to be in the pipe and were adequately impressed by the capability – check. A Unity3d SDK (among several others) is looking really sharp – check. So now I’m thinking about the question, “…What’s the first thing I want to tinker with?” and probably 75% of my ideas revolve around gestures. In fact, gestures are probably the essential component of this input schema and as such, it will be make-or-break for Intel to make gestures easy to get started with and also deep enough to explore, experiment, and mod. But Easy¬†needs to come first…