Soma Games Soma Games Blog Home Soma Games Pop Culture Games Arc GRoG The Race Dark Glass Games & Faith
Terribiliter Magnificasti Me Mirabilia

RealSense Gestures: Reality and Wish Lists

As part of the continuing series covering our experience with the RealSense technology from Intel, I’ve been thinking about gestures…

I’ve been saying for a long time that one of the keys to Apple’s success in getting developer buy-in for iOS was the very approachable and well designed tool kit they provided in X-Code. It was as if they polled 100 random potential coders and asked, “If you made an iPhone app, what’s the first thing you would want to tinker with?” and then they made all of those APIs easy to find and easy to use. The result was a tool kit that rewarded you early for modest effort and thereby encouraged developers to try more, to get better, to learn more again and keep exploring the tool kit for the next cool thing. It made adoption of something totally new feel manageable and rewarding. That not only encouraged the curiosity crowd, but also the business-minded crowd who has to ask, “How long will it take to adopt this tech? And is it likely to be worth it?” So long as the first answer is “Not too much.” then the second question is less acute.
The point being: it enabled early adopters to show off quickly. That drew in the early followers and the dominoes fell from there.

RealSense would benefit greatly from this lesson. Hardware appears to be in the pipe and were adequately impressed by the capability – check. A Unity3d SDK (among several others) is looking really sharp – check. So now I’m thinking about the question, “…What’s the first thing I want to tinker with?” and probably 75% of my ideas revolve around gestures. In fact, gestures are probably the essential component of this input schema and as such, it will be make-or-break for Intel to make gestures easy to get started with and also deep enough to explore, experiment, and mod. But Easy needs to come first…

Continue Reading…

Posted 1 week, 4 days ago at 6:49 pm.

Add a comment

RealSense Rubber Meets the Road

Continuing our series on Intel’s new/upcoming RealSense technology we recently got the alpha build of their Unity3D enabled SDK and a much improved version of the camera. While the package is cool and opens up a lot of interesting theoretical possibilities it got us thinking about the practical question surrounding this tech.

RealSense is, at its bottom line, an input device. In that sense it will be measured against things like joysticks, mice and game controllers and as a developer trying to make a living with this software we’ll be looking at several things beyond the “cool” factor. Things like:

  • Addressable audience
  • Typical hardware profile
  • Time/cost to implement
  • Processor overhead

When we’re being compensated to experiment and do basic R&D (And – full disclosure again – we are.) then we can ignore basically all of these considerations but when we move past that and start to explore actually deploying such tech…suddenly the calculus for deployment changes dramatically.

Continue Reading…

Posted 3 weeks, 6 days ago at 3:00 pm.

Add a comment

Realsense & Unity3d : A First Look

(by Jon Collins, on behlaf of Soma Games and Code-Monkeys)

This article is part of a series that documents our ‘Everyman’ experience with the new RealSense hardware and software being developed by Intel.

Full disclosure, Intel does pay us for some of this stuff but one of my favorite aspects of working with them is that they aren’t asking us to write puff-pieces. Our honest, sometimes critical, opinions are accepted…and even seem to be appreciated…so we got that going for us.

A First Look

There’s no denying that the Pre-alpha SDK is exactly what it says on the box, a pre-alpha, that said there’s a surprising amount of useful functionality which can be gleaned from looking deeper into the C# samples that are present and taking lessons learnerd from previous SDKs.

First off, the kit includes a Unity3D sample (there is just the one in the current package) is the Nine Cubes sample within the frameworks folder of the samples directory structure.

This gives us a good starting point to look into how to take advantage of the camera & SDK, although a few red-herrings are present which may be hangover from development versions, it gave us enough of an idea to further explore and adapt some of the separate C# samples bringing that functionality into our initial Unity3D project. (CS: We use Unity3D almost exclusively here at Soma Games so having this bridge to RalSense was a practical pre-requiste for us to consider adoption of RealSense)

RealSense Hand Joints and BonesFor this exercise we were primarily concerned with being able to track & record finger joint positioning within Unity3D. The available methods and documentation suggest there is an planned ability to load, save, and recognize gestures from a pre-defined library but after a little digging and running questions up to the dev team it appears that feature has been ‘delayed’ :( So with our hopes dashed at not finding the C# gesture viewer sample we wanted to see how, or even if, we would be able to access the joints to explore developing our own approach to logging finger & hand poses.

Continue Reading…

Posted 2 months ago at 8:30 am.

Add a comment

Getting Rolling with the RealSense SDK

(by Jon Hogins, on behlaf of Soma Games and Code-Monkeys)

This article is part of a series that documents our ‘Everyman’ experience with the new RealSense hardware and software being developed by Intel.

Full disclosure, Intel does pay us for some of this stuff but one of my favorite aspects of working with them is that they aren’t asking us to write puff-pieces. Our honest, sometimes critical, opinions are accepted…and even seem to be appreciated…so we got that going for us.

I recently got the fantastic opportunity to use a a pre-alpha version of Intel’s new RealSense camera to build a full-fledged app. It’s still a work in progress, but let me share my experiences and a few tips on getting the most out of RealSense’s video APIs.

The App

My mission has been to create a video conferencing app with a few interesting finger tracking interactions using the RealSense camera. After a bit of research, I decided on the Intel Media SDK for real-time H264 encoding and decoding and OpenCV for the initial display, moving to Unity/DirectX later.

Getting Started

Getting the RealSense SDK installed and creating projects based on the samples is straight forward, even in its Pre-alpha state. The installer adds the RSSDK_DIR environment variable and each VC++ project using RealSense only needs to add a property sheet via Visual Studio’s Property Manager. The documentation and samples are fairly comprehensive, and the APIs are the most accessible of any of the Intel C++ API I’ve worked with.

Continue Reading…

Posted 2 months, 1 week ago at 11:33 am.

Add a comment

Gaming and Intel’s RealSense

The Future?If you were watching at CES you may have seen Intel unveil their RealSense initiative. This is really an evolution of the Perceptual Computing initiative they pushed a year earlier but now with (vastly) improved hardware and software. We’ve been involved with this program for a while now, but wearing our Code-Monkeys hats, and we’ve even won a couple of awards. While we’ve written in the past about the tech I wanted to share a few thoughts about what we see in the future.

Hardware-free interfaces like RealSense and Kinnect are undeniably going to be more and more common in the coming years and for many reasons but maybe not the reasons that seem most obvious. That said, the experience of building this kind of UI also exposed its weaknesses which were a little surprising. Take Tom Cruise here on the right in the iconic scene from Minority Report. Take a pose like Tom here and hold it. How long before your arms wear out and fall to your side from flaming deltoids? The limit of physical endurance was something that took us totally by surprise when we started this but of course it should have been obvious and while we found it to be a very limiting factor working with existing control schemes it forced us to think differently about how we controlled these games, specifically aiming toward schemes that were more autonomous systems that coasted, needing occasional input instead of constant input.

Related to this was the matter of latency. No matter how ninja I get, moving my arm takes an astonishing amount of time compared to twitching my thumb. Ergo, any of the control schemes or game mechanics that required twitch controls were a non-starter using meat-space controls.

These are a couple of the limitations we saw but what I’m really excited about was how those challenges lead to exciting epiphanies!

RealSense and technologies like it invite us to consider a very different way of approaching our games, our data and all of our virtual interactions – and the magic of it all is in the appealing ability to treat these virtual worlds in the way we treat the real world using our hands, our voices, and the well-honed ability to recognize spatial relations. Input schemes can move increasingly away from buttons and joysticks and drill-down menus (after all, these were always mechanical metaphors for physical actions anyway) into modalities more like dancing or conducting a symphony. Our virtual spaces can operate and be organized just like our real spaces and screens are more like windows to other worlds than flat representations of flatland spaces or even the compression interface into three-dimensional, but largely inaccessible worlds.

F: The Storm RidersSo if it’s not clear – we’re very, very excited about where this tech is going and working with it in its infancy has been kinda mind-blowing.

For practical purposes, expect to see us deploying RealSense technology in Stargate SG1 Gunship (under the Code-Monkeys label) F:The Storm Riders, and Redwall: The Warrior Reborn. It’s too soon of course to rely on this input being available but we will definitely make the games to use this tech where it makes sense. (We considered a RealSense version of G, but it feels like a poor fit)

We’ll be at GDC in a couple of weeks and if this is something you’re interested in, stop by and we’d love to talk to you!

Posted 6 months ago at 6:22 pm.

Add a comment

MeeGo5 is what you meant to say Intel

Announcing MeeGo5I have about eight blog posts I wan to make coming out of the Intel Elements 2011 Conference, most of them positive. BUt one of these seems pretty time sensitive and I want to be part of the conversation out here so I’m going to do this now even if it’s only half baked.

Whoever is in charge – you cannot use the name Tizen – it’s about the worst possible marketing move possible at this moment.
Instead – call it MeeGo5 – and you’ll be celebrated instead of mocked.

Continue Reading…

Posted 2 years, 11 months ago at 1:42 pm.

5 comments

On The Future of Game Publishing

by Gavin Nichols

The other day, Soren Johnsen posted a tweet that really caught my interest. He said
“The next console generation will be won by whoever understands why the Xbox Indie Games Channel did not become the iOS App Store.’
This is true in so many ways.

The iOS App store has enjoyed an unparalleled level of success since it launched a few years back largely because it managed to hit a golden combination of approachability by both developers and consumers, while simultaneously lifting the best to the top through a natural feeling review system. For the first time Joe Schmoe could take his idea, build it himself and publish it to millions of potential customers, all from his living room. Customers had access to hundreds of thousands of apps at their fingertips, instantly, anytime and anywhere, for an affordable price.

Continue Reading…

Posted 2 years, 11 months ago at 12:01 pm.

Add a comment

AppUp – A Big Idea That Takes Time

We’re here at Intel Elements 2011, a “one year later” event from where we first heard Peter Biddle lay out a rather large vision for the Intel AppUp Center. Without going back into the history and our previous thoughts on AppUp I find myself feeling increasingly invested in this thing. Far more than getting tied up in what AppUp is or is not, I’m fascinated by what AppUp wants to become.

Continue Reading…

Posted 2 years, 11 months ago at 11:04 am.

Add a comment

Developing AppUp Games Using Unity 3D

One of the most exciting and powerful tools available to the indie developer today is Unity 3D (http://www.unity3D.com), a wildly popular game engine that exploded in popularity when the iPhone app store roared into public prominence. The Unity 3D engine has become so popular in part because of its ease of use, powerful tools, and too-good-to-be-true pricing. We’ve raved about Unity as a tool in the past though so I wont get into all that again. Instead I’d like to look at one specific aspect of how Unity and AppUp work together in beautiful unison.

Big Lesson #1: Multi-platform is not an either-or concept. It’s emphatically an also-and concept.

One of the most valuable aspects of Unity 3D is its ability to deploy a single project to multiple platforms. By installing various plug-ins or a little ninja coding you can build one game that runs on everything from Mac and PC desktops, all manner of mobile devices and even in a browser. Today’s case in point will be our recent release of Bok Choy Boy and how we brought it to Intel AppUp (here) at the same time we launched to several other platforms including iPhone, iPad and a browser based mini-game.

IMHO the profound magic of the app store model, specifically places like AppUp, iTunes and the Android marketplace is a massive, instantaneous, global distribution network. So long as you plan for it up front there is no reason not to target ALL of these platforms at one time in order to create the widest possible exposure for your game…and in so doing try to take over the world…again. The reality is that you never know where a game will catch on. For example, when Bok Choy Boy launched we never expected the HUGE audience we garnered in China. Over half of the total downloads have been in a country that we weren’t even thinking of. In hindsight we can guess why the Chinese market liked it, but we would have lost a ton of customers if we hadn’t taken advantage of the globe spanning power of app stores like AppUp and planned for multi-platform distribution and used a tool like Unity 3D. So I could keep beating that horse but seriously…do this. Continue Reading…

Posted 2 years, 11 months ago at 12:08 pm.

1 comment

Intel AppUp 1 Year Later

It was exactly 1 year ago that we got our first taste of Intel’s AppUp Center when they launched the beta store at CES 2010. It was received with mixed reviews and nobody really knew what to expect from it.

What a difference a year makes.

AppUp Has Angry Birds

Note to Intel: Why does this page still say "Moblin?"

Today you look at the AppUp Center and right up front is what must be the biggest runaway hit game of 2010 – Angry Birds. What a huge coup! What a great “I told you so” moment. Congratulations to Intel – Peter, you called it man.

Continue Reading…

Posted 3 years, 7 months ago at 10:18 pm.

3 comments