Soma Games Soma Games Blog Home Soma Games Pop Culture Games Arc GRoG The Race Dark Glass Games & Faith
Terribiliter Magnificasti Me Mirabilia

RealSense And The Uncanny Valley

IDF 2014 was not the first time we’ve been honored to have a tech demo on the floor. And as we’ve been to the rodeo we’ve learned a thing or two about floor demos. First among those things: Keep It Simple Sherlock. So with that lesson in mind we created Cloak&Badger – a very simple game mechanic that splurged on the eye candy (a Cloak&Badgerwonderfully animated badger guard) and did exactly one thing: it used the recently updated RealSense (Beta) SDK and its emotion sensing API. The entire game worked as you made faces at the camera…that’s it…and it was a blast!

Cueing the player to what emotion drove the RPG-Style dialog tree in a particular direction was straight-forward and folks had tons of fun trying on the various emotive states that the API supports. (Includes:  Joy, Sadness, Disgust, Contempt, and Surprise plus the more general “sentiments” that are Positive, Neutral, and Negative.) By the rules of our KISS constraint it was an unqualified success and we had tons of smiles, laughs and genuine fun all week


Almost…
And then it went sideways.

Continue Reading…

Posted 4 weeks, 1 day ago at 9:53 am.

2 comments

Mobile is Dead. Long Live the PC!

Soma Games wrote our first line of game code at the tail end of 2008, just as the iPhone was really blowing up and as it happened, we were in the right place at the right time. It wasn’t on purpose, it was partly opportunistic, but it worked out. We rode that mobile wave for years and were part of the Indie Game Renaissance it helped generate. (See: Polygon, GDC, Wired)

One Rig to Rule Them All…

What made mobile so attractive was, of course, the low barrier to entry but that was only what got us interested. What kept us interested was the demonstrated market for indie games. Hardware constraints initially leveled the playing field so big studios had a much less pronounced quality and scope advantage over small shops so nimble little shops like Soma Games could compete and still land a feature from Apple or get covered by Kotaku.

Now it’s 2014 and as far as I can tell, the mobile space is no longer interesting for indies. I’m not the only one either. (See: Gamasutra, and this…for a start.) In fact, just about everybody I got to know as other indie mobile developers in the last several years is coming to the same conclusion. Mobile is over, let’s do PC games.

Continue Reading…

Posted 4 weeks, 1 day ago at 9:13 am.

Add a comment

IDF and My Wireless Dream

IDF is always a great place to get a glimpse of upcoming technology and while some portion of what you see there never quite makes it to the real market a trained eye can start to sense what ideas really have legs and are likely to keep going. This year, the stars that caught my attention were the consumer scale robots with Edison tech, and wireless everything.
CoolOffice
Continue Reading…

Posted 1 month ago at 1:30 pm.

Add a comment

That Long-Overdue Redwall Update…

It was sixteen months ago that we posted our first blog regarding Redwall, or Project Mouseworks. Shortly thereafter we launched our AbbeyCraft kickstarter, it funded, and then roughly a year ago this month AbbeyCraft was released. All going well so far. The plan at that point, as far as we could see it, though shrouded in some pretty dense fog, was to wrap up a modest private funding effort, build a modest adventure game and then see what happened. It was a pretty straightforward plan and while Redwall was obviously a big thing, our goals were fairly short term and limited. But something happened on the way to that pivot and while it’s cost us some time I hope you’ll see it as something overall quite positive – I know we do.


Setback #1: If I’m honest, I was just horribly naive about how the private funding world works. I’d never done it before but with all things considered it felt like the right play as opposed to either a traditional publishing deal or taking a second draught at the crowd funding trough. I’ll certainly write more about this experience in the future but suffice it to say that I underestimated the time this was going to take. On its surface that sounds like a bad thing, it was certainly wretchedly frustrating at times, but as I’ll describe below I think it was actually a blessing in disguise.

Continue Reading…

Posted 1 month, 3 weeks ago at 10:00 am.

13 comments

RealSense Gestures: Reality and Wish Lists

As part of the continuing series covering our experience with the RealSense technology from Intel, I’ve been thinking about gestures…

I’ve been saying for a long time that one of the keys to Apple’s success in getting developer buy-in for iOS was the very approachable and well designed tool kit they provided in X-Code. It was as if they polled 100 random potential coders and asked, “If you made an iPhone app, what’s the first thing you would want to tinker with?” and then they made all of those APIs easy to find and easy to use. The result was a tool kit that rewarded you early for modest effort and thereby encouraged developers to try more, to get better, to learn more again and keep exploring the tool kit for the next cool thing. It made adoption of something totally new feel manageable and rewarding. That not only encouraged the curiosity crowd, but also the business-minded crowd who has to ask, “How long will it take to adopt this tech? And is it likely to be worth it?” So long as the first answer is “Not too much.” then the second question is less acute.
The point being: it enabled early adopters to show off quickly. That drew in the early followers and the dominoes fell from there.

RealSense would benefit greatly from this lesson. Hardware appears to be in the pipe and were adequately impressed by the capability – check. A Unity3d SDK (among several others) is looking really sharp – check. So now I’m thinking about the question, “…What’s the first thing I want to tinker with?” and probably 75% of my ideas revolve around gestures. In fact, gestures are probably the essential component of this input schema and as such, it will be make-or-break for Intel to make gestures easy to get started with and also deep enough to explore, experiment, and mod. But Easy needs to come first…

Continue Reading…

Posted 2 months, 1 week ago at 6:49 pm.

Add a comment

Expression or Outreach

StreetPreacherComing back recently from CGDC has me thinking again about something I always think about at CGDC – whether or not we’re the “black sheep” of that group…and if we are, is that a good thing or a bad thing.

Last year at the end-of-conference Town-Hall part, where everybody can basically bring up anything they want, Mikee Bridges from GameChurch said something that brought this idea back to the front of my mind. I don’t remember exactly what he said but it was something along the lines of “Are all of our [game projects] actually serving the function of outreach[1]?”

It’s a perfect question for Mikee. After all, GameChurch’s mission statement is one of outreach – specifically an outreach to gamers. But I was surprised at how quickly my mouth popped open and I said “that’s not what we’re doing…” And I’ve been pondering that brief exchange ever since.

Continue Reading…

Posted 2 months, 3 weeks ago at 3:09 pm.

7 comments

RealSense Rubber Meets the Road

Continuing our series on Intel’s new/upcoming RealSense technology we recently got the alpha build of their Unity3D enabled SDK and a much improved version of the camera. While the package is cool and opens up a lot of interesting theoretical possibilities it got us thinking about the practical question surrounding this tech.

RealSense is, at its bottom line, an input device. In that sense it will be measured against things like joysticks, mice and game controllers and as a developer trying to make a living with this software we’ll be looking at several things beyond the “cool” factor. Things like:

  • Addressable audience
  • Typical hardware profile
  • Time/cost to implement
  • Processor overhead

When we’re being compensated to experiment and do basic R&D (And – full disclosure again – we are.) then we can ignore basically all of these considerations but when we move past that and start to explore actually deploying such tech…suddenly the calculus for deployment changes dramatically.

Continue Reading…

Posted 2 months, 3 weeks ago at 3:00 pm.

Add a comment

Redwall at GDC

I should have written this months ago, while all the memories were fresh, but sometimes you need a little time for an idea to find its place in your mind and sort itself out – perhaps this is one of those times.

A few months back we were at GDC in San Francisco. For the first time we took a risk and bought some booth space on the Indie floor sharing a slot with our friends at OmegaTech. Not being exactly organized we brought three things to show: a working build of Stargate SG1 Gunship, an alpha build of G Prime and a banner for Redwall. (Memo to self: next time try ‘focus’) G, for all the pretty screenshots, really wasn’t a good choice for a booth show – it’s more of a thinker really, and only alpha. SG1 showed pretty well. People seemed to like what we’d done with the UI, but far and away we had the most response to Redwall…even though we had nothing to show but a banner.

Seriously I was shocked…again. At times we had folks four and five deep around our tiny little table and at other times people were literally throwing resumes at us. Tweets and posts and selfies, all because of the way this series of books has touched people. There was a no-man’s-land of open seating adjacent to our booth and I could sit there inconspicuously watching as people would come up to the banner and take long pauses as if they were reliving fond memories. Sometimes they’d want to ask us questions but more often they just looked wistfully on at the sandstone walls and the setting rose-colored sun and seemed to be moved, almost to reverie.

Continue Reading…

Posted 3 months, 4 weeks ago at 1:52 pm.

6 comments

Realsense & Unity3d : A First Look

(by Jon Collins, on behlaf of Soma Games and Code-Monkeys)

This article is part of a series that documents our ‘Everyman’ experience with the new RealSense hardware and software being developed by Intel.

Full disclosure, Intel does pay us for some of this stuff but one of my favorite aspects of working with them is that they aren’t asking us to write puff-pieces. Our honest, sometimes critical, opinions are accepted…and even seem to be appreciated…so we got that going for us.

A First Look

There’s no denying that the Pre-alpha SDK is exactly what it says on the box, a pre-alpha, that said there’s a surprising amount of useful functionality which can be gleaned from looking deeper into the C# samples that are present and taking lessons learnerd from previous SDKs.

First off, the kit includes a Unity3D sample (there is just the one in the current package) is the Nine Cubes sample within the frameworks folder of the samples directory structure.

This gives us a good starting point to look into how to take advantage of the camera & SDK, although a few red-herrings are present which may be hangover from development versions, it gave us enough of an idea to further explore and adapt some of the separate C# samples bringing that functionality into our initial Unity3D project. (CS: We use Unity3D almost exclusively here at Soma Games so having this bridge to RalSense was a practical pre-requiste for us to consider adoption of RealSense)

RealSense Hand Joints and BonesFor this exercise we were primarily concerned with being able to track & record finger joint positioning within Unity3D. The available methods and documentation suggest there is an planned ability to load, save, and recognize gestures from a pre-defined library but after a little digging and running questions up to the dev team it appears that feature has been ‘delayed’ :( So with our hopes dashed at not finding the C# gesture viewer sample we wanted to see how, or even if, we would be able to access the joints to explore developing our own approach to logging finger & hand poses.

Continue Reading…

Posted 4 months ago at 8:30 am.

Add a comment

Getting Rolling with the RealSense SDK

(by Jon Hogins, on behlaf of Soma Games and Code-Monkeys)

This article is part of a series that documents our ‘Everyman’ experience with the new RealSense hardware and software being developed by Intel.

Full disclosure, Intel does pay us for some of this stuff but one of my favorite aspects of working with them is that they aren’t asking us to write puff-pieces. Our honest, sometimes critical, opinions are accepted…and even seem to be appreciated…so we got that going for us.

I recently got the fantastic opportunity to use a a pre-alpha version of Intel’s new RealSense camera to build a full-fledged app. It’s still a work in progress, but let me share my experiences and a few tips on getting the most out of RealSense’s video APIs.

The App

My mission has been to create a video conferencing app with a few interesting finger tracking interactions using the RealSense camera. After a bit of research, I decided on the Intel Media SDK for real-time H264 encoding and decoding and OpenCV for the initial display, moving to Unity/DirectX later.

Getting Started

Getting the RealSense SDK installed and creating projects based on the samples is straight forward, even in its Pre-alpha state. The installer adds the RSSDK_DIR environment variable and each VC++ project using RealSense only needs to add a property sheet via Visual Studio’s Property Manager. The documentation and samples are fairly comprehensive, and the APIs are the most accessible of any of the Intel C++ API I’ve worked with.

Continue Reading…

Posted 4 months ago at 11:33 am.

Add a comment