Harvest Moon and Redwall

As the recent news about our progress on funding makes the circuit and…some other news (ahem) looms nearer and nearer, we are understandably being asked questions about the game’s scope, mechanics and genre. We’ve been deliberately coy on specifics and the biggest reason has been to minimize the misery for the fans if things never gelled. Now that we’re feeling more confident in the way things have shaped up it seems fair to start sharing our thoughts on the game itself.

One of the strongest themes I’ve heard from fans of Redwall is easiest to describe this way:

We want to live there.

Redwall News – It’s Big and We Need Your Help…

So we’ve got some good news and some bad news regarding Redwall. And when that’s all done I’ll be asking for your help.

First: the good news…which is really very good. If you were watching closely a few months ago you would have seen the pitch page we put up that revealed we were seeking $1.2m in funding for The Warrior Reborn. The timing of that page going live was deliberate for two reasons. First, we were just starting IDF in San Francisco and for the first time we were going to show one of our “final-ish” character designs in the public – Neebrock the Badger. Up to that point it had all been concept art and sketches but this was the real deal. We were a little anxious to see how people responded but we anticipated positive response and we got it. So riding that wave just a little to our pitch page was an easy link. But the second thing was much more tangible. When I left IDF and the City by the Bay I flew to beautiful, downtown Chattanooga, TN to meet a big potential investor.

We pitched, we ate phenomenal fried chicken, and then we waited…until last week.

I am extremely proud and excited to announce that one of the most well known names in the world of philanthropy has decided to honor us with their friendship and material support…and this is a huge break for us. (It’s also considered tacky to mention them by name…or so I’m told.)

RealSense And The Uncanny Valley

IDF 2014 was not the first time we’ve been honored to have a tech demo on the floor. And as we’ve been to the rodeo we’ve learned a thing or two about floor demos. First among those things: Keep It Simple Sherlock. So with that lesson in mind we created Cloak&Badger – a very simple game mechanic that splurged on the eye candy (a Cloak&Badgerwonderfully animated badger guard) and did exactly one thing: it used the recently updated RealSense (Beta) SDK and its emotion sensing API. The entire game worked as you made faces at the camera…that’s it…and it was a blast!

Cueing the player to what emotion drove the RPG-Style dialog tree in a particular direction was straight-forward and folks had tons of fun trying on the various emotive states that the API supports. (Includes:  Joy, Sadness, Disgust, Contempt, and Surprise plus the more general “sentiments” that are Positive, Neutral, and Negative.) By the rules of our KISS constraint it was an unqualified success and we had tons of smiles, laughs and genuine fun all week


Almost…
And then it went sideways.

That Long-Overdue Redwall Update…

It was sixteen months ago that we posted our first blog regarding Redwall, or Project Mouseworks. Shortly thereafter we launched our AbbeyCraft kickstarter, it funded, and then roughly a year ago this month AbbeyCraft was released. All going well so far. The plan at that point, as far as we could see it, though shrouded in some pretty dense fog, was to wrap up a modest private funding effort, build a modest adventure game and then see what happened. It was a pretty straightforward plan and while Redwall was obviously a big thing, our goals were fairly short term and limited. But something happened on the way to that pivot and while it’s cost us some time I hope you’ll see it as something overall quite positive – I know we do.

Setback #1: If I’m honest, I was just horribly naive about how the private funding world works. I’d never done it before but with all things considered it felt like the right play as opposed to either a traditional publishing deal or taking a second draught at the crowd funding trough. I’ll certainly write more about this experience in the future but suffice it to say that I underestimated the time this was going to take. On its surface that sounds like a bad thing, it was certainly wretchedly frustrating at times, but as I’ll describe below I think it was actually a blessing in disguise.

Redwall at GDC

I should have written this months ago, while all the memories were fresh, but sometimes you need a little time for an idea to find its place in your mind and sort itself out – perhaps this is one of those times.

A few months back we were at GDC in San Francisco. For the first time we took a risk and bought some booth space on the Indie floor sharing a slot with our friends at OmegaTech. Not being exactly organized we brought three things to show: a working build of Stargate SG1 Gunship, an alpha build of G Prime and a banner for Redwall. (Memo to self: next time try ‘focus’) G, for all the pretty screenshots, really wasn’t a good choice for a booth show – it’s more of a thinker really, and only alpha. SG1 showed pretty well. People seemed to like what we’d done with the UI, but far and away we had the most response to Redwall…even though we had nothing to show but a banner.

Seriously I was shocked…again. At times we had folks four and five deep around our tiny little table and at other times people were literally throwing resumes at us. Tweets and posts and selfies, all because of the way this series of books has touched people. There was a no-man’s-land of open seating adjacent to our booth and I could sit there inconspicuously watching as people would come up to the banner and take long pauses as if they were reliving fond memories. Sometimes they’d want to ask us questions but more often they just looked wistfully on at the sandstone walls and the setting rose-colored sun and seemed to be moved, almost to reverie.

Realsense & Unity3d : A First Look

(by Jon Collins, on behlaf of Soma Games and Code-Monkeys)

This article is part of a series that documents our ‘Everyman’ experience with the new RealSense hardware and software being developed by Intel.

Full disclosure, Intel does pay us for some of this stuff but one of my favorite aspects of working with them is that they aren’t asking us to write puff-pieces. Our honest, sometimes critical, opinions are accepted…and even seem to be appreciated…so we got that going for us.

A First Look

There’s no denying that the Pre-alpha SDK is exactly what it says on the box, a pre-alpha, that said there’s a surprising amount of useful functionality which can be gleaned from looking deeper into the C# samples that are present and taking lessons learnerd from previous SDKs.

First off, the kit includes a Unity3D sample (there is just the one in the current package) is the Nine Cubes sample within the frameworks folder of the samples directory structure.

This gives us a good starting point to look into how to take advantage of the camera & SDK, although a few red-herrings are present which may be hangover from development versions, it gave us enough of an idea to further explore and adapt some of the separate C# samples bringing that functionality into our initial Unity3D project. (CS: We use Unity3D almost exclusively here at Soma Games so having this bridge to RalSense was a practical pre-requiste for us to consider adoption of RealSense)

RealSense Hand Joints and BonesFor this exercise we were primarily concerned with being able to track & record finger joint positioning within Unity3D. The available methods and documentation suggest there is an planned ability to load, save, and recognize gestures from a pre-defined library but after a little digging and running questions up to the dev team it appears that feature has been ‘delayed’ 🙁 So with our hopes dashed at not finding the C# gesture viewer sample we wanted to see how, or even if, we would be able to access the joints to explore developing our own approach to logging finger & hand poses.

Getting Rolling with the RealSense SDK

(by Jon Hogins, on behlaf of Soma Games and Code-Monkeys)

This article is part of a series that documents our ‘Everyman’ experience with the new RealSense hardware and software being developed by Intel.

Full disclosure, Intel does pay us for some of this stuff but one of my favorite aspects of working with them is that they aren’t asking us to write puff-pieces. Our honest, sometimes critical, opinions are accepted…and even seem to be appreciated…so we got that going for us.

I recently got the fantastic opportunity to use a a pre-alpha version of Intel’s new RealSense camera to build a full-fledged app. It’s still a work in progress, but let me share my experiences and a few tips on getting the most out of RealSense’s video APIs.

The App

My mission has been to create a video conferencing app with a few interesting finger tracking interactions using the RealSense camera. After a bit of research, I decided on the Intel Media SDK for real-time H264 encoding and decoding and OpenCV for the initial display, moving to Unity/DirectX later.

Getting Started

Getting the RealSense SDK installed and creating projects based on the samples is straight forward, even in its Pre-alpha state. The installer adds the RSSDK_DIR environment variable and each VC++ project using RealSense only needs to add a property sheet via Visual Studio’s Property Manager. The documentation and samples are fairly comprehensive, and the APIs are the most accessible of any of the Intel C++ API I’ve worked with.