IDF 2014 was not the first time we’ve been honored to have a tech demo on the floor. And as we’ve been to the rodeo we’ve learned a thing or two about floor demos. First among those things: Keep It Simple Sherlock. So with that lesson in mind we created Cloak&Badger – a very simple game mechanic that splurged on the eye candy (a wonderfully animated badger guard) and did exactly one thing: it used the recently updated RealSense (Beta) SDK and its emotion sensing API. The entire game worked as you made faces at the camera…that’s it…and it was a blast!
Cueing the player to what emotion drove the RPG-Style dialog tree in a particular direction was straight-forward and folks had tons of fun trying on the various emotive states that the API supports. (Includes: Joy, Sadness, Disgust, Contempt, and Surprise plus the more general “sentiments” that are Positive, Neutral, and Negative.) By the rules of our KISS constraint it was an unqualified success and we had tons of smiles, laughs and genuine fun all week
And then it went sideways.
Here’s the thing: when players knew what they were trying to do and were actively faking an emotion it was great fun. “Look at that stupid computer! It thinks I’m contemptuous!” But at some point each player turned away to ask a question or make a comment and things get weird fast. Out of the corner of their eye they saw that the computer was still tracking them, still evaluating their emotional state. But this time it was their real, non-acted emotional state, and the game was progressing without their intent or consent.
Now it was, “Hey! That stupid computer is still watching me!…You’re damn right that’s contempt”
Somewhere in there we entered a new kind of Uncanny Valley and folks did NOT like it, not at all. I think I can say, in fact, that while we got all kinds of good reactions on the game itself, this secondary experience was universally negative. Nobody liked being watched, not one. Now perhaps our specific situation accentuated this gap between intentional and unintentional interaction with the computer. Perhaps they’d have been more comfortable if they had been expecting this kind of passive observation but I kinda doubt it. I think what was really going on here came down to control and the awareness that the computer was now acting independent of their intent.
When the emotion sensing feature was shown to us at first the pitch was how a game could invisibly watch a player’s mood during gameplay and react accordingly. So if the player is looking bored you can throw more zombies at them. If they smile or frown or hate your ever-bleeding guts…the game can do something with that sentiment and presumably improve and customize the experience. But there’s an important difference between input mechanisms and voyeurism. There’s a creepiness that sets in when you realize the thing is watching you…it doesn’t feel good. There’s a line between a customized experience and a manipulated player.
So for me – I’m torn. Honestly it is pretty cool tech and we saw first hand how it could used as a fun and natural feeling game mechanic. But we also accidentally stumbled into something creepy and uncomfortable and just a little violating…and I don’t want to go there again.