IDF 2014 was not the first time we’ve been honored to have a tech demo on the floor. And as we’ve been to the rodeo we’ve learned a thing or two about floor demos. First among those things: Keep It Simple Sherlock. So with that lesson in mind we created Cloak&Badger – a very simple game mechanic that splurged on the eye candy (a Cloak&Badgerwonderfully animated badger guard) and did exactly one thing: it used the recently updated RealSense (Beta) SDK and its emotion sensing API. The entire game worked as you made faces at the camera…that’s it…and it was a blast!

Cueing the player to what emotion drove the RPG-Style dialog tree in a particular direction was straight-forward and folks had tons of fun trying on the various emotive states that the API supports. (Includes:  Joy, Sadness, Disgust, Contempt, and Surprise plus the more general “sentiments” that are Positive, Neutral, and Negative.) By the rules of our KISS constraint it was an unqualified success and we had tons of smiles, laughs and genuine fun all week


Almost…
And then it went sideways.

Here’s the thing: when players knew what they were trying to do and were actively faking an emotion it was great fun. “Look at that stupid computer! It thinks I’m contemptuous!” But at some point each player turned away to ask a question or make a comment and things get weird fast. Out of the corner of their eye they saw that the computer was still tracking them, still evaluating their emotional state. But this time it was their real, non-acted emotional state, and the game was progressing without their intent or consent.

Now it was, “Hey! That stupid computer is still watching me!…You’re damn right that’s contempt”

Somewhere in there we entered a new kind of Uncanny Valley and folks did NOT like it, not at all. I think I can say, in fact, that while we got all kinds of good reactions on the game itself, this secondary experience was universally negative. Nobody liked being watched, not one. Now perhaps our specific situation accentuated this gap between intentional and unintentional interaction with the computer. Perhaps they’d have been more comfortable if they had been expecting this kind of passive observation but I kinda doubt it. I think what was really going on here came down to control and the awareness that the computer was now acting independent of their intent.

When the emotion sensing feature was shown to us at first the pitch was how a game could invisibly watch a player’s mood during gameplay and react accordingly. So if the player is looking bored you can throw more zombies at them. If they smile or frown or hate your ever-bleeding guts…the game can do something with that sentiment and presumably improve and customize the experience. But there’s an important difference between input mechanisms and voyeurism. There’s a creepiness that sets in when you realize the thing is watching you…it doesn’t feel good. There’s a line between a customized experience and a manipulated player.

So for me – I’m torn. Honestly it is pretty cool tech and we saw first hand how it could used as a fun and natural feeling game mechanic. But we also accidentally stumbled into something creepy and uncomfortable and just a little violating…and I don’t want to go there again.

About the author
2 Comments
  1. Bob

    I think the issue is a monitoring feature that is so transparently monitoring with consequence. For example hand tracking while the PC is monitoring where your hand is, that seems OK. The Amazon phone tracks you but the result is something intuitive for visual effect. The problem is tracking an expression and then putting that back on someone with consequence.
    A good compromise maybe FaceShift. That feature allows you to use expressions for your avatar to represent you virtually to add personality. There is no consequence of your personality. When my daughter and I play Little Big Planet, she loves to use the expression tweeks to give her character personality. Crying, or laughing at the right moment can add a human dimension to a game. But if it causes the game to react differently that might be the issue.

    • Hi Bob, I think you’re right on the matter of transparency but this experience seemed to be more about permission. The weirdness seemed to arise from that moment that you feel “watched” and you didn’t consent to that.
      I suspect an easy fix for Cloak&Badger is something that turns the emotion tracking feature off – either manually or automatically. For instance, we might shut it down if we loose eye contact or if the facing angle goes to far.
      This observation also makes me think that using the feature in some behind-the-scenes secret way that was ‘spying’ on the user might eventually lead to a problem.
      To be clear, I see this as a question of best practices as opposed to anything technical. But I’m not aware of any such guidance on a question like this.

Leave Comment

Your email address will not be published. Required fields are marked *

clear formSubmit