I’l never forget the moment I first understood that the iPhone was something magic though at the time I wasn’t sure what it was I was observing. My pastor, who is one of the most dedicated MacHeads I know, had an iPhone without 38 seconds of them being released. A few days later he was showing a photo of his grandson, on the iPhone, to Beth. Beth is one of those people who maintains a kind of love-hate relationship with all technology. She’s not a gear-head by any stretch, but nor is she a Luddite like Rebekah. (I do SO love you sweetie, even if you resent my livlihood.)
Beth took the iPhone, cooed appropriately at the charming picture and began to hand the phone back to Bill. As she did the photo rotated and scaled and Beth gasped. She pulled the phone back to herself and the photo spun around again. Eyes like saucers and her mouth agape she starts spinning the phone back and forth back and forth in awe until Bill snatches it away from her with a protective ‘give me THAT’ kind of look.
Without any expectation and no penchant for TechWow Beth had seen something that connected with her emotionally and intuitively. In that instant I think I glimpsed the future.
The same kind of thing happened over and over when I got my own iPhone a couple of months later and I’m living it again with my iPad. People who are standing next to you for coffee simply want to touch it. They don’t care what it does really or if it’s “productive.” Nobody asks if it’s OS4, Windows or Swahili – they want to touch it and swipe at it and slice the flying fruit. In time the other questions do come up and they are, no doubt, important. But none of those things are the magic.
Touch is the magic. It’s the secret sauce, the almost overlooked, always undervalued, hard to quantify thing that has made the iThings explode onto the stage the way that nobody imagined. When TouchUI is done right it de-abstacts everything we interact with in a computer or smartphone. The mouse+GUI was so revolutionary in its day for the same reason. It tuned a profoundly abstract and inscrutable command line prompt into icons that many of us could intuitively grasp as symbols for what we really wanted. Touch does that even more by bringing us one step closer to…whatever: our photos, our books, our games. The less abstract the experience, the less “cyber” it all feels and then those electronic things start to feel more and more like the “real” things of our “real” world. And the more real it all seems the more we’re able to connect.
TouchUI isn’t just a new interface item, it’s a fundamental shift in thinking – at least it can be. I was looking back to 2001 when tablet PCs were first introduced and also thinking about what I saw at the most recent Computex. Why did tablets bomb in 2001 – what did Bill do wrong back then? I think the answer is self-evident in Gates’ recent complaints on the iPad. Gates thinks the iPad is missing a stylus and a keyboard because TouchUI lacks “input”. In other words, Gates thinks the iPad isn’t “productive” and he’s fixated on simply trading one abstraction for another. Instead of bringing the user closer to their stuff, his stylus fetish is simply a move to make the mouse more like a more ancient UI tool – the pencil.
Two things are way wrong here. First is the focus on productivity. When computers were primarily business tools with price points requiring business funds then productivity was an important goal and justification. But with the focus now on personal use (netbooks thrive on tweens and soccer moms, not suits and salary men) productivity is no longer the driving argument for technology adoption. Instead, fun and connectivity are.
Second, I can’t help but feel like Gates betrays a fundamental inability to recognize that things like windows and mice are metaphors, not things in themselves. The mouse is a device used to control what on the screen? – a metaphor of a pointing finger. So using my physical finger simplifies the whole interaction. The more we remove the layers of abstraction the more people will WANT to use these things as opposed to merely being tools.
Despite how it might sound to this point, this is not some Ra-Ra thing for Apple, it’s a Ra-Ra thing for a way of thought. With few exceptions, the tablets and slates I saw at Computex wereWindows PCs with a touch screen. That’s a step in the right direction to be sure but it fails to integrate the fundamentally different way of thinking that TouchUI requires. The only other system that really seems to embrace the touch thing is what I saw with the early demonstrations of MeeGo but again I want to be clear that I don’t think this is an OS question at all. It’s an implementation question. If MS and others want to compete here they need to go into the developer’s offices and throw out all the mice. Every single aspect of the UI needs to built around touching the screen. This is also somethign Flash developers need to jump on if they want to save their favorite SDK regardless of what Apple does. Touch isn’t locked into Apple, they were simply the first folks to do it right.
What I want to see next is a place where touch meets 3D – that will explode the “folder” metaphor…