Continuing our series on Intel’s new/upcoming RealSense technology we recently got the alpha build of their Unity3D enabled SDK and a much improved version of the camera. While the package is cool and opens up a lot of interesting theoretical possibilities it got us thinking about the practical question surrounding this tech.

RealSense is, at its bottom line, an input device. In that sense it will be measured against things like joysticks, mice and game controllers and as a developer trying to make a living with this software we’ll be looking at several things beyond the “cool” factor. Things like:

  • Addressable audience
  • Typical hardware profile
  • Time/cost to implement
  • Processor overhead

When we’re being compensated to experiment and do basic R&D (And – full disclosure again – we are.) then we can ignore basically all of these considerations but when we move past that and start to explore actually deploying such tech…suddenly the calculus for deployment changes dramatically.

Who will buy it?

The questions of market are the ones that will ultimately override most of the others. They can all be summed up and rephrased as “is it worth it?” and each shop will need to answer that question on its own. But while this will ultimately be the most influential question it is also the one we know the least about at this moment.

What we’ve seen in terms of hardware in the pipeline, RealSense looks to be a fairly inexpensive, differentiating feature that will appear on a wide variety of mid- to high-end tablets, Ultrabooks and 2-in-1 rigs. Without a robust body of software to use it, the hardware is positioned as an “oooh neat” bonus waiting for developers to make use of in incremental ways.

Admittedly, it’s a chicken and egg problem for Intel and I imagine the lessons of AppUp are influencing the decisions here but this seems to be a relatively frictionless path. RealSense cameras can be sold as nothing more glamorous than a better webcam and plenty of history supports the known market for better cameras. All of this makes me comfortable that the hardware will start to penetrate the market soon enough and in such a way that a small and specialized niche is avoided. It also suggests that users will get exposure to the hardware in bite-sized pieces without the need to alter existing behavior, that too makes adoption easier.

“Here’s your webcam that you already know and love…oh look, what’s this RealSense button do? A 3D webcam…cool!”

Feature by feature people can learn to see RealSense as an improvement on what they already know and nobody is left making an either/or purchase decision but rather the both/and variety.

Suffice it to say that while it’s all speculation, I’m reasonably comfortable with the path I see laid out for getting the hardware into the market and the kind of consumer most likely to have this tech available to them – namely folks with a demonstrated willingness to indulge a bit in upgraded technology.

How difficult is it?

On this question we have a lot more data to work with. After our experience with the RealSense predecessor, PerceptualSDK, and the evolving state of the RealSense SDK I like what I see. So far the tools have been well-designed, straightforward and not too difficult to implement.

Releasing something that adapts seamlessly with the environment most used by indie developers seems like the smart play to encourage experimentation and rapid deployment. Needless to say we’re biased toward both gaming as a category and Unity3D as a platform but I believe focusing on both would be a smart play.

Compared to other sectors, the gaming space is the most likely to get the most apps into the market quickly because indie game developers are nimble, creative, and willing to do something just for the heck of it. Compare them to business-case users who often need a much more compelling reason to adopt and deploy a new technology. And this willingness to experiment and embrace thrives on the player side too. Seeing the apoplexy of gamers over the mere potential of Oculus Rift needs to be seen as a reflection of an entire population’s state of mind. Show us something cool we can do and you can bet that we’ll do it. Make it accessible and reasonably inexpensive and we’ll fuel a Facebook acquisition.

I suspect one of the biggest challenges for the Intel engineers right now will be the temptation to do too many things at once and spread their efforts too thin. I would also imagine there are a lot of voices inside the company who all want to pull the SDK a certain direction. That said, internal politics are outside my sphere so I just hope they find the optimum path for releasing and supporting features and tools…on Unity.

Processing Overhead

This one…could be a problem. The current RealSense toolkit is expensive in terms of processor cycles and this may be the place the most improvement is needed. As mentioned before, at the bottom line RealSense is an input tool and a game can’t afford to spend too many cycles just collecting input or it becomes a marketing gimmick.

I would say that we’d be unlikely to accept any input that required more than 5% of our target processor. We need as much overhead as possible for things that make the game an immersive experience. Things like sound and eye candy, polygons and frame-rate. To compromise on those things for an experimental input would be a difficult (and unlikely) trade off.

Not knowing the ins and outs of the hardware I can’t speak to what’s possible in the current package but I do know this is something that will need to be addressed. Optimization, however, is a normal part of the process so I doubt the problem is a permanent one. And based on our experience with PerC I expect the most gains will come from smart filtering of the immense data steam coming from the camera.

Intelligent Adoption

I expect the final hurdle, and one that falls to people like us, is developers who take the challenge to rethink our UI expectations and push this new tech to a place where it serves a purpose that genuinely elevates the user experience.

It took a while for touch UI to mature. Things like accelerometers and GPS  needed some time for innovation to sink in and facilitate genuinely improved experiences (Play Ingress if you haven’t – a great use of GPS for ARG). RealSense will be no different and it’s not Intel’s job to innovate for us…plus, that’s the funnest part.

Carry on RealSense – Carry on…

Leave a Reply

Your email address will not be published. Required fields are marked *

Post comment