The first reaction a VR virgin usually exhibits is, within 10 seconds:

“OMG, this is the coolest thing ever!”

Within 10 seconds of that, after knocking a few things off the desk and smacking the wall, they exclaim, with a mix of wonder and fear:

“um, where are my hands?!?”

 

vr-hands

 

This is the challenge of modern consumer VR : The optical stimulus is so intense, so realistic, that once the player gets used to the fact that they have been teleported to a new universe, they expect to be able to interact with that universe much like they do in the Real World, on planet earth.

In real life, our agency with other humans, and with the Earth, is expressed primarily through hands.

In videogame design, we call this concept Agency. Agency is simply defined as: a players ability to interact with their environment. For instance, in PacMan, your agency is to move up, down, left, and right… and once you gobble the magic yellow dot, to be able to devour glowing ghosts. In a more modern first person shooter, your agency is the ability to rapidly traverse 3d landscapes, to “pick up” objects by running over them, and to shoot at enemies with a variety of weapons.

 

In real life, our agency with other humans, and with the Earth, is expressed primarily through hands. Our hands are our primary tools, both for manipulation of objects, for creation, for crafting, and both powerful and subtle gesturing during verbal communication with others. Imagine being in the Real World without arms and hands. How would you feel? How would you behave differently? What would you miss?

 

This is why Oculus bought a great little KickStarter company called NimbleVR, and why they are investing millions into the esoteric discipline of Computer Vision: to ’see’ a players real hands, and to insert virtual copies into the virtual world with near-zero latency.

 

Of note, hands are what we use to pick up and inspect objects… what we use to augment our verbal communication… and importantly, what we use to wield tools. Tools, in fact, give *additional* agency in the world : screwdrivers to attach, swords to cut, cups to drink from, paintbrushes to create with, a violin bow to make music with, a steering wheel and shifter to drive a car. An interesting aspect of tools is that almost every tool has a handle… the physical extension of the tool designed specifically to be held in the hand. Humans are intimately familiar with the handle metaphor, and supplied with a given tool, will naturally adapt hand position to the tool and purpose at hand.

 

So while free hands solutions like Oculus and LeapMotion are wonderful for gestures, they leave something to be desired where grasping, manipulation, and tool usage are involved. For those actions and agencies, we might wish to have something like a generic handle that could actively simulate the surface of any given tool. Further, there should be one handle per hand, and the hands should have total freedom of movement about the body.

 

Such a solution does, in fact, exist:

 

Our friends at Sixense have been working on the challenge of pro-grade consumer 3D hand input for perhaps longer than anyone else. They created the product launched as the Razer Hydra, a novel and empowering way to explore the videogame worlds of Portal. Now, as VR mania takes hold, developers are scouring eBay and paying 400% retail for the hardware, in anticipation of the consumer release of the Sixense STEM, a much improved, modular, totally wireless version. They’re already intermittently shipping developer units and are set to put out the full release July 2015.

 

Read more about STEM here

Leave a Reply

Your email address will not be published. Required fields are marked *