While watching throngs of VR virgins enter the Oculus Calibration scene, and seeing them flail about fruitlessly trying to touch the desk, the plant, the stack of cards… it made me ponder about the lack of good haptics in present-day VR, and what a truly difficult challenge that is to tackle technically. Couldn’t there be a more elegant solution?
Of course, there is. Simply model the virtual environment physically around the player, track the physical objects’ position and orientation in real time, and then render those physical objects accurately within the VR world… sort of like AR, in reverse: a genuine Virtual Office.
I realised that one of the first challenges I had with developing VR was finding my keyboard when I had my HMD on. A keyboard is something trival to model, and fairly trivial to track and position. Why not just have a voice or touch or gesture toggle that caused the keyboard to appear in the virtual world, right where it is in the real world?
I wasn’t the only one thinking this. Abrash and Carmack had a hilarious interchange during a group session at Oculus Connect, where Carmack was talking about using cameras to track the fingers on a virtual keyboard, and how typing speed and finger occlusion would be real barriers to effective tech, and Abrash kept repeating three times : “No, John, I’m talking about a REAL keyboard. The player has a REAL keyboard, and they see a replica of it in the VR. We track the keystrokes because the player is hitting the real keyboard’s keys.”
With that in mind, it made total sense for my test scenario to be an extension of the Oculus Calibration scene. While many enthusiasts visit this scene once and never again, I in fact find it to be the absolute BEST scene for beginner VR orientation, and use it on almost every virgin experience.
Now, I am building a precisely measured mm-accurate replica of my home studio, so that when we go into the VR, players can reach out an touch ANY object, with complete and total haptic accuracy.
Stay tuned for progress!