We’ve created some very rapid prototypes in the past 10 days, just to test the waters (no pun intended) of cinematic 360 capture and playback within VR HMDs.

The tests have been mostly very rewarding.

Our findings are as follows:

  1. Initial 2D 360 still / video capture is easy. We started with the Google Photosphere app, free on Android, which takes about 5 minutes and 40 photos per sphere. We’ve since upgraded to the Ricoh Theta, which captures 360 video with a single button-press.
  2. consider EVERYTHING in the 360 field of view. Its all in the shot. There’s no back stage. This concept takes a lot of getting used to if you’re used to working with lights, sound techs, and crews.
  3. Editing is time consuming. Easier to clean physical reality prior to the actual shot, then to paint it in post.
  4. A base plug at the foot of the shot is a nice touch, both visually and to cover the merge seam.
  5. Similarly, we use a lens flare to simulate the light dynamics of the sun.
  6. Audio engineering is key, time-consuming, fun, AND makes the difference between “just another photosphere” and the feeling of presence. You collect video at a single point; audio should be collected at all the local sound origination points, then placed into proper 3D positions in post, with filters.
  7. Since we’re authoring all this within the game engine, we’re having a lot of fun with the 3D positional audio. Placing sounds, even animating sounds as, say, a bird flies across the forest canopy.

into the wild... virtually.

And finally, there are some things, some of the best parts of nature, which simply aren’t going to be in VR anytime soon. Those being, the elements. Wind in your hair, and clean running stream water on your bare feet… those will have to wait.

fresh water from the springs... yes please!.. but not in VR.

Sign-up here to be the first to try dSky experiences