Designing A Virtual Reality Application:

Scene Level Navigation

Max Interactive VR Level View

designer

Clint Lewis

Scene Level Navigation

Most users of VR applications have the need to easily and safely navigate the entire scene. I say safely because improper navigation systems can cause nausea and dizziness in the VR experience.

Teleportation is a widely a used method in VR applications to move the user around the scene due to limitations in the physical “play” area. However, there are places in the scene that are inaccessible to this type of locomotion as teleportation usually depends on snapping to horizontal surfaces.

See below for an example of teleportation in Max Interactive’s Level VR editor.

 

 

Due to the limitations mentioned above, a general system of user movement is needed. Users need to be able to access every part of their design.

Early on we had a system of “joy stick” navigation using the VR controllers to access places not covered by “teleportation”. Unfortunately, even though this method of navigation is intuitive for most users, it caused immediate dizziness and had to be abandoned.

I considered and designed a “teleport to selected object” workflow, but I kept that in the back-log as I wanted to build upon what users knew from other VR applications. See embedded pdf below.

TeleportToSelected

 

From my investigation of other VR applications the preferred navigation solution was to use the motion of the vector drawn between the  VR controllers to adjust the user’s orientation, displacement and relative scale in the scene.

 

Based on this navigation technique I created this mock-up to guide development.

 

 

Here is the final implementation of the navigation system in Max Interactive’s level VR editor. Note the “line” drawn between the the controller cursors and the scene scale value displayed. The scene  scale value has several snap points that correspond to a variety of sizes.

  • mouse 5 cm
  • small child .90 m
  • adult 1.8 m
  • t-rex 6m (6 m)
  • bird’s eye (30 m)
  • large building (400 m)

In addition to these snap points, the user will be able to adjust the scene to intermediary scale values. When the user reaches a snap point they will get haptic feedback from the controller.

 

 

Clint Lewis: UX and Product Design Portfolio