Virtual Reality Application UX and Product Design: Working With Objects
Max Interactive VR Level View
designer
Clint Lewis
![](https://clintlewis.ca/wp-content/uploads/2017/12/ManusVR_Glove_2016.jpg)
Original Photo by ManusVR
Working With Objects
Working with objects in virtual reality is perhaps the most fundamental problem of the medium. Early VR experiments made use of “gloved” controllers as the human hand is the preferred way people like to interact with objects. As natural as this hand interface seems, issues of latency and tracking are always a problem. To this day, it remains the “Holy Grail” of VR human computer interfaces. Currently the most effective interfaces are the tracked controllers available with the HTC Vive and Oculus VR systems.
Using these controllers there are 3 problems we need to solve for the user:
- How does my user’s virtual “hand” work?
- How does my user define a selection of objects to work on?
- How does my user move objects?
The Virtual Hand
In most VR applications there are two main methods of representing the user’s “virtual hands”. I have named these methods as:
- direct manipulation
- remote manipulation
With direct manipulation the user interacts with objects in close proximity. The VR controller is a proxy for the user’s hand. Some VR applications will go as far as representing the controller using a realistic hand model with fingers. The advantage of this method is that it is quite intuitive. See video below.
With direct Manipulation there are pros and cons:
- Object selections are moved intuitively by movements of the hand with the controller.
- Selecting and moving objects can cause awkward wrist flexion leading to injury.
- It is difficult to move object selections precisely.
- The user needs to be in the vicinity of object to move and select
- Requires a “cursor” as a selector on the controller.
On the other hand (pardon the pun) with remote manipulation, interaction is done via a laser beam acting at a distance from the controller. There is no real world analogy to this method, with the closest being using a tv remote to change channels. The advantage of this method is that the user can affect objects without being in proximity to them, such as from a bird’s eye view. see video below for implementation of remote manipulation in Max Interactive’s VR level editor
Remote manipulation has its own pros and cons:
- User can move object selection at a distance.
- Rotation and scaling of object selections can be difficult.
- Requires a transform gizmo to control the axis of position, rotation and scale.
Editing the Object Selection
If you remember from the first section of this design document I asked the question, “How does my user define a selection of objects to work on?” When designing the selection method in Max Interactive’s VR level editor I tried to follow the same conventions as most other 3d digital content creation (DCC) applications, namely the parent application 3ds Max. In 3ds Max when a user selects an object with the mouse, it is high-lighted. To deselect, the user selects empty space or another object in the scene. In 3ds Max, to multi-select objects, the user holds down the shift key on the keyboard to add or remove additional objects to the selection. So, applying these conventions to the VR level editor in Max interactive…
Single Selection and Deselection Workflow: See video below.
- In the video below the user selects an object via the laser.
- Then selects another object, losing the first selection.
With Multi-Selection and Deselection: See video below
- The user activates the multi-selection mode button in the main menu. Multi-select mode could have been activated with a controller button press, but I felt this would be cumbersome and difficult for the user to remember.
- User selects the first object in the selection.
- User selects another object, thus adding to the selection.
- A bounding box is drawn around the entire selection with transform gizmo.
- To remove an object from the selection, the user selects it a second time.
An alternative method for multi-object selection could be accomplished by “painting” a selection from object to object as demonstrated here.
Transform Gizmo
This gizmo can be used with either a direct or remote manipulation method, but it is more useful with the remote method. The laser can be used to pick exact locations on a object. Thus the user can precisely select the transform axis they want to affect.
The transform gizmo was designed ad hoc, with input from all team members. Early on the team drew inspiration from VR developers such as Vrtisan and their innovative gizmo design.
See below: Remote Manipulation Transform Gizmo