Interaction Methods for Virtual Reality Applications

The potential of Virtual Reality (VR) technology has not been fully realized because the user interfaces are not designed to effectively support the user. One of the most critical interface aspects is the possibility to interact with the system within the virtual environment. The initial idea of optimizing the user interaction was to transfer the interaction that occurs in the real world as accurately as possible into the VR environment. In this case the user, theoretically, would have no problems interchanging between the two worlds.

But, VR technology is not yet mature enough to do this. Some elements of the real-world interaction, such as haptic feedback, cannot be transferred into the VR environment, and others only are transferred with insufficient precision. Examples of common problem areas are haptic feedback, operation of controls requiring fine motor skills (e.g. rotary controls), matching the movement of a real person and their virtual representation, and visualization quality.

Differences Between VR and Reality

Figure 1. Holding a ball with two virtual hands. The opaque hands show the positions of the virtual hands; the wireframe hands show the positions of the user’s real hands. They are, in contrast to the virtual hands, able to penetrate the ball.
It is normally not possible to generate an exact copy of the real world. Therefore, fundamental differences exist between the real and the virtual worlds. To illustrate those differences, imagine the simple example of holding a ball in two hands (Figure 1). In reality, a ball is kept hold of with two hands, the position of the hands is determined by the surface of the ball, and control of the ball is provided by the haptic and visual senses of the human. In a virtual environment, where no haptic feedback is provided, the control of the ball must exclusively be accomplished by the visual system.

There are several aspects that make it much more difficult to control the ball in the virtual environment than in reality:

  • The hands seen by the user are not his own hands, but more or less realistic reproductions.
  • The ball is also only a reproduction.
  • The positional quality of the reproduction of both the ball and the hands is in most cases not perfect.
  • The user is not really holding a ball (in fact, he or she is not holding anything); the user has to position his real hands according to the visual movement of the reproduction of his hands. This causes a completely different stimulation of his muscles than with a real ball (no weight and no counterforces from the ball).

A task that is simple in reality turns out to be very challenging in a VR environment, and the only way to make the user able to deal with these issues is through training. Where noticeable differences exist between reality and the virtual world that cannot be overcome, it is not worthwhile to put a lot of effort into the design of interaction methods that try to mirror reality as exactly as possible. In these cases, other methods of interaction should be taken into consideration. Examples from aircraft design are given below.

VR in Cockpit Development

Figure 2. A precise visualization of the cockpit elements is achieved by proper positioning of the “virtual camera.” This is achieved by tracking the head of the user.
The basic requirement here is to ensure the accuracy of the data generated in the VR environment. The initial idea was to carry out VR cockpit assessments in real aircraft, which meant reproducing the real aircraft interaction within a VR environment. Two problem areas were identified: first, the visualization error caused by an improper placement of the virtual camera, which is the user’s eye into the virtual world, and second, the error arising from an inaccurate hand model and an imprecise finger tracking method.

The relevant data is gained by tracking the head of the user (Figure 2). The tracking was done by an optical system and the reflectors necessary for this task were mounted on the data helmet. The distance and orientation of the reflectors to the user’s real eyes depend on the way the user wears the helmet. A series of trials showed that with the standard calibration of the helmet, more than 40% of the test subjects complained about an unnatural visual perception of the cockpit. The way users wore the helmet differed by up to 16°. With additional calibration, the virtual camera was positioned much better at the user’s real eye position, independent from the way the user wears the helmet.

Haptics play an important role in cockpit development. To be able to assess haptic aspects, we built a piece of hardware that is flexible enough to represent all types of military cockpits and tactical working environments. The challenge was to make sure that the visual representation of objects in VR was within ±2 mm of their real position in order to provide a realistic feeling to the user. Since we also wanted to include a representation of both the thumb and the index finger of the user, it was necessary not only to reduce the visualization error of cockpit elements, but also of the hand, including the index finger and thumb.

With a standard data glove and a standardized hand model, it was not possible to achieve the required precision. The data glove was replaced by custom-made hand/finger/thumb trackers compatible with the optical tracking system. Instead of the standardized hand model, we developed a flexible model, which is adaptable to the real geometry of the user’s hand. In addition, we implemented a calibration process that measures, in three steps, the rough dimensions of the hand and the precise lengths of the index finger and thumb, as well as the position and orientation of the tracking sensors on the hand, index finger, and thumb. The two calibration methods contributed to a high level of user acceptance.

VR For Maintenance Applications

For maintenance applications, a much more complex interaction with the VR environment is necessary than for cockpit development. The user is not a pilot but a technician who has to operate tools to turn screws, and has to install and dismount objects in difficult-toaccess positions, sometimes in a cramped environment and with uncomfortable posture. We evaluated five methods in order to find out which interaction methods are adequate for a maintainability engineer. The task for each subject was to mount an LRU (Line Replaceable Unit, a box of electronics, such as a radio or other auxiliary equipment, inside the avionic compartment of an aircraft).

  1. Contact Simulation without Haptic Feedback: The first alternative was to use a data glove for interaction in combination with a contact simulation. Vis - ualization was provided by the data helmet and no haptic feedback was available. To move the LRU, the user grabs it with his virtual hand. A grasp occurs if both the index finger and thumb of the virtual hand have contact with the LRU. Once held in the virtual hand, the position and orientation of the LRU is controlled by the points where the index finger and the thumb touch the LRU. It was also possible to grab the LRU with two hands, wearing two data gloves. In this case, the position and orientation of the LRU is controlled by the points where the palms touch the LRU. The contact simulation ensures that, if a collision between the LRU and the aircraft structure occurs, a compensation movement is calculated.
  2. LRU Tracking without Contact Simulation: The second option was to skip the contact simulation and use a realsized model of the LRU made of polystyrene. It was provided with an additional tracking sensor, so that the movement of the real LRU and its virtual representation matched. The user was wearing two data gloves; they were visualized by two virtual hands. The user was able to see and feel the LRU. Every test subject was able to move the LRU to its end position in a very short period of time.
  3. Force Feedback Device plus Contact Simulation: The third method applied an active force feedback device (FFD), which comprises a robotic arm, the joints of which were controlled by electric motors. The FFD provided a force onto the user. The range of the robotic arm was comparable to the range of a human arm. A mockup of the LRU was mounted at the end of the FFD arm. The contact simulation linked to the FFD made sure that, in case of a collision between the LRU and aircraft structure, the respective force is transmitted to the user. The LRU could be grabbed either with one or both hands. The FFD proved to be suitable to support the movement of the LRU to its end position, but only if all six degrees of freedom (rotation and translation) had been implemented.
  4. Space Mouse with Contact Simulation: The fourth method was to use a SpaceMouse to control the movement of the LRU in combination with a contact simulation. The SpaceMouse allowed the user to move an object in all six degrees of freedom. This way of controlling objects is very different from the way it is done in reality, which makes it unsuitable for training applications, but provides a promising option for the maintainability engineer to assess mounting paths.
  5. Flying Mouse with Contact Simulation: The last method was to use a FlyingMouse to control movement of the LRU in combination with a contact simulation. The visualization was done in a four-sided Cave3-System. The FlyingMouse has the form of a handle; it contains one or more triggers and a tracking sensor to pick up its position and orientation in space. By pressing a trigger, the virtual LRU is tied to the FlyingMouse and is moved according to the movement of the FlyingMouse. So if the FlyingMouse is moved by one meter to the left, the LRU does the same. If the trigger is released, the LRU and FlyingMouse are separated and a movement of the FlyingMouse no longer moves the LRU.

Interaction should always be as simple as possible. It is easier for a user to work within the VR environment and manage the complexity of a particular task if the interaction is intuitive. The selection of the interaction method is essential for the success of a VR application.

This article was written by Leonhard Vogelmeier, Harald Neujahr, and Dr. Peter Sandl of EADS Deutschland GmbH, Munich, Germany. For more information, Click Here .