Active Interaction Devices

Author: Jack Hsu


Introduction

Traditionally, computers use input devices ranging from switches and punched cards to keyboards and mice. However, it is obvious that such devices are insufficient for treading the pathways of the virtual world. For example, how would you express the simple action of drinking a cup of water using a mouse? Typing the command in with the keyboard may come to mind, but it is a cumbersome solution as you will have to specify things like which cup to drink from, and how much to drink. You will also have to learn the correct syntax to convey the information to the computer. This is definitely not a simple task to perform using the keyboard!

Virtual reality departs from conventional Human-Computer Interaction (HCI) and naturally requires a different set of user input tools. This paper will examine the various types of input devices that have been developed for use with virtual reality.

Gloves

We use our hands as the main body device to manipulate and interact with our environment. Therefore it is natural that in the world of virtual reality, we should also be able to use our hands to explore and manipulate objects. A glove device is designed specifically for capturing the movement and location of the hand in order to achieve such a purpose.

When you move your hand, the glove picks up the movement and sends an electrical signal to the computer which then translates the movement from real space into virtual space. Often, you are able to see your virtual hand in the virtual world. This greatly aids in the hand-eye coordination, necessary for any kind of positioning in a three dimensional world.

Besides being used to perform tasks that we use our hands in the real world for, researchers soon realized that this same glove could be used for architectural walk-throughs, scientific visualization, or any number of other applications. [AUKS92] This is done by assigning meaning to hand gestures.

DataGlove

A DataGlove is made of lightweight Lycra, and consists of two measurement tools. The first tool measures the flex and extension of every finger. It does so by using a set of fiber-optic cables that runs along each finger, with a photo-sensor at one end and a light- emitting diode (LED) at the other end. When a person wearing the glove bends a finger, light from the LED escapes through small holes in the cable's sheath. Thus less light reaches the photo-sensor and generates a weaker electrical signal. In this way, the computer receives input about which fingers are bent, and by how much.

The second tool measures the absolute position (X, Y, and Z axis) andthe orientation (roll, pitch, and yaw) of the hand. This tool has two parts: a stationary transmitter, and a receiver that is placed on the glove. Both the transmitter and the receiver are made up of three coils of wire at right angles. The transmitter has an electrical current passing through and it creates a magnetic field. When the glove moves, the receiver on it produces three distinct electrical charges. By measuring these charges, we can calculate the position and orientation of the glove. This system is known as the Polhemus magnetic positioning system. [AUKS92]

One problem with the DataGlove is that it requires recalibration for each user as it is very sensitive to knuckle position. Also, it is about one hundred times more expensive than the PowerGlove, which will be described next.

PowerGlove

A PowerGlove is a low-cost version of the DataGlove that perfoms the same functions using completely different methods. Originally, it was sold as a game controller for the Nintendo Home Entertainment System, but because of its relatively low cost, the PowerGlove has also quickly found its way into a number of virtual reality research facilities around the world [AUKS92].

For flex-measuring, the PowerGlove has a strip of mylar plastic coated with electrically conductive link. This strip is placed along each finger and when a finger is flexed, the electrical resistance changes. The change corresponds to the degree bent.

For absolute position and orientation, the PowerGlove uses the simpler ultrasonic positioning technique. Receivers pick up the signals from two ultrasonic transmitters on the glove and translate them into a position in space.

Like the DataGlove, the PowerGlove needs recalibration for different users. Also, it is less accurate than the DataGlove. However, the PowerGlove is more rugged and easier to use than the DataGlove.

Dexterous Hand Master

A dexterous hand master (DHM) is an exoskeleton that is attached to the fingers using velcro straps, and attached to each finger joint is a device called a Hall effect sensor whose purpose is to measure the finger-join angle. Instead of using optical or electrical signals, a DHM uses mechanical linkages to track the movement of the hand.

Not only is a DHM more accurate than a PowerGlove or a DataGlove, it is also able to measure the radial-ulnar deviation (side to side motion) of each finger. In addition, it takes into account the fact that a human finger has three sections, rather than two, which the DataGloves and PowerGloves are capable of measuring. This precision makes it extremely useful for any application that requires a high level of control, such as controlling dexterous robotic hands.

DHMs are also less sensitive than either the dataglove or the powerglove with respect to different hand sizes and placement on the fingers. However, it is rather clunky to work with.

Mice and joysticks

Mice and joysticks are popular input devices for a computer system. They are sufficient for navigating around a simple virtual world in two dimensions and for performing simple tasks by using the buttons on the devices. In fact, by limiting the amount of control the user has, it may actually simplify a given task for the user as there are fewer variables to deal with.

Mice and joysticks usually have two degrees of freedom, although there are mice designed with six degrees of freedom. Either ultrasonic, electromagnetic, or gyroscopic tracking is used for the 6D mice.

Wands

A wand is like a joystick with an unrestrictive base, and has six degrees of freedom. There are buttons on a wand and a thumbwheel that allows scalable values to be entered.

In the virtual world, a wand does not necessarily have to appear as a pointing device. It can be represented as a drill, paintbrush, spray gun, or even an ice-cream cone. One of the strengths of virtual environments is that the appearance of an object can be whatever the designer chooses -- it need bear no resemblance to the object's physical appearance. [PIME92]

A wand is very easy and intuitive to use. Regardless of its representation in the virtual world, most of the actions involve just 'point and click' with the wand.

Force (space) balls

A force ball has a ball that you could apply force on, although you cannot actually move the ball. The force you apply is picked up by sensors in the center of the ball, from where the information is then relayed to the computer. A force ball has six degrees of freedom.

A force ball is easy and intuitive to use; you simply push the ball in the direction that you want to move. Typically, a user becomes very comfortable with the device after fifteen or twenty minutes of use. Also, a force ball requires very little space as there is no movement. In addition, you do not have to hold it in midair, as you would using a 6D mouse. Most force balls have programmable buttons for a developer to configure to suit the needs of the application.

However, uses of a force ball are limited to navigation and selection. It is not suitable for interactions or issuing commands.

Biological input sensors

Biological input sensors use dermal electrodes to detect particular muscle activity. For example, they can be placed near the eyes so that by making simple eye movements we can navigate through virtual worlds.

It is also possible that in the future we can detect hand movement by wearing a bracelet that can detect hand muscle activities, thereby replacing a more cumbersome glove.

Voice recognition

Research on voice recognition has been in development for over twenty years, and it would be a very desirable input device for virtual reality. After all, studies have shown that speech is the most rapid form of communication, no matter how fast you can type. [AUKS92]

Although we have gone a long way in bringing voice recognition technology from mainframes to personal computers, the systems nowadays are still far from perfect, especially when it comes to understanding continuous speech. Most system requires training, where the user has to repeat a word many times in various ways to let the computer recognize the different patterns of the sound. It should be noted that the computer does not actually understand the word, but merely stores the digitized pattern of the word for later comparisons to a voice command. In addition, most systems are limited to a vocabulary in the hundreds of words. [RHEI91]

Voice recognition as an input device is limited in two ways. First, there are times when it actually decreases efficiency. Suppose you want to draw a line. It's definitely faster if you just enter the coordinates of the line using a keyboard than to say "Draw a line starting from coordinates such and such to coordinates such and such". Second, the computer may have difficulty understandings words under different contexts, or words that sound alike. For example, "You have an e-mail" may be understood as "You have any male."

Conclusion

Different input devices are suitable for different tasks, but one requirement for any device to work well in VR is that the lag between the action and what appears on the screen cannot be too great. For example, if you move your wand and the direction is changed only 5 seconds later, you can be easily confused and the device becomes hard to use.

References

[AUKS92] Steve Aukstakalnis, David Blatner, "Silicon Mirage: the art and science of virtual reality", Peachpit Press, 1992.

[PIME93] Ken Pimentel, Kevin Teixeira, "Virtual Reality: through the new looking glass", Intel/Windcrest/McGraw Hill,1993.

[RHEI91] Howard Rheingold, "Virtual Reality", Touchstone, 1991.


[Table of Contents][Next Chapter]


Human Interface Technology Laboratory