Military applications of virtual reality

Author: Jim Baumann


One of the first areas where virtual reality found practical application is in military training and operations. In this article, we will explore three views of military applications of virtual reality; as a simulation of reality, as an extension of human senses through telepresence, and as an information enhancer through augmented reality.

Virtual Reality in military simulations

One of the earliest uses of simulators in a military environment was the flight trainers built by the Link Company in the late 1920's and 1930's. These trainers looked like sawed-off coffins mounted on a pedestal, and were used to teach instrument flying. The darkness inside the trainer cockpit, the realistic readings on the instrument panel, and the motion of the trainer on the pedestal combined to produce a sensation similar to actually flying on instruments at night. The Link trainers were very effective tools for their intended purpose, teaching thousands of pilots the night flying skills they needed before and during World War II.

To move beyond the instrument flying domain, simulator designers needed a way to produce a view of the outside world. The first example of a simulator with an outside view appeared in the 1950's, when television and video cameras became available. With this equipment, a video camera could be 'flown' over a scale model of the terrain around an airport, and the resulting image was sent to a television monitor placed in front of the pilot in the simulator. His movement of the control stick and throttle produced corresponding movement of the camera over the terrain board. Now the pilot could receive visual feedback both inside and outside the cockpit.

The logical extension of the video camera/television monitor approach was to use multiple monitors to simulate the entire field of view from the airplane cockpit. This method is still in use for transport aircraft simulators, where the field of view needs to be only about 180 degrees horizontally and 60 degrees vertically. For fighter aircraft simulators, the field of view must be at least 180 degrees horizontally and vertically. For these applications, the simulator consists of a cockpit placed at the center of a domed room, and the virtual images are projected onto the inside surface of the dome. These types of simulators have proven to be very effective training aids by themselves, and the newest innovation is a project called SIMNET to electronically connect two or more simulators to produce a distributed simulation environment. [McCarty, 1993] Distributed simulations can be used not only for training, but to develop and test new combat strategy and tactics. A significant development in this area is an IEEE data protocol standard for distributed interactive simulations. [IEEE, 1993] This standard allows the distributed simulation to include not only aircraft, but also land-based vehicles and ships. Another recent development is the use of head-mounted displays (HMDs) to decrease the cost of wide field of view simulations. [McCarty, 1993]

Telepresence for military missions

Two fairly obvious reasons have driven the military to explore and employ telepresence in their operations; to reduce exposure to hazards and to increase stealth. Many aspects of combat operations are very hazardous, and they become even more dangerous if the combatant seeks to improve his performance. Prime examples of this principle are firing weapons and performing reconnaissance. To perform either of these tasks well takes time, and this is usually time when the combatant is exposed to hostile fire. Smart weapons and remotely- piloted vehicles (RPVs) were developed to address this problem.

Some smart weapons are autonomous, while others are remotely controlled after they are launched. This allows the shooter and weapon controller to launch the weapon and immediately seek cover, thus decreasing his exposure to return fire. In the case of RPVs, the person who controls the vehicle not only has the advantage of being in a safer place, but the RPV can be made smaller than a vehicle that would carry a man, thus making it more difficult for the enemy to detect.

Military information enhancement

In a dynamic combat environment, it is imperative to supply the pilot or tank commander with as much of the necessary information as possible while reducing the amount of distracting information. This goal led the Air Force to develop the head-up display (HUD) which optically combines critical information (altitude, airspeed, heading) with an unobstructed view through the forward windscreen of a fighter aircraft. With the HUD, the pilot never has to look down at his instruments. When the HUD is coupled with the aircraft's radar and other sensors, a synthetic image of an enemy aircraft can be displayed on the HUD to show the pilot where that aircraft is, even though the pilot may not be able to see the actual aircraft with his unaided eyes. This combination of real and virtual views of the outside world can be extended to nighttime operations. Using an infrared camera mounted in the nose of the aircraft, an enhanced view of the terrain ahead of the aircraft can be projected on the HUD. The effect is for the pilot to have a 'daylight' window through which he has both a real and an enhanced view of the nighttime terrain and sky.

In some cases, the pilot may need to focus totally on the virtual information and completely exclude the actual view. Work in this area has been pioneered by Thomas Furness III and others at Wright Laboratories, Wright-Patterson Air Force Base, Ohio. This work, dubbed the Super Cockpit, involved not only a virtual view of the outside world, but also of the cockpit itself, where the pilot would select and manipulate virtual controls using hand gestures. [Furness, 1986]

References

Furness, T. A. (1986). The Super Cockpit and Its Human Factors Challenges. Proceedings of the Human Factors Society. 30th Annual Meeting, (pp. 48-52). Santa Monica, CA: Human Factors Society.

IEEE Standard for Information Technology - Protocols for Distributed Interactive Simulation Applications, 12 May 1993.

McCarty, W. D., Sheasby, S., Amburn, P., Stytz, M. R., and Switzer, C., A Virtual Cockpit for a Distributed Interactive Simulation Environment, Air Force Institute of Technology, unpublished paper, 30 September 1993.

Platt, P. A., Dahn, D. A., & Amburn, P. (1991). Low-Cost Approaches to Virtual Flight Simulation. Proceedings of the IEEE 1991 National Aerospace and Electronics Conference NAECON 1991 Vol. 23, (pp. 940-6). New York, NY: IEEE.


[Table of Contents][Next Chapter]


Human Interface Technology Laboratory