Real-time simulators

Authors: Claudio Esperanca and Ninad Jog


In a typical Virtual Reality system, the user is immersed in a simulated world that exists only as an abstraction that comes to life by means of sensory stimulation. His actions in that virtual world are, for the most part, innocuous with respect to the "real" world.

On the other hand, Virtual Reality lends itself to applications where the artificial environment perceived by the user is a simulacrum of some real environment. To maintain the similitude of these two it is necessary to map changes from one environment to the other. Changes in the real world are detected by sensors which feed the Reality Engine, which in turn modifies the world as seen by the user. In the same manner, user actions that cause some modification of the virtual world also trigger actuators that will effect the change in the real world. (|See figure II.B.1|).

One example of the situation we just described is Tom Furness' "Super-Cockpit" project. The virtual world in this case is an artificial environment containing objects such as the landscape over which the airplane is flying, controls, instruments, etc. As the plane travels, the real-world landscape changes and so must the mock-up landscape in the pilot's artificial world. If the pilot decides to turn left at some point, the instruments in his virtual cockpit must reflect the change in course and the airplane itself must be steered to the new direction.

When the virtual world is merely a mirror of the real world, it's easy (at least in principle) to control the behavior of the objects as seen by the user, since they must ultimately behave exactly as their "real" counterpart. In most cases, though, there is no "real world" to speak of, and the way the objects behave must be explicitly programmed by the author of the VR system. This is what we call "simulation".

We may (loosely) characterize a simulation as being "real-time" if the user has the impression of seing the events happening at some rate compatible with a continuous animation. The concept of "real-time" is thus connected with parameters such as the number of frames per second the system is capable of rendering or the time lag necessary to incorporate in the scene the effects of some user action.

Let's take, for instance, a virtual world consisting of a room with a rubber ball in it. The author of this virtual reality will probably program the object "ball" so that the user may grab it and throw it on the floor of the "room" and see it bounce. In this scenario, some properties of the component objects will have to be programmed:

These (and possibly other) properties are usually set by the author as he/she programs the simulation. A more challenging project would be to let the user adjust these properties while "living" in the virtual world.


[Table of Contents]


Human Interface Technology Laboratory