Virtual Reality and Scientific Computing at LBL:




The VR Myth

Say the words "virtual reality" and most people will automatically form a mental image of a person wearing headgear, cybergloves, body suits, CAVE's and so forth. When we analyze the components of what make up "VR", we can identify three general components:

  1. Input Devices are used to specify 6 dimensional information (where am i, where am i looking) as well as discrete events such as picks; as well as the usual menu of keyboards, mice, etc.
    input devices
  2. Output Devices used to present information to the user. These include conventional monitors, such as garden-variety workstation RGB monitors, but also more sexy devices such as stereo devices such as head-mounted-displays, stereoscopic shutters, etc. (Note that a. and b. are HARDWARE. The third, and really most crucial component is the software.)
    HMD
  3. "VR Software" In "vr systems", the user interacts with some type of "virtual world" in a "virtual environment." Thus we can think of two broad classes of functionality in the software. The first is constructing, authoring, deriving, eetc. the virtual world itself. The second is the framework for establishing user interactions with the virtual world.

Consider an architectural walkthrough example. We have a building we want to "tour." Where did the building come from? It may come from an architect who may doing research on the placement of windows, for example.

Then there are the interactions. when the person taking the tour "bumps" into the elevator, what happens? does the door open, and the elevator take them to another floor or does it behave the same as a bare wall?

The truth of the matter is that the essence of VR lies in the software, not the hardware. Fancier hardware provides for an increased level of displays is currently a topic of debate.

The VR interface, regardless of hardware implementation, can be thought of in loose terms as a way of interfacing with the user. The difference between VR interfaces and conventional interfaces can be likened to the difference between a Macintosh point-n-click windows-based interface and a PC-DOS command line interface. The same benefits can be derived: specification of 6 dimensional data with a minimum of user effort.


History of VR in Scientific Computing

gif image of Virtual Wind tunnel project

The work that is perhaps most widely known is the Virtual Windtunnel project from NASA-Ames, circa 1991. The goal of this project was to demonstrate VR as an enabling technology, allowing aerospace engineers to study flow around airfoils. The system permitted the engineer to place, in a virtual space, a seed point from which particles were advected through the flow field, or for computing streamlines.

The relevant lesson from this project, and related works, show the usefulness of "Direct Manipulation Interfaces". In the virtual windtunnel project, the user sees an icon representing the seed point for the particle advection process, they grab it (a natural gesture), and while grasping it, simply "move" it to a new location, then let go of it. This sort of interface is easy to use and requires no explanation. There is no concern for underlying coordinate systems or other particulars. The user can focus on studying the flow field rather than become bogged down in details.


Current Capabilities at LBL

113k gif of AVS visualization software package

At LBL, we have successfully interfaced low cost (on the order of $1000) 6 dimensional input devices to a wide variety of scientific computing processes. A scientific visualization software package called AVS allows a user to "write" a "program" in a visual programming language. This package is used at our lab on a day-to-day basis by discipline scientists studying such varied subjects from Hydrology and Geophysics to Particle Physics and Biochemistry. We have built and released for anonymous ftp a set of tools that can be used to implement a desktop environment for scientific visualization that makes use of a VR input device.

Using the combination of the VR input device and the visual programming language, one can:


Future Plans

Among the problems associated with immersive VR:

  1. Physiological dysfunction caused by low-resolution head gear as well as neurologic dysfunction caused by low frame rates.
  2. Really lousy pictures that present no semblance of "reality."

We can't really do much about a) except wait for technology to catch up and for prices to come down. At our site, $1000 is a lot of money, so the purchase of a $80,000 HMD is pretty much out of the question.

But, we can do something about b) right now. We will be examining issues related to combining existing graphics hardware (polygon engines) with preprocessing software, primarly from lighting simulation groups at LBL, in an effort to create a better presentation of models by precomputing diffuse-light interactions within the model, while leaving the specular portion of the lighting calculations up to the graphics hardware. There is much promise in this area as far as creating images that are pleasing to the eye.

Additionally, we have been working with the Khoros Group in New Mexico on a project, the goal of which is to make available for FREE, via anonymous ftp, the infrastructure for visual programming and data transport.


	Summer, 1995 
	Wes Bethel
	Information and Computing Sciences Division 
	Mail Stop 50-F, 129 
	Lawrence Berkeley Laboratory 
	The University of California
	Berkeley, California 94720 
	ewbethel at lbl dot gov 
	voice: (510) 486-7353