Implementing VR Interfaces for the Geosciences
Wes Bethel, Lawrence Berkeley National Laboratory, Information
and Computing Sciences Division
Janet Jacobsen, Lawrence Berkeley National Laboratory, Earth
Sciences Division
Andy Austin and Mark Lederer, BP Exploration
Todd Little, Western Atlas Software
Abstract
For the past few years, a multidisciplinary team of computer and earth scientists at Lawrence Berkeley National Laboratory has been exploring the use of advanced user interfaces, commonly called "Virtual Reality" (VR), coupled with visualization and scientific computing software. Working closely with industry, these efforts have resulted in an environment in which VR technology is coupled with existing visualization and computational tools.
VR technology may be thought of as a user interface. It is useful to think of a spectrum, ranging the gamut from command-line interfaces to completely immersive environments. In the former, one uses the keyboard to enter three- or six-dimensional parameters. In the latter, three- or six-dimensional information is provided by trackers contained either in hand-held devices or attached to the user in some fashion, e.g. attached to a head-mounted display. Among the advantages of the former, rich, extensible and often complex languages are a vehicle whereby the user controls parameters to manipulate object position and location in a virtual world. In the latter, the user interacts with these parameters by means of motor skills which are highly developed.
Two specific geoscience application areas will be highlighted. In the first, we have used VR technology to manipulate three-dimensional input parameters, such as the spatial location of injection or production wells in a reservoir simulator. In the second, we demonstrate how VR technology has been used to manipulate visualization tools, such as the position of a "rake," presented to the user in the form of a virtual injection well, which is used in computing streamlines through fluxes.
We will discuss some issues to be considered when implementing VR and will describe our software develop ment and user-working environments. Much of our work has been to create a software infrastructure to support VR. This infrastructure is a collection of software "building blocks" used in a visual programming environment for constructing a complete "program" for visualization. Our scientists can rapidly and easily interface several different VR input devices, such as a Spaceball or magnetic trackers, to a variety of computational or visualization tools.
Our work on interface and visualization issues led us into two technical areas which will receive elaboration in this paper. One of these is an optimized space-search algorithm which can be used by many scientific visualization tools, such as a streamlines solver. Another is our experience in making practical use of magnetic trackers as an input device.
We will conclude with a summary of the motivation for this work, and give an example of near-future extensions to this work.
Postscript Version (2177K) of the full paper, including color images.
Color Images only from the paper (2105K).
An image produced using this system (not in the paper). Click here to see a 1k by 1k version.
Contact: Wes Bethel Lawrence Berkeley National Laboratory Mail Stop 50-F Berkeley, CA 94720 USA email: ewbethel@lbl.gov url: http://www-vis.lbl.gov/~wes phone: (510) 486-7353 fax: (510) 486-5548