Scientists at the National Institute of Standards and Technology (NIST) have developed software that improves the accuracy of the tracking devices in its immersive, or virtual, research environment by at least 700 percent. The software can be used by scientists in other immersive environments with slight modifications for their individual laboratories. This advance is a step forward in transforming immersive technology that has traditionally been a qualitative tool into a scientific instrument with which precision measurements can be made.
Immersive environments such as NIST's are typically made up of two or more 8 foot by 8 foot walls onto which images ranging from larger-than-life bodies or actual-size buildings can be displayed on the walls and the floors. The images are three-dimensional. Researchers wear 3-D glasses and hold a wand, each of whose location is tracked. Using these devices the researcher can walk around and interact with the virtual world with the help of the underlying graphics system.
While these small virtual reality laboratories have been around for more than a decade, they have mainly been used for a scientist to get inside a project and develop a feel of the object of study, explained NIST mathematician John Hagedorn. Researchers can walk inside hallways of newly designed buildings before they are constructed to ensure the proportions are correct, or inspect microscopic structures, for example.
The visuals in immersive environments are sometimes not quite accurate because of an inherent problem with the electromagnetic transmitters and receivers used to track where the user is in the space. Ferrous metals such as rebar in the walls, other metal in the room or metal walls, throw off the communications between the stationary and the small receivers attached to the tracked devices. These distortions are especially obvious when an image with lines or edges meets the virtual environment's 90 degree angles where the walls and the floor meet. These distortions interfere with the "reality" aspect and limit the immersive environment's value as a measurement tool.
To improve the image's accuracy, Hagedorn and colleagues concentrated on the inaccuracy of the tracking devices. They knew there was a difference between where the tracking device said it was and where it really was. The researchers mapped two sets of data points—where they knew the sensors actually were and where the computer said they were. Using this data, they developed software that transforms the reported positions of the sensors into the actual position. "Our program," Hagedorn said, "provides corrections of both the location and the orientation in the 3-D space." Average location errors were reduced by a factor of 22; average orientation errors by a factor of 7.5.
"This improvement in motion tracking has furthered our goal of turning the immersive environment from a qualitative tool into a quantitative one—a sort of virtual laboratory," Hagedorn explained. The first test with the new software was measuring a lattice structure with elements of about 2 to 3 millimeters in size designed to grow artificial skin replacements or bone. A 3-D image of the structure was constructed (see photo) using data obtained from a high-resolution microscope. NIST scientists interactively measured the diameters of the fibers and the spacing between the layers of fiber using the virtual lab. These precision measurements enabled the researchers to determine that the manufactured material substantially deviated from the design specification. On the other hand, additional measurements in the immersive environment showed that the angles between fibers in the manufactured material closely matched the design.