The term "motion tracking" refers to technology that enables real-time measurement of an object's location and orientation. Tracker calibration is the method by which the errors in the motion tracker's measurements of location and orientation are assessed and corrections to those errors are made in real-time.
At NIST, we use motion tracking devices in our immersive visualization environment to monitor the position of the user and of a hand-held interaction device. The image on this page shows NIST's immersive visualization system with the motion tracker's transmitter and sensors as well as the three screens that together provide the visual display. The three screens are used to display a single three-dimensional scene.
Tracker calibration is important because errors in measured location and orientation can substantially compromise the effectiveness of the applications that use motion tracking. For example, in NIST's immersive visualization environment, errors in motion tracking result in problems such as: virtual objects move inappropriately as the user moves; straight lines appear bent when they cross screen boundaries; and virtual objects tied to the tracked hand-held device appear incorrectly positioned.
One of the figures on the multimedia page illustrates one of these effects. In this figure, the grid lines should all be straight. To the user in the immersive environment, the lines appear bent at the points where they cross the boundaries between screens because the images are being drawn based on an incorrect tracked location for the eyes of the observer. This figure is based on actual errors observed in the motion tracking system at NIST. It is by no means the worst case that could have been provided. In informal observations that we made before initiating this project, we observed location errors in excess of 50 cm and orientation errors that appeared to be more than 15 degrees.
To correct the raw tracker data, we first record the tracker's measurements at a large set of known locations and orientations that encompass the volume that we need to accurately track. This enables us to calculate corrections at each of these points. We then perform a Delaunay tetrahedralization of the points based on the measured locations. Then, in real time, as the tracker reports each location and orientation, we find the tetrahedron that contains the measured location and generate barycentric coordinates for this location relative to the containing tetrahedron. The barycentric coordinates are used as weights for performing weighted averaging of the corrections at the vertices of the containing tetrahedron.
For orientation averaging, we use these weights with a spherical weighted averaging technique to average the correction rotations at each of the four vertices of the tetrahedron. This use of barycentric coordinates with spherical weighted averaging has a much clearer geometric rationale than previous methods.
To see more images for this project, see the multimedia page, or click the image to the right.