Take a sneak peek at the new NIST.gov and let us know what you think!
(Please note: some content may not be complete on the beta site.).
NIST Authors in Bold
|Author(s):||Matthew L. Aronoff; John V. Messina;|
|Title:||Collaborative Augmented Reality for Better Standards|
|Published:||August 15, 2007|
|Abstract:||Concurrent engineering depends on clear communication between all members of the development process. As that communication becomes more and more complex -- including not only textual descriptions, but CAD models, diagnostic data, process control data, and so on -- the quality of the standards used to move and understand that information likewise becomes more and more important. If the standard is incomplete, redundant, or ambiguous, most of the expected benefits are lost. In order to improve data exchange standards, explicit data models are required. However, creating those data models is a process that requires active collaboration between domain experts familiar with the information being exchanged. What is needed is a solution that encourages interaction without requiring a high level of data modeling expertise. Focus is a software tool designed to provide such an environment. It is a distributed design and conferencing application which uses augmented reality to allow domain experts to come together in real time to create data models. Focus uses concrete, 3-dimensional objects in place of abstract data modeling concepts such as domain classes. Any domain expert can therefore create data models without first having to learn a modeling language like UML. In addition, because of the networked nature of Focus, it is easier to ensure the participation of the best domain experts regardless of location. This paper details the development, features, and expected benefits of Focus in a collaborative engineering environment.|
|Citation:||NIST Interagency/Internal Report (NISTIR) - 7441|
|Keywords:||3d,data modeling,distributed computing,ruby,standards development,uml|
|Research Areas:||Product Data|
|PDF version:||Click here to retrieve PDF version of paper (272KB)|