Abstract
The projects developed under the auspices of the Defense Advanced Research Projects Agency (DARPA) Information Management (IM) program are innovative approaches to tackle the hard problems associated with delivering critical information in a timely fashion to decision makers. To the extent that each of the information management systems interfaces with users, these systems must undergo testing with actual humans. The DARPA IM Evaluation project has developed an evaluation methodology that can assist system developers in assessing the usability and utility of their systems. The key components of an evaluation plan are data, users, tasks and metrics. The evaluation project recruited six IM project Principal Investigators (PI's) and devoted a year's effort in developing a method for getting from exploring and implementing systems to actually planning and performing structured, hypothesis-based evaluations. Five of the projects participated in this effort while a sixth project was integrated into and evaluated with a larger effort. This report describes the component systems, evaluation factors, and our experiences in constructing test plans.
Keywords
DARPA, evaluation, experimental design, human subjects, information management systems
Citation
Morse, E.
(2002),
Evaluation Methodologies for Information Management Systems, D-LIB Magazine, [online], https://tsapps.nist.gov/publication/get_pdf.cfm?pub_id=51044 (Accessed May 16, 2026)
Additional citation formats
Issues
If you have any questions about this publication or are having problems accessing it, please contact [email protected].