Intelligent technologies within the military, law enforcement, and homeland security fields are continuously evolving. Testing these technologies is crucial to (1) inform the technology developers of specific aspects for enhancement, (2) request end-user feedback, and (3) verify the degree of the technology's capabilities. Test exercises provide valuable data that both update the state of the technology and present information to the evaluation design team to aid further testing. Evaluation designers have exerted substantial effort in creating methodologies to streamline the test plan development process. This is particularly evident when producing comprehensive test plans. The Multi-Relationship Evaluation Design (MRED) methodology is being developed to collect input from several source categories and automatically output evaluation blueprints that identify pertinent test characteristics. MRED captures input from three specific categories including personnel stakeholders, the technology state, and the available resources. This information and the relationships among these inputs are merged to feed an algorithm that will output specific test plan elements. This paper will propose a model of developing a technology's state and its influence on the MRED-output. MRED defines the input technology state category to include the maturity, reliability, and repeatability of a technology under test. The states of these three characteristics evolve as a technology is developed from the conceptual stage to a fully-functional system. Likewise, test characteristics evolve to capture the most pertinent data to enhance this development process. In order to ensure that the appropriate test designs are generated, it is critical to understand the relationships between these input and output elements. These relationships will also be described in this paper. Future efforts will formalize the full MRED model as relationships are further investigated among the inputs and the output.
Proceedings Title: 2011 International Test and Evaluation Association Technology Review (ITEA Tech Review 2011)
Conference Dates: July 19-21, 2011
Conference Location: Annapolis, MD
Pub Type: Conferences
MRED, Performance Metrics, Evaluation Framework, Uncertainty