The need for effective human-robot interaction (HRI) continues to present challenges for the field of robotics. As new technologies are integrated into human-robot teams in a myriad of application domains, exposures to and expectations from robots are growing rapidly. A key factor that limits the success of human-robot teams is the lack of principled metrics for assessing the effectiveness of HRI.
The necessity for validated test methods and metrics for human-robot teaming is driven by the desire for repeatable and consistent evaluations of HRI methodologies. Such evaluations are critical for advancing underlying models of HRI, as well as establishing traceable mechanisms for vendors and consumers of HRI technologies to assess and assure functionality.
This full-day workshop will address the issues surrounding the development of test methods and metrics for evaluating the performance of human-robot teams across the multitude of human-centered application domains, including industrial, social, medical, field and service robotics. This workshop will focus on establishing the diversity of approaches for addressing HRI metrology in different robotic domains, and identifying the underlying issues of traceability, objective repeatability, and transparency in HRI metrology.
The workshop organizers are seeking full paper submissions for presentation. Papers should be limited to 6 pages in the IEEE Proceedings format. Accepted papers will also be invited to submit their papers to a special journal issue to be published at a later date. Extended abstracts and posters will also be accepted for consideration for presentation during a poster session.
Solicited paper topics include:
- Test methods and metrics for evaluating human-robot teams
- Best practices and real-world case studies in human-robot teaming
- HRI data set generation, formatting, and dissemination for human-robot teams
- Verification and validation of HRI studies involving human-robot teams
- Benchmarking performance of human-robot teams
- Evaluation of novel human-robot team designs and methods
- Paper submission deadline: Friday, 25 January, 2019
- Notice of acceptance for presentation: Friday, 8 February, 2019
- Submission of posters and extended abstracts for presentation: Friday, 8 February, 2019
- Notice of acceptance of posters for presentation: Friday, 15 February, 2019
- Workshop date: Monday, 11 March, 2019
- Dr. Jeremy A. Marvel, National Institute of Standards and Technology (NIST), USA
- Shelly Bagchi, National Institute of Standards and Technology (NIST), USA
- Megan Zimmerman, National Institute of Standards and Technology (NIST), USA
- Murat Aksu, National Institute of Standards and Technology (NIST), USA
- Brian Antonishek, National Institute of Standards and Technology (NIST), USA
- Dr. Yue Wang, Clemson University
- Dr. Ross Mead, Semio
- Dr. Terry Fong, National Aeronautics and Space Administration (NASA), USA
- Dr. Heni Ben Amor, Arizona State University