Dr. Yang Cai, Principal Investigator; Dr. Roberta Klatzky, Senior Member; Dr. Mel Siegel, Senior Member; Dr. Lenny Weiss, Senior Member; Sean Hackett, Research Engineer; Florian Alber, Research Engineer; Ben Graham, Research Assistant; Weizhe Guo, Summer Intern
The Visual Intelligence Studio at Carnegie Mellon University specializes in research and development in artificial intelligence, augmented reality, human-system interfaces, visualization, cyber-physical security, and video analytics.
Yang Cai, Ph.D. Director of Visual Intelligence Studio and Associate Research Professor of Biomedical Engineering and Senior Systems Scientist of Cylab Institute, Carnegie Mellon University. Yang has worked on artificial intelligence, human-computer interaction and robotics for over twenty years. He is the author of related books “Instinctive Computing” (Springer, 2016) and “Ambient Diagnostics” (CRC Press, 2014) and organizer of multiple ACMCHI and ACM Multimedia workshops. He is PI of the current NIST PSCR project for “Hyper-Reality Helmet for Localization and Mapping for First Responders.” He is collaborating with local fire departments and EMS including the fire marshals. Yang coordinated the design, development, and demonstration efforts throughout the competition.
Roberta Klatzky, Ph.D. is the Charles J. Queenan, Jr. University Professor of Psychology at Carnegie Mellon University, where she is also on the faculty of the Human-Computer Interaction Institute and the Carnegie Mellon Neuroscience Institute. Her research investigates perception, spatial thinking and action from the perspective of multiple modalities, sensory and symbolic, in real and virtual environments. Klatzky's basic research has been applied to tele-manipulation, image-guided surgery, navigation aids for the blind, and neural rehabilitation. Her vast experience in haptics and navigation provided critical information for the navigation approach used during the live trials at the firefighter testing facility.
Mel Siegel, Ph.D. is Professor Emeritus in Robotics Institute, Carnegie Mellon University, IEEE Fellow and sensor and instrumentation expert. Mel’s team has developed haptic sensing system for foot pressure profiles and developed climbing robot for aircraft inspection. He taught “Sensors and Sensing” graduate courses at CMU and other universities. Mel contributed his expertise in haptic perception and sensor technologies.
Lenny Weiss, MD, is Assistant Professor of Emergency Medicine University of Pittsburgh School of Medicine Assistant Medical Director, City of Pittsburgh EMS, STAT MedEvac Pittsburgh SWAT, UPMC Prehospital Care. Lenny is Co-Founder of the University of Pittsburgh Resuscitation Logistics and Informatics Venture (ReLIVe). Lenny acted as an adviser during the project, offering medical advice and support for the VR EMS aspect of the challenge.
Sean Hackett is a Research Engineer with Visual Intelligence Studio with an MEng degree in Electrical Engineering. Sean lead the development of the haptic system including the electronic hardware, communication systems and software implementation for the VR and live challenge.
Florian Alber is a Research Engineer at Visual Intelligence Studio with an MSc degree in Mechatronics. Florian assisted in the mechanical implementation and testing of the haptic system for the live challenge.
Ben Graham is an Electrical and Computer Engineering undergraduate student at Carnegie Mellon University. Ben is a skillful developer of 3D VR systems including Unreal, Unity, and OpenGL. Ben is a research Assistant of Visual Intelligence Studio for Augmented Reality systems.
Weizhe Guo was a Summer Intern at Visual Intelligence Studio currently studying Computer Science at Tsinghua University in Beijing. He assisted integrating the haptic helmet with VR in Unreal Engine.
The contestants modified a common first responder helmet and its harness to fit four haptic actuators, and attached a small control box, including a radio module and a connector for the haptic interface. The four embedded haptic actuators deliver a relative directional signal (left, right, forward and backward). The haptic signals are sent in real-time from a radio frequency remote controller, or from virtual reality simulator via a cable.