Skip to main content
U.S. flag

An official website of the United States government

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

Hyper-Reality Helmet for Mapping and Visualizing Public Safety Data

Carnegie Mellon University


The team designed a hyper-reality helmet to provide on-demand information on a heads-up display for first responders to work in indoor and hazardous environments where a GPS signal is not available and visibility is poor.The teams second generation helmet has multimodal interfaces, including sensors, pre-incident planning maps, stereo vision, and voices. Our experiments showed the importance of pre-incident planning and sensory fusion in the real-world problem-solving.This research has impacts on advancing the technology for pre-incident planning, firefighting,emergency medicine, and other first responding and risk management missions. It also affects future research initiatives in emergency medicine, robotic rescue and recovery, and smart city technologies. - July 2019

Carnegie Mellon University logo

Quick Resources


Meet the Team

Principle Investigator: Yang Cai
Carnegie Mellon University

 

Project Overview

This project focuses on the development of a smart helmet to assist first responders in emergency environments of fire, flood, shooting, cyber attack, and medical distress, where GPS, cellular and regular WiFi signals are not available. The hyper-reality technology is to superimpose the on-demand information to the objects in an actual scene image so that the user can see more in-depth information beyond reality. In contrast to many prevailing augmented reality technologies, our approach focuses on enhancing reality with minimal graphical and textual highlighting without obscuring the user’s view. The hyper-reality helmet contains a holographic screen that can mix the actual visible image with mapping and visualization information. The helmet contains a visible camera, a thermal imaging camera, an Android smartphone processor, voice command, and tactile interface. In order to be deployable to a broader spectrum of first responders, including firefighters, police, medical, and cyber professionals, it needs to be extremely affordable, reconfigurable and robust for different applications.

This project includes the following six tasks:

  1. rapid prototyping the holographic display and data processing helmet
  2. landmark recognition and tracking for correcting accumulated mapping errors and tagging landmarks on the map
  3. video simultaneous location and mapping (SLAM) and smartphone ad-hoc network (SPAN) based triangulation that map ego-motion and infrastructure components with other sensors for Location Based Services (LBS)
  4. gesture and speech recognition for navigating and information retrieval 
  5. superimposing live thermal images to the visible views with articulated measurements, potential lasers, and case studies such as turning the invisible methane gas emission into visible objects
  6. human-robot interaction interface development for 3D mapping of the infrastructure and the First-Person-View (FPV) from drone’s cameras

Benefits and Outcomes

A prototype of the hyper-reality helmet and measurable case studies in realistic environments such as an office building, airport, and paramedic emergency response situations. A system that can be integrated with available LBS technologies to enhance mapping and visualization of data on-demand. This approach can benefit not only individual responders, but also futuristic human-robot first responder forces.

<< Back to the PSIAP 2017 - Location-Based Services Page

Created September 28, 2017, Updated December 30, 2022