The purpose of the First Responder Unmanned Aircraft System (UAS) FastFind Challenge (UAS 3.1) was to advance UAS technologies by building and flying a UAS designed to support first responder search and rescue (SAR) operations. The goal was for contestants to design, build, and fly a complete UAS solution that helps a SAR team locate multiple missing persons in a thick forested area by improving image detection and enhancing navigation techniques to find people faster. Potential innovations included sensor technology to detect a person in an area with dense foliage; creation of new algorithms to optimize aerial search patterns dynamically; and development of artificial intelligence to process images or video feeds and identify humans within the designated search area. First responders have suggested these improvements would result in a reduction of total search time relative to current first responder standards.
Based on concept papers (Stage 1) and demonstration videos (Stage 2), the contestants who met the evaluation criteria and had an operational UAS were selected to participate in the final live competition (Stage 3), which took place at a forest near Mississippi State University in Starkville, MS. To simulate a real-life scenario at the final stage, the challenge implementer geolocated several individuals throughout the forest in order to test each contestants’ image detection solution in various forest densities. Over the course of two days, contestants were given multiple search areas in which to test and demonstrate the efficacy of their solution. These tests sought answers to match the challenge’s primary goal of evaluating technology that would improve speed and accuracy in identifying “lost” individuals.
Proposed solutions leveraged an RGB camera, an infrared sensor, or a combination of the two to search and identify the “lost” individuals. Some contestants had a team member dedicated to watching the video feed on the controller to identify heat signatures in infrared, or human clothing which contrasted with the forest in a standard RGB camera. Some contestants developed algorithms or software solutions to process the images real-time while others required post-processing for analysis of the images. Due to the relative humidity and ambient temperatures, in combination with the density of the forest, the solutions that contained both an infrared and RGB camera outperformed the RGB camera-only systems. Additionally, the systems that leveraged real-time processing using artificial intelligence and infrared images outperformed all other solutions in both the accuracy of positive identification of an individual as well as the shortest overall search time.
For more information on winning solutions, click on the team names in the Stage 3 chart below. For more information on the overarching First Responder UAS Triple Challenge, including the other two challenges, please visit: 2021 First Responder UAS Triple Challenge | NIST.
|1st place||Team AMAV from University of Maryland||$40,000|
|2nd place||Team ARCC from Penn State University||$20,000|
|3rd place||Team Aggie from NCAT/Purdue||$10,000|
|First Responder's Choice Award||Team AMAV from University of Maryland||$5,000|
The following winners each received $5,000 in prize awards and an invitation to participate in Stage 3: Live Test and Evaluation:
The following winners each received $3,000 in prize awards and an invitation to participate in Stage 2: Video Test and Safety Evaluation:
The following finalists each received an invitation to participate in Stage 2: Video Test and Safety Evaluation:
The following winners each received $7,000 in prize awards and an invitation to participate in Stage 2: Check-in Review:
UAS 3.1 was hosted by NIST’s PSCR Division and managed by Kansas State University, in partnership with Mississippi State University. More information about UAS 3.1 may be found on Challenge.gov.