University of Houston
Use of video camera systems for public safety monitoring has become pervasive across public safety agencies – both large and small. While manual monitoring of emergencies on video is essential to inform response, the number of video streams often exceeds the number of people available to monitor them. And this challenge is growing as the use of video continues to expand. The AI academic discipline of computer vision is also quickly growing along with hardware to support deployment of real-time AI applications on devices as well as in operations centers. The surge in resources and interest in AI and computer vision application of AI are quickly accelerating the development of tools that can be used within public safety. However, challenges have occurred in transitioning what has been traditionally academic experimental prototype technology to the complex systems and policies, and highly variable video data collected within public safety video monitoring environments.
Considerable research progress has been made in developing video analytics for monitoring video streams, e.g. analytics to automatically detect a left object (baggage) and line (perimeter) crossing are common today. While more advanced and sophisticated analytics are being researched and can be made to be usably accurate for public safety environments and data, the ingestion of resulting information to facilitate communication and timely response from first responders requires integration of robust video analytic methods and AI-focused hardware systems with existing information management and communication systems. Typical video systems leverage a video management system (VMS) to record video from cameras and pushes event information into a public safety information management system (PSIM). The PSIM is often used as the information management and communication system to define standard operating processes for each event, which in turn facilitates planning and response. We have studied existing infrastructure of public safety video systems, both to understand its impact on design of video analytic solutions and to realize requirements on integration of developed solutions to enable its use by end users.
Shishir Shah*, Principal Investigator
University of Houston
University of Houston
City of Houston
University of Houston was awarded a separate, one-year award to complete a demonstration project with the City of Houston under funding opportunity 2020-NIST-PSIAP-TABA-01. The demonstration project will focus its live video analytics platform on three use cases: 1) crowd count alert; 2) background change (for example, graffiti on walls or left baggage); and 3) events in regions of interest/pre-described boundaries (for example, a vehicle in a no-parking zone). The University of Houston will demonstrate and train public safety personnel at the City of Houston on the video analytics system, and they will analyze the efficacy and usability of the analytics and software. They will follow-up with a user questionnaire to capture the public safety participants’ experience utilizing the software and analytics.
The Public Safety community has made sizable investments into video technology, which will be leveraged with this project. The objective is to develop an intelligent, non-obtrusive, real-time continuous monitoring system for assessing activity and predicting emergent suspicious and criminal behavior across a network of distributed cameras. The University of Houston has already developed an analytics engine capable of detecting a number of situations requiring attention by an officer and evaluated it on their own campus. Building on this, there are three key objectives of this project.
Applying analytics to real-life systems
Applying enhanced analytics to real-life Public Safety video systems can be challenging. In real deployments, cameras are often backhauled via wireless links, where packet loss and signal jitter impact video quality. This research will focus on tuning the analytics methodology to function in such environments and develop guidelines for video system deployment and configuration. Analytics will be enabled within the City of Houston’s Public Safety Video System, of which 80% of its 850+ cameras are connected wirelessly in an urban setting, and includes most of the regional public safety stakeholders as collaborators and users.
Operations workflow integration
A major challenge in large systems such as this is getting the right information to the right people at the right time. Officers have to be alerted once analytics identify an abnormal situation in view of a camera. Within the scope of this project, these alerts will be based on a multi-tiered video analytic framework and its utility will be evaluated for integration into the City’s public safety video system operational workflow based on PSIM solutions. Public Safety agencies all over the country use PSIM solutions of different vendors, making standardization very important.
Today, not many standards exist for analytics metadata. Guidelines will be identified where possible within this research to further foster the standards movement for analytics. Other U.S. Public Safety related organizations have expressed interest and will be approached for trials of successful results of this technology.
Fundamental analytics such as “perimeter crossing” are quite common and exist on most modern video analytic solutions. With this project, key public safety use cases will be explored and enhanced behavioral analytics will be developed, putting the instincts of an experienced public safety officer into an algorithm, helping to identify threats before disaster strikes. Ideally, every camera on the network feeding into the analytics engine becomes an additional “experienced officer”. These additional eyes in the street represent a massive force multiplier. The research will use key members of the Houston public safety staff to develop and test these analytics.