Skip to main content
U.S. flag

An official website of the United States government

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

NIST Gets HIP: a New method for testing optical imaging sensors

Schematic of the Hyperspectral Image Projector

Figure 1. Schematic of the Hyperspectral Image Projector (HIP). A spectral engine optically in series with a spatial engine projects a time-integrated 2D image to the sensor under test wherein each spatial pixel is presented with a spectrum that can be programmed to simulate those that occur in realistic scenes.

A new type of scene projector in development at PML will enable the performance of future optical and infrared imaging instruments to be evaluated by having them "watch reality TV"—that is, projected scenes that are realistic both spectrally and spatially.

Detecting climate variation with remote sensing instruments, providing state-of-the-art surveillance imagery for security and defense, or getting the most out of medical imagers requires excellent knowledge of the sensor's performance. One of the challenges faced by scientists and engineers developing these instruments is effectively evaluating their performance. To address this challenge, we are developing the Hyperspectral Image Projector (HIP). The HIP will enable performance evaluation of cameras and other imaging instruments using realistic scenes.

Remote sensing instruments and medical imagers are designed to take images composed of many spectral bands, not just the minimum red, green, and blue components used by common digital cameras. These images are referred to as hyperspectral because each pixel contains information for hundreds or thousands of narrow spectral bands. The purpose of the Hyperspectral Image Projector (HIP) is to enable scientists to project hyperspectral images into sensors, simulating realistic scenes both spectrally and spatially, for performance testing and evaluation of the sensor instruments in the laboratory.

For example, by using the HIP to test satellite sensor performance in controlled laboratory settings, scientists can alleviate expensive field testing, allow better separation of environmental effects from instrument effects, and enable system-level performance testing and validation of space-flight instruments prior to launch.1

Many realistic scenes of interest for testing defense and security sensors would be very difficult or dangerous to set up outside, but can be relatively easily simulated and projected into the sensors by the HIP. Similarly, tissue phantoms used to test medical optical and infrared imaging instruments are difficult to maintain and disseminate with known properties, whereas the HIP can present repeatable digital versions of tissue phantoms to these instruments.2

The design of the HIP system is similar to commercially available Digital Light Processing (DLP) projection systems. In DLP systems, the projected image is made from a composite of grayscale images representing each of the RGB colors (red, green, and blue). The individual grayscale images are generated by focusing light through a rotating multicolored filter (to obtain the spectral component) and illuminating a digital micromirror device (to obtain the spatial component). When the grayscale images are projected and combined at typical video frame rates, the result is a full RGB-color image.

coral reef
Figure 2. Top: Original image of a coral reef, and the same image after being projected by the HIP and measured by a remote sensing imager while being tested in the lab.
In contrast to the DLP system, the HIP system has the ability to project composites of numerous spectra. Instead of using a filter as in the DLP system, the spectral components for the HIP system are generated with a spectral engine, composed of dispersive optics and a spatial light modulator such as a digital micromirror device (DMD) or a liquid crystal spatial light modulator. The spatial engine, composed of a second spatial light modulator, then determines the spatial component for each spectral component. Synchronized operation of both engines ensures that each spectral component is projected sequentially in the correct proportions in each spatial region to create a time-averaged hyperspectral image.3

The advantage of the HIP system is not only its ability to project realistic, spectrally and spatially complex scenes, but also the user's ability to arbitrarily define and control the spectral distributions at each pixel in the spatial image. For example, the HIP can alter certain spectral components to reflect changing scenes. This means that HIP can be used to test imagers under a wide range of conditions and for a variety of applications.

We recently demonstrated the capabilities of the HIP system by projecting a hyperspectral image of a coral reef.4 The original image, acquired by an airborne hyperspectral sensor, was deconvolved into six spectral components, and then re-projected using the HIP into a laboratory imaging spectrometer. The pictures below present an RGB version of the original image and the HIP-projected image.

The original research cited here was performed with a HIP operating in the visible spectral range, and served to prove the concept. NIST and collaborators are continuing to develop the HIP by extending the spectral range into the infrared and ultraviolet, increasing the spectral resolution and brightness, and enabling it to show dynamic scenes (i.e., hyperspectral image movies). Portable, rack-mounted prototypes are being built that will allow the HIP to be used by other scientists and test engineers in their own labs.

Acknowledgments

This work was funded in part by the Department of Defense Test Resource Management Center (TRMC) Test and Evaluation/Science and Technology (T&E/S&T) Program, and in part by the NIST Office of Law Enforcement Standards. A version of this article appeared in September in the SPIE online "Newsroom."

References

1. "Hyperspectral image compressive projection algorithm" J.P. Rice, D.W. Allen, Proc. SPIE 7334, 733414 (2009).

2. "Dynamically programmable  digital tissue phantoms" S.W. Brown, J.P. Rice, D.W. Allen, K. Zuzak, E. Livingston, M. Litorja, Proc. SPIE 6870, 687003 (2008).

3. "Development of hyperspectral image projectors," J. P. Rice, S. W. Brown, and J. E. Neira,  Proc. SPIE  6297, 629701 (2006).

4. "Hyperspectral projection of a coral reef scene using the NIST hyperspectral image projector," D.W. Allen, J.P. Rice, J.A. Goodman, Proc. SPIE 7334, 733415 (2009).

Released October 3, 2011, Updated June 19, 2018