Skip to main content
U.S. flag

An official website of the United States government

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

Information Access Division Highlights - 2013

Information Access Division Highlights - 2013

 

NOVEMBER – DECEMBER 2013

NIST Researchers Publish Case Study of Usability Metrics

Usability researchers Mary Theofanos, Office of Data and Informatics, Material Measurement Laboratory, and Yee-Yin Choong and Brian Stanton, of ITL's Information Access Division, recently published a case study entitled Measuring the Effect of Feedback on Fingerprint Capture. The work is included as Chapter 10 in a new book, Measuring the User Experience, Collecting, Analyzing, and

Presenting Usability Metrics by Tom Tullis and Bill Albert. The preeminent metrics book in the usability field, the new release is the second edition of a well-received and respected reference, Measuring the User Experience. This was the first book that focused on how to quantify the user experience. The second edition includes new material on how recent technologies have made it easier and more effective to collect a broader range of data about the user experience. It presents new metrics and examines how new technologies can refine user experience measurement. The book also contains new research, updated examples, and six new case studies including the ITL case study.

Effects of Decomposition Levels and Quality Layers with JPEG 2000 Compression of 1000 ppi Fingerprint Images

By John M. Libert, Shahram Orandi, and John D. Grantham
NISTIR 7939
August 2013

As part of NIST's research efforts to support development of the FBI Next Generation Identification (NGI) systems, this study evaluates effects on image fidelity of wavelet transform decomposition and quality layer options for JPEG 2000 compression of 1000 ppi fingerprint images. First, a suite of 1000 ppi fingerprints are subjected to encoding specifying from three to eight levels of DWT decomposition.Decoded images are compared to non-compressed source images and fidelity loss is evaluated by certified latent fingerprint examiners as well as by several automated computational fidelity metrics. Fidelity losses among the six decomposition level options result in no statistically significant differences among assessments of image degradation by trained fingerprint examiners. Computational metrics find statistically significant fidelity differences, with lowest error for five and six DWT decompositions. A second experiment is also described in the report.

SEPTEMBER – OCTOBER 2013

Patrick Grother, of ITL's Information Access Division, received a American National Standards Institute 2013 Leadership and Service Award from the American National Standards Institute (ANSI). He received the Edward Lohse Information Technology Medal, which "recognizes outstanding efforts to foster cooperation among the bodies involved in global IT standardization."

Biometric Specifications for Personal Identity Verification

By Patrick J. Grother, Wayne J. Salamon, and Ramaswamy Chandramouli
NIST Special Publication 800-76-2
July 2013

Homeland Security Presidential Directive (HSPD) 12, Policy for a Common Identification Standard for Federal Employees and Contractors, called for new standards to be adopted governing interoperable use of identity credentials to allow physical and logical access to federal government locations and systems. NIST developed Federal Information Processing Standard (FIPS) 201, Personal Identity Verification (PIV) of Federal Employees and Contractors, which defines procedures and specifications for issuance and use of an interoperable identity credential. This document, a companion document to

FIPS 201, describes technical acquisition and formatting specifications for the PIV system and establishes minimum accuracy specifications for deployed biometric authentication processes.

IREX VI Temporal Stability of Iris Recognition Accuracy

By Patrick J. Grother, James R. Matey, Elham Tabassi, George W. Quinn, and Michael Chumakov
NISTIR 7948
July 2013

Stability is a required definitional property for a biometric to be useful. Quantitative statements of stability are operationally important as they dictate reenrollment schedules, e.g., of a face on a passport. We quantify time variation in iris recognition accuracy in two ways. First we produce rate-of-change estimates for up to 122,000 frequent travelers using a fixed iris recognition system for up to 9 years. Second, we apply iris recognition algorithms to the images of 217 individuals used in a Notre Dame study. The algorithms produce pupil dilation and exposed iris area measures which we relate to recognition outcomes.

IREX VI: Part 1, Evaluation of Iris Identification Algorithms

By George W. Quinn, Patrick Grother, and Meilee L. Ngan
NISTIR 7949
July 2013

IREX IV aims to provide a fair and balanced scientific evaluation of the performance of automated iris recognition algorithms. IREX IV evaluated the performance of 66 identification (i.e., one-to-many) algorithms submitted by 12 companies and universities.IREX IV investigated the use of cost models for application-specific algorithm optimization. The goal is to see if algorithm developers can improve performance when given advanced knowledge of the costs of identification errors.

Toward a Shared Approach for Ensuring Patient Safety with Enhanced Workflow Design for Electronic Health Records (EHRs) – Summary of the Workshop

By Svetlana Lowry, Mala Ramaiah, A. Ozok, A.P. Gurses, M.C. Gibbons, D. Brick, E.S. Patterson, and V.R. Lewis
NISTIR 7952
July 2013

In April 2013, NIST sponsored the EHR Usability and Patient Safety Roundtable: Supporting Patient Safety through HER Design. At the workshop, representatives of government, HER developers, EHR users, and academics identified common ground on challenges and aspirations to improve patient safety, usability, and human factors regarding workflows in the use of electronic health records. Achieving these objectives is necessary to enhance safe and effective care to all patients and increase the rate of adoption of el electronic health records in the United States.

JULY – AUGUST 2013

ITL-Organized Workshop Results in New 3D Shape Benchmark

A new 3D shape benchmark resulted from a conference track organized by ITL's Information Access Division researcher Afzal Godil. The track focused on Large-Scale Sketch-Based 3D Shape Retrieval in collaboration with Texas State University and the University of Konstanz in Germany. The objective of the sketch-based 3D model retrieval was to retrieve 3D models using a 2D sketch as input. This approach is intuitive and convenient for users to search for relevant 3D models and also important for several applications including sketch-based modeling and sketch-based shape recognition. The track was organized to foster this challenging research area by providing a common sketch-based retrieval dataset and soliciting retrieval results from current state-of-the-art retrieval methods for comparison. The new shape benchmark will provide a valuable contribution to the 3D shape retrieval and evaluation community.

Godil co-organized the Shape Retrieval Contest (SHREC) under the Eurographics Workshop on 3D Object Retrieval (3DOR'13).Four tracks were organized under SHREC; fourteen groups with 35 researchers from around the world participated in the 3D shape retrieval tracks and submitted 45 results based on different methods. The goals of the contest were to promote the development of shape retrieval methods and to evaluate and compare the effectiveness of different approaches. Papers on the shape contest were presented at the 3DOR'13, held in Gerona, Spain, on May 11, 2013.

Incorporating Biometric Software Development Kits into the Development Process

By Karen Marshall, Ross J. Micheals, Kevin Mangold, and Kayee Kwong
NISTIR 7929
May 2013

The use of biometric devices has become critical to implementing various forms of security in a large number of organizations worldwide. The current state of biometric sensor integration is labor-intensive and prone to interoperability issues because of proprietary hardware and software, and a lack of standards in the Software Development Kit (SDK) installation process. Sensor integration incorporates the device hardware installation and intricate patchwork necessary to facilitate full communication between the device's software and the application that will ultimately command and control the target sensor. This document describes a process used to achieve more flexible and reliable integration of biometric sensor SDKs into the application development process.

MARCH – APRIL 2013

The Twenty-First Text REtrieval Conference Proceedings (TREC 2012)

Ellen M. Voorhees and Lori M. Buckland, Editors
NIST Special Publication 500-298
February 2013

This report constitutes the proceedings of the Twenty-First Text REtrieval Conference (TREC 2012) held in Gaithersburg, Maryland, on November 6-9, 2012. The conference was cosponsored by the National Institute of Standards and Technology (NIST), the Defense Advanced Research Projects Agency (DARPA), and the Advanced Research and Development Activity (ARDA).

Examination of Downsampling Strategies for Converting 1000 ppi Fingerprint Imagery to 500 ppi

By Shahram Orandi, John M. Libert, John D. Grantham, Margaret Lepley, Bruce Bandini, Kenneth Ko, Lindsay M. Petersen, Stephen S. Wood, and Stephen G. Harvey
NISTIR 7839
January 2013

Currently the bulk of fingerprint data in operations is captured, processed and stored at 500 ppi using the WSQ compressed digital format. With the transition to 1000 ppi, some systems will unavoidably contain an overlap between 500 ppi and 1000 ppi operational pathways. This overlap may be a result of legacy infrastructure limitations, or some other financial or logistical reason. Additionally, there will still be a need to compare newly collected 1000 ppi images against legacy 500 ppi images, for both one-to-one and one-to-many scenarios. To create a bridge between legacy and modern data, there needs to be a pathway for interoperability of legacy and modern data on equal footing by converting one of the images to the same resolution as the other. Downsampling of the higher resolution 1000 ppi imagery to 500 ppi provides this pathway. The study compares several computational methods for downsampling of modern fingerprint images from 1000 ppi to 500 ppi.

JANUARY – FEBRUARY 2013

ITL Research Improves the Usability of Mobile Biometric Systems

The FBI's Hostage Rescue Team (HRT) presented NIST with a challenge: to design fingerprint capture software capable of running on a small hardware platform and having a touch interface. The interface had to be able to enter biographic information, control the fingerprint capture device, and display the resulting prints, all on a screen the size of an index card. The system was to be used in high-stress situations by the HRT in fulfilling their anti-terrorism duties. These duties required a portable and intuitive system that could be transported, deployed, and operated quickly.

In response to this challenge, NIST's Mary Frances Theofanos, Matthew Aronoff (formerly of NIST), Yee-Yin Choong, Ross Micheals, and Brian Stanton conducted a number of requirements-gathering exercises. First they performed task analyses. These analyses consisted of documenting the HRT's actions as they performed biometric captures in a simulated environment. One-on-one interviews followed to document any individual requirements. Lastly, the researchers conducted a group brainstorming session to work out detailed operational requirements. The requirements-gathering effort resulted in high-fidelity prototypes and detailed operational use cases.

This work represents the first-of-its-kind design and demonstration of a mobile biometric system implemented on a smart phone and a successful user-centered design approach to meet user requirements. Based on this success, the FBI is adopting the user-centered design approach on other projects, and several government agencies are exploring the smart phone platform for mobile biometric applications. The work opened up a new era of handheld biometric devices for federal, state, and local law enforcement agencies. The research team was recognized for dramatically improving the usability of the biometrics acquisition user interface for handheld touch-screen mobile biometrics devices by receiving a 2012 Department of Commerce Bronze Medal Award. For more information, see the NIST Mobile ID website < http://zing.ncsl.nist.gov/mobile_id/>.

The Twentieth Text REtrieval Conference Proceedings (TREC 2011)

Ellen M. Voorhees and Lori P. Buckland, Editors
NIST Special Publication 500-296
October 2012

This report constitutes the proceedings of the Twentieth Text REtrieval Conference (TREC 2011) held at NIST on November 15-18, 2011. The conference was cosponsored by NIST, the Defense Advanced Research Projects Agency (DARPA), and the Advanced Research and Development Activity (ARDA).

Significant Test in Speaker Recognition Data Analysis with Data Dependency

By Jin Chu Wu, Alvin F. Martin, Craig S. Greenberg, Raghu N. Kacker, and Vincent M. Stanford
NISTIR 7884
October 2012

To evaluate the performance of speaker recognition systems, a detection cost function defined as a weighted sum of the probabilities of type I and type II errors is employed. The speaker datasets may have data dependency due to multiple uses of the same subjects. Using the standard errors of the detection cost function computed by means of the two-layer nonparametric two-sample bootstrap method, a significance test is performed to determine whether the difference between the measured performance levels of two speaker recognition algorithms is statistically significant. While conducting the significance test, the correlation coefficient between two detection cost functions for two algorithms, respectively, is taken into account. Examples are provided.

Created December 8, 2014, Updated August 25, 2016