Skip to main content
U.S. flag

An official website of the United States government

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

Adaptive Representations for Video-based Face Recognition Across Pose

Published

Author(s)

P. Jonathon Phillips, Yi-Chen Chen, Vishal M. Patel, Rama Chellappa

Abstract

n this paper, we address the problem of matching faces across changes in pose in unconstrained videos. We pro- pose two methods based on 3D rotation and sparse representation that compensate for changes in pose. The first is Sparse Representation-based Alignment (SRA) that generates pose aligned features under a sparsity constraint. The mapping for the pose aligned features are learned from a reference set of face images. The reference set is independent of the videos in an experiment. Thus, they generalize to across data sets. The second is a Dictionary Rotation (DR) method that directly rotates video dictionary atoms in both their harmonic basis and 3D geometry to match the poses of the probe videos. We demonstrate the effectiveness of our approach over several state-of-the-art algorithms through extensive experiments on three challenging unconstrained video datasets: the video challenge of the Face and Ocular Challenge Series (FOCS), the Multiple Biometrics Grand Challenge (MBGC), and the Human ID datasets.
Conference Dates
March 24-26, 2014
Conference Location
Steamboat Springs, CO
Conference Title
IEEE Winter Conference on Applications on Computer Vision (WACV)

Citation

Phillips, P. , Chen, Y. , Patel, V. and Chellappa, R. (2014), Adaptive Representations for Video-based Face Recognition Across Pose, IEEE Winter Conference on Applications on Computer Vision (WACV), Steamboat Springs, CO, [online], https://tsapps.nist.gov/publication/get_pdf.cfm?pub_id=914491 (Accessed October 11, 2024)

Issues

If you have any questions about this publication or are having problems accessing it, please contact reflib@nist.gov.

Created March 24, 2014, Updated February 19, 2017