Skip to main content
U.S. flag

An official website of the United States government

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

Two SVDs Suffice: Spectral decompositions for probabilistic topic modeling and latent Dirichlet allocation

Published

Author(s)

Yi-Kai Liu, Animashree Anandkumar, Dean P. Foster, Daniel Hsu, Sham M. Kakade

Abstract

The problem of topic modeling can be seen as a generalization of the clustering problem, in that it posits that observations are generated due to multiple latent factors (e.g. the words in each document are generated as a mixture of several active topics, as opposed to just one). This increased representational power comes at the cost of a more challenging unsupervised learning problem of estimating the topic probability vectors (the distributions over words for each topic), when only the words are observed and the corresponding topics are hidden. We provide a simple and efficient learning procedure that is guaranteed to recover the parameters for a wide class of mixture models, including the popular latent Dirichlet allocation (LDA) model. For LDA, the procedure correctly recovers both the topic probability vectors and the prior over the topics, using only trigram statistics (i.e. third order moments, which may be estimated with documents containing just three words). The method, termed Excess Correlation Analysis (ECA), is based on a spectral decomposition of low order moments (third and fourth order) via two singular value decompositions (SVDs). Moreover, the algorithm is scalable since the SVD operations are carried out on k * k matrices, where k is the number of latent factors (e.g. the number of topics), rather than in the d-dimensional observed space (typically d >> k).
Proceedings Title
Advances in Neural Information Processing Systems (NIPS)
Conference Dates
December 3-6, 2012
Conference Location
Lake Tahoe, NV
Conference Title
Neural Information Processing Systems (NIPS)

Keywords

Machine learning, unsupervised learning, mixture models, latent Dirichlet allocation, independent components analysis

Citation

Liu, Y. , Anandkumar, A. , Foster, D. , Hsu, D. and Kakade, S. (2012), Two SVDs Suffice: Spectral decompositions for probabilistic topic modeling and latent Dirichlet allocation, Advances in Neural Information Processing Systems (NIPS), Lake Tahoe, NV (Accessed December 4, 2024)

Issues

If you have any questions about this publication or are having problems accessing it, please contact reflib@nist.gov.

Created December 6, 2012, Updated February 19, 2017