Skip to main content
U.S. flag

An official website of the United States government

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

A Decade of Automatic Content Evaluation of News Summaries: Reassessing the State of the Art

Published

Author(s)

Peter Rankel, John M. Conroy, Hoa T. Dang, Ani Nenkova

Abstract

How good are automatic content metrics for news summary evaluation? Here we provide a detailed answer to this question, with a particular focus on assessing the ability of automatic evaluations to identify statistically significant differences present in manual evaluation of content. Using four years of TAC data, we analyze the performance of eight ROUGE variants in terms of accuracy, precision and recall in finding significantly different systems. Our experiments show that some of the neglected variants of ROUGE, based on higher order n-gram syntactic dependencies are most accurate across the years; the commonly used R-1 scores find too many significant differences. We also test combinations of ROUGE variants and find that they considerably improve the accuracy of automatic prediction.
Proceedings Title
Proceedings of the 51st Annual Meeting of the Association for Computational Linguistics
Conference Dates
August 4-9, 2013
Conference Location
Sofia, BG
Conference Title
51st Annual Meeting of the Association for Computational Linguistics

Keywords

evaluation, summarization

Citation

Rankel, P. , Conroy, J. , Dang, H. and Nenkova, A. (2013), A Decade of Automatic Content Evaluation of News Summaries: Reassessing the State of the Art, Proceedings of the 51st Annual Meeting of the Association for Computational Linguistics, Sofia, BG, [online], https://tsapps.nist.gov/publication/get_pdf.cfm?pub_id=914007 (Accessed October 11, 2024)

Issues

If you have any questions about this publication or are having problems accessing it, please contact reflib@nist.gov.

Created August 4, 2013, Updated October 12, 2021