Skip to main content

NOTICE: Due to a lapse in annual appropriations, most of this website is not being updated. Learn more.

Form submissions will still be accepted but will not receive responses at this time. Sections of this site for programs using non-appropriated funds (such as NVLAP) or those that are excepted from the shutdown (such as CHIPS and NVD) will continue to be updated.

U.S. flag

An official website of the United States government

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

The Effects of Human Variation in DUC Summarization Evaluation

Published

Author(s)

Donna K. Harman, Paul D. Over

Abstract

There is a long history of research in automatic text summarization systems by both the text retrieval and the natural language processing communities, but evaluation of such systems' output has always presented problems. One critical problem remains how to handle the unavoidable variability in human judgments at the core of all the evaluations. Sponsored by
Proceedings Title
Proceedings of the Text Summarization Branches Out Workshop
Conference Dates
July 25-26, 2004
Conference Location
Barcelona, SP
Conference Title
Text Summarization Branches Out Workshop

Keywords

DUC, evaluation, human variability, summarization

Citation

Harman, D. and Over, P. (2004), The Effects of Human Variation in DUC Summarization Evaluation, Proceedings of the Text Summarization Branches Out Workshop, Barcelona, SP (Accessed October 27, 2025)

Issues

If you have any questions about this publication or are having problems accessing it, please contact [email protected].

Created July 1, 2004, Updated February 17, 2017
Was this page helpful?