NOTICE: Due to a lapse in annual appropriations, most of this website is not being updated. Learn more.
Form submissions will still be accepted but will not receive responses at this time. Sections of this site for programs using non-appropriated funds (such as NVLAP) or those that are excepted from the shutdown (such as CHIPS and NVD) will continue to be updated.
An official website of the United States government
Here’s how you know
Official websites use .gov
A .gov website belongs to an official government organization in the United States.
Secure .gov websites use HTTPS
A lock (
) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.
The Effects of Human Variation in DUC Summarization Evaluation
Published
Author(s)
Donna K. Harman, Paul D. Over
Abstract
There is a long history of research in automatic text summarization systems by both the text retrieval and the natural language processing communities, but evaluation of such systems' output has always presented problems. One critical problem remains how to handle the unavoidable variability in human judgments at the core of all the evaluations. Sponsored by
Proceedings Title
Proceedings of the Text Summarization Branches Out Workshop
Conference Dates
July 25-26, 2004
Conference Location
Barcelona, SP
Conference Title
Text Summarization Branches Out Workshop
Pub Type
Conferences
Keywords
DUC, evaluation, human variability, summarization
Citation
Harman, D.
and Over, P.
(2004),
The Effects of Human Variation in DUC Summarization Evaluation, Proceedings of the Text Summarization Branches Out Workshop, Barcelona, SP
(Accessed October 27, 2025)