Skip to main content

NOTICE: Due to a lapse in annual appropriations, most of this website is not being updated. Learn more.

Form submissions will still be accepted but will not receive responses at this time. Sections of this site for programs using non-appropriated funds (such as NVLAP) or those that are excepted from the shutdown (such as CHIPS and NVD) will continue to be updated.

U.S. flag

An official website of the United States government

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

The DUC Summarization Evaluations

Published

Author(s)

Donna K. Harman, Paul D. Over

Abstract

There has been a long history of research in text summarization by both the text retrieval and the natural language processing communities, but evaluation of this research has always presented problems. In 2001 NIST launched a new text summarization evaluation effort, guided by a roadmap from the research community and sponsored by the DARPA TIDES project. This paper is a report of the first formal evaluation in a new conference called the Document Understanding Conference (DUC).
Proceedings Title
Proceedings of HLT 2002 Second International Conference on Human Language Technology Research

Keywords

Document Understanding Conference, DUC, natural language processing, summarization evaluation, text retrieval

Citation

Harman, D. and Over, P. (2002), The DUC Summarization Evaluations, Proceedings of HLT 2002 Second International Conference on Human Language Technology Research (Accessed October 14, 2025)

Issues

If you have any questions about this publication or are having problems accessing it, please contact [email protected].

Created March 1, 2002, Updated February 17, 2017
Was this page helpful?