Skip to main content
U.S. flag

An official website of the United States government

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

Search Publications by: Ellen M. Voorhees (Assoc)

Search Title, Abstract, Conference, Citation, Keyword or Author
Displaying 76 - 100 of 170

The TREC-5 Database Merging Track

October 24, 2005
Author(s)
Ellen M. Voorhees
This paper is the track report for the TREC-5 database merging track. The report motivates the database merging problem, defines the task performed by the participants in detail, and summarizes the main results of the track.

IT: The Thirteenth Text Retrieval Conference, TREC 2004

October 3, 2005
Author(s)
Ellen M. Voorhees
This report provides an overview of the thirteenth Text Retrieval Conference, TREC 2004. TREC 2004 was held at the National Institute of Standards and Technology (NIST) November~16--19, 2004. The conference was co-sponsored by NIST, the US Department of

Retrieval System Evaluation

September 26, 2005
Author(s)
C E. Buckley, Ellen M. Voorhees
One of the primary motivations for TREC was to standardize retrieval system evaluation. Prior to TREC, there was little explicit discussion of what constituted a minimally acceptable experimental design, and no hard evidence to support any position. TREC

Overview of the TREC 2004 Question Answering Track

August 1, 2005
Author(s)
Ellen M. Voorhees
The TREC 2004 Question Answering track contained a single task in which question series were used to define a set of targets. Each series contained factoid and list questions and related to a single target. The final question in the series was an Other

Overview of the TREC 2004 Robust Retrieval Track

August 1, 2005
Author(s)
Ellen M. Voorhees
The robust retrieval track explores methods for improving the consistency of retrieval technology by focusing on poorly performing topics. The retrieval task in the track is a traditional ad hoc retrieval task where the evaluation methodology emphasizes a

Overview of TREC 2004

August 1, 2005
Author(s)
Ellen M. Voorhees
This report provides an overview of the thirteenth Text REtrieval Conference, TREC 2004. TREC 2004 was held at the National Institute of Standards and Technology (NIST) November~16--19, 2004. The conference was co-sponsored by NIST, the U.S. Department of

IT--The Twelfth Text Retrieval Conference, TREC 2003

October 25, 2004
Author(s)
Ellen M. Voorhees, Donna K. Harman
This chapter provides an executive summary of the TREC workshop series and the remainder of thevolume. It explains the motivation for TREC and highlights TREC's accomplishments in improving retrievaleffectiveness and fostering technology transfer.

Retrieving Noisy Text

September 26, 2004
Author(s)
Ellen M. Voorhees, John S. Garofolo
Two tracks within TREC have examined the problem of retrieving noisy documents---documents whose content is not necessarily a faithful representation of the author's intent. The confusion track tested the ability of system to retrieve documents that were

Retrieval Evaluation with Incomplete Information

July 1, 2004
Author(s)
C E. Buckley, Ellen M. Voorhees
This paper examines whether the Cranfield evaluation methodology is robust to gross violations of the completeness assumption (i.e., the assumption that all relevant documents within a test collection have been identified and are present in the collection)

Overview of the TREC 2003 Question Answering Track

March 1, 2004
Author(s)
Ellen M. Voorhees
The TREC 2003 question answering track contained two tasks, the passages task and the main task. In the passages task, systems returned a single text snippet in response to factoid questions; the evaluation metric was the number of snippets that contained