Skip to main content

NOTICE: Due to a lapse in annual appropriations, most of this website is not being updated. Learn more.

Form submissions will still be accepted but will not receive responses at this time. Sections of this site for programs using non-appropriated funds (such as NVLAP) or those that are excepted from the shutdown (such as CHIPS and NVD) will continue to be updated.

U.S. flag

An official website of the United States government

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

CLIR Evaluation at TREC

Published

Author(s)

Donna K. Harman, M C. Braschler, M Hess, M Kluck, C Peters, P Schauble, P Sheridan

Abstract

Starting in 1997, the National Institute of Standards and Technology conducted 3 years of evaluation of cross-language information retrieval systems in the Text REtrieval Conference (TREC). Twenty-two participating systems used topics (test questions) in one language to retrieve documents written in English, French, German, and Italian. A large-scale multilingual test collection has been built and a new technique for building such a collection in a distributed manner was devised.
Citation
Cross-Language Evaluation Forum
Volume
2069

Keywords

evaluation, information retrieval, machine translation

Citation

Harman, D. , Braschler, M. , Hess, M. , Kluck, M. , Peters, C. , Schauble, P. and Sheridan, P. (2001), CLIR Evaluation at TREC, Cross-Language Evaluation Forum (Accessed October 11, 2025)

Issues

If you have any questions about this publication or are having problems accessing it, please contact [email protected].

Created January 1, 2001, Updated February 17, 2017
Was this page helpful?