Skip to main content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.


The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

OpenCLIR Evaluation

The goal of the OpenCLIR (Open Cross Language Information Retrieval) evaluation is to develop methods to locate text and speech content in “documents” (speech or text) in low-resource languages, using English queries. This capability is one of several expected to ultimately support effective triage and analysis of large volumes of data, in a variety of less studied languages. Successful systems will be able to adapt to new languages and new genres.

The OpenCLIR evaluation was created out of the IARPA MATERIAL program that encompasses more tasks, including domain classification and summarization, and more languages and query types. The purpose of OpenCLIR is to provide a simplified, smaller scale evaluation open to all. Please see IARPA's MATERIAL website and NIST's MATERIAL website for more information on the MATERIAL program and IARPA's OpenCLIR website for OpenCLIR specifically.

OpenCLIR 2019

The first OpenCLIR evaluation, OpenCLIR19, took place in January/February 2019. Details can be found in the evaluation plan linked under Documentation below.

The first OpenCLIR evaluation planned on declaring a winner in two separate categories, text and audio data, with a monetary award for the winner of USD 10,000 in the text category and USD 20,000 in the audio category. Please see the documentation below for more details and rules regarding the prizes.


The winners of the OpenCLIR19 challenge were announced by IARPA on November 8, 2019 in this Tweet.

    Text data track winner: 

    • Elhuyar Foundation (Spain)

    Text data track runners-up:

    • 2nd Place: Dublin City University (Ireland)
    • 3rd Place: Hunan University of Science and Technology (China)

    Speech track:

    • No submissions qualified to win.

    Documentation and Resources​​​​

    Schedule (updated March 18, 2019)

    Milestone Date
    Release of evaluation plan July 2018
    Registration period Mid-July, 2018 - November 30, 2018

    Development cycle

    Release of Build Packs (training data)

    Release of ANALYSIS, DEV, QUERY-DEV (encrypted data, decryption keys)

    August 21, 2018 - May 31, 2019

    August 21, 2018

    August 21, 2018

    Release of EVAL, QUERY-EVAL (encrypted data) March 4, 2019

    Evaluation period

    Release of EVAL, QUERY-EVAL (decryption keys)

    System output due to NIST

    March 11 - May 31, 2019

    March 11, 2019

    May 31, 2019

    System description due to NIST July 12, 2019



    • Please email for any questions or comments regarding the OpenCLIR19 evaluation. 



    Created May 31, 2018, Updated January 28, 2020