Skip to main content
U.S. flag

An official website of the United States government

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

TREC Deep Learning Track: Reusable Test Collections in the Large Data Regime

Published

Author(s)

Ellen M. Voorhees, Ian Soboroff, Nick Craswell, Bhaskar Mitra, Emine Yilmaz, Daniel Campos

Abstract

The TREC Deep Learning (DL) Track studies ad hoc search in the large data regime, meaning that a large set of human-labeled training data is available. Results so far indicate that the best models with large data are likely deep neural networks. This paper supports the reuse of the TREC DL test collections in three ways. First we describe the data sets in detail, documenting clearly and in one place some details that are otherwise scattered in track guidelines, overview papers and in our associated MS MARCO leaderboard pages. We intend this description to make it easy for newcomers to use the TREC DL data. Second, because there is some risk of iteration and selection bias when reusing a data set, we describe the best practices for writing a paper using TREC DL data, without overfitting. We provide some illustrative analysis. Finally we address a number of issues around the TREC DL data, including an analysis of reusability.
Proceedings Title
Proceedings of the ACM Special Interest Group on Information Retrieval 2021 Conference (SIGIR 2021)
Conference Dates
July 11-15, 2021
Conference Location
virtual; would have been Montreal, CA

Keywords

deep learning, information retrieval evaluation, reusable benchmark, test collections

Citation

Voorhees, E. , Soboroff, I. , Craswell, N. , Mitra, B. , Yilmaz, E. and Campos, D. (2021), TREC Deep Learning Track: Reusable Test Collections in the Large Data Regime, Proceedings of the ACM Special Interest Group on Information Retrieval 2021 Conference (SIGIR 2021), virtual; would have been Montreal, CA, [online], https://doi.org/10.1145/3404835.3463249, https://tsapps.nist.gov/publication/get_pdf.cfm?pub_id=932336 (Accessed October 14, 2024)

Issues

If you have any questions about this publication or are having problems accessing it, please contact reflib@nist.gov.

Created July 11, 2021, Updated February 14, 2023