Too many Relevants: Whither Cranfield Test Collections?
Ellen M. Voorhees, Nick Craswell, Jimmy Lin
This paper presents the lessons regarding the construction and use of large Cranfield-style test collections learned from the TREC 2021 Deep Learning track. The corpus used in the 2021 edition of the track was much bigger than the corpus used in previous years and contains many more relevant documents. The process used to select documents to judge that had been used in earlier years of the track failed to produce a reliable collection because most topics have too many relevant documents. Judgment budgets were exceeded before an adequate sample of the relevant set could be found, so there are likely many unknown relevant documents in the unjudged portion of the corpus. As a result, the collection is not reusable, and furthermore, recall-based measures are unreliable, even for the retrieval system results used in building it. Yet, early-precision measures cannot distinguish among system results because the maximum score is easily obtained for many topics. And since the existing tools for appraising the quality of test collections depend on systems' scores, they also fail when there are too many relevant documents. Collection builders will need new strategies and tools for building reliable test collections for continued use of the Cranfield paradigm on ever-larger corpora. Ensuring that the definition of 'relevant' truly reflects the desired systems' rankings is a provisional strategy for continued collection building.
Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval
July 11-15, 2022
45th International ACM SIGIR Conference on Research and Development in Information Retrieval (SIGIR 2022)
, Craswell, N.
and Lin, J.
Too many Relevants: Whither Cranfield Test Collections?, Proceedings of the 45th International ACM SIGIR Conference on Research and Development in Information Retrieval, Madrid, ES, [online], https://doi.org/10.1145/3477495.3531728, https://tsapps.nist.gov/publication/get_pdf.cfm?pub_id=934359
(Accessed December 11, 2023)