D Hawking, Nick Craswell, P Thistlewaite, Donna Harman
A frozen 18.5 million page snapshot of part of the Web has been created to enable and encourage meaningful and reproducible evaluation of Web search systems and techniques. This collection is being used in an evaluation framework within the Text Retrieval Conference (TREC) and will hopefully provide convincing answers to questions such as, Can link information result in better rankings?, Do longer queries result in better answers?, and, Do TREC systems work well on Web data? The snapshot and associated evaluation methods are described and an invitation is extended to participate. Preliminary results are presented for an effectiveness comparison of six TREC systems working on the snapshot collection against five well-known Web search systems working over the current Web. These suggest that the standard of document rankings produced by public Web search engines is by no means state-of-the-art.
International World Wide Web Conference
information retrieval, web search engines
, Craswell, N.
, Thistlewaite, P.
and Harman, D.
Results and Challenges in Web Search Evaluation, International World Wide Web Conference
(Accessed February 25, 2024)