Skip to main content
U.S. flag

An official website of the United States government

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

Evaluation Campaigns and TRECvid

Published

Author(s)

Alan Smeaton, Paul D. Over, Wessel Kraaij

Abstract

The TREC Video Retrieval Evaluation (TRECvid) is an international benchmarking activity to encourage research in video information retrieval by providing a large test collection, uniform scoring procedures, and a forum for organizations interested in comparing their results. TRECvid completed its fifth annual cycle at the end of 2005 and in 2006 TRECvid will involve almost 70 research organizations, universities and other consortia. Throughout its existence, TRECvid has benchmarked both interactive and automatic. manual searching for shots from within a video corpus, automatic detection of a variety of semantic and low-level video features, shot boundary detection and the detection of story boundaries in broadcast TV news. This paper will give an introduction to information retrieval (IR) evaluation from both a user and a system perspective, high-lighting that system evaluation is by far thee most prevalent type of evaluation carried out. We also include a summary of TRECvid as an example of system evaluation benchmarking campaign and this allows us to discuss whether such campaigns are a good thing or bad thing. There are arguments for and against these campaigns and we present some of them in the paper concluding that on balance they have had a very positive impact on research progress.
Conference Location
, USA
Conference Title
ACM Multimedia Information Retrieval Workshop 2006

Keywords

benchmarking, evaluation, video retrieval

Citation

Smeaton, A. , Over, P. and Kraaij, W. (2006), Evaluation Campaigns and TRECvid, ACM Multimedia Information Retrieval Workshop 2006, , USA, [online], https://tsapps.nist.gov/publication/get_pdf.cfm?pub_id=50847 (Accessed June 25, 2024)

Issues

If you have any questions about this publication or are having problems accessing it, please contact reflib@nist.gov.

Created December 19, 2006, Updated October 12, 2021