Take a sneak peek at the new NIST.gov and let us know what you think!
(Please note: some content may not be complete on the beta site.).
NIST Authors in Bold
|Author(s):||Paul D. Over; George M. Awad; Jonathan G. Fiscus; Brian Antonishek; Martial Michel; Alan Smeaton; Wessel Kraaij; Georges Quenot;|
|Title:||TRECVID 2010 - An Overview of the Goals, Tasks, Data, Evaluation Mechanisms, and Metrics|
|Published:||April 15, 2011|
|Abstract:||The TREC Video Retrieval Evaluation (TRECVID) 2010 was a TREC-style video analysis and retrieval evaluation, the goal of which remains to promote progress in content-based exploitation of digital video via open, metrics-based evaluation. Over the last 10 years this effort has yielded a better understanding of how systems can effectively accomplish such processing and how one can reliably benchmark their performance. TRECVID is funded by the National Institute of Standards and Technology (NIST) and other US government agencies. Many organizations and individuals worldwide also contribute significant time and effort. In 2010, TRECVID turned to new and different data and to some new tasks. 73 teams from various research organizations --- 27 from Europe, 32 from Asia, 12 from North America, 1 from Africa, and 1 from South America --- completed one or more of six tasks: content-based copy detection, instance search, known-item search, semantic indexing, surveillance event detection, and multimedia event detection.|
|Keywords:||evaluation, information retrieval, multimedia, TRECVID, video analysis, video retrieval|
|Research Areas:||Information Technology, Information Processing Systems, Information Delivery Systems, Software Testing Metrics|
|PDF version:||Click here to retrieve PDF version of paper (3MB)|