The Multimedia Event Detection (MED) evaluation track is part of the TRECVID Evaluation. The 2013 evaluation will be the third MED evaluation which was preceded by the 2012 and 2011 evaluation and the 2010 Pilot evaluation.
The goal of MED is to assemble core detection technologies into a system that can search multimedia recordings for user-defined events based on pre-computed metadata. The metadata stores developed by the systems are expected to be sufficiently general to permit re-use for subsequent user defined events.
A user searching for events in multimedia material may be interested in a wide variety of potential events. Since it is an intractable task to build special purpose detectors for each event a priori, technology is needed that can take as input a human-generated definition of the event that a system will use to search a collection of multimedia clips. The MED evaluation series will define events via an event kit which consists of an event name, definition, explication (textual exposition of the terms and concepts), evidential descriptions, and illustrative video exemplars.
The major changes for the 2013 evaluation include:
- The 20 MED '12 test events will be this years Pre-specified events
- 20 new AdHoc events will be released
- Four defined contrastive evaluation conditions will be defined
- A 1/3 subset of last year's test collection is a defined evaluation condition
- The training exemplar conditions will be 100, 10 and 0 exemplars.
MED Task Definitions
The 2013 evaluation will support two evaluation tasks:
- Pre-Specified Event MED: WITH knowledge of the pre-specified test event kits, construct a metadata store for the test videos and then for each pre-specified test event kit, search the metadata store to detect occurrences of the test event.
- Ad Hoc Event MED: WITHOUT knowledge of the ad hoc test event kits, construct a metadata store for the test videos and then for each ad hoc test event kit, search the metadata store to detect occurrences of the test event.
The Pre-Specified event task is identical to the MED12 task. Participants must build a system for at least one of the test events in order to participate in the evaluation and TRECVID Conference.
NIST maintains an email discussion list to disseminate information. Send requests to join the list to med_poc at nist dot gov.
Evaluation Plan and Data Use Rules
MED system performance will be evaluated as specified in the evaluation plan. The evaluation plan contains the rules, protocols, metric definitions, scoring instructions, and submission instructions.
- The latest evaluation plan is MED '13 Evaluation Plan V2.
- To facilitate comparisons to V1, please the PDF of V1 compared to V2.
- Telecon Notes: 20130419
A collection of Internet multimedia (i.e., clips containing both audio and video streams) will be provided to registered MED participants. The data, which was collected by the Linguistic Data Consortium, consists of publicly available, user-generated content posted to the various Internet video hosting sites.
MED12 Participants will receive following training resources:
- the MED10 data set, the MED11 DEV-T set,
- the MED11 Development and Test Collection,
- the Kindred Collection, and
- the "PROGRESS Test" Collection. This data set will be made available to previous MED participants that submitted results to NIST and new participants that complete the MED13 Dry Run.
The evaluation plan and license information will specify usage rules of the data resources in full detail.
2013 Pre-Specified Event Kits
Twenty Pre-Specified events will be from the 2012 test events. The table below contains the event names.
|MED11 Event||MED12 Events|
Changing a vehicle tire
Flash mob gathering
Getting a vehicle unstuck
Grooming an animal
Making a sandwich
Repairing an appliance
Working on a sewing project
|Attempting a bike trick
Cleaning an appliance
Giving directions to a location
Renovating a home
Town hall meeting
Winning a race without a vehicle
Working on a metal crafts project
- Event names need to be interpreted in the full context of the event definitions that will be made available as part of the event kits.
In order to obtain the corpora, ALL (including '12 participants) TRECVID-registered sites must complete an evaluation license with the LDC. Each site who requires access to the data, either as part of a team or as a standalone researcher, must complete a license.
To complete the evaluation license follow these steps:
- Download the license MED13 Evaluation License.
- Return the completed license to LDC's Membership Office via email at ldc at ldc dot upenn dot edu. Alternatively you may fax the completed license to the LDC at 215-573-2175.
- When you send the completed license to the LDC, include the following information:
- Registered TRECVID Team name
- Site/organization name
- Data contact person's name
- Data contact person's email
The designated data contact person for each site will receive instructions from the LDC about the specific procedures for obtaining the data packages when they are released.
Dry Run Evaluation
The dry run data resources are available from http://www-nlpir.nist.gov/projects/tv2013/med.data/. The username and password was provided during TRECVID registration. There are two relevant files:
- MED13DRYRUN_Files-v2.tar.bz2 - Contains the index files for three of the pre-specified events (E006, E009, and E013) tested against the MEDTest testing collection.
- MED13_testTEAM_MED13DRYRUN_PS_2.tar.bz2 - A demonstration submission file for the Dry Run
In addition to the data files, F4DE_3.0.1 contains a scoring primer for MED '13 in the file DEVA/doc/TRECVid-MED13-ScoringPrimer.html of the release that demonstrates how to score a system and prepare a submission file.
Evaluation scripts to support the MED evaluation are within the NIST Framework for Detection Evaluations (F4DE) Toolkit Version 3.0.1 or later found on the NIST MIG tools page.
The package contains an MED evaluation primer (found in DEVA/doc/TRECVid-MED11-ScoringPrimer.html) of the distribution.
The NIST has prepared a MER Workstation. The current release version is MERAPP13-v3.0.
Consult the TRECVID Master schedule.
- March 19, 2013 - Initial page created.