The Multimedia Event Detection (MED) and Multimedia Event Recounting (MER) evaluation tracks are part of the TRECVID Evaluation. The 2014 evaluation will be the fourth MED evaluation which was preceded by the 2011, 2012 , and 2013 evaluations and the 2010 Pilot evaluation. This year, the MED and MER evaluation tracks have been combined.
The goal of MED is to assemble core detection technologies into a system that can search multimedia recordings for user-defined events based on pre-computed metadata. The metadata stores developed by the systems are expected to be sufficiently general to permit re-use for subsequent user defined events. The goal of MER is to recount the evidence that led the MED system to the conclusion that a particular multimedia clip contains an instance of a specific MED event.
A user searching for events in multimedia material may be interested in a wide variety of potential events. Since it is an intractable task to build special purpose detectors for each event a priori, technology is needed that can take as input a human-generated definition of the event that a system will use to search a collection of multimedia clips. The MED evaluation series will define events via an event kit which consists of an event name, definition, explication (textual exposition of the terms and concepts), evidential descriptions, and illustrative video exemplars.
A number of major changes have been made for the 2014 evaluation, please consult the evaluation plan.
NIST maintains an email discussion list to disseminate information. Send requests to join the list to med_poc at nist dot gov.
MED and MER system performance will be evaluated as specified in the evaluation plan. The evaluation plan contains the rules, protocols, metric definitions, scoring instructions, and submission instructions.
Frequently Asked Questions
A number of frequently asked questions have been collected and answered in the FAQ document.
A collection of Internet multimedia (i.e., clips containing both audio and video streams) will be provided to registered MED participants. The data, which was collected by the Linguistic Data Consortium, consists of publicly available, user-generated content posted to the various Internet video hosting sites.
MED14 Participants will the resources specified in the license agreement.
The evaluation plan and license information will specify usage rules of the data resources in full detail.
In order to obtain the corpora, ALL (including previous participants) TRECVID-registered sites must complete an evaluation license with the LDC. Each site who requires access to the data, either as part of a team or as a standalone researcher, must complete a license.
To complete the evaluation license follow these steps:
- Download the license MED14 Data License.
- Return the completed license to LDC's Membership Office via email at ldc at ldc dot upenn dot edu. Alternatively you may fax the completed license to the LDC at 215-573-2175.
- When you send the completed license to the LDC, include the following information:
- Registered TRECVID Team name
- Site/organization name
- Data contact person's name
- Data contact person's email
The designated data contact person for each site will receive instructions from the LDC about the specific procedures for obtaining the data packages when they are released.
Dry Run Evaluation
The dry run data resources will be posted soon.
Evaluation scripts to support the MED evaluation are within the NIST Framework for Detection Evaluations (F4DE) Toolkit found on the NIST MIG tools page. A new version with support of the scoring server will be made available soon.
As per the evaluation plan, NIST will provide access to an I/O server to facilitate the evaluation. The client software for connecting to this server is available in the following archive.
XML schemas will be used to validate some submission files, they are included with the I/O Client software.
NIST will provide judges with access to a MER Workstation for MER assessment. The MER Workstation software is available in the following archive.
Consult the TRECVID Master Schedule.
- February 24, 2014 - Initial page created.
- March 26, 2014 - Evaluation plan added.
- April 9, 2014 - Evaluation plan updated.
- April 24, 2014 - Evaluation plan updated.
- May 1, 2014 - Schema archive added.
- May 8, 2014 - I/O Client software archive added.
- May 15, 2014 - Evaluation plan, FAQ, and I/O Client software updated.
- June 4, 2014 - FAQ updated, added question 22.
- June 9, 2014 - I/O Client software updated.
- June 13, 2014 - FAQ updated with 23-26
- June 17, 2014 - I/O Client software updated.
- June 18, 2014 - FAQ updated. Added question 27.
- July 2, 2014 - FAQ update. Added 28-32
- July 3, 2014 - MER Workstation software archive added.
- July 14, 2014 - FAQ update. Added 33-36
- July 15, 2014 - I/O Client software updated.
- July 16, 2014 - FAQ update. Updated question 22.
- July 30, 2014 - FAQ update. Added 37 and 38.
- August 8, 2014 - I/O Client software update.
- August 12, 2014 - MER Workstation software updated.