Skip to main content
U.S. flag

An official website of the United States government

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

MED 2017 Evaluation

The Multimedia Event Detection (MED) track is part of the TRECVID Evaluation. The 2017 evaluation will be the eighth MED evaluation which was preceded by the 2011, 2012, 2013, 2014, 2015, and 2016 evaluations and the 2010 Pilot evaluation.

The goal of MED is to assemble core detection technologies into a system that can search multimedia recordings for user-defined events based on pre-computed metadata. The metadata stores developed by the systems are expected to be sufficiently general to permit re-use for subsequent user defined events.

A user searching for events in multimedia material may be interested in a wide variety of potential events. Since it is an intractable task to build special purpose detectors for each event a priori, technology is needed that can take as input a human-generated definition of the event that a system will use to search a collection of multimedia clips. The MED evaluation series will define events via an event kit which consists of an event name, definition, explication (textual exposition of the terms and concepts), evidential descriptions, and illustrative video exemplars.

Information Dissemination

NIST maintains an email discussion list to disseminate information. Send requests to join the list to med_poc at nist dot gov.

This page will be updated periodically with additional information and resources.  When such updates occur, a notice will also be sent to the aforementioned mailing list.

Evaluation Plan

MED system performance will be evaluated as specified in the evaluation plan, MED17 Evaluation Plan.  Please note that some content (specifically the submission format and instructions) is subject to change as we update our submission/scoring infrastructure for 2017.

Data Resources

Portions of the HAVIC collection of Internet multimedia (i.e., clips containing both audio and video streams) will be provided to registered MED participants. The data, which was collected by the Linguistic Data Consortium, consists of publicly available, user-generated content posted to the various Internet video hosting sites.

The Yahoo Flickr Creative Commons 100M dataset (YFCC100M) is a large collection of images and video available on Yahoo! Flickr.  All photos and videos listed in the collection are licensed under one of the Create Commons copyright licenses.  This dataset is available directly from Yahoo! here.  Only a subset of the YFCC100M videos will be used for this evaluation, this subset is to be determined.

The evaluation plan and license information will specify usage rules of the data resources in full detail.

Video data

HAVIC clips will be provided in MPEG-4 formatted files. The video will be encoded to the H.264 standard. The audio will be encoded using MPEG-4'S Advanced Audio Coding (AAC) standard.  Please refer to the YFCC100M documentation regarding video file format.

Data Licensing

In order to obtain the HAVIC corpora, ALL (including previous participants) TRECVID-registered sites must complete an evaluation license with the LDC. Each site who requires access to the HAVIC data, either as part of a team or as a standalone researcher, must complete a license.

To complete the evaluation license follow these steps:

  1. Download the license, MED17 Data License.
  2. Return the completed license to LDC's Membership Office via email at ldc at ldc dot upenn dot edu, and be sure to include med_poc at nist dot gov.  Alternatively, you may fax the completed license to the LDC at 215-573-2175.
  3. When you send the completed license to the LDC, include the following information:
    • Registered TRECVID Team name
    • Site/organization name
    • Data contact person's name
    • Data contact person's email
    • Specifically which Corpora/Data listed in the agreement you need from the LDC, as participants in last years MED evaluation may already have some of the data on site

This year, NIST will make the HAVIC resources available to sites who've successfully submitted a signed MED17 Data License agreement to the LDC.  The data will be hosted on Amazon Web Services (AWS), you will need to create an AWS account if you or your site doesn't already have one in order to access the HAVIC data.  To request access to the data, please send a request to med_poc at nist dot gov, and include the following information:

  • AWS root account canonical ID (an obfuscated form of your account number)

    • The canonical ID is found under a drop-down menu in the upper right-hand corner of the AWS console, under their username:
      [username] drop-down menu -> My Security Credentials -> Accounts Identifiers

  • Your site/organization name

  • Your team name (if different from site/organization name)

Input Files

The following archives contain the input files for both the Pre-Specified and Ad-Hoc Event portions of the MED17 evaluation.

Dry Run Evaluation

There are two relevant archives for the dry run evaluation:

In addition to the data files, F4DE-3.5.0 contains a scoring primer for MED '17 in the file DEVA/doc/TRECVid-MED17-ScoringPrimer.html that demonstrates how to score a system and prepare a submission file.

Evaluation Tools

Evaluation scripts to support the MED evaluation are within the NIST Framework for Detection Evaluations (F4DE) Toolkit, a link to the latest release can be found on the NIST MIG tools page.
MED17 requires a F4DE version of 3.5.0 or later. The list of F4DE releases can be found on the F4DE GitHub page here.

Schedule

Consult the TRECVID Master Schedule

Revision History

  • August 15, 2017 - Updated Input Files to include Ad-Hoc materials
  • July 6, 2017 - Updated Data Licensing section.  Added Input Files, Dry Run Evaluation, and Evaluation Tools sections
  • March 30, 2017 - Initial page created

Disclaimer

Any mention of commercial products within NIST web pages is for information only; it does not imply recommendation or endorsement by NIST.

Created March 30, 2017, Updated November 29, 2019