Skip to main content
U.S. flag

An official website of the United States government

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

SHREC 2009 - Shape Retrieval Contest on a New Generic Shape Benchmark

Call for Participation:

SHREC 2009 - Shape Retrieval Contest on a New Generic Shape Benchmark

Introduction

The objectives of this shape retrieval contest are to evaluate the performance of 3D shape retrieval approaches on a new generic 3D shape benchmark.

Task description

In response to a given set of queries, the task is to evaluate similarity scores with the target models and return an ordered ranked list along with the similarity scores for each query.

Data set

In this new generic benchmark there are 800 3D models. The target database contains 720 complete 3D models, which are categorized into 40 classes. In each class there are 18 models. The file format to represent the 3D models is the ASCII Object File Format (*.off).

Query set

This query set will have a total of eighty 3D models, two from each class.

Evaluation Methodology

We will employ the following evaluation measures: Precision-Recall curve; Average Precision (AP) and Mean Average Precision (MAP); E-Measure; Discounted Cumulative Gain; Nearest Neighbor, First-Tier (Tier1) and Second-Tier (Tier2).

Procedure

The following list is a step-by-step description of the activities:

  • The participants must register by sending a message to %20shrec [at] nist.gov (SHREC[at]nist[dot]gov). Early registration is encouraged, so that we get an impression of the number of participants at an early stage.
  • The database will be made available via this website. Similarly the query set will be made available, against which the evaluation will be run.
  • Participants will submit the ranked lists and similarity scores for each query per email. Up to 5 ranked lists may be submitted, resulting from different runs. Each run may be a different algorithm, or a different parameter setting. More information on the rank list file format.
  • The evaluations will be done automatically.
  • The organization will release the evaluation scores of all the runs.
  • The participants write a short paper describing their method and commenting the evaluation results.
  • The track results are combined into a joint paper, published in the proceedings of the Eurographics Workshop on 3D Object Retrieval.
  • The description of the tracks and their results are presented at the Eurographics Workshop on 3D Object Retrieval (March 29, 2009).

Schedule

November 25 - Call for participation.
December 8 - The target data set and a sample query set will be available on line.
January 3 - Please register before this date.
January 5 - Distribution of the final query sets. Participants can start the retrieval.
January 12 - Submission of results (ranked lists) and a short paper draft describing the method(s).
January 13 - Distribution of relevance judgments and evaluation scores.
January 25 - Submission of final short papers for the contest proceedings.
February 1 - Track is finished, and results are ready for inclusion in a track report.
February 15 - Camera ready track papers submitted for printing.
March 29 - EUROGRAPHICS Workshop on 3D Object Retrieval including SHREC'09.
Created May 7, 2019