Skip to main content
U.S. flag

An official website of the United States government

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

SHREC 2011 - Shape Retrieval Contest of Range Scans

Call For Participation:

SHREC 2011 - Shape Retrieval Contest of Range Scans

Introduction

The objective of this shape retrieval contest is to retrieve 3D models those are relevant to a query range scan. This task corresponds to a real life scenario where the query is a 3D range scan of an object acquired from an arbitrary view direction. The algorithm should retrieve the relevant 3D objects from a database.

Task description

In response to a given set of queries, the task is to evaluate similarity scores with the target models and return an ordered ranked list along with the similarity scores for each query. The set of query consists of range images.

Data set

The query set is composed of 150 range images, which are acquired by capturing 3 range scans of 50 models from arbitrary view directions. The range images are captured using a Minolta Laser Scanner. The file format is in the ASCII Object File Format (*.off) representing the scan in a triangular mesh.

The target database contains 1000 complete 3D models, which are categorized into 50 classes. In each class there are 20 models. The file format to represent the 3D models is the ASCII Object File Format (*.off).

Evaluation Methodology

We will employ the following evaluation measures: Precision-Recall curve; Average Precision (AP) and Mean Average Precision (MAP); E-Measure; Discounted Cumulative Gain; Nearest Neighbor, First-Tier (Tier1) and Second-Tier (Tier2).

Procedure

The following list is a step-by-step description of the activities:

  • The participants must register by sending a message to %20shrec [at] nist.gov (SHREC[at]nist[dot]gov). Early registration is encouraged, so that we get an impression of the number of participants at an early stage.
  • The database will be made available via this website. Similarly the query set will be made available, against which the evaluation will be run.
  • Participants will submit the ranked lists and similarity scores for the query set. Upto 5 ranked lists per query set may be submitted, resulting from different runs. Each run may be a different algorithm, or a different parameter setting. More information on the rank list file format.
  • The evaluations will be done automatically.
  • The organization will release the evaluation scores of all the runs.
  • The participants write a one page description of their method and commenting the evaluation results with two figures.
  • The track results are combined into a joint paper, published in the proceedings of the Eurographics Workshop on 3D Object Retrieval.
  • The description of the tracks and their results are presented at the Eurographics Workshop on 3D Object Retrieval (April 10, 2011).

Schedule

January 24 - Call for participation.
January 28 - Sample query scans and target models will be available on line.
February 1 - Please register before this date.
February 1 - Distribution of the final query set and target database. Participants can start the retrieval.
February 9 - Submission of results (ranked lists) and a one page description of their method(s).
February 11 - Distribution of relevance judgments and evaluation scores.
February 13 - Submission of final descriptions for the contest proceedings.
February 15 - Track is finished, and results are ready for inclusion in a track report
February 22 - Camera ready track papers submitted for printing
April 10 - EUROGRAPHICS Workshop on 3D Object Retrieval including SHREC'2011
Created May 6, 2019