Skip to main content
U.S. flag

An official website of the United States government

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

SHREC 2009 - Shape Retrieval Contest of Partial 3D Models

Call For Participation:

SHREC 2009 - Shape Retrieval Contest of partial 3D models


There are two objectives of this shape retrieval contest:

  • a) To evaluate partial similarity between query and target objects and retrieve complete 3D models that are relevant to a partial query object.
  • b) To retrieve 3D models those are relevant to a query depth map. This task corresponds to a real life scenario where the query is a 3D range scan of an object acquired from an arbitrary view direction. The algorithm should retrieve the relevant 3D objects from a database.

Task description

In response to a given set of queries, the task is to evaluate similarity scores with the target models and return an ordered ranked list along with the similarity scores for each query. The set of queries either consists of partial 3D models or of range images. The participants may present ranked lists for either of the query sets or both. There is no obligation to submit rank lists for both of the query sets.

Data set

The first query set consists of 20 3D partial models which are obtained by cutting parts from complete models. The objective is to retrieve the models which the query part may belong to. The file format to represent the partial query models is the ASCII Object File Format (*.off).

This second query set is composed of 20 range images, which are acquired by capturing range data of 20 models from arbitrary view directions. The range images are captured using a desktop 3D scanner. The file format is in the ASCII Object File Format (*.off) representing the scan in a triangular mesh.

The target database is the same for both of the query sets and it contains 720 complete 3D models, which are categorized into 40 classes. In each class there are 18 models. The file format to represent the 3D models is the ASCII Object File Format (*.off).

Evaluation Methodology

We will employ the following evaluation measures: Precision-Recall curve; Average Precision (AP) and Mean Average Precision (MAP); E-Measure; Discounted Cumulative Gain; Nearest Neighbor, First-Tier (Tier1) and Second-Tier (Tier2).


The following list is a step-by-step description of the activities:

  • The participants must register by sending a message to %20shrec [at] Early registration is encouraged, so that we get an impression of the number of participants at an early stage.
  • The database will be made available via this website. Similarly the query set will be made available, against which the evaluation will be run.
  • Participants will submit the ranked lists and similarity scores for one of the two query sets or both. The participants do not have to submit query results for both of the sets. Up to 5 ranked lists per query set may be submitted, resulting from different runs. Each run may be a different algorithm, or a different parameter setting. More information on the rank list file format.
  • The evaluations will be done automatically.
  • The organization will release the evaluation scores of all the runs.
  • The participants write a short paper describing their method and commenting the evaluation results.
  • The track results are combined into a joint paper, published in the proceedings of the Eurographics Workshop on 3D Object Retrieval.
  • The description of the tracks and their results are presented at the Eurographics Workshop on 3D Object Retrieval (March 29, 2009).


November 25 - Call for participation.
December 8 - The target data set and a sample query set will be available on line. For 3D partial models we provide 11 sample queries and for range scans we provide 5 sample queries. The final query set will not contain the sample queries.
January 3 - Please register before this date.
January 5 - Distribution of the final query sets. Participants can start the retrieval.
January 12 - Submission of results (ranked lists) and a short paper draft describing the method(s).
January 13 - Distribution of relevance judgments and evaluation scores.
January 25 - Submission of final short papers for the contest proceedings.
February 1 - Track is finished, and results are ready for inclusion in a track report
February 15 - Camera ready track papers submitted for printing
March 29 - EUROGRAPHICS Workshop on 3D Object Retrieval including SHREC'09
Created May 7, 2019