Skip to main content
U.S. flag

An official website of the United States government

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

SHREC 2010 - Shape Retrieval Contest of Range Scans

Call For Participation:

SHREC 2010 - Shape Retrieval Contest of Range Scans


The objective of this shape retrieval contest is to retrieve 3D models those are relevant to a query range scan. This task corresponds to a real life scenario where the query is a 3D range scan of an object acquired from an arbitrary view direction. The algorithm should retrieve the relevant 3D objects from a database.

Task description

In response to a given set of queries, the task is to evaluate similarity scores with the target models and return an ordered ranked list along with the similarity scores for each query. The set of query consists of range images.

Data set

The query set is composed of 120 range images, which are acquired by capturing 3 range scans of 40 models from arbitrary view directions. The range images are captured using a Minolta Laser Scanner. The file format is in the ASCII Object File Format (*.off) representing the scan in a triangular mesh.

The target database contains 800 complete 3D models, which are categorized into 40 classes. In each class there are 20 models. The file format to represent the 3D models is the ASCII Object File Format (*.off).

Evaluation Methodology

We will employ the following evaluation measures: Precision-Recall curve; Average Precision (AP) and Mean Average Precision (MAP); E-Measure; Discounted Cumulative Gain; Nearest Neighbor, First-Tier (Tier1) and Second-Tier (Tier2).


The following list is a step-by-step description of the activities:

  • The participants must register by sending a message to %20shrec [at] (SHREC[at]nist[dot]gov). Early registration is encouraged, so that we get an impression of the number of participants at an early stage.
  • The database will be made available via this website. Similarly the query set will be made available, against which the evaluation will be run.
  • Participants will submit the ranked lists and similarity scores for the query set. Upto 5 ranked lists per query set may be submitted, resulting from different runs. Each run may be a different algorithm, or a different parameter setting. More information on the rank list file format.
  • The evaluations will be done automatically.
  • The organization will release the evaluation scores of all the runs.
  • The participants write a one page description of their method and commenting the evaluation results with two figures.
  • The track results are combined into a joint paper, published in the proceedings of the Eurographics Workshop on 3D Object Retrieval.
  • The description of the tracks and their results are presented at the Eurographics Workshop on 3D Object Retrieval (May 2, 2010).


January 26 - Call for participation.
January 29 - The target data set and a sample query set will be available on line. We will provide 5 sample queries. The final query set will not contain the sample queries.
February 3 - Please register before this date.
February 3 - Distribution of the final query sets. Participants can start the retrieval.
February 13 - Submission of results (ranked lists) and a one page description of their method(s).
February 17 - Distribution of relevance judgments and evaluation scores.
February 20 - Submission of final descriptions (two page) for the contest proceedings.
February 24 - Track is finished, and results are ready for inclusion in a track report
March 7 - Camera ready track papers submitted for printing
May 2 - EUROGRAPHICS Workshop on 3D Object Retrieval including SHREC'2010
Created May 7, 2019