Skip to main content
U.S. flag

An official website of the United States government

Dot gov

Official websites use .gov
A .gov website belongs to an official government organization in the United States.


Secure .gov websites use HTTPS
A lock ( ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.

NIST 2020 CTS Speaker Recognition Challenge


Following the success of the 2019 Conversational Telephone Speech (CTS) Speaker Recognition Challenge, which received 1347 submissions from 67 academic and industrial organizations, the US National Institute of Standards and Technology (NIST) will be organizing a 2020 CTS Challenge, the next iteration of an ongoing series of speaker recognition evaluations conducted by NIST since 1996.

Similar to the 2019 CTS Challenge, the 2020 evaluation will feature a leaderboard-style challenge offering an open/unconstrained training condition, but using CTS recordings extracted from multiple data sources containing multilingual speech. In addition, unlike the 2019 CTS Challenge, no Development set will be released.


 The objectives of the evaluation series are (1) for NIST to effectively measure system-calibrated performance of the current state of technology, (2) to provide a common test bed that enables the research community to explore promising new ideas in speaker recognition, and (3) to support the community in their development of advanced technology incorporating these ideas. The evaluations are intended to be of interest to all researchers working on the general problem of text-independent speaker recognition. To this end, the evaluations are designed to focus on core technology issues and to be simple and accessible to those wishing to participate.

Participation in the 2020 CTS Challenge is open to all who find the evaluation of interest and are able to comply with the evaluation rules set forth in the evaluation plan. This page will be updated once the evaluation plan becomes available.

Evaluation Plan

2020 NIST CTS Challenge Evaluation Plan


Please visit:

Results (updated Nov 30)

Results on the Test set will be published here periodically (see Disclaimer below).


1 JHU-MIT TEST 20201116-093757 3.20 0.087 0.090
2 TEAM-CDPE-28 TEST 20201125-191214 3.25 0.093 0.099
3 THUEE TEST 20201030-215029 3.06 0.106 0.107
4 AAP TEST 20201106-132023 3.63 0.165 0.167
5 BiometricVox TEST 20201031-061334 3.95 0.144 0.203
6 I2R TEST 20201023-040028 4.24 0.177 0.208
7 Veridas TEST 20201128-025536 4.69 0.211 0.222
8 ROXANNE TEST 20201023-131702 4.29 0.174 0.273
9 TEAM-QSYF-27 TEST 20201019-043832 5.70 0.283 0.381
10 Elektronika TEST 20201010-060518 12.55 0.570 0.625
11 BGU TEST 20201124-104658 5.88 0.307 0.866
12 GRD TEST 20201021-022551 20.31 0.733 0.951
13 dBLab TEST 20201102-052612 24.58 0.950 1.000
14 XMUSPEECH TEST 20201117-011447 13.40 0.582 3.522
15 TEAM-QTUY-05 TEST 20201030-141025 21.83 0.828 11.951
16 STC TEST 20201114-155620 3.49 0.147 19.000


Participants are allowed to publish the leaderboard results unaltered, but they must not make advertising claims about their standing/ranking in the evaluation, or winning the evaluation, or claim NIST or the U.S. Government endorsement of their system(s) or commercial product(s). See the evaluation plan for more details regarding the participation rules in the NIST CTS Challenge.


For more information about the challenge please send questions to For the CTS Challenge discussion please visit our Google Group.


Created July 15, 2020, Updated January 14, 2021