Challenge Design and Lessons Learned from the 2018 Differential Privacy Challenges
Diane Ridgeway, Mary Theofanos, Terese Manley, Christine Task
The push for open data has made a multitude of datasets available enabling researchers to analyze publicly available information using various statistical and machine learning methods in support of policy development. An area of increasing interest that is being made available is public safety data, which can include both sensitive information and Personally Identifiable Information (PII). Release of sensitive data and PII can lead to individual and organizational harm. However, the removal of PII alone is an insufficient approach to preventing linkage attacks -- the process of combining unrelated data to identify individuals and entities. A growing body of academic research in the field of differential privacy exists which claims strict mathematical guarantees of data privacy, but with a potentially greater loss of dataset utility. In 2017 National Institute of Standards and Technology (NIST) Public Safety Communications Research (PSCR) Division initiated efforts to test, evaluate and strengthen research in differential privacy and add to its growing body of knowledge by making available open source algorithms for public safety use. This publication describes the design and results of PSCR's multi-phased innovation prize challenge and makes recommendations for conducting future challenges in differential privacy.
, Theofanos, M.
, Manley, T.
and Task, C.
Challenge Design and Lessons Learned from the 2018 Differential Privacy Challenges, Technical Note (NIST TN), National Institute of Standards and Technology, Gaithersburg, MD, [online], https://doi.org/10.6028/NIST.TN.2151, https://tsapps.nist.gov/publication/get_pdf.cfm?pub_id=931343
(Accessed September 20, 2021)