Create guidelines for agencies to evaluate the efficacy of differential-privacy-guarantee protections, including for AI (EO Sec. 9(b))
The comment period has closed on the Initial Public Draft of the Guidelines for Evaluating Differential Privacy Guarantees. This publication, which fulfills one of NIST’s assignments under EO 14110, is intended to help agencies and practitioners of all backgrounds understand how to evaluate promises made (and not made) when deploying differential privacy, including for privacy-preserving machine learning.
NIST released draft Guidelines for Evaluating Differential Privacy Guarantees (SP 800-226) for public comment on December 11, 2023.
The document is intended to help agencies and practitioners of all backgrounds—policy makers, business owners, product managers, IT technicians, software engineers, data scientists, researchers, and academics—better understand how to evaluate promises made (and not made) when deploying differential privacy, including for privacy-preserving machine learning. The draft also provides a supplemental interactive software archive that illustrates how to achieve differential privacy and other concepts described in the publication.
Comments were accepted until January 25, 2024. Visit the publication page for additional details.