The Center for AI Standards and Innovation (CAISI) has signed a collaborative research and development agreement (CRADA) with OpenMined, a 501(c)(3) non-profit that develops open-source software infrastructure for secure computation across organizational boundaries. Under this agreement, CAISI and OpenMined will collaborate on research into privacy-preserving methods for conducting AI evaluations, enabling rigorous measurement of AI systems even when the underlying data, models, or benchmarks must remain confidential due to, for example, intellectual property concerns, data protection requirements, or national security considerations.
Access to real-world or sensitive data presents a challenge for researchers as AI evaluations are increasingly intended to reflect or predict real-world deployments. It is simultaneously crucial that data is shared in a secure and decentralized manner, in order to safeguard intellectual property, encourage innovation, and maintain privacy.
This collaboration will leverage OpenMined’s software infrastructure, including PySyft and subsequent advances, to conduct evaluations that address both the security requirements of AI developers and data owners, as well as the scientific rigor demanded by researchers and evaluators.
The insights generated from this collaboration will support NIST's efforts in AI security and applied AI evaluation. This research will inform the development of voluntary standards, best practices, and future recommendations for AI practitioners and adopters on how to effectively measure AI systems, e.g., for workforce or productivity uplift and other impacts.
For questions, please contact caisi-metrology [at] nist.gov (caisi-metrology[at]nist[dot]gov).