NIST works with the AI community to identify the building blocks needed to cultivate trust and ensure that AI systems are accurate, reliable, safe, secure and resilient, robust, explainable and interpretable – and that they mitigate bias while also taking into account privacy and fairness. To foster collaboration and develop a shared understanding of what constitutes trustworthy AI, NIST has been organizing a series of workshops bringing together government, industry, academia, and other stakeholders from the US and around the world. The workshops’ focus is on advancing the development of AI standards, guidelines, and related tools.
Recent Workshops and Events
Video on NIST's Responsibilities Under President’s Executive Order (EO) on Safe, Secure, and Trustworthy Artificial Intelligence issued on October 30, 2023,
NIST released a video with more information about its role in the EO on November 9, 2023. Watch now.
November 17, 2023 | Washington, DC (Hybrid)
NIST hosted a workshop on November 17, 2023, to engage in a conversation about artificial intelligence (AI) safety. The hybrid workshop was held at the Department of Commerce in Washington, DC, with options for virtual or in person attendance. View recording, agenda, and slides.
NIST relies on and encourages robust interactions with companies, universities, nonprofits, and other government agencies in driving and carrying out its AI agenda. There are multiple ways to engage with NIST, including:
NIST Draft Reports: NIST counts on stakeholders to review drafts of reports on a variety of AI issues. Drafts typically are prepared based on inputs from private and public sector individuals and organizations and then offered for public review on NIST’s AI website and via email alerts. Public comments help to improve them. Sign up for AI-related emails here.
Workshops: NIST convenes experts for single day, multi-day, and multi-week sessions to tackle key characteristics of AI trustworthiness and other AI-related topics. All workshops are virtual for now and readily accessible. Most use online discussion forums to encourage two-way communications.
Requests for Information (RFIs): NIST sometimes uses formal RFIs to alert the public about its AI activities and to gain insights into specific AI issues. For example, an RFI was issued to help develop the AI Risk Management Framework.”
AI Visiting Fellows: Accomplished Visiting Fellows bring thought leadership to foundational research for trustworthy and responsible AI, use-inspired AI research, AI hardware research, as well as AI related standards and evaluations conducted in NIST laboratories.
Student Programs: NIST offers a range of opportunities for students to engage with NIST on AI-related work. That includes the Professional Research Experience Program (PREP), which provides valuable laboratory experience and financial assistance to undergraduate, graduate, and post-graduate students.
Grants: NIST offers some financial assistance to support collaborative research including AI projects.
Sign up for AI email alerts here. If you have questions or ideas about how to engage with us on AI topics or have ideas about NIST’s AI activities, send us an email: ai-inquiries [at] nist.gov (ai-inquiries[at]nist[dot]gov)