Skip to main content
U.S. flag

An official website of the United States government

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

Executive Order on Safe, Secure, and Trustworthy Artificial Intelligence

NIST, Engineering Biology Research Consortium (EBRC) to develop safety tools for synthetic biology to defend against potential misuse of AI. Read announcement

The comment period has closed on the NIST Request for Information (RFI) related to its responsibilities under Executive Order 14110 on Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence. NIST will continue to accept responses to the issues cited in the RFI, although timing will determine whether or not we can consider those comments due to the tight timetables in the Executive Order. View comments received.  Responses may be sent to ai-inquiries [at] nist.gov (ai-inquiries[at]nist[dot]gov) and should follow the formatting guidance outlined in Federal Register Notice 88 FR 88368

NIST’s Responsibilities Under the October 30, 2023 Executive Order

Overview

The President’s Executive Order (EO) on Safe, Secure, and Trustworthy Artificial Intelligence (14110) issued on October 30, 2023, charges multiple agencies – including NIST – with producing guidelines and taking other actions:

  • Develop a companion resource to the NIST AI Risk Management Framework, NIST AI 100-1 for  generative AI.
  • Develop a companion resource to the NIST Secure Software Development Framework to incorporate secure-development practices for generative AI and dual-use foundation models.
  • Launch a new initiative to create guidance and benchmarks for evaluating AI capabilities, with a focus on capabilities that could cause harm.
  • Develop and help to ensure the availability of testing environments in coordination with the Department of Energy (DoE) and the National Science Foundation (NSF).
  • Establish guidelines and processes – except for AI used as a component of a national security system – to enable developers of generative AI, especially dual-use foundation models, to conduct AI red-teaming tests to enable deployment of safe, secure, and trustworthy systems. 
  • Initiate an effort to engage with industry and relevant stakeholders to develop guidelines for possible use by synthetic nucleic acid sequence providers. 
  • Develop a report to the Director of OMB and the Assistant to the President for National Security Affairs identifying existing standards, tools, methods, and practices, as well as the potential development of further science-backed standards and techniques for authenticating, labeling or detecting synthetic content; preventing generative AI from producing child sexual abuse material or producing non-consensual intimate imagery of real individuals; and testing software for the above mentioned purposes. 
  • Create guidelines for agencies to evaluate the efficacy of differential-privacy-guarantee protections, including for AI.
  • Establish a plan for global engagement on promoting and developing AI standards.

Most of the EO tasks to NIST have a 270 day deadline. 

NIST will seek public comment on draft documents produced under the EO.

For EO-related questions email: ai-inquiries [at] nist.gov (ai-inquiries[at]nist[dot]gov).

NIST's due dates under Executive Order 14110

 

News and Updates

Contacts

Media Inquiries