Skip to main content
U.S. flag

An official website of the United States government

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

Draft White Paper on Combinatorial Methods for Explainability in AI and Machine Learning

NIST has release a draft white paper for public comment: "An Application of Combinatorial Methods for Explainability in Artificial Intelligence and Machine Learning." Comments are due by July 3, 2019.

A draft white paper is now available for comment, An Application of Combinatorial Methods for Explainability in Artificial Intelligence and Machine Learning.

This short paper introduces an approach to producing explanations or justifications of decisions made in some artificial intelligence and machine learning (AI/ML) systems, using methods derived from those for fault location in combinatorial testing. We show that validation and explainability issues are closely related to the problem of fault location in combinatorial testing, and that certain methods and tools developed for fault location can also be applied to this problem. This approach is particularly useful in classification problems, where the goal is to determine an object’s membership in a set based on its characteristics. We use a conceptually simple scheme to make it easy to justify classification decisions: identifying combinations of features that are present in members of the identified class but absent or rare in non-members. The method has been implemented in a prototype tool called ComXAI, and examples of its application are given. Examples from a range of application domains are included to show the utility of these methods.

The public comment period for this document ends on July 3, 2019.  See the document details for a copy of the paper and instructions for submitting comments.

Released May 22, 2019, Updated June 7, 2019