Skip to main content
U.S. flag

An official website of the United States government

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

SAMATE Tool Taxonomy

[SAMATE Home | IntrO TO SAMATE | SARD | SATE | Bugs Framework | Publications | Tool Survey | Resources]


1 What We Want of a Taxonomy
   1.1 What Do We Mean by Tool, Function, etc.?
   1.2 Classification Scheme Desiderata
   1.3 Questions a Taxonomy Should Address
2 A Taxonomy of Tools
   2.1 Life Cycle Process or Activity
   2.2 Automation Level
   2.3 Approach
   2.4 Viewpoint
3 Other Useful Data
   3.1 Assessment vs. Development
   3.2 Sponsor
   3.3 Price
   3.4 Platforms
   3.5 Languages/Formats
   3.6 Assessment/Quality
   3.7 Run time
4 References

What We Want of a Taxonomy 

The SAMATE project needs a taxonomy, or classification, of software assurance (SwA) tools and techniques to

  • consistently group and discuss SwA tools and techniques,
  • identify gaps and research needs, and
  • help decide where to put effort. 

What Do We Mean by Tool, Function, etc.? 

Here we use tool to mean a single distinct function or (manual) technique. Function means something producing a (software assurance) result for the user. A Source Code Security Analysis tool looking for weaknesses is a function. A parser is not (unless it, too, reports flaws while parsing).

  • Although we may speak of "a tool", a single program may perform many different functions. Thus one program may be classified under several tool classes.
  • This taxonomy encompasses processes and manual techniques, too. For instance, quality is best designed in at the start, and cannot be effectively "tested in" later. Correct-by-construction may be far better than debugging later. Manual code reviews have a place, too.
  • This taxonomy doesn't classify underlying algorithms: it doesn't matter to tool testing how a checker catches, say, buffer overflows; only if it does. (Of course, different processing techniques may make a huge difference in the accuracy of results or scalability.) But classifying a tool shouldn't depend on possibly proprietary information about how it works.

Classification Scheme Desiderata 

As far as possible, a classification scheme should be

  • Unambiguous: one unique classification for each tool
  • Objective: classify by mechanically comparing attributes
  • Orthogonal: few nonsensical classes
  • Comprehensive: few "other" or "unclassified" entries
  • Easy to use: classes correspond to common notions. Also, one can find relevant classes quickly
  • Usefully distinctive: doesn't combine intuitively different tools or separate tools in the same class. Also, one specification covers all tools in a class. Tools in the same class should work on the same kind of flaws.

Questions a Taxonomy Should Address 

What questions need to be answered to complete the SA Tool/Function taxonomy?

Regarding Tool Classes

  • What classes of tools are currently used to identify potential vulnerabilities in applications today?
  • What is the order of importance of those tools (which have the highest impact in identifying or precluding application vulnerabilities)?
  • What tool classes are most mature?
  • What tool classes are most common?
  • What are the gaps in capabilities amongst tools of the same class?
  • What are the gaps in capabilities for a tool class in general?
  • What classes of tools are missing from the taxonomy of SA tools below?

Regarding Tool Capabilities

  • What are the capabilities that define a particular tool class?
  • What capabilitiess are required for a particular class of tool?
  • What is the domain for each capability?
  • How would each capability be described in a functional specification for that SwA tool?

A validation of a taxonomy is to try classes of tools, which is in the Tool Survey.

A Taxonomy of Tools

This is a proposed taxonomy. We welcome your comments and suggestions.

This taxonomy is a faceted classification, possibly with further hierarchical organization within each class. There are four facets: life cycle process, automation level, approach, and viewpoint.

Life Cycle Process or Activity 

Primary tool or technique are used at different times in the software life cycle. Support tools and techniques, such as management and configuration tools, apply across the life cycle. This is a unification of IEEE/EIA 12207-1996 [1], Kornecki and Zalewski [2], and SWEBOK 2004 [3].

The notation [n a.b.c] means section a.b.c of reference n below.

Primary Processes

Requirements [2] [3 1.1]
[1 5.3.2 & 5.3.4]
Design [2] [3 1.2]
[1 5.3.3, 5.3.5, & 5.3.6]
[1 5.3.7] [2] [3 1.3]
[1 5.1]
Maintenance [1 5.5] [3 1.5]
Testing [2] [3 1.4 & 1.9]
[1 5.3.7 - 5.3.11]
Operation [1 5.4]

Supporting Processes

SWEBOK [3] lists other categories: Miscellaneous Tools and Software Engineering Methods (Heuristic, Formal, and Prototyping).

Automation Level 

How much does the tool do by itself, and how much does the human need to do?

0. Manual procedure

e.g., code review

1. Analysis aid

e.g., call graph extractor

2. Semi-automated

automated results, manual interpretation, e.g., static analyzer for potential flaws or Intrusion Detectors

3. Automated

e.g., firewall


What approach does this tool or technique take to software assurance?

  • Preclude
proactively make flaws impossible, e.g., correct by creation
  • Detect
find flaws, e.g., checkers, testers
  • Mitigate
reduce or eliminate flaw impact, e.g., security kernel, MLS
  • React
take actions upon an event
  • Appraise
report information, e.g., complexity metrics or call graphs


Can we see or "poke at" the internals? External tools do not have access to application software code or configuration and audit data. Internal tools do.

  • External (black box)
e.g., acceptance of COTS packages or Web site penetration tester
  • Internal (white box)
    • Static
      e.g. code scanners
    • Dynamic
      e.g. wrapper, execution monitoring

Other Useful Data 

Tools would not be classified by these (one wouldn't separate commercial from academic tools functionally), but such information would be useful.

Assessment vs. Development 

"DO-178B differentiates between verification tools that cannot introduce errors but may fail to detect them and development tools whose output is part of airborne software and thus can introduce errors." [2, page 19] (Emphasis in the original.)


Who fixes it? Can I get it?

  • Academic
  • Commercial
  • Open
  • Proprietary
used within a company, either as a service or on their own products.


  • 0
  • $ (nomial, e.g., up to about $17)
  • $$ (up to a few hundred dollars)
  • $$$ (significant, thousands of dollars)

Cost of use is another item, but related.


What does it run on? Linux, Windows, Solaris, ...


What is the target language or format? C++, Java, bytecode, UML, ...


How well does it work? Number of bugs. Number of false alarms. Tool pedigree. Maturity of tool. Performance on benchmarks.

Run time 

How long does it run or do per unit (LOC, module, requirement)? Is it quick enough to run after every edit? every night? every month? For manual methods, how often are, say, reviews? Is it scalable?

Computational complexity might be separate or a way of quantifying run time.

  • Simple
  • Decidable
    • P
    • NP
  • Undecidable


[1] IEEE/EIA Std 12207.0-1996, Software life cycle processes

[2] Andrew J. Kornecki and Janusz ZalewskiThe Qualification of Software Development Tools From the DO-178B Certification Perspective, CrossTalk, pages 19-23, April 2006

[3] Guide to the SWEBOK, Chapter 10, 2004. accessed January 2015

Created March 23, 2021, Updated May 17, 2021