Skip to main content
U.S. flag

An official website of the United States government

Official websites use .gov
A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS
A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

Static Analysis Tool Exposition (SATE) 2010 Workshop

[SAMATE Home | IntrO TO SAMATE | SARD | SATE | Bugs Framework | Publications | Tool Survey | Resources]

NIST Administration Bldg
Credit: ©Robert Rathe

Looking for Needles in BIG Haystacks

SAMATE meeting 

October 1, 2010

co-located with the
13th semi-annual Software Assurance Forum
National Institute of Standards and Technology
Gaithersburg, Maryland, USA

 

Overview

Software must be developed to have high quality: quality cannot be "tested in". However auditors, certifiers, and others must assess the quality of software they receive. "Black-box" software testing cannot realistically find maliciously implanted Trojan horses or subtle errors which have many preconditions. For maximum reliability and assurance, static analysis must be used in addition to good development and testing. Static analyzers are quite capable and are developing quickly. Yet, developers, auditors, and examiners could use far more capabilities.

The goals of the Static Analysis Tool Exposition (SATE) 2010 are to:

  • Enable empirical research based on large test sets
  • Encourage improvement of tools
  • Speed adoption of tools by objectively demonstrating their use on real software

Briefly, participating tool makers run their tools on a set of programs. Researchers led by NIST analyze the tool reports. This workshop is the first chance the public will have to hear SATE 2010 observations and conclusions.

This workshop has two goals. First, gather participants and organizers of SATE to share experiences, report interesting observations, and discuss lessons learned. The workshop is also an opportunity for attendees to help shape the next SATE in 2011.

The second goal is to convene researchers, tool developers, and government and industrial users of software assurance tools to define obstacles to urgently-needed software assurance capabilities and identify engineering or research approaches to overcome them. We solicit contributions describing basic research, applications, experience, or proposals relevant to software assurance tools, techniques, and their evaluation. Questions and topics of interest include but are not limited to:

  • Contribution of static analysis to software security assurance
  • Issues in applying static analysis to binaries
  • System assurance at the design or requirements level
  • Integration of, or tradeoffs between, static and dynamic analysis
  • Issues in scaling static analysis to deal with large systems
  • Flaw catching vs. sound analysis
  • Benchmarks or reference datasets
  • Formal descriptions of weaknesses and vulnerabilities in the CWE
  • User experience drawing useful lessons or comparisons
  • Synergies of pre- and post-production assurance
  • Case studies on real applications
  • Temporal and inter-tool information sharing

Who Should Attend?

Those who develop, use, purchase, or review software assurance tools should attend. Academicians who are working in the area of semi- or completely automated tools to review or assess the security properties of software are especially welcome. We encourage participation from researchers, students, developers, and assurance tool users in industry, government, and universities.

This workshop follows the SATE 2009 Workhop, Static Analysis Tool Exposition 2008 (at SAW), the Static Analysis Summit II (at SIGAda 2007), and the first Static Analysis Summit in 2006.

Important Date

  •   1 October: Workshop

Registration

Registration is closed.

Program

The program will consist of presentations by participants in and organizers of the 2010 Static Analysis Tool Exposition.

This is the final program.

8:30 AM Welcome to SATE 2010 - Paul E. Black, NIST, SATE organizer

8:40 SATE 2010 background, Vadim Okun, NIST, SATE organizer

9:10 Bringing Static Analysis to the Masses, Tucker Taft, SofCheck, Inc., SATE participant

9:35 Running Goanna for SATE - What we found, how and why, Ansgar Fehnker, Red Lizard, SATE participant

10:00 Coverity Analysis: Improving Quality in the Software Supply Chain, Peter Henriksen, Coverity, SATE participant

10:25 break

11:00 Choosing SATE Test Cases Based on CVEs, Sue Wang, SATE organizer 

11:30 Bugs that Matter - Static Analysis True Positives and False Negatives, Paul Anderson, GrammaTech, SATE participant

11:55 Static Analysis Software Assurance Tools and SATE 2010, Nat Hillary, LDRA, SATE participant

12:20 PM Lunch

1:30 Observations From Analysis, Aurelien Delaitre, NIST, SATE organizer

1:50 Improving Static Analysis Results Accuracy, Chris Wysopal, Veracode, SATE participant

2:15 Our Sparrow Experience with Abstract Interpretation and Impure Catalysts, Kwangkeun Yi, Seoul National University, SATE participant

2:40 break

3:00 Discussion session: planning SATE 2011 Paul E. Black, NIST, SATE organizer

4:30 The use of machine learning with signal- and NLP processing of source code to detect and classify vulnerabilities and weaknesses with MARFCAT, Serguei Mokhov, Concordia, SATE participant

4:55 Closing remarks - Paul E. Black, NIST, SATE organizer

Organization

General Chair

Paul E. Black (NIST) paul.black [at] nist.gov (paul[dot]black[at]nist[dot]gov)

Program Planning Committee

Redge Bartholomew (Rockwell-Collins)
Steve Christey (MITRE)
Romain Gaucher (Cigital)
Raoul Jetley (FDA)
Scott Kagan (Lockheed-Martin)
Michael Lowry (NASA)
Jaime Merced (DoD)
Frédéric Painchaud (DRDC Canada)
Ajoy Kumar (DTCC)

Created March 24, 2021, Updated May 17, 2021