the software assurance metrics and tool evaluation

Upload: neovik82

Post on 30-May-2018

225 views

Category:

Documents


0 download

TRANSCRIPT

  • 8/14/2019 The Software Assurance Metrics and Tool Evaluation

    1/21

    This is a work of the U.S. Government and is not subject tocopyright protection in the United States.

    The OWASPFoundation

    OWAS

    PAppSec

    DCOctober 2005

    http://www.owasp.org/

    The Software Assurance

    Metrics and Tool Evaluation(SAMATE) Project

    Paul E. Black

    Computer Scientist, NIST

    [email protected]

    +1 301-975-4794

  • 8/14/2019 The Software Assurance Metrics and Tool Evaluation

    2/21

    2OWASP AppSec DC 2005

    Outline

    Overview of the NIST SAMATE projectPurposes of tool and technique evaluation

    Software and effectiveness metrics

    Report of workshop on Defining the Stateof the Art in Software Security Tools

    Final comments

  • 8/14/2019 The Software Assurance Metrics and Tool Evaluation

    3/21

    3OWASP AppSec DC 2005

    The NIST SAMATE Project

    SurveysTools

    Researchers and companies

    Workshops & conference sessionsTaxonomy of software assurance (SwA) functions &

    techniquesOrder of importance (cost/benefit, criticalities, )

    Gaps and research agendas

    Studies to develop metrics

    Enable tool evaluationsWrite detailed specificationDevelop test plans and reference material

    Collect tool evaluations, case studies, and comparisons

    http://samate.nist.gov/

  • 8/14/2019 The Software Assurance Metrics and Tool Evaluation

    4/21

    4OWASP AppSec DC 2005

    Taxonomy of SwA Tool Functions andTechniques

    Concept or business need Use cases Changes to current systems

    Requirements and design Consistency Completeness Compliance

    Implementation The usual

    Assessment and acceptance External

    Automated vulnerability scanners Penetration test assistants Other standard testing techniques: usage, spec-based, statistical, worst-case/criticality, etc.

    Insider Automated code scanners

    Syntactic, e.g., grep Semantic

    Code review assistants Source code

    Virtual Machine code (e.g., Java bytecode or .Net intermediate code) Binary (debugger, decompiler)

    Operation Operator training Auditing Penetration test

  • 8/14/2019 The Software Assurance Metrics and Tool Evaluation

    5/21

    5OWASP AppSec DC 2005

    Planned Workshops

    Enumerate SwA functions and techniquesApproach (code vs. spec, static vs. dynamic)Software type (distributed, real time, secure)

    Type of fault detected

    Recruit focus groups

    Which are the most important?Highest cost/benefit ratio?

    Finds highest priority vulnerabilities?

    Most widely used?

    Critique reference dataset

    Identify gaps in functions and recommend researchPlan and initiate studies for metrics

  • 8/14/2019 The Software Assurance Metrics and Tool Evaluation

    6/21

    6OWASP AppSec DC 2005

    Outline

    Overview of the NIST SAMATE projectPurposes of tool and technique evaluation

    Software and effectiveness metrics

    Report of workshop on Defining the Stateof the Art in Software Security Tools

    Final comments

  • 8/14/2019 The Software Assurance Metrics and Tool Evaluation

    7/21

    7OWASP AppSec DC 2005

    Purposes of Tool or TechniqueEvaluations

    Precisely document what a tool does (anddoesnt) do

    in order to

    Provide feedback to tool developersSimple changes to make

    Directions for future releases

    Inform users

    Match the tool or technique to a particularsituation

    Understand significance of tool results

    Know how techniques work together

  • 8/14/2019 The Software Assurance Metrics and Tool Evaluation

    8/21

    8OWASP AppSec DC 2005

    Developing a Specification

    After a technique or tool function is selected bythe working group

    NIST and focus group develops a clear, testablespecification

    Specification posted for public comment Comments incorporated

    Develop a measurement methodology: Test cases

    Procedures Reference implementations and data

    Scripts and auxiliary programs

    Interpretation criteria

  • 8/14/2019 The Software Assurance Metrics and Tool Evaluation

    9/21

    1 2 6 12 18 24

    Workshop1

    SwA

    classes

    3 4 5 9 15 21

    Workshop 3

    metrics

    studies

    Workshop 2

    research

    gaps

    focus

    group

    class 1

    focus

    group

    class 1

    Function

    Taxonomy

    Tool

    SurveySurvey

    Publication

    strawman

    spec

    test plan

    test plandraft

    Spec0

    Spec1

    select func

    strawman

    spec

    draft

    Spec0

    Spec1

    SAMATE Project Timeline

    focus

    group

    class 2

    focus

    group

    class 2

    tooltesting

    matrix

    select func

  • 8/14/2019 The Software Assurance Metrics and Tool Evaluation

    10/21

    10OWASP AppSec DC 2005

    Why Look at Checking First?

    Vital for software developed outside, i.e.,when process is not visible

    Applicable to legacy software

    Feedback for process improvement

    Process experiments are expensive

    Many are working on process (SEI, PSP,etc.)

  • 8/14/2019 The Software Assurance Metrics and Tool Evaluation

    11/21

    11OWASP AppSec DC 2005

    Outline

    Overview of the NIST SAMATE projectPurposes of tool and technique evaluation

    Software and effectiveness metrics

    Report of workshop on Defining the Stateof the Art in Software Security Tools

    Final comments

  • 8/14/2019 The Software Assurance Metrics and Tool Evaluation

    12/21

    12OWASP AppSec DC 2005

    But, is the tool or methodologyeffective?

    Is this program secure (enough)?How secure does toolX make a program?

    How much more secure does techniqueXmake aprogram after techniques YandZ?

    Do they really find or prevent bugs andvulnerabilities?

    Dollar for dollar, does methodology P or S givemore reliability?

  • 8/14/2019 The Software Assurance Metrics and Tool Evaluation

    13/21

    13OWASP AppSec DC 2005

    Toward Software Metrics

    Qualitative comparison

    Formally defined quantity

    Unit and scale

    Measured valueDerived units

    warmer, colder buggy, secure

    temperature quality? confidence?

    degree, Kelvin ?

    Heat energy=smt Software assurancept

  • 8/14/2019 The Software Assurance Metrics and Tool Evaluation

    14/21

    14OWASP AppSec DC 2005

    Benefits of SAMATE Project

    Define metrics for evaluating SwA tools

    Users can make more informed toolchoices

    Neutral test program

    Tool creators make better tools

  • 8/14/2019 The Software Assurance Metrics and Tool Evaluation

    15/21

    15OWASP AppSec DC 2005

    Outline

    Overview of the NIST SAMATE projectPurposes of tool and technique evaluation

    Software and effectiveness metrics

    Report of workshop on Defining the Stateof the Art in Software Security Tools

    Final comments

  • 8/14/2019 The Software Assurance Metrics and Tool Evaluation

    16/21

    16OWASP AppSec DC 2005

    Workshop on Defining the State of theArt in Software Security Tools

    Workshop CharacteristicsNIST, Gaithersburg

    10 & 11 August 2005

    http://samate.nist.gov/softSecToolsSOA

    45 people from government, universities, vendors and service providers,and research companies came

    Proceedings, including discussion notes and submitted material, should

    be available from above URL when you see this.

    GoalsUnderstand the state of the art of software security assurance tools in

    detecting security flaws and vulnerabilities.

    Discuss metrics to evaluate the effectiveness of such tools.

    Collect flawed and clean software for a reference dataset.

    Publish a report on classes of software security vulnerabilities.

  • 8/14/2019 The Software Assurance Metrics and Tool Evaluation

    17/21

    17OWASP AppSec DC 2005

    Outcomes of Workshop I

    Understand the state of the art of softwaresecurity assurance tools in detectingsecurity flaws and vulnerabilities.A report is being written.

    Discuss metrics to evaluate the tooleffectivenessAll agreed that software metrics and tool

    effectiveness metrics are a good idea.

    No consensus on how to approach thechallenge.

  • 8/14/2019 The Software Assurance Metrics and Tool Evaluation

    18/21

    18OWASP AppSec DC 2005

    Outcomes of Workshop II

    Collect flawed and clean software to be areference.Several collections emerged: MIT, Fortify, etc.

    Attendees agreed that a shared reference dataset wouldhelp.

    NIST reference dataset in development.Prototype available at (URL forthcoming)

    Report on classes of software securityvulnerabilitiesDiscussed several existing flaw taxonomies: Clasp,

    PLOVER (CVE), etc.Attendees agreed a common taxonomy would help.

    Discussions continuing on samate email list.

  • 8/14/2019 The Software Assurance Metrics and Tool Evaluation

    19/21

    19OWASP AppSec DC 2005

    Outline

    Overview of the NIST SAMATE projectPurposes of tool and technique evaluation

    Software and effectiveness metrics

    Report of workshop on Defining the Stateof the Art in Software Security Tools

    Final comments

  • 8/14/2019 The Software Assurance Metrics and Tool Evaluation

    20/21

    20OWASP AppSec DC 2005

    Society has 3 options:

    Learn how to make software thatworks

    Limit size or authority of software

    Accept failing software

  • 8/14/2019 The Software Assurance Metrics and Tool Evaluation

    21/21

    21OWASP AppSec DC 2005

    Contact to Participate

    Paul E. Black

    Project Leader

    Software Diagnostics & Conformance TestingDivision, Software Quality Group, Information

    Technology Laboratory, NIST

    [email protected]