testing what can goswrog? teting cmput …ugweb.cs.ualberta.ca/~c401/w07/lectures/05-4up.pdfcmput...

22
Testing CMPUT 401—Module 05 Department of Computing Science University of Alberta Ken Wong, 2007 2 Testing ! Goals: ! find problems in a program ! try to refute that the program works correctly ! track and fix problems ! to assist management and improve quality ! verification ! check that requirements are satisfied ! avoid regression ! check re-occurrence of previous problems ! establish due diligence ! reference in case of product liability litigation 3 What Can Go Wrong? ! Some examples: ! algorithm ! code logic does not produce the proper output ! accuracy ! calculated result is not to desired level of accuracy ! overload ! data structure is filled past its specified capacity ! timing ! problem in coordinating concurrent processes ! system ! problem with operating system or hardware ! performance ! does not perform at the required speed 4 Testing ! Scientific approach: ! correct way to test a theory is not to verify it, but to seek to refute it ! avoid ignoring data that throws off the theory ! cannot “prove” correctness by testing ! cannot test a program completely ! cannot reveal the absence of bugs

Upload: dodat

Post on 07-Jul-2018

221 views

Category:

Documents


0 download

TRANSCRIPT

Testing

CMPUT 401—Module 05

Department of Computing Science

University of Alberta

Ken Wong, 2007

2

Testing

! Goals:

! find problems in a program

! try to refute that the program works correctly

! track and fix problems

! to assist management and improve quality

! verification

! check that requirements are satisfied

! avoid regression

! check re-occurrence of previous problems

! establish due diligence

! reference in case of product liability litigation

3

What Can Go Wrong?

! Some examples:

! algorithm

! code logic does notproduce the properoutput

! accuracy

! calculated result isnot to desired level ofaccuracy

! overload

! data structure is filledpast its specifiedcapacity

! timing

! problem incoordinatingconcurrent processes

! system

! problem withoperating system orhardware

! performance

! does not perform atthe required speed

4

Testing

! Scientific approach:

! correct way to test a theory is not to verifyit, but to seek to refute it

! avoid ignoring data that throws off thetheory

! cannot “prove” correctness by testing

! cannot test a program completely

! cannot reveal the absence of bugs

5

Testing

! Not always adversarial:

! the best tester is not the one who finds themost defects or who embarrasses the mostprogrammers

! the best tester is one who gets the mostdefects fixed

! quality comes from those who construct theproduct, not only from testers

6

Testing

! Note:

! never be the gatekeeper

! testers should not have veto control over aproduct release

! rest of the team could relax, assumingquality problems will be caught by testers

! share the authority to release

7

Testing Strategies

! Incremental strategy:

! top down

! implement and test highest-level modules first

! need stubs for lower-level functionality

! do not need test drivers

! bottom up

! implement and test lowest-level modules first

! need to write test drivers

! typical to combine both 8

Testing Strategies

! White box testing:

! create test cases while referring to thesource code (how does that help?)

! also known as glass box testing

! Black box testing:

! create test cases without looking at thesource code

! but not “ignorance-based” testing

9

Black Box Testing

! Examples:

! beta testing

! get feedback from users

! release and integrity testing

! gather all materials and release

! certification

! testing by a third party

! final acceptance testing

! meets the requirements

10

Acceptance Testing

! Goal:

! flag serious problems in basic functionality

! mainstream tests of the required features(not boundary cases)

! refer to use cases or service descriptions tocreate acceptance tests

11

Testing

! Note:

! testing requires inference, not just thecomparison of output to expected results

! tester must design tests and determine theexpectations

12

Test Matrix for Input Fields

! Typical tests for an integer field:

! nothing

! empty field (clear default value)

! 0

! valid value

! at lower bound (LB) of value – 1

! at LB of value

! at upper bound (UB) of value

! at UB of value + 1

! …

13

Test Matrix for Input Fields

! Typical tests for an integer field:

! far below the LB of value

! far above the UB of value

! at LB number of digits

! at LB – 1 number of digits

! at UB number of digits

! at UB + 1 number of digits

! far more than UB number of digits

! negative

! … 14

Test Matrix for Input Fields

! Typical tests for an integer field:

! non-digits

! wrong data type (like floating point)

! expressions

! leading space(s)

! leading zero(s)

! leading + sign(s)

! non-printing character

! filename reserved characters, like / and :

! …

15

Test Matrix for Input Fields

! Typical tests for an integer field:

! language reserved characters

! upper ASCII (128–254) characters

! ASCII 255

! uppercase characters, like “O”

! lowercase characters, like “l”

! modifiers, like ctrl, alt, shift-ctrl, etc.

! function keys

! enter nothing but wait (timeout?)

! … 16

Test Matrix for Input Fields

! Typical tests for an integer field:

! enter one digit and wait

! enter digits and edit

! enter digits while the system is reacting toother interrupts, like clock events, mousemovement, file saving, etc.

! enter a digit, change focus to anotherapplication, return to this application, whathappens?

17

Test Matrix for Input Fields

! Matrix (situations versus tests):

Field

Noth

ing

Em

pty

0

LB–1

…LB

UB

UB+

1

ucase

lcase

modifie

r

fkey

18

Test Matrix forWriting a File to Disk

! Question:

! What are some situations where a programwould attempt to write a file?

19

Test Matrix forWriting a File to Disk

! Example situations:

! saving a new file

! overwriting a file with the same name

! appending to a file

! replacing a file under edit, same name

! exporting to a new file format

! printing to disk

! logging messages or errors to disk

! saving a temporary file

! … 20

Test Matrix forWriting a File to Disk

! Question:

! What are some interesting test cases forthe situation of trying to save a file?

21

Test Matrix forWriting a File to Disk

! Example test cases:

! save to a full disk

! also on local network, remote network

! save to an almost full disk

! also on local network, remote network

! save to a write-protected disk

! also on local network, remote network

! save to a file, directory, or disk where youdo not have write permissions

! …22

Test Matrix forWriting a File to Disk

! Example test cases:

! save to a damaged disk

! also on local network, remote network

! save to an unformatted disk

! also on local network, remote network

! remove disk or mount after opening the filefor writing

! timeout waiting for disk to come backonline

! …

23

Test Matrix forWriting a File to Disk

! Example test cases:

! create keyboard and mouse I/O during asave

! generate some other interrupt during asave

! power off the local computer during a save

! power off the drive or remote computerduring a save

24

Testing

! Note:

! do not think “no one would do that”

! extreme-looking bugs arepotential security flaws

! buffer overruns

! excess data runs into other memory spaces

! skilled cracker can use flaw as back door into thesystem

25

Testing

! Note:

! “non-reproducible” bugs are reproducible

! must find the right context under which thefailure occurs

! bug on delayed fuse, due to memory leak,wild pointer, corrupted stack

! bug might only occur at a specific time

! bug might be the remnant of anotherfailure

26

Traceability Matrix

! Another matrix:

! each row contains a test case

! each column contains a specification item

! e.g., function, variable, compatible device, claim

! each cell shows which test cases testswhich items

! if a feature changes, use matrix to seewhich tests need to be redone

27

Traceability Matrix

613323

xxT6

xxxT5

xxxT4

xxxT3

xxxxT2

xxxT1

S6S5S4S3S2S1

28

Combination Testing

! Idea:

! testing several variables together

! major problem is the number of test casesto consider

! e.g., 3 variables, each with 100 possible values,leads to 1 000 000 test cases

29

Combination Testing

! Domain partitioning:

! reduce the number of values that will betested for each variable

! e.g., upper and lower bounds, representativevalues

! may still be too many cases to be practical

! e.g., 3 variables, 5 tests each, leads to 125 tests

30

Combination Testing

! All singles technique:

! simplest set of combination tests thatensures you cover every value of interest

! create tests so that every value of everyvariable must appear in a least one test

31

Combination Testing

! All singles example:

! 3 variables each with 5 values

! V1 for operating system (A, B, C, D, E)

! V2 for web browser (F, G, H, I, J)

! V3 for attached printer (V, W, X, Y, Z)

! try to reduce the number of configurationsto a manageable number

32

Combination Testing

! All singles example (5 cases):

ZJET5

YIDT4

XHCT3

WGBT2

VFAT1

printerbrowserOS

33

Combination Testing

! Issue:

! but some popular and importantconfigurations could be missed

! address this problem by adding test cases

34

Combination Testing

! All pairs technique:

! create tests so every value of everyvariable is paired with every value of everyother variable in at least one test case

! e.g., …

35

Combination Testing

ZKCT13

YJCT12

XICT11

XMBT10

VLBT9

YKBT8

ZJBT7

WIBT6

ZMAT5

YLAT4

XKAT3

WJAT2

VIAT1

V3V2V1

36

Combination Testing

YMET25

XLET24

WKET23

VJET22

ZIET21

WMDT20

ZLDT19

VKDT18

XJDT17

YIDT16

VMCT15

WLCT14

V3V2V1

37

White Box Testing

! Examples:

! automated unit testing

! automatically test the program in isolated pieces

! issue of automating only “easy” tests

! control flow

! tester can track execution, changes of variables,entered functions in a debugger

! data integrity

! backward and forward slice on a variable in astatement

! …38

White Box Testing

! Examples:

! internal boundaries

! see special constants used, for forcing overflowsor special processing

! use coverage monitors

! see which statements, paths, or branches havenot been tested

! algorithm-specific testing

! e.g., inverting an ill-conditioned matrix,geometric degeneracies

39

White Box Testing

! Deliberate errors:

! mutations

! make a small change to the program

! effect of change should show up in some test; ifnot, probably inadequate test set

! bebugging

! introduce bugs to check the efficiency of thetesters

! estimate bugs left

! need representative bugs

40

Code Analysis

! Static! read the code

! e.g., type checking, compiler analyses, lint,metrics, model checkers, slicers, etc.

! Dynamic! run the code and observe/record/count events or

conditions

! e.g., debuggers, profilers, tracers, heap viewers,coverage tools, crash dumps, assertions, etc.

41

Analysis

! Question:

! How do you figure out what affected thevalue of a variable?

42

Program Slicing

! Idea:

! isolate the behavior ofa set of variables

! Backward slice:

! using criterionV = { a, b, c, d }

on statementz = a;

43

Program Slicing

! Forward slice:

! using criterionV = { z }

on statementz = a;

44

Program Slicing

! Backward slice:void main()

{

int i = 1;

int sum = 0;

while (i < 10) {

sum = add( sum, i );

i = add( i, 1 );

}

printf( “i = %d\n”, i );

}

static int add( int a, int b )

{

return (a + b);

}

45

Dynamic Analysis

! Instrumentation:

! special monitoring, control, or library codeenabled, inserted, or replaced in thesoftware to support a dynamic analysis

! needs to be relatively non-intrusive

! to the runtime performance of the application

! to the development process

46

Dynamic Analysis

! Instrumentation:

! typically done by operating system, codetransformer, aspect weaver, compiler,linker, within interpreter, or manually

! e.g., insert code in function bodies to record theirinvocation

! e.g., replace production malloc library withdebugging version

! e.g., inserting printf statements

47

Analysis

! Question:

! How do you know where to start looking forfixing a bug?

48

A Dynamic Analysis

! Software reconnaissance:

! locate program features or bugs

! focus attention on important areas

! provide “good places to start looking”

— Wilde

49

Software Reconnaissance

! Idea:

! study differences in execution profiles

50

Software Reconnaissance

! Idea:

! points of divergence are starting places forlocal exploration

! starting place for setting a breakpoint in adebugger

! finding code unique to the feature

51

Software Reconnaissance

! Issues:

! features that look different but areexecuted by the same code

! features used in every test case

! level of granularity

52

Defect Analysis

! Grady (1992):

! approach from Hewlett Packard

! statistical quality control

53

Defect Analysis

! Rate of defects found/fixed over time:

54

Defect Analysis

! Use:

! detecting a problematic release

! software barrier(creating as many defects as you arefixing)

55

Defect Analysis

! Counts of pre and postrelease defectsby subsystem:

56

Defect Analysis

! Use:

! improve testing and inspection?

! reengineer?

! replace?

57

When to Release

! Question:

! Is the software good enough to releasenow?

! When will the software be good enough torelease?

58

When to Release

! Estimate remaining defect correctionwork:

! use number of open defects and averageeffort per defect

! e.g., 250 open defects, 4 hours on average perdefect fix, need 1000 hours of work to fix justthese defects

! refine estimate to allow for newly founddefects

! refine estimate to account for severity ofdefects

59

Defect Density

! Defect density =

! defects found / size of code

60

Predicting Latent Defects

! Defect density example:

! version 1 has 100 KLOC, 700 defects foundprerelease, 100 defects found postrelease

! defect density is ((700+100)/100) = 8 defectsper KLOC

! …

61

Predicting Latent Defects

! Example:

! version 2 has an additional 50 KLOC, 400defects found in this code prerelease, 75defects found post-release

! defect density is ((400+75)/50) = 9.5 defects perKLOC in just this code

! …

62

Predicting Latent Defects

! Example:

! has version 3 beentested/inspected/reviewed enough torelease?

! version 3 has an additional 100 KLOC, 600defects found in this code so far

! defect density is (600/100) = 6 defects per KLOCin just this code

63

Predicting Latent Defects

! Example:

! unless the development process hasimproved, expected about 8 to 9.5 defectsper KLOC or 800 to 950 defects to be found

! to find and fix only 90% of the defectsprerelease, expect 720 to 855 defects

! thus, not ready to release

64

Defect Density

! Industry average for structuredprogramming:

! 15 to 50 defects / KLOC

65

Defect Density

! Code reading/inspection andindependent testing:

! 10 to 20 defects / KLOC prerelease and0.5 defects / KLOC postrelease[Microsoft]

66

Defect Density

! Cleanroom:

! 0 defects in 500 KLOC postrelease(shuttle onboard flight software)[Federal Systems Company]

67

Selected Testing Techniques

! Decisions to make:

! testers

! who does the testing

! coverage

! what gets tested

! potential problems

! why you are testing

! activities

! how you test

! evaluation

! how to tell if test passed or failed68

Tester-Based Techniques

! Who does the testing:

! user testing

! alpha testing

! beta testing

! bug bashes

! subject-matter expert testing

! paired testing

! eat your own dogfood

69

Coverage-Based Techniques

! What gets tested:

! function testing

! test every function, one by one

! unit testing if glass box

! feature or functional testing if black box

! menu tour

! walk through all the menus and dialogs in a GUIproduct, taking every available choice

! …

70

Coverage-Based Techniques

! What gets tested:

! equivalence class analysis

! a set of values for a variable that you considerequivalent because they all test the same thing

! if one catches a bug, the others likely will too

! if one does not catch a bug, the other likely willnot either

! test only a few values of the class(e.g., at the boundary, best representative)

! …

71

Coverage-Based Techniques

! What gets tested:

! input field test catalogs

! for each type of input field, develop a fairlystandard set of test cases and reuse it for similarfields in the product

! statement and branch coverage

! test cases execute every statement and branch

! just have 100% coverage is not enough

! configuration testing

! testing on many different hardwareconfigurations

72

Problem-Based Techniques

! Why you are testing:

! risk-based testing

! prioritize testing on the probability that somefeature will fail and the probable cost of failure

! constraint violations

! limits on the input values, output values,intermediate values, memory, and storage

73

Activity-Based Techniques

! How you test:

! regression testing (reuse of tests)

! try to show a bug fix is no good

! try to show that an old bug became unfixed

! try to show that a change caused something thatused to work to now become broken

! scripted testing

! manual testing, done by a junior tester whofollows a step-by-step procedure written by asenior tester

74

Activity-Based Techniques

! How you test:

! smoke testing

! try to show that the new build is not worthtesting further

! often automated, to check that the build wasdone correctly

! guerrilla testing

! a fast and vicious attack on the program, oftentime-limited, focused, and done by anexperienced tester familiar with the product

! if no significant bugs are found, only lightly testthis area later

75

Activity-Based Techniques

! How you test:

! installation testing

! install the software in various ways on varioussystems

! check which files are added or changed

! try to uninstall

! load testing

! program is stressed by being run on a systemfacing many resource demands

! look for vulnerabilities from the pattern of eventsleading to a failure

76

Activity-Based Techniques

! How you test:

! performance testing

! determine whether optimization is needed

! also, a significant change in performance in somearea can indicate a bug

77

Evaluation-Based Techniques

! How to tell if the test passed or failed:

! comparison with saved results

! if the result was correct before but is differentnow, perhaps a new bug

! comparison with a specification

! a mismatch could be a bug

! oracle-based testing

! check with a more trusted program

78

Evaluation-Based Techniques

! How to test if the test passed or failed:

! heuristic consistency (evaluation)

! consistent with past behavior

! consistent with company image

! consistent with comparable products

! consistent with claims

! consistent with user’s expectations

! consistent within product

! consistent with purpose

79

References

! Testing ComputerSoftware

! C. Kaner, J. Falk,H.Q. Nguyen

! Wiley, 1999

! Software Engineering

! S. L. Pfleeger

! Prentice-Hall, 1998

80

References

! Testing ComputerSoftware

! C. Kaner, J. Falk,H.Q. Nguyen

! Wiley, 1999

! Lessons Learned inSoftware Testing

! C. Kaner, J. Bach,B. Pettichord

! Wiley, 2002

81

References

! The Risks Digest! http://catless.ncl.ac.uk/Risks

! Fatal Defect

! I. Peterson

! Vintage, 1995

82

References

! Testing ComputerSoftware

! C. Kaner, J. Falk,H.Q. Nguyen

! Wiley, 1999

! Lessons Learned inSoftware Testing

! C. Kaner, J. Bach,B. Pettichord

! Wiley, 2002

83

References

! The Risks Digest! http://catless.ncl.ac.uk/Risks

! Fatal Defect

! I. Peterson

! Vintage, 1995

84

References

! Testing ComputerSoftware

! C. Kaner, J. Falk,H.Q. Nguyen

! Wiley, 1999

! Lessons Learned inSoftware Testing

! C. Kaner, J. Bach,B. Pettichord

! Wiley, 2002

85

References

! JUnit framework! http://www.junit.org/