[hcmc stc jan 2015] risk-based software testing approaches
TRANSCRIPT
Risk-based Testing Approach
STC 2015
VU NGUYEN
Faculty of Information Technology University of Science, VNU-HCMC
Agenda
• Introduction
• Typical Risk-based Testing Process
• Assessing risks of test areas
• Advantages and caveats
• Conclusions
2
Introduction - 1
• Typical testing scenario
– In the beginning of project, not much testing work to do
– When software is ready for testing • Limited time and resources for testing
• But testers need to test software as thoroughly as possible
3
As a test manager, how do you do to survive?
Introduction - 2
• Objectives of testing – finding bugs
– establishing confidence in software
• But, you can never test everything to find every bugs because of limited time and resources
• Realistic goals of testing – Find worse bugs, as many as possible
– Find bad bugs needing of fixing before release
4
Introduction - 3
• General strategy
– As you cannot test everything, you can decide • What to test
• What not to test
• What to test more
• What to test less
• What to test often
• Risk-based testing (RBT) helps prioritize tests
5
Agenda
• Introduction
• Typical Risk-based Testing Process
• Assessing risks of test areas
• Advantages and caveats
• Conclusions
6
A Risk-based Testing Process
7
Risk Identification and Assessment
Prioritized Test Areas
Bugs Data
Requirements, Design, Code, etc.
Test Planning
Test Design
Test Executing
Test Report
Agenda
• Introduction
• Typical Risk-based Testing Process
• Assessing risks of test areas
• Advantages and caveats
• Conclusions
8
Risk-based Testing
• RBT: type of testing tackling risks associated with test areas when testing
• Test area
– Functions or group of functions
– Product properties (from non-functional requirements)
• Risk of test area
– Chance of bugs causing damage if bugs are left undetected
– Product of damage and chance of damage to occur
9
Risk Identification & Assessment
Risk Exposure (of a test area)
Damage
Criticality
Visibility
Usage frequency
Chance of damage
Complexity
Changed area
New technologies
Area w/ prior bugs
Bug-prone developers
10
Determine Damage – 1/2
• Damage or loss occurs if bugs left undetected
• Bugs in important areas result in higher damage
• Key factors
– Criticality: critical areas of testing
– Visibility: areas visible to users through user interface
– Usage frequency: how often areas are used 11
Levels
Serious
Major
Minor
Trivial
Levels
Always
Frequent
Occasional
Rare
Determine Damage – 2/2
• Quantify damage
– Each factor is assigned a weight from 0 to 10
– Each level is assigned a number from 0 to 3
• Example
12
Damage = Criticality*w1 + Visibility*w2 + Frequency*w3
Test Areas Criticality (W1 = 10)
Visibility (W2 = 5)
Frequency (W3 = 8)
Damage
User Registration 2 1 1 2*10+1*5+1*8=33
Search and view products
2 3 3 59
Checkout 3 3 3 69
Product maintenance
1 1 3 39
Determine Chance of Damage – 1/2
• Chance of damage: probability of damage caused by bugs left undetected in a test area
• Dependent on many factors, such as
– Area with many bugs found earlier (bug-prone area)
– Complexity of area
– Change in area
– Area with new technologies
– Area developed by bug-prone developers
• Other factors: turnover, number of people involved, time pressure
13
Determine Chance of Damage – 2/2
• Quantify chance of damage
– Each factor is assigned a weight from 0 to 10
– Each level is assigned a number from 0 to 3
– 5 factors, max = 150 dividing 150 to get %
• Example
14
Chance = (Bug-Prone*w1 + Complexity*w2 + Change*w3 + …)/150
Test Areas Bug-prone area (w1=10)
Complexity (W2=10)
Change (W3=5)
New Tech (W4=5)
Bug-prone Dev (W5=5)
Chance of Damage (%)
User Registration
1 0 1 0 0 1*10+0*10+1*5+0*5+0*5/150=10
Search and view products
2 2 1 0 1 33
Checkout 2 1 1 1 1 30
Product maintenance
2 0 0 0 1 17
Determine Risk Exposure-1/2
• Risk Exposure Prioritized areas
– Areas with high risk exposures need more attention than others
• Example
15
Risk Exposure = Damage * Chance of Damage
Test Areas Damage Chance Risk Exposure
User Registration 33 10% 3.3
Search and view products
59 33% 19.5
Checkout 69 30% 20.7
Product maintenance 39 17% 6.6
Determine Risk Exposure-2/2
• Caveats
– Weights assigned for damage and chance of damage factors are based on experience, no proof done
– Numerical values are subjective
• But
– Imprecision can be alleviated with consistency
– One can differentiate between areas having highly different risk values
– Review is needed if risk values are close
16
Revise Risk Assessment
• Use test results to verify risk assessment
• Use test results to revise risks of test areas
– Bug-prone areas
– Bug-prone developers
– Bug density
17
Risk Identification and Assessment
Test Design
Test Executing
Test Report
Prioritized Test Areas
Bugs Data
Agenda
• Introduction
• Typical Risk-based Testing Process
• Assessing risks of test areas
• Advantages and caveats
• Conclusions
18
Advantages and Caveats
• Advantages of RBT – Helping to make informed decisions about where
to place more or less resources for testing
– Supporting test planning
– Maximizing value of testing
– Increasing test effectiveness
• Caveats – RBT does not fit all kinds of projects
• May not be suitable for safety-critical systems
– Risk of unrecognized or hidden bad bugs
– Subjective quantifications of factors
19
Conclusions
• One can never test everything, so it is smart to decide what to test less or more
• RBT is a useful approach to “test smarter”
– helps improve testing effectiveness
– helps overcome high pressure and limited resources
• RBT can be automated easily using Excel
• One should apply this approach with careful justifications
20
References
• Hans Schaefer, “Risk Based Testing, Strategies for Prioritizing Tests against Deadlines”, http://www.methodsandtools.com/archive/archive.php?id=31
• Vu Nguyen, “Value-based Test Prioritization” http://www.hcmc-stc.org/blog/value-based-test-prioritization
• Barry Boehm, “Value-Based Software Engineering”. ACM Software Engineering Notes, 2003; 28(2)
• Barry Boehm, and Victor Basili, "Software Defect Reduction Top 10 List," Computer, vol. 34, no. 1, pp. 135-137, Jan. 2001 (http://csse.usc.edu/csse/TECHRPTS/2001/usccse2001-510/usccse2001-510d.doc)
21
© 2014 HCMC Software Testing Club
THANK YOU