software engineering practice - software quality management

51
Software quality management McGill ECSE 428 Software Engineering Practice Radu Negulescu Winter 2004

Upload: radunegulescu

Post on 07-Sep-2014

377 views

Category:

Technology


5 download

DESCRIPTION

 

TRANSCRIPT

Page 1: Software Engineering Practice - Software Quality Management

Software quality management

McGill ECSE 428Software Engineering Practice

Radu Negulescu

Winter 2004

Page 2: Software Engineering Practice - Software Quality Management

McGill University ECSE 428 © 2004 Radu NegulescuSoftware Engineering Practice Software quality management—Slide 2

About this course module

Quality assurance is:

• A necessary condition for success in a software project

• Complex and costly: explicit costs (testing, review) and hidden costs (lumped with dev or planning)

SQA requires elaborate techniques and planning

• In general, a vast topic

• Here we discuss main points, most likely to impact your practice

• A bit more depth than 321

Recommended reading:

• Jalote ch. 7 “Quality planning and defect estimation”, ch. 12 “Peer review”

• McConnell Survival Guide, ch. 9 “Quality assurance”, ch. 15 “System testing”

• McConnell Rapid Dev., ch. 18 “Daily build and smoke test”

Page 3: Software Engineering Practice - Software Quality Management

McGill University ECSE 428 © 2004 Radu NegulescuSoftware Engineering Practice Software quality management—Slide 3

Overview

Basic concepts

Test case design

• Engineering principles

• Advanced techniques

Formal reviews

QA planning

Test automation

Page 4: Software Engineering Practice - Software Quality Management

McGill University ECSE 428 © 2004 Radu NegulescuSoftware Engineering Practice Software quality management—Slide 4

QA notions

Recall the following notions

• Bugs: failure, fault, defect, error

• Major QA types: verification vs. validation

Quality may refer to:

• System crashes

• Stated specifications

• Unstated user expectations

• “Degree to which the software satisfies both stated and implied requirements” [McC, survival guide]

• Delivered defect density: de facto standard

Page 5: Software Engineering Practice - Software Quality Management

McGill University ECSE 428 © 2004 Radu NegulescuSoftware Engineering Practice Software quality management—Slide 5

What is considered a defect?

Defect = cause of inconsistency with requirements or needs of the customer

• In several artifacts (SRS, DD, code)

• Several types: Jalote p. 269

• Possible severity scheme [after Jalote p. 270]:Critical = Show stopper!

User cannot carry out a function, orAffects the whole project schedule

Major = High impact, but not a show stopperAcross many modules, orStop the user from proceeding, but has workaround

Minor = Inconvenience the user, but does not stop from proceedingCosmetic = In no way affects performance or function

Example: grammar mistakes or misaligned buttons

Page 6: Software Engineering Practice - Software Quality Management

McGill University ECSE 428 © 2004 Radu NegulescuSoftware Engineering Practice Software quality management—Slide 6

Purpose of QA

Enhance quality

• Detect/manifest defects (search for failures)

• Locate/identify defects (debugging)

• Eliminate defects (a dev. job, but supported by QA followup)

Measure quality

• E.g. estimate reliability

Communicate about quality

• Raise issues

• Provide baseline data

• Psychological effect

• Morale (developer and customer)

Page 7: Software Engineering Practice - Software Quality Management

McGill University ECSE 428 © 2004 Radu NegulescuSoftware Engineering Practice Software quality management—Slide 7

A business perspective

We pay our QA department to:

• Detect defects in-process (before our customers do!)

• Provide objective input for business decisions

• Keep stakeholders aware of concerns related to shipping a product

Page 8: Software Engineering Practice - Software Quality Management

McGill University ECSE 428 © 2004 Radu NegulescuSoftware Engineering Practice Software quality management—Slide 8

QA activities

[Survival guide]

Debugging, code tracing

• E.g. trace just before an integration

Defect tracking

• For each defect record dates, conditions, etc.

• Statistics on code, developers, QA activities

Unit testing: executed by developer of unit

Technical reviews: usually by peers

Integration testing: by the developer of the new code

System testing: by independent QA group

Acceptance testing: as specified; done for the customer

Page 9: Software Engineering Practice - Software Quality Management

McGill University ECSE 428 © 2004 Radu NegulescuSoftware Engineering Practice Software quality management—Slide 9

QA dynamics

Most organizations err on the side of self-defeating QA shortcuts

• Schedule optimum: 95% defects removed! [Rapid dev., p. 69]

• Most organizations: < 50%!

• Repairs cost more downstream

• Repeat defect detection is a waste

• Low quality costs more in customer support

• Users remember low quality rather than timely delivery

On the other hand, QA can theoretically continue forever

• Infinite or huge number of input combinations

• Hidden QA costs often exceed development costs

• Need to put a cap to it

Main question:

• Exactly how much QA should suffice? What kind? When? How?

Page 10: Software Engineering Practice - Software Quality Management

McGill University ECSE 428 © 2004 Radu NegulescuSoftware Engineering Practice Software quality management—Slide 10

Test cases

Elements of a test case:

• Input data

• Execution conditions

• Expected output

• Link to a specific test objective

Page 11: Software Engineering Practice - Software Quality Management

McGill University ECSE 428 © 2004 Radu NegulescuSoftware Engineering Practice Software quality management—Slide 11

Test cases

Example: web-based reservation system

• Reserve by clicking on time slots at least 24 hours in advance

• Up to 100 time slots can be reserved

Input OutputTest # Description studentID hrsLeft time-crt time-slot slot status hrsLeft slot status

sel.eq.1 Select free slot 1234 38 3/day14 5/day22 free 37 res.1234sel.eq.2 Select slot owned by other 1234 38 3/day14 5/day22 res.5678 38 res.5678sel.eq.3 Select same day 1234 38 3/day14 3/day14 free 38 freesel.eq.4 Select <24hrs next day 1234 38 3/day14 2/day15 free 38 freesel.eq.5 Select with no hours left 1234 0 3/day14 5/day22 free 0 free

sel.bd.1 Select slot owned by oneself 1234 38 3/day14 5/day22 res.1234 38 res.1234sel.bd.2 Select first day 1234 100 1/day0 3/day1 free 99 res.1234sel.bd.3 Select last day 1234 12 3/day5 3/day100 free 11 res.1234sel.bd.4 Select first slot 1234 12 3/day5 1/day22 free 11 res.1234sel.bd.5 Select last slot 1234 12 3/day5 8/day22 free 11 res.1234sel.bd.6 Select exactly 24h before 1234 12 3/day5 3/day6 free 11 res.1234sel.bd.7 Select with one hours left 1234 1 3/day14 5/day22 free 0 res.1234

EQUIVALENCE TESTS - SELECT OPERATION

BOUNDARY TESTS - SELECT OPERATION

Page 12: Software Engineering Practice - Software Quality Management

McGill University ECSE 428 © 2004 Radu NegulescuSoftware Engineering Practice Software quality management—Slide 12

Test cases

Test cases may be interactive

Example:Black-box test memory in Windows calculator

Entry:Calculator window displayed

Steps:Type in 26Click MSType in 38Click MRCheck output 26Click MCClick MRCheck output 0

Page 13: Software Engineering Practice - Software Quality Management

McGill University ECSE 428 © 2004 Radu NegulescuSoftware Engineering Practice Software quality management—Slide 13

Test procedures

Test procedure = detailed instructions for setup, execution and evaluation of test results

• Same for a large group of test cases

• Often in form of checklists

• Example:Platform Windows 2000, RAM 128KCalculator only application runningRead test case from file CalcTest112.xlsApply input dataRecord actual outputReport differences from expected output

Test script = program that automates execution of a test procedure

• Often in a specialized testing language akin to Visual Basic or C++ Rational suite: “SQA basic”

Page 14: Software Engineering Practice - Software Quality Management

McGill University ECSE 428 © 2004 Radu NegulescuSoftware Engineering Practice Software quality management—Slide 14

Overview

Basic concepts

Test case design

• Engineering principles

• Advanced techniques

Formal reviews

QA planning

Test automation

Page 15: Software Engineering Practice - Software Quality Management

McGill University ECSE 428 © 2004 Radu NegulescuSoftware Engineering Practice Software quality management—Slide 15

Tests must be repeatable

Why? A number of reasons

• Avoid judgment calls after test executionPass or fail, bug or feature, validity of input data, execution conditions, …

• Inform the tester on what needs doneTesters shouldn’t have to understand the workings of the programsDifferent people understand different things…

• Enable automated test execution

• Report failure back to developmentWhat do you mean “it doesn’t work” when I know it works!

How? Full details!

• Bad: save number, type number, restore number, see first numberWhat if numbers are the same? type in –0? over limit? truncation? “restore”is interpreted as “store another number”?

• Good: type 12, click MS, type 33, click MR, see 12

• Keep the principle even if interaction or dynamic input is required!

Page 16: Software Engineering Practice - Software Quality Management

McGill University ECSE 428 © 2004 Radu NegulescuSoftware Engineering Practice Software quality management—Slide 16

Complete input coverage is not feasible

Example: purchase something online

How many possible inputs?

• Text fields

• Button clicks

• Hyperlink clicks

And, how many possible input combinations?

• One string: 26 x 26 x 26 x …

• Infinite sequences of mouse clicks

• Delays, ...

And, how many possible interleavings?

• Client events vs. server events

• Different threads on server

Page 17: Software Engineering Practice - Software Quality Management

McGill University ECSE 428 © 2004 Radu NegulescuSoftware Engineering Practice Software quality management—Slide 17

Testing cannot prove complete absence of bugs

Misconception: “my code is fully tested so it has no problems”

Reality: “testing can only prove presence of bugs” [Dijkstra]

Edsger Wybe Dijkstra 1930-2002

• A pioneer of structured and disciplined programming

• “Goto considered harmful”

• Key contributions to concurrency, system design, formal methods,software engineering, ...

Page 18: Software Engineering Practice - Software Quality Management

McGill University ECSE 428 © 2004 Radu NegulescuSoftware Engineering Practice Software quality management—Slide 18

Coverage

Coverage = percentage of a class of IUT elements that have been exercised at least once by a set of tests

• Code coverage

• State coverage

• Input coverage

• Use case coverage

• ...

A compromise must be reached between coverage and effort

• What is a reasonable compromise?

Page 19: Software Engineering Practice - Software Quality Management

McGill University ECSE 428 © 2004 Radu NegulescuSoftware Engineering Practice Software quality management—Slide 19

Targeted tests are more efficient!

Example: typo-like faults

• Simplifying assumptions:Uniformly distributed in the codePerfect detection

• Assume random testsProbability of detecting a fault at time t ~= defect concentration ~= remaining number of faults

Exponential decay: 1/f0 * e^(- f0 t)The effort to detect a new fault ~= 1/probability = exponential in t

• Assume tests targeted to code coverageEach test covers a few new statementsEqually likely to discover a faultThe effort per fault stabilizes to constantAll such faults are eventually discovered

• The simplifying assumptions hold not perfectly, but well enough

• Similar ones hold well enough for other types of faults

Page 20: Software Engineering Practice - Software Quality Management

McGill University ECSE 428 © 2004 Radu NegulescuSoftware Engineering Practice Software quality management—Slide 20

Get a feel for coverage effects!

Naval combat game: shooting in a pattern is much more effective than shooting at random

* *

*

* **

*

*

**

*

*

Page 21: Software Engineering Practice - Software Quality Management

McGill University ECSE 428 © 2004 Radu NegulescuSoftware Engineering Practice Software quality management—Slide 21

Typical test targets

Code coverage

• Statement

• Condition

Black-box coverage

• Code coverage won’t detect omission faults

• Use case coverage

• UI state transitions

• Activate / not activate each particular outputE.g. buy a book or not; change address or not

Invalid data: one invalid test vector for each constraint

• Preconditions

• Invalid formats (text in number fields)

Boundary conditions: one test for each boundary

Configurations of the deployment platform

Typical faults

Page 22: Software Engineering Practice - Software Quality Management

McGill University ECSE 428 © 2004 Radu NegulescuSoftware Engineering Practice Software quality management—Slide 22

Typical test targets

A frequently encountered pattern:

• For each use caseA “best test” that exercises most of the code with typical valuesOne test for each invalid or boundary condition

Example: testing “select” in web reservation system

• For each non-functional requirementProve that it is metUse typical data

Page 23: Software Engineering Practice - Software Quality Management

McGill University ECSE 428 © 2004 Radu NegulescuSoftware Engineering Practice Software quality management—Slide 23

Typical test targets

What else to test

• Install, setup

• Help

• OS versions and locales

• Screen resolutions and depths

• Multiple instances

• Window minimize, drag/redraw, close

• Memory leaks, invasiveness

• User manual and helpGo through each keystroke in manual; check result

Page 24: Software Engineering Practice - Software Quality Management

McGill University ECSE 428 © 2004 Radu NegulescuSoftware Engineering Practice Software quality management—Slide 24

Multidimensional coverage

Same test case may count in several targets

• E.g. several memory functions in Windows calculator

• Keep error and invalid conditions separate

Example: platform coverage

• Platform = hardware and software environmentMay include other concurrently running applications (Why?)

• Pattern approachList all likely combinationsFor each combination, assign

Projected usage valueTechnical risk value

• Cycle through list changing platform periodically so that different tests are run on different platforms

Use based: most tests should run on the highly used platformRisk based: test high risk exposure firstRequires organized lab - Swap Ready

At the end we reasonably cover all platforms and all tests

Page 25: Software Engineering Practice - Software Quality Management

McGill University ECSE 428 © 2004 Radu NegulescuSoftware Engineering Practice Software quality management—Slide 25

And finally...

Test cases must be easy to maintain!

• Will evolve with the system

• Executed in an organized manner

• Coverage estimation

Not too simple, not too complex

• Depending on needs

Structured, linked to some test goal

• Grouped by function tested

• Grouped by type of coverage / type of testing

Page 26: Software Engineering Practice - Software Quality Management

McGill University ECSE 428 © 2004 Radu NegulescuSoftware Engineering Practice Software quality management—Slide 26

Scenario-based testing

Based on

• User experience: create account, purchase, access affiliate, etc.

• User type: novice user, power user, intermittent user

Elaboration

• In parallel with development

• Separate test procedure from test dataProcedure: instructions to follow to run the test; navigation instructions; same for all testsTest data: any data typed into a form or field by a user; variable part of a testOther data: any other field; any decision

Execution

• QA, contractors, students, support staff

• Test script

Page 27: Software Engineering Practice - Software Quality Management

McGill University ECSE 428 © 2004 Radu NegulescuSoftware Engineering Practice Software quality management—Slide 27

Example: purchase a book from Amazon.com

Procedure: can be applied to many test cases

• Go to URL www.amazon.com

• Wait for home page to load

• Type in ISBN number of book

• Add book to shopping cart

• Proceed to checkout

• Log in with user name and password

• Place order

Data: make sure the test case is repeatable

• Book ISBN

• Customer user name, password

• Exactly which buttons to click, in order

Page 28: Software Engineering Practice - Software Quality Management

McGill University ECSE 428 © 2004 Radu NegulescuSoftware Engineering Practice Software quality management—Slide 28

Exploratory testing

Example: explore statistical functions in Windows calculator

• Press “Sta”

• Load statistics box

• This activates the blue buttons

• Cover all the activated functionsExercise the functionExercise all mouse events

Right-click brings up help!

• Use helpHelp→index→statistical calculations →using key sequences as functions

• Surprise: interoperability NotepadE.g. type 123:m for memory function

• High priority? Explore that!

Test cases defined and executed on the fly

Dynamically targeted!

Page 29: Software Engineering Practice - Software Quality Management

McGill University ECSE 428 © 2004 Radu NegulescuSoftware Engineering Practice Software quality management—Slide 29

Result of exploratory testing

During exploratory test we must capture

• Functions, options or sub-functions being explored

• Test cases attempted

• State of the application: comments, notes, images

• Hints, reminders and observations that may be useful to future testers

• Date, platform, configuration under test

(Test must be repeatable!)

• Oracles, “strategy to assess correctness”Specified propertyOS-crash, help activeOr, just mapping

• Other relevant detailsAbout the decisions taken, state of product, and result of test

Records: concise, tabular, chronological

Page 30: Software Engineering Practice - Software Quality Management

McGill University ECSE 428 © 2004 Radu NegulescuSoftware Engineering Practice Software quality management—Slide 30

Daily build and smoke test

Build daily

• Keep development in sync

• Ensure few open defects

Smoke test

• Exercise the system end-to-end

• Not exhaustive: detect major problems

• Evolve as the system grows

• Build only ready-to-build code, after private builds

Managing the daily builds

• Run by QA, not DEV

• Stop work for broken builds

• Release in the morning

Page 31: Software Engineering Practice - Software Quality Management

McGill University ECSE 428 © 2004 Radu NegulescuSoftware Engineering Practice Software quality management—Slide 31

Overview

Basic concepts

Test case design

• Engineering principles

• Advanced techniques

Formal reviews

QA planning

Test automation

Page 32: Software Engineering Practice - Software Quality Management

McGill University ECSE 428 © 2004 Radu NegulescuSoftware Engineering Practice Software quality management—Slide 32

Reviews

Manually examine a software artifact according to specified criteria

• Inspections, reviews, walkthroughs, ...

Reviews are more effective than testing

• Number of defects found

• Effort per defect

• See Jalote p. 247: review capability baseline

• Point to cause, not just symptom

• Can be used for artifacts other than code, earlier in the process

• Training value for novices

• Good leverage for experts

Reviews cannot replace testing

• Subjective interpretation not completely avoidable

• Little or no automation

• Different types of defects

• Different expertise

Page 33: Software Engineering Practice - Software Quality Management

McGill University ECSE 428 © 2004 Radu NegulescuSoftware Engineering Practice Software quality management—Slide 33

Typical review process

[After Jalote]

Review meeting

• Moderated

• Focused on defects, not fixes

• Focused on artifact, not author

• Check preparedness

• Go through the artifact line by line

• Reconcile defect reports

Plan

nin

g

Indiv

idual

Rev

iew

Rev

iew

mee

ting

Rew

ork

&fo

llow

up

ScheduleReview team

Defect logsTime spent

Defect listIssue listMetrics

Entrycriteria

Page 34: Software Engineering Practice - Software Quality Management

McGill University ECSE 428 © 2004 Radu NegulescuSoftware Engineering Practice Software quality management—Slide 34

Variations

Different focus for different artifacts

• Jalote p. 242

Level of formality

• Inspections, walkthroughs, etc.

One-person reviews

• Reduce costs and efficiency

• Keep psychological effects

Active design reviews

• Really, a type of inspection

• No meeting

• Looks like several one-person reviews

• Opportunity to compare defects found

Page 35: Software Engineering Practice - Software Quality Management

McGill University ECSE 428 © 2004 Radu NegulescuSoftware Engineering Practice Software quality management—Slide 35

Overview

Basic concepts

Test case design

• Engineering principles

• Advanced techniques

Formal reviews

QA planning

Test automation

Page 36: Software Engineering Practice - Software Quality Management

McGill University ECSE 428 © 2004 Radu NegulescuSoftware Engineering Practice Software quality management—Slide 36

Test task breakdown

Establish test goals

• Cohesive functionality

• Easy to communicate

Task granularity

• Aim for 90 minute chunks

• Time will go down to 45 and 30 minutes on second and third pass

Test breakdown should match design breakdown

• Test a piece of code as soon as it is created

• Test cases evolve with code

Risk dictates order of testing

• Important bugs can be fixed while lower-risk ones are being tested

• Technical risks based on work done and possible impacts

• Commercial risks based on known business opportunities or current customers

• Risks should be re-evaluated on every test cycle! (situations change)

Page 37: Software Engineering Practice - Software Quality Management

McGill University ECSE 428 © 2004 Radu NegulescuSoftware Engineering Practice Software quality management—Slide 37

Example

Windows calculator

• Test objectives / tasks:UI componentsArithmetic and logicStatistical functionsTypical usage scenariosPlatforms, install...

Page 38: Software Engineering Practice - Software Quality Management

McGill University ECSE 428 © 2004 Radu NegulescuSoftware Engineering Practice Software quality management—Slide 38

QA stages

Every major activity should have sign-off criteria

• Sound motivation, visibility, control

Recommended QA activities from [Survival guide, p. 217]:

• UI prototype review

• User manual/requirements specification review *

• Architecture review

• Detailed design review

• Code review

• Unit testing **

• Integration testing *

• Daily smoke test

• System testing **

Widespread but not best practice:

• * sometimes used, ** always used

Page 39: Software Engineering Practice - Software Quality Management

McGill University ECSE 428 © 2004 Radu NegulescuSoftware Engineering Practice Software quality management—Slide 39

Varying levels of QA

If more stages are needed

• More detailed testingMore criteria to coverHigher coverage of each criterion

• More artifacts under reviewTest plan / test suite review

• More critical reviewingExtended review checklistCross-reviewing by more experienced reviewers

If fewer stages are needed

• Merge several QA steps

• Careful not to decrease development efficiency

Page 40: Software Engineering Practice - Software Quality Management

McGill University ECSE 428 © 2004 Radu NegulescuSoftware Engineering Practice Software quality management—Slide 40

QA allocation

How much QA effort is needed, and how much is too much?#stages = log(#injected / #on-release) / log(DRE)

Numbers depend on

• Effort and size metrics for DEV and QA activities

• Baseline metrics

• Definition and type of defects

• Organization capability

• Developer, training and experience

• Technology used

Page 41: Software Engineering Practice - Software Quality Management

McGill University ECSE 428 © 2004 Radu NegulescuSoftware Engineering Practice Software quality management—Slide 41

Defect injection and removal

DIR = defect injection rate

• # defects introduced at each development stage

• Normalized, per function point or per person-hour

Fault potential = Sum((DIR per stage)*(FP or effort))

DRE = defect removal efficiency

• % of existing defects removed per stage of testing or review

• In-process defect removal = Product(DRE per stage)

Page 42: Software Engineering Practice - Software Quality Management

McGill University ECSE 428 © 2004 Radu NegulescuSoftware Engineering Practice Software quality management—Slide 42

Example

A small project

• 200 FP

• Aim for fewer than 2 defects/KLOC on release

• Assume that black-box, structural, integration, and system testing will be performed.

• How many inspection steps do we need to perform?

Use Jones’ rules of thumb

• Fault potential = FP^1.25 = 753

• Size = 100 * FP = 20 KLOC

• Injected faults/KLOC = 37.65

• 4 types of testing (black-box, structural, integration, system) leave in 0.7^4 = 24% faults

• Inspections need to leave in less than 2 / (37.65 * .24) = 22% of faults

• Two types of inspection leave 0.4^2 = 16% of faults

• Therefore, two inspection steps (e.g. SRS and design inspections) suffice

Coefficients should be adapted to project type / organization

• Also see Jalote, ch. 12 for calculations using baselines

Page 43: Software Engineering Practice - Software Quality Management

McGill University ECSE 428 © 2004 Radu NegulescuSoftware Engineering Practice Software quality management—Slide 43

Quality management

Procedural: defined procedures but no measures

• Hard to tell whether the QA procedures were executed well or not

Quantitative: quality measures, procedure independent

• Setting a quality goalDefect concentrationReliability

• Monitoring and controlSet intermediate goalsRefer to intermediate goalsTake action on deviations

Page 44: Software Engineering Practice - Software Quality Management

McGill University ECSE 428 © 2004 Radu NegulescuSoftware Engineering Practice Software quality management—Slide 44

Setting a quality goal

Defect concentration on release

• Industry averages, past performance:# defects per FP or KLOCEstimated size → total # defects on release

Needs to be a tangible goal

• Convert to in-process DRE, based on estimated DIR

• Convert to # defects in acceptance testing, based on past dataInfosys: 5..10% of total defects! [Jalote p. 155]

Page 45: Software Engineering Practice - Software Quality Management

McGill University ECSE 428 © 2004 Radu NegulescuSoftware Engineering Practice Software quality management—Slide 45

Monitoring

Good quality management should send warning signs early

• Set intermediate goals and take action if not met

Intermediate project goals based on

• Rayleigh curve of defect discovery

• Baseline data on relative effectiveness [Jalote p.150; p.155]

• Baseline data per type of QA activity

If current data is out of norm, determine cause and take action

• Look into related factors: effort, experience, etc.[Jalote table 12.3/p. 249]

Action:

• Additional QA steps, as seen

• Review QA procedure & redo

• Training, prototyping, ...

Page 46: Software Engineering Practice - Software Quality Management

McGill University ECSE 428 © 2004 Radu NegulescuSoftware Engineering Practice Software quality management—Slide 46

Release criteria

Should be based on objective, measurable detail

• E.g. “no open high-severity defects” and “less than 10 defects on AT”

• Avoid need for a judgment call, and associated pressures!

Decision lays with product management

• DEV may be oblivious to shortcomings of its brainchild

• QA can propose alternatives but not decide what testing to skip

• MGMT knows best the business goals and priorities

Read the trend of defect discovery to estimate remaining defects

• Use the failure data from the final stages of testingIn simplest form, remaining defects ~= #new defects; extrapolateA bit too simplistic

• Compare actual defect data to quality goalsActual vs. planned defect ratio is probably the same for the discovered and remaining defectsCorrections for actual development effort and number of QA stages

Page 47: Software Engineering Practice - Software Quality Management

McGill University ECSE 428 © 2004 Radu NegulescuSoftware Engineering Practice Software quality management—Slide 47

Overview

Basic concepts

Test case design

• Engineering principles

• Advanced techniques

Formal reviews

QA planning

Test automation

Page 48: Software Engineering Practice - Software Quality Management

McGill University ECSE 428 © 2004 Radu NegulescuSoftware Engineering Practice Software quality management—Slide 48

Applying regression tests

Needed to support refactoring and other processes

Run a test harness / test script

• Read tests from a user-edited test manifest

• Can be a spreadsheet file

• CSV (comma separated variable) → a handy format supported by many databases

• Open a ODBC or JDBC link to read test cases directly into test harness program

Rational Robot

Page 49: Software Engineering Practice - Software Quality Management

McGill University ECSE 428 © 2004 Radu NegulescuSoftware Engineering Practice Software quality management—Slide 49

Test case elaboration

Rational TestFactory

• Produce a “best test” using typical values

• Target tests to uncovered code

• Low coverage rates

UI mapping

• Based on component registry / OS API

Coverage analysis

• Based on automated code instrumentation

Page 50: Software Engineering Practice - Software Quality Management

McGill University ECSE 428 © 2004 Radu NegulescuSoftware Engineering Practice Software quality management—Slide 50

Static analysis

Automated tools to study code

• Compiler warnings

• Lint

• W3C tools to see if HTML conforms to standards

• Posix compliance checking

• Win API checking

• Code complexity measures including Loc, McCabe etc.

Page 51: Software Engineering Practice - Software Quality Management

McGill University ECSE 428 © 2004 Radu NegulescuSoftware Engineering Practice Software quality management—Slide 51

Discussion

Thank you!

Any questions?