1 software quality cis 375 bruce r. maxim um-dearborn

25
1 Software Quality CIS 375 Bruce R. Maxim UM-Dearborn

Upload: samantha-chambers

Post on 29-Dec-2015

216 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: 1 Software Quality CIS 375 Bruce R. Maxim UM-Dearborn

1

Software Quality

CIS 375

Bruce R. Maxim

UM-Dearborn

Page 2: 1 Software Quality CIS 375 Bruce R. Maxim UM-Dearborn

2

Software Quality Principles

• Conformance to software requirements is the foundation from which quality is measured.

• Specified standards define a set of development criteria that guide the manner in which software is engineered.

• Software quality is suspect when a software product conforms to its explicitly stated requirements and fails to conform to the customer's implicit requirements (e.g. ease of use).

Page 3: 1 Software Quality CIS 375 Bruce R. Maxim UM-Dearborn

3

McCall’s Quality Factors

• Product Operation– Correctness– Efficiency– Integrity– Reliability– Usability

• Product Revision– Flexibility– Maintainability– Testability

• Product Transition– Interoperability– Portability– Reusability

Page 4: 1 Software Quality CIS 375 Bruce R. Maxim UM-Dearborn

4

McCall’s Quality Factors

Page 5: 1 Software Quality CIS 375 Bruce R. Maxim UM-Dearborn

5

McCall’s Software Metrics

• Auditability• Accuracy• Communication

commonality• Completeness• Consistency• Data commonality• Error tolerance• Execution efficiency• Expandability• Generality

• Hardware independence• Instrumentation• Modularity• Operability• Security• Self-documentation• Simplicity• Software system

independence• Traceability• Training

Page 6: 1 Software Quality CIS 375 Bruce R. Maxim UM-Dearborn

6

FURPS Quality Factors

• Functionality

• Usability

• Reliability

• Performance

• Supportability

Page 7: 1 Software Quality CIS 375 Bruce R. Maxim UM-Dearborn

7

ISO 9126 Quality Factors

• Functionality

• Reliability

• Usability

• Efficiency

• Maintainability

• Portability

Page 8: 1 Software Quality CIS 375 Bruce R. Maxim UM-Dearborn

8

Measurement Process - 1

• Formulation– derivation of software measures and metrics

appropriate for software representation being considered

• Collection– mechanism used to accumulate the date used to

derive the software metrics

• Analysis– computation of metrics

Page 9: 1 Software Quality CIS 375 Bruce R. Maxim UM-Dearborn

9

Measurement Process - 2

• Interpretation– evaluation of metrics that results in gaining

insight into quality of the work product

• Feedback– recommendations derived from

interpretation of the metrics is transmitted to the software development team

Page 10: 1 Software Quality CIS 375 Bruce R. Maxim UM-Dearborn

10

Technical Metric Formulation

• The objectives of measurement should be established before collecting any data.

• Each metric is defined in an unambiguous manner.

• Metrics should be based on a theory that is valid for the application domain.

• Metrics should be tailored to accommodate specific products and processes

Page 11: 1 Software Quality CIS 375 Bruce R. Maxim UM-Dearborn

11

Software Metric Attributes

• Simple and computable

• Empirically and intuitively persuasive

• Consistent and objective

• Consistent in use of units and measures

• Programming language independent

• Provides an effective mechanism for quality feedback

Page 12: 1 Software Quality CIS 375 Bruce R. Maxim UM-Dearborn

12

Representative Analysis Metrics

• Function-based metrics

• Bang metric– function strong or data strong

• Davis specification quality metrics

Page 13: 1 Software Quality CIS 375 Bruce R. Maxim UM-Dearborn

13

Specification Quality Metrics - 1

nn = nf + nnf

nn = requirements & specification .

nf = functional .

nnf = non-functional .

Specificity, Q1 = nai/qr

nai = # of requirements with reviewer

agreement.

Page 14: 1 Software Quality CIS 375 Bruce R. Maxim UM-Dearborn

14

Specification Quality Metrics - 2

Completeness, Q2 = nu / (ni * ns)

nu = unique functions.

ni = # of inputs.

ns = # of states.

Overall completeness, Q3 = nc / (nc + nnv)

nc = # validated & correct.

nnv = # not validated.

Page 15: 1 Software Quality CIS 375 Bruce R. Maxim UM-Dearborn

15

Representative Design Metrics - 1

• Architectural design metrics– Structural complexity (based on module fanout)– Data complexity (based on module interface inputs

and outputs)– System complexity (sum of structural and data

complexity)– Morphology (number of nodes and arcs in

program graph) – Design structure quality index (DSQI)

Page 16: 1 Software Quality CIS 375 Bruce R. Maxim UM-Dearborn

16

Representative Design Metrics - 2

• Component-level design metrics– Cohesion metrics (data slice, data tokens,

glue tokens, superglue tokens, stickiness)– Coupling metrics (data and control flow,

global, environmental)– Complexity metrics (e.g. cyclomatic

complexity)

• Interface design metrics (e.g. layout appropriateness)

Page 17: 1 Software Quality CIS 375 Bruce R. Maxim UM-Dearborn

17

Halstead’s Software ScienceSource Code Metrics

• Overall program length• Potential minimum algorithm volume • Actual algorithm volume

– number of bits used to specify program

• Program level– software complexity

• Language level– constant for given language

Page 18: 1 Software Quality CIS 375 Bruce R. Maxim UM-Dearborn

18

Testing Metrics

• Metrics that predict the likely number of tests required during various testing phases

• Metrics that focus on test coverage for a given component

Page 19: 1 Software Quality CIS 375 Bruce R. Maxim UM-Dearborn

19

Estimating Number of ErrorsError Seeding - 1

(s / S) = (n / N)

S = # of seeded errors

s = # seeded errors found

N = # of actual errors

n = # of actual errors found so far

Page 20: 1 Software Quality CIS 375 Bruce R. Maxim UM-Dearborn

20

Estimating Number of ErrorsError Seeding – 2

E(1) = (x / n) =

(# of real errors found by 1/ total # of real errors) =

q / y =

(# errors found by both / # real errors found by 2)

E(2) = (y / n) = q / x =

(# real errors found by both / # found by 1)

n = q/(E(1) * E(2))

Page 21: 1 Software Quality CIS 375 Bruce R. Maxim UM-Dearborn

21

Estimating Number of ErrorsError Seeding – 3

Assumex = 25

y = 30

q = 15

E(1) = (15 / 30) = .5

E(2) = (15 / 25) = .6

n = [15 / (.5)(.6)] = 50 errors

Page 22: 1 Software Quality CIS 375 Bruce R. Maxim UM-Dearborn

22

Software Confidence

S = # of seeded errors.

N = # of actual errors.

C (confidence level) = 1 if n > N

C (confidence level) =

[S / (S – N + 1)] if n <= N

Example: N = 0 and S = 10C = 10/(10 – 0 + 1) = 10/11 91%

Page 23: 1 Software Quality CIS 375 Bruce R. Maxim UM-Dearborn

23

Confidence Example

How many seeded errors need to be used and found to have a 90% confidence a program is bug free?N = 0

C = S/(S – 0 + 1) = 98/100

Solving for SS = 49

Page 24: 1 Software Quality CIS 375 Bruce R. Maxim UM-Dearborn

24

Failure Intensity

• Suppose that intensity is proportional to # of faults or errors present at the start of testing.– function A has 90% of duty time– function B has 10% of duty time

• Suppose there are 100 total errors50 in A, 50 in B(.9)50K + (.1)50K = 50K(.1)50K = 5K

or (.9)50k = 45K

Page 25: 1 Software Quality CIS 375 Bruce R. Maxim UM-Dearborn

25

Maintenance MetricsSoftware Maturity Index

SMI = [Mt = (Fa + Fc + Fd)]/Mt

Mt = number of modules in current release.

Fa = modules added.

Fc = modules changed.

Fd = modules deleted.

• SMI approaches 1.0 as product begins to stabilize