A Comparison of Release Readiness Approaches

Download A Comparison of Release Readiness Approaches

Post on 12-Jul-2015

95 views

Category:

Education

1 download

TRANSCRIPT

Release Readiness Measurement

A Comparison of Best Practices8th Malaysian Software Engineering Conference (MySEC 2014)Resort World Langkawi, MalaysiaRelease Readiness Measurement24-09-2014Nico Koprowski, M. Firdaus Harun, Horst Lichter(Universiti Teknologi Malaysia)

Release ReadinessMeasurement for Release ReadinessDefect TrackingSoftware Readiness IndexShipItDiscussionComparisonSummaryIntroduction2

ScheduleCostSoftware ReadinessWhen to release the software product?Sw Readiness:-Sw company aggressively find the possible way to deliver a finished software product at the right time.Consider schedule and costTo deliver a such high quality software might take a long time or over schedule or increase the cost to hire the developer.To make sure a software deliver in time, it might miss important features which lead to low quality software2Release Readiness3TimeRelease ReadinessTime of release?100% finishedIs a software product ready to be released?-This property called Release Readiness-In this graph, it shows that in releasing software product we could not completed all features that we promised to customer and the same time we should consider the date of delivery.-Therefore, there must be a number of approaches that facilitate release manager or lead software developer when-to-release software to customer by considering quality and cost.3Defect Tracking4Q: When is a software product ready to be released?Defect Tracking: When the number of remaining defects is sufficently low!Reliability:The probability of executing a software system without failure for a specified time period.Defect Density (DD)5

LOC: 100KLOC: 50KLOC: 100KDefects: 700Defects: 475Defects: 600HistoryPresent

LOC: 140KDefects / DD: ?DD: 7/KLOCDD: 9.5/KLOCDD: 6/KLOCV1.0V2.0V3.0DD No. of defects per LOC.

Point:1) The more historical data you have, the more confident you can be in your pre release defect density targets.5Defect Pooling6

Team ATeam BCommon findingsUnique findingsUnique findingsThe more unique findings, the more defects remainDefect ADefect BPoints:Distinct arbitraryOperate independentlyTest the full scope of product

6Defect Seeding7

SeedDetect

Team ATeam BIdea: Remaining defects rate proportional to seeded defects detected rateSoftware Product7Architectural Defect Tracking (ADF)8Presentation TierBusiness TierData Access TierN-tier architecture:Parameters:# UIs# UI messages# Parents# ChildrenDepth of InheritancesCoupling# Selects# Insert/Updates# Deletes# Sub-queriesUser InterfacesClassesSQLADF estimate and predict defect in each layer by utilizing NN Prediction model.

Apply 2-layered neural networks predictive models predict defect class in each tier:-Kohonen Network Ready / Not Ready based on defect class-General regression neural network LOC changed with time required to change.8Software Readiness Index (SRI)9Q: When is a software product ready to be released?SRI: When it obtains the desired amount of quality!QualityReliabilityFunctionalityEfficiencyUsabilityQualitySRI Criteria10

Taken from: A.Asthana and J. Oliveri. Quantifying software reliability and readiness.Points:Contains 5 vectors which contains different variablesEach vector is computed as weighted sum of the constituent variables; magnitude of each vector is normalized to 1.0.This weight can be decided based on our goals such as to release readiness or measuring a risk in a software development project.All computed vector values (additive, multiplicative or hybrid both model computation) will indicate that the software Ready to Go, Go with Condition or No decision. The decision value based on threshold value that we decided in each vectors (NEXT Slide).10Thresholds11GreenYellowRedCriterionMeasurement0%100%GoodOk, Sufficient Bad, Not tolerableShipIt12Q: When is a software product ready to be released?ShipIt: When the overall development progress is sufficiently advanced!RequirementsDesignTestingTimeProgress: The ratio of the already spent effort to the overall planned effortShipIt Criteria13RequirementsCodingTestingQualityDocumentationSupervisionSupportGathered, Analysed and DesignedModules, Objects codedBuild TimesTest CoverageOpen issuesZero Failure Test hoursCOCOMORequirements, Design, CodeTest Plan, User GuideInstallation and TrainingBeta Test BugsPoints:7 components with its sub componentsAdopted waterfall modelAccumulate the value from each components.

13Discussion: Defect Tracking14Only ReliabilityWhen TestingDiscussion: SRI15Quality including ReliabilityDiscussion: ShipIt16Whole Progress including Quality and ReliabilityOnly suitable for waterfall modelComparison17Defect TrackingSRIShipItFits inFits inLeast release criteriaUsed in final stepsADT where applicableUniversally applicableMost comprehensive and concretenessThresholdsBroadest scopeGood for progress communicationLess concreteAvailablityUniversalityConcretenessSimplicityScopeSummary18Question: Is the software product ready to be released?Suggestion: Quantify release readiness properties via metrics (applicable for all software designs) and project progress (any software development types) i.e. Holistic approach.Reliability measurement with Defect Tracking: Least criteriaQuality measurement with SRI: Most comprehensive approachProgress measurement with ShipIt: Most complex approachThe End19Thanks for your attention!

Defect TrackingSRIShipItLeast release criteriaUsed in final stepsADT where applicableUniversally applicableMost comprehensive and concretenessThresholdsBroader scopeGood for progress communicationLess concreteAvailablityUniversalityConcretenessSimplicityScopeBroader scope the measurement covers reliability, quality and progress.19

Recommended

View more >