software project monitoring and control
DESCRIPTION
SOFTWARE PROJECT MONITORING AND CONTROL. Qalitative and Q uantitative Data. Software project managers need both qualitative and quantitative data to be able to make decisions and control software projects so that if there are any deviations from what is planned, control can be exerted. - PowerPoint PPT PresentationTRANSCRIPT
![Page 1: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/1.jpg)
SOFTWARE PROJECT MONITORING AND
CONTROL
![Page 2: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/2.jpg)
QALİTATİVE AND QUANTİTATİVE DATASoftware project managers need both qualitative and quantitative data to be able to make decisions and control software projects so that if there are any deviations from what is planned, control can be exerted.
Control actions may include:
Extending the schedule
Adding more resources
Using superior resources
Improving software processes
Reducing scope (product requirements)
![Page 3: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/3.jpg)
MEASURES AND SOFTWARE METRİCS
Measurements enable managers to gain insight for objective
project evaluation.
If we do not measure, judgments and decision making can be
based only on our intuition and subjective evaluation.
A measure provides a quantitative indication of the extend,
amount, dimension, capacity, or size of some attribute of a
product or a process.
IEEE defines a software metric as “a quantitative measure of
the degree to which a system, component, or process
possesses a given attribute”.
![Page 4: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/4.jpg)
MEASURES AND SOFTWARE METRİCSEngineering is quantitative discipline, and direct measures such as
voltage, mass, velocity, or temperature are measured.
But unlike other engineering disciplines, software engineering is not
grounded in the basic laws of physics.
Some members of software community argue that software is not
measurable. There will always be a qualitative assessments, but project
managers need software metrics to gain insight and control.
“Just as temperature measurement began with an index
finger...and grew to sophisticated scales, tools, and techniques, so too is
software measurement maturing”.
![Page 5: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/5.jpg)
DİRECT AND INDİRECT MEASURES
A direct measure is obtained by applying measurement rules directly to the phenomenon of interest.For example, by using the specified counting rules, a software program’s “Line of Code” can be measured directly. http://sunset.usc.edu/research/CODECOUNT/An indirect measure is obtained by combining direct measures.For example, number of “Function Points” is an indirect measure determined by counting a system’s inputs, outputs, queries, files, and interfaces.
![Page 6: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/6.jpg)
SOFTWARE METRİCS TYPESProduct metrics, also called predictor metrics are measures of software product and mainly used to determine the quality of the product such as performance.
Process metrics, also called control metrics are measures of software process and mainly used to determine efficiency and effectiveness of the process, such as defects discovered during unit testing They are used for Software Process Improvement (SPI).
Project metrics are measures of effort, cost, schedule, and risk. They are used to assess status of a project and track risks.
![Page 7: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/7.jpg)
RİSK TRİGGERSAs the project proceeds, risk monitoring activities commence.Project manager monitors factors that may provide an indication of whether a risk is becoming more or less likely.For example, if a risk of “staff turnover” is identified, general attitude of team members, how they get along, interpersonal relationships, potential problems with compensation and benefits should be monitored.Also, effectiveness of risk mitigation strategies should be monitored.If mitigation plans fail, contingency plans should be executed.
![Page 8: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/8.jpg)
SOFTWARE METRİCS USAGE
Product, Process, Project Metrics Metrics Repository
Project TimeProject CostProduct Scope/QualityRisk ManagementFuture Project EstimationSoftware Process Improvement
![Page 9: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/9.jpg)
PROJECT MANAGEMENT LİFE CYCLE
![Page 10: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/10.jpg)
Monitoring and Controlling proces group consists of those
processes required to track, review, and orchestrate
progress and performance of the project .
MONİTORİNG AND CONTROLLING
![Page 11: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/11.jpg)
KNOWLEDGE AREAS
![Page 12: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/12.jpg)
Monitoring project and product scope and
management changes to scope baseline.
Scope is controlled by using traceability
management techniques.
SCOPE CONTROL
![Page 13: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/13.jpg)
TRACEABİLİTY MANAGEMENT
• An item is traceable if we can fully figure out
– WHERE it comes from, WHY it is there
– WHAT it will be used for, HOW it will be used
• Objectives of traceability are to assess impact of proposed
changes
• For example, how does an item such as SDD document gets
affected if we change a Use Case in SRS document?
• We have to have a traceability link between SRS and SDD.
![Page 14: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/14.jpg)
• To be identified, recorded, retrieved• Bidirectional: for accessibility from ...
– source to target (forward traceability)– target to source (backward traceability)
• Within same phase (horizontal) or among phases (vertical)
Objectives, domain concepts, requirements, assumptions
Architectural components & connectors
Source code Test data User manual
horizontal
vertical
forward backward
TRACEABİLİTY MANAGEMENT
![Page 15: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/15.jpg)
• Backward traceability– Why is this here? (and recursively)
– Where does it come from? (and recursively)
• Forward traceability– Where is this taken into account? (and recursively)
– What are the implications of this? (and recursively)
ß Localize & assess impact of changes along horizontal/vertical links
Objectives, domain concepts, requirements, assumptions
Architectural components & connectors
Source code Test data User manual
horizontal
vertical
forward backward
TRACEABİLİTY MANAGEMENT
![Page 16: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/16.jpg)
• Matrix representation of single-relation traceability graph– e.g. Dependency graph
Traceable item T1 T2 T3 T4 T5 T1 0 1 0 1 0 T2 0 0 1 0 1 T3 1 0 0 0 1 T4 0 0 1 0 1 T5 0 0 0 0 0
across Ti's row: forward retrieval of elements depending on Ti
down Ti's column: backward retrieval of elements which Ti depends on
J forward, backward navigation
J simple forms of analysis e.g. cycle T1®T4®T3® T1 can be detected
L unmanageable, error-prone for large graphs; single relation only
T1 T2
T3
T4
T5
TRACEABİLİTY MATRİX
![Page 17: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/17.jpg)
Monitoring project activities and taking corrective
actions if there are any deviations from what is
planned like project crashing.
SCEDULE AND COST CONTROL
![Page 18: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/18.jpg)
Binary tracking requires that progress on a
work package, change requests, and problem
reports be counted as:
0% complete until the associated work
products pass their acceptance criteria
100% complete when the work products
pass their acceptance criteria
BİNARY TRACKİNG
![Page 19: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/19.jpg)
Assume a 20,000 LOC system (estimated), with development
metrics:
270 of 300 requirements designed: 90%
750 of 1000 modules reviewed: 75%
500 of 1000 modules through CUT: 50%
200 of 1000 modules integrated: 20%
43 of 300 requirements tested: 14%
CUT: Code and Unit Test
These numbers are obtained using binary tracking of work
packages
BİNARY TRACKİNG
![Page 20: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/20.jpg)
Also assume our typical distribution of effort is:• Arch. Design: 17 %• Detailed Design: 26 %• Code & Unit Test: 35 %• Integration Test: 10 %• Acceptance Test: 12 %•Percent complete is therefore:90(.17)+75(.26)+50(.35)+20(.10)+14(.12)= 56% complete
BİNARY TRACKİNG
![Page 21: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/21.jpg)
Project is 56% complete; 44% remains
Effort to date is 75 staff-months
Estimated effort to complete is therefore:(44 / 56) * 75 = 60 staff-months
BİNARY TRACKİNG
![Page 22: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/22.jpg)
EARNED VALUE MANAGEMENT
EVM compares PLANNED work to COMPLETED
work to determine if work accomplished, cost, and
schedule are progressing as planned. The amount of work actually completed and resources
actually consumed at a certain point in a project
TO
The amount of work planned (budgeted) to be
completed and resources planned to be consumed at
that same point in the project
![Page 23: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/23.jpg)
EARNED VALUE MANAGEMENT
Budgeted Cost of Work Scheduled (BCWS): The cost
of the work scheduled or planned to be completed in a
certain time period per the plan. This is also called the
PLANNED VALUE.
Budgeted Cost of Work Performed (BCWP): The
budgeted cost of the work done up to a defined point in
the project. This is called the EARNED VALUE.
Actual Cost of Work Performed (ACWP): The actual
cost of work up to a defined point in the project.
![Page 24: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/24.jpg)
EARNED VALUE MANAGEMENT
Schedule Variance:
SV = BCWP – BCWS
Schedule Performance Index:
SPI = BCWP / BCWS
Cost Variance:
CV = BCWP - ACWP
Cost Performance Index:
CPI = BCWP / ACWP
![Page 25: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/25.jpg)
EARNED VALUE MANAGEMENT
SV, CV = 0 Project On Budget and ScheduleSV, CV < 0 Over Budget and Behind ScheduleSV, CV > 0 Under Budget and Ahead of Schedule
CPI, SPI = 1 Project On Budget and ScheduleCPI, SPI < 1 Over Budget and Behind ScheduleCPI, SPI > 1 Under Budget and Ahead of Schedule
![Page 26: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/26.jpg)
EARNED VALUE MANAGEMENT
Project Description:We are supposed to build 10 units of equipmentWe are supposed to complete the project within 6 weeksWe estimated that 600 man-hours to complete 10 unitsIt costs us $10/hour to build the equipment
Our Plan:We are supposed to build 1.67 units each weekEach unit costs $600We will spend $1,000 each week
![Page 27: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/27.jpg)
EARNED VALUE MANAGEMENT
Project status:Week 34 units of equipment completed400 man-hours spent
How are we doing?Are we ahead or behind schedule?Are we under or over budget?
Results:Accomplished Work: 4/10 = %40 complete Schedule: 3/6 = %50 over Budget: 400/600 = %67 spent
![Page 28: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/28.jpg)
EARNED VALUE MANAGEMENT
BCWS=(600 man-hours*$10/hour)*(3/6 weeks) = $3000
BCWP=(600 man-hours*$10/hour)*(4/10 units) = $2400
ACWP=400 man-hours*$10/hour = $4000
The price of the job that we have done is only $2400 (4
units)
Schedule: in 3 weeks, the price of the job that we should
have done was $3000
Cost: We spent much more; we spent $4000
![Page 29: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/29.jpg)
EARNED VALUE MANAGEMENT
SV = BCWP – BCWS = $2400 - $3000 = -$600
SV is negative; we are behind schedule
CV = BCWP – ACWP = $2400 - $4000 = -$1600
CV is negative; we are over budget
SPI = BCWP / BCWS = $2400 / $3000 = 0.8
SPI is less than 1; we are behind schedule
CPI = BCWP / ACWP = $2400 / $4000 = 0.6
CPI is less than 1; we are over budget
![Page 30: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/30.jpg)
EARNED VALUE MANAGEMENT
Earned Value analysis results are used to predict the future
performance of the project
Budget At Completion (BAC) = The total budget (PV or
BCWS) at the end of the project. If a project has Management
Reserve (MR), it is typically added to the BAC.
Amount expended to date (AC)
Estimated cost To Complete (ETC)
ETC = (BAC – EV) / CPI
Estimated cost At Completion (EAC)
EAC = ATC + AC
![Page 31: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/31.jpg)
Implementing risk response plans, tracking
identified risks, identifying new risks, and
evaluating risk process effectiveness.
RİSK CONTROL
![Page 32: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/32.jpg)
RİSK CONTROL
![Page 33: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/33.jpg)
Risk exposure is the product of
PROBABILITY x POTENTIAL LOSS
A project with 30% probability of late delivery and a penaltyof $100,000 for late delivery has a risk exposure of:
0.3 x 100,000 = $30,000
RİSK EXPOSURE
![Page 34: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/34.jpg)
Risk Leverage Factor:
RLF = (REb - REa) / RMc
where REb is the risk exposure before risk mitigation,REa is the risk exposure after risk mitigation andRMc is the cost of the risk mitigating actions
RİSK LEVERAGE FACTOR
![Page 35: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/35.jpg)
Suppose we are considering spending $25,000 to reduce the probability of a risk factor with potential impact of
$500,000 from 0.4 to 0.1then the RLF is: (200,000 - 50,000) / 25,000 = 6.0
Larger RLFs indicate better investment strategies RLFs can be used to prioritize risk and determine
mitigation strategies
RİSK LEVERAGE FACTOR
![Page 36: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/36.jpg)
A risk register contains the following information for each
identified risk factor:
Risk factor identifier
Revision number & revision date
Responsible party
Risk category (schedule, resources, cost, technical, other)
Description
Status (Closed, Action, Monitor)
RİSK REGİSTER
![Page 37: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/37.jpg)
If closed: date of closure and disposition(disposition: avoided, transferred, removed from watch list, immediate action or contingent action completed, crisis managed)
If active: action plan number of contingency plan number & status of the action)(status: on plan; or deviating from plan and risk factors for completing the plan)
RİSK REGİSTER
![Page 38: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/38.jpg)
Monitoring and controlling project and product
quality
QUALİTY CONTROL
![Page 39: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/39.jpg)
The review process in agile software development is usually informal. In Scrum, for example, there is a review meeting after each
iteration of the software has been completed (a sprint review), where quality issues and problems may be discussed.
In extreme programming, pair programming ensures that code is constantly being examined and reviewed by another team member.
XP relies on individuals taking the initiative to improve and re-factor code. Agile approaches are not usually standards-driven, so issues of standards compliance are not usually considered.
REVİEWS AND INSPECTİONS
![Page 40: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/40.jpg)
These are peer reviews where engineers examine the source of a system with the aim of discovering anomalies and defects.
Inspections do not require execution of a system so may be used before implementation.
They may be applied to any representation of the system (requirements, design, configuration data, test data, etc.).
They have been shown to be an effective technique for discovering program errors.
REVİEWS AND INSPECTİONS
![Page 41: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/41.jpg)
Agile processes rarely use formal inspection or peer review processes.
Rather, they rely on team members cooperating to check each other’s code, and informal guidelines, such as ‘check before check-in’, which suggest that programmers should check their own code.
Extreme programming practitioners argue that pair programming is an effective substitute for inspection as this is, in effect, a continual inspection process.
Two people look at every line of code and check it before it is accepted.
REVİEWS AND INSPECTİONS
![Page 42: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/42.jpg)
SOFTWARE PROCESS IMPROVEMENT
SPI encompasses a set of activities that will lead to a
better software process, and as a consequence a
higher-quality software delivered in a more timely
manner.
SPI help software engineering companies to find their
process inefficiencies and try to improve them.
![Page 43: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/43.jpg)
SOFTWARE PROCESS IMPROVEMENT
![Page 44: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/44.jpg)
SOFTWARE PROCESS IMPROVEMENT
Process measurementAttributes of the current process are measured. These
are a baseline for assessing improvements. Process analysis
The current process is assessed and bottlenecks and weaknesses are identified.
Process changeChanges to the process that have been identified
during the analysis are introduced. For example, process change can be better UML tools, improved communications, changing order of activities, etc.
![Page 45: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/45.jpg)
SOFTWARE PROCESS IMPROVEMENT
![Page 46: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/46.jpg)
SOFTWARE PROCESS IMPROVEMENT
There are 2 different approaches to SPI:
Process Maturity: It is for “Plan-Driven” development and
focuses on improving process and project management.
Agile: Focuses on iterative development and reduction
of overheads.
SPI frameworks are intended as a means to assess the
extent to which an organization’s processes follow best
practices and help to identify areas of weakness for
process improvement.
![Page 47: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/47.jpg)
CMMI PROCESS IMPROVEMENT FRAMEWORK
There are several process maturity models:SPICE ISO/IEC 15504BootstrapPersonal Software Process (PSP)Team Software Process (TSP)TickITSEI CMMI
![Page 48: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/48.jpg)
CMMI PROCESS IMPROVEMENT FRAMEWORK
Capability Maturity Model Integrated (CMMI) framework is the current stage of work on process assessment and improvement that started at the Software Engineering Institute (SEI) in the 1980s.
The SEI’s mission is to promote software technology transfer particularly to US defense contractors.
It has had a profound influence on process improvement. Capability Maturity Model introduced in the early 1990s Revised maturity framework (CMMI) introduced in 2001
![Page 49: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/49.jpg)
CMMI PROCESS IMPROVEMENT FRAMEWORK
CMMI allows a software company’s development and
management processes to be assessed and assigned a
score.
There are 4 process groups which include 22 process
areas.
These process areas are relevant to software process
capability and improvement.
![Page 50: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/50.jpg)
CMMI PROCESS IMPROVEMENT FRAMEWORK
Category Process areaProcess management Organizational process definition (OPD)
Organizational process focus (OPF)
Organizational training (OT)
Organizational process performance (OPP)Organizational innovation and deployment (OID)
Project management Project planning (PP)Project monitoring and control (PMC)
Supplier agreement management (SAM)
Integrated project management (IPM)
Risk management (RSKM)
Quantitative project management (QPM)
![Page 51: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/51.jpg)
CMMI PROCESS IMPROVEMENT FRAMEWORK
Category Process area
Engineering Requirements management (REQM)
Requirements development (RD)
Technical solution (TS)
Product integration (PI)
Verification (VER)
Validation (VAL)
Support Configuration management (CM)Process and product quality management (PPQA)
Measurement and analysis (MA)
Decision analysis and resolution (DAR)Causal analysis and resolution (CAR)
![Page 52: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/52.jpg)
CMMI MODELS There are 2 different CMMI models:
Staged CMMI: Assesses a software company’s maturity from 1 to 5.
Process improvement is achieved by implementing practices at each
level and moving from the lower level to higher level. Used to asses a
company as a whole.
Continuous CMMI: Assesses each process area separately, and
assigns a capability assessment score from 0 to 5. Normally
companies operate at different maturity levels for different process
areas. A company may be at level 5 for Configuration Management
process, but at level 2 for Risk Management process.
![Page 53: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/53.jpg)
STAGED CMMI
![Page 54: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/54.jpg)
STAGED CMMI
Initial: Essentially uncontrolled
Managed: Product management procedures defined and used
Defined: Process management procedures and strategies
defined and used
Quantitatively Managed: Quality management strategies
defined and used
Optimizing: Process improvement strategies defined and used
![Page 55: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/55.jpg)
STAGED CMMI (LEVELS 1-2)Level Focus Process Area
Managed Basic project management
Requirements management
Project planning
Project monitoring and control
Supplier agreement management
Measurement and analysis
Process and product quality assurance
Configuration management
Performed
![Page 56: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/56.jpg)
STAGED CMMI (LEVEL 3)Level Focus Process Area
Defined Process Standardization
Requirements development
Technical solution
Product integration
Verification
Validation
Organizational process focus
Organizational process definition
Organizational training
Integrated project management
Integrated supplier management
Risk management
Decision analysis and resolution
Organizational environment for integration
Integrated teaming
![Page 57: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/57.jpg)
STAGED CMMI (LEVELS 5-4)
Level Focus Process Area
Optimizing Continuous process improvement
Organizational innovation and deployment
Casual analysis and resolution
Quantitatively managed Quantitative management
Organizational process performance
Quantitative project management
![Page 58: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/58.jpg)
CONTİNOUS CMMI
![Page 59: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/59.jpg)
The maturity assessment is not a single value but is a
set of values showing the organizations maturity in each
area.
Examines the processes used in an organization and
assesses their maturity in each process area.
The advantage of a continuous approach is that
organizations can pick and choose process areas to
improve according to their local needs.
CONTİNOUS CMMI
![Page 60: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/60.jpg)
PEOPLE CMM
Used to improve the workforce.
Defines set of 5 organizational maturity levels that
provide an indication of sophistication of workforce
practices and processes.
People CMM complements any SPI framework.
![Page 61: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/61.jpg)
PEOPLE CMM
![Page 62: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/62.jpg)
Product metrics fall into two classes:
Dynamic metrics such as response time, availability, or
reliability are collected by measurements made of a program
in execution. These metrics can be collected during testing or
after system has gone into use.
Static metrics are collected by measurements made of
representations of the system such as the design program, or
documentation.
SOFTWARE PRODUCT METRİCS
![Page 63: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/63.jpg)
Unfortunately, direct product measurements cannot be made
on some of the quality attributes such as understandability and
maintainability.
Therefore, you have to assume that there is a relationship
between the internal attribute and quality attribute.
Model formulation involves identifying the functional form
(linear or exponential) by analysis of collected data, identifying
parameters that are to be included in the model, and calibrating
these parameters.
SOFTWARE PRODUCT METRİCS
![Page 64: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/64.jpg)
SOFTWARE QUALİTY ATTRİBUTES
Safety Understandability Portability
Security Testability Usability
Reliability Adaptability Reusability
Resilience Modularity Efficiency
Robustness Complexity Learnability
Subjective quality of a software system is largely based on its non-functional system attributes:
![Page 65: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/65.jpg)
M a in t a in a b il it yM a in t a in a b il it y
F le x i b il i t yF le x i b il i t y
T e s t a b ili t yT e s t a b ili t y
P o rt a b il it yP o rt a b il i t y
R e u s a b il i t yR e u s a b il i t y
I n t e r o p e r a b il i t yI n t e r o p e r a b il i t y
C o rr e c t n e s sC o rr e c t n e s s
R e li a b il i t yR e li a b il i t y
E ff i c ie n c yE ff i c ie n c y
I n t e g r it yI n te g r it y
U s a b il i t yU s a b il i t y
P R O D U C T T R A N S I T I O NP R O D U C T T R A N S I T I O NP R O D U C T R E V I S I O NP R O D U C T R E V I S I O N
P R O D U C T O P E R A TI O NP R O D U C T O P E R A TI O N
SOFTWARE QUALİTY ATTRİBUTES
![Page 66: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/66.jpg)
SOFTWARE QUALİTY ATTRİBUTES
![Page 67: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/67.jpg)
REQUİREMENTS MODEL METRİCS
Technical work in software engineering begins with the
creation of the requirements model.
These metrics examine requirements model with the
intent of predicting the of the resultant system.
Size is an indicator of increased coding, integration,
and testing effort.
![Page 68: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/68.jpg)
Function-Based Metrics: Function Point (FP) metric is used
to measure the functionality delivered by a software system.
Specification Metrics: Quality of analysis model and its
specification such as ambiguity, completeness, correctness,
understandability, verifiability, consistency, and traceability.
Although many of these characteristics are qualitative, there
are quantitative specification metrics.
REQUİREMENTS MODEL METRİCS
![Page 69: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/69.jpg)
Number of total requirements: Nr = Nf + Nnf
Nr= number of total requirements
Nf=Number of functional requirements
Nfn=Number of non-functional requirementsAmbiguity: Q = Nui / Nr; where
Nui= number of requirements for which all reviewers had identical interpretations
The closer the value of Q to 1, the lower the ambiguity of the specification.
REQUİREMENTS MODEL METRİCS
![Page 70: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/70.jpg)
The adequacy of uses cases can be measured using an
ordered triple (Low, Medium, High) to indicate:
Level of granularity (excessive detail) specified
Level in the primary and secondary scenarios
Sufficiency of the number of secondary scenarios in
specifying alternatives to the primary scenario
Semantics of analysis UML models such as sequence, state,
and class diagrams can also be measured.
REQUİREMENTS MODEL METRİCS
![Page 71: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/71.jpg)
OO DESİGN MODEL METRİCS
Coupling: Physical connections between elements of
OO design such as number of collaborations between
classes or the number of messages passed between
objects.
Cohesion: Cohesiveness of a class is determined by
examining the degree to which the set of properties it
possesses is part of the problem or design domain.
![Page 72: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/72.jpg)
CLASS-ORİENTED METRİCS – THE CK METRİCS SUİTE
Depth of the inheritance tree (DIT) is the maximum length from the node to the root of the tree. As DIT grows, it is likely that lower-level classes will inherit many methods. This leads to potential difficulties when attempting to predict the behavior of a class, but large DIT values imply that many methods may be reused.
Coupling between object classes (CBO). The CRC model may be used to determine the value for CBO. It is the number of collaborations listed for a class on its CRC index card. As CBO increases, it is likely that the reusability of a class will decrease.
![Page 73: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/73.jpg)
USER INTERFACE DESİGN METRİCS
Layout Appropriateness (LA) measures the user’s
movements from one layout entity such as icons, text,
menu, and windows.
Web page metrics are number of words, links,
graphics, colors, and fonts contained within a Web
page.
![Page 74: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/74.jpg)
USER INTERFACE DESİGN METRİCSDoes the user interface promote usability?
Are the aesthetics of the WebApp appropriate for the application
domain and pleasing to the user?
Is the content designed in a manner that imparts the most
information with the least effort?
Is navigation efficient and straightforward?
Has the WebApp architecture been designed to accommodate the
special goals and objectives of WebApp users, the structure of content
and functionality, and the flow of navigation required to use the system
effectively?
Are components designed in a manner that reduces procedural
complexity and enhances the correctness, reliability and performance?
![Page 75: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/75.jpg)
SOURCE CODE METRICSLength of identifiers is the average length of identifiers
(names for variables, methods, etc.)
The longer the identifiers, the more likely that they are
meaningful, thus more maintainable.
Depth of Conditional Nesting measures nesting of if-
statements.
Deeply nested if-statements are hard to understand and
potentially error-prone.
![Page 76: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/76.jpg)
SOURCE CODE METRİCSCyclomatic Complexity is a measure of the number of
independent paths such as if-else through the code and
measures structural complexity.
![Page 77: SOFTWARE PROJECT MONITORING AND CONTROL](https://reader036.vdocuments.mx/reader036/viewer/2022062323/56816356550346895dd4022d/html5/thumbnails/77.jpg)
TESTİNG METRİCSMajority of metrics focus on the process of testing, NOT
actual technical characteristics of tests themselves
Testers rely on analysis, design, and code metrics to
guide them in design and execution of test cases.
For example, Cyclomatic Complexity, a design metric,
lies at the core of basis path testing.
Each if-else statement (normal and alternate path) path
must be tested.