identifying high perceived value practices of cmmi level 2: an

13
Identifying high perceived value practices of CMMI level 2: An empirical study Mahmood Niazi a, * , Muhammad Ali Babar b a School of Computing and Mathematics, Keele University, Staffordshire ST5 5BG, UK b Lero, University of Limerick, Ireland article info Article history: Received 17 November 2008 Received in revised form 22 January 2009 Accepted 9 March 2009 Available online xxxx Keywords: CMMI Empirical study Perceived value abstract Objective: In this paper, we present findings from an empirical study that was aimed at identifying the relative ‘‘perceived value” of CMMI level 2 specific practices based on the perceptions and experiences of practitioners of small and medium size companies. The objective of this study is to identify the extent to which a particular CMMI practice is used in order to develop a finer-grained framework, which encom- passes the notion of perceived value within specific practices. Method: We used face-to-face questionnaire based survey sessions as the main approach to collecting data from 46 software development practitioners from Malaysia and Vietnam. We asked practitioners to choose and rank CMMI level 2 practices against the five types of assessments (high, medium, low, zero or do not know). From this, we have proposed the notion of ‘perceived value’ associated with each prac- tice. Results: We have identified three ‘requirements management’ practices as having a ‘high perceived value’. The results also reveal the similarities and differences in the perceptions of Malaysian and Viet- namese practitioners with regard to the relative values of different practices of CMMI level 2 process areas. Conclusions: Small and medium size companies should not be seen as being ‘‘at fault” for not adopting CMMI – instead the Software Process Improvement (SPI) implementation approaches and its transition mechanisms should be improved. We argue that research into ‘‘tailoring” existing process capability maturity models may address some of the issues of small and medium size companies. Ó 2009 Elsevier B.V. All rights reserved. 1. Introduction One of the major challenges in the software industry is to help organisations, especially small-to-medium size, to design and implement appropriate control structures for managing software development activities. That is why the software engineering com- munity (researchers and practitioners) has been allocating a large amount of resources to Software Process Improvement (SPI) since the early nineties. There has been a proliferation of SPI standards and models all claiming to increase the likelihood of success of pro- cess improvement initiatives. Some of the most notable efforts in this line of research and practice have resulted in process capabil- ity maturity models such as Capability Maturity Model (CMM), Capability Maturity Model Integration (CMMI) [11] and ISO/IEC 15504 (SPICE). These SPI frameworks can be used for defining and measuring the processes and practices that can be used by software developing organisations that are improving and/or assessing their software development processes. However, a large majority of software development organisa- tions appear to be unwilling to follow the SPI initiatives based on process capability maturity models like CMMI [53], which is the successor to CMM and is consistent with the international stan- dard ISO/IEC 15504. An SPI programme is quite an expensive undertaking as organisations need to commit significant resources over a long time period. Even organisations who are willing to commit the resources and time do not always achieve their desired results [16,24]. Hence, the significant investment required and lim- ited success are considered two of the main reasons for many organisations being reluctant to embark on a long path of system- atic process improvement [53]. Moreover, in the case of small and medium size organisations, there are more concerns about the rel- evance and applicability of the models like CMM or CMMI [8]. Case studies reporting small organisations’ experience with CMM [5] invariably discuss the difficulties that small organisations have of using and benefiting from CMM. Based on a systematic review of 45 studies reporting SPI experiences of 122 companies, Pino et al., have found that only two medium size companies (with 70 and 150 employees) could be formally assessed as CMM-SW level 2 [42]. Consequently, there is growing realization of the importance of gaining an in-depth understanding of the business drivers for SPI in order to make SPI approaches more widely and easily used [12]. Researchers have also been emphasising the important role of identifying the relative ‘‘perceived value” (i.e. the extent to which 0950-5849/$ - see front matter Ó 2009 Elsevier B.V. All rights reserved. doi:10.1016/j.infsof.2009.03.001 * Corresponding author. E-mail addresses: [email protected] (M. Niazi), [email protected] (M.A. Babar). Information and Software Technology xxx (2009) xxx–xxx Contents lists available at ScienceDirect Information and Software Technology journal homepage: www.elsevier.com/locate/infsof ARTICLE IN PRESS Please cite this article in press as: M. Niazi, M.A. Babar, Identifying high perceived value practices of CMMI level 2: An empirical study, Inform. Softw. Technol. (2009), doi:10.1016/j.infsof.2009.03.001

Upload: truonganh

Post on 02-Jan-2017

219 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: Identifying high perceived value practices of CMMI level 2: An

Information and Software Technology xxx (2009) xxx–xxx

ARTICLE IN PRESS

Contents lists available at ScienceDirect

Information and Software Technology

journal homepage: www.elsevier .com/locate / infsof

Identifying high perceived value practices of CMMI level 2: An empirical study

Mahmood Niazi a,*, Muhammad Ali Babar b

a School of Computing and Mathematics, Keele University, Staffordshire ST5 5BG, UKb Lero, University of Limerick, Ireland

a r t i c l e i n f o

Article history:Received 17 November 2008Received in revised form 22 January 2009Accepted 9 March 2009Available online xxxx

Keywords:CMMIEmpirical studyPerceived value

0950-5849/$ - see front matter � 2009 Elsevier B.V. Adoi:10.1016/j.infsof.2009.03.001

* Corresponding author.E-mail addresses: [email protected] (M. Ni

Babar).

Please cite this article in press as: M. Niazi, M.Technol. (2009), doi:10.1016/j.infsof.2009.03.00

a b s t r a c t

Objective: In this paper, we present findings from an empirical study that was aimed at identifying therelative ‘‘perceived value” of CMMI level 2 specific practices based on the perceptions and experiencesof practitioners of small and medium size companies. The objective of this study is to identify the extentto which a particular CMMI practice is used in order to develop a finer-grained framework, which encom-passes the notion of perceived value within specific practices.Method: We used face-to-face questionnaire based survey sessions as the main approach to collectingdata from 46 software development practitioners from Malaysia and Vietnam. We asked practitionersto choose and rank CMMI level 2 practices against the five types of assessments (high, medium, low, zeroor do not know). From this, we have proposed the notion of ‘perceived value’ associated with each prac-tice.Results: We have identified three ‘requirements management’ practices as having a ‘high perceivedvalue’. The results also reveal the similarities and differences in the perceptions of Malaysian and Viet-namese practitioners with regard to the relative values of different practices of CMMI level 2 processareas.Conclusions: Small and medium size companies should not be seen as being ‘‘at fault” for not adoptingCMMI – instead the Software Process Improvement (SPI) implementation approaches and its transitionmechanisms should be improved. We argue that research into ‘‘tailoring” existing process capabilitymaturity models may address some of the issues of small and medium size companies.

� 2009 Elsevier B.V. All rights reserved.

1. Introduction process capability maturity models like CMMI [53], which is the

One of the major challenges in the software industry is to helporganisations, especially small-to-medium size, to design andimplement appropriate control structures for managing softwaredevelopment activities. That is why the software engineering com-munity (researchers and practitioners) has been allocating a largeamount of resources to Software Process Improvement (SPI) sincethe early nineties. There has been a proliferation of SPI standardsand models all claiming to increase the likelihood of success of pro-cess improvement initiatives. Some of the most notable efforts inthis line of research and practice have resulted in process capabil-ity maturity models such as Capability Maturity Model (CMM),Capability Maturity Model Integration (CMMI) [11] and ISO/IEC15504 (SPICE). These SPI frameworks can be used for definingand measuring the processes and practices that can be used bysoftware developing organisations that are improving and/orassessing their software development processes.

However, a large majority of software development organisa-tions appear to be unwilling to follow the SPI initiatives based on

ll rights reserved.

azi), [email protected] (M.A.

A. Babar, Identifying high perce1

successor to CMM and is consistent with the international stan-dard ISO/IEC 15504. An SPI programme is quite an expensiveundertaking as organisations need to commit significant resourcesover a long time period. Even organisations who are willing tocommit the resources and time do not always achieve their desiredresults [16,24]. Hence, the significant investment required and lim-ited success are considered two of the main reasons for manyorganisations being reluctant to embark on a long path of system-atic process improvement [53]. Moreover, in the case of small andmedium size organisations, there are more concerns about the rel-evance and applicability of the models like CMM or CMMI [8]. Casestudies reporting small organisations’ experience with CMM [5]invariably discuss the difficulties that small organisations have ofusing and benefiting from CMM. Based on a systematic review of45 studies reporting SPI experiences of 122 companies, Pinoet al., have found that only two medium size companies (with 70and 150 employees) could be formally assessed as CMM-SW level2 [42].

Consequently, there is growing realization of the importance ofgaining an in-depth understanding of the business drivers for SPI inorder to make SPI approaches more widely and easily used [12].Researchers have also been emphasising the important role ofidentifying the relative ‘‘perceived value” (i.e. the extent to which

ived value practices of CMMI level 2: An empirical study, Inform. Softw.

Page 2: Identifying high perceived value practices of CMMI level 2: An

2 M. Niazi, M.A. Babar / Information and Software Technology xxx (2009) xxx–xxx

ARTICLE IN PRESS

a CMMI level 2 specific practice is perceived to add value to a pro-ject or an organisation based on the perceptions and experiences ofpractitioners) of different CMMI practices [61]. It is argued that abetter understanding of the relative value of CMMI practices per-ceived by organisations and practitioners is expected to enableSPI program managers to concentrate more on ‘‘high perceivedvalue” practices. Other researchers have also studied the relative‘‘perceived value” of CMMI practices with the aim of helping prac-titioners [61]. This research purports to extend the findings of Wil-kie et al. [61] by conducting a similar study in two differentcountries, i.e. Malaysia and Vietnam, which have been attractinga significant number of software outsourcing contracts from Ire-land, where Wilkie et al. carried out their study. Wilkie et al. [61]have described the results of CMMI software process appraisalwith six small-to-medium sized software development companies.They have identified ‘‘commonly practiced” or ‘‘not practiced” ele-ments of the model that lead them to the notion of perceived valueassociated with each specific CMMI practice. However, we havedecided to identify the relative value of different practices of CMMIlevel 2 process areas based on practitioners’ perceptions and expe-riences rather than based on process appraisal like Wilkie et al.[61]. We believe that software practitioners usually associate dif-ferent values to different CMMI practices and the relative valueof a practice may encourage or discourage them from fully sup-porting a particular practice.

This paper presents the results of an empirical study aimed atidentifying the relative ‘‘perceived value” of CMMI level 2 prac-tices, based on the perceptions and experiences of practitionersin developing countries Malaysia and Vietnam, we are involvedin outsourced software development. Since we expected to havesmall and medium size companies [59], we limited our researchto the practices of CMMI level 2 process areas as it has been ob-served that when small and medium size companies start processimprovement based on the CMM model, they set out to achieve le-vel 2 [42]. Our investigation has revealed several interesting find-ings which enabled us to identify and explain the relative‘‘perceived value” of different practices of the CMMI level 2 processareas. We have also identified a set of research questions that needto be explored in this line of research. This paper makes the follow-ing contributions to the SPI discipline:

� It presents the design and results of a first of its kind study indeveloping countries to identify an important aspect of CMMI-based SPI practices.

� It provides information about how practitioners perceive differ-ent practices of CMMI level 2 process areas.

� It identifies further research areas that need to be explored tosupport an effective and successful SPI program.

The following section discusses the background and motivationof this research. Section 3 describes the study design and logistics.Section 4 presents and discusses findings. The research and find-ings are summarized in Section 5. Limitations of the study are de-scribed in Section 6. The concluding remarks and future work arepresented in Section 7.

2. Research background and motivation

Research shows that the effort put into SPI can assist in produc-ing high quality software, reducing cost and time-to-market, andincreasing productivity [2,21,43]. Despite the different advancesmade in the development of SPI standards and models, the failurerate for SPI programs has been reported up to 70% in a report fromthe Software Engineering Institute (SEI) [31,47]. It can be arguedthat one of the reasons for this situation is lack of attention being

Please cite this article in press as: M. Niazi, M.A. Babar, Identifying high percTechnol. (2009), doi:10.1016/j.infsof.2009.03.001

paid to gaining fine-grained understanding of the relative per-ceived value of various SPI practices prescribed by well-knownSPI/SPA standards and models. There is a growing realization thatorganisations and practitioners usually pursue a particular practiceor activity based on the perceived value/benefit and required effortand cost [1,30]. From the value-based software engineering per-spective [7], it can be argued that the relative value of a practicemay be the key consideration in management’s decision to imple-ment a particular value. This is a particular issue for small andmedium size companies who need to be value sensitive in orderto remain competitive in a tough global market.

Recently, a large number of software development companieshave started moving their software development jobs/centres torelatively low cost countries like Malaysia and Vietnam, an emerg-ing paradigm called Global Software Development (GSD). However,this paradigm shift has created a wide variety of new challenges forsoftware development process management researchers and prac-titioners. One of the main challenges is the vital need of gaining anin-depth understanding of different practices of the SPI frame-works (i.e. CMMI Level 2) in organisations involved in softwareoutsourcing and/or off-shoring, which are parts of the GSD phe-nomenon. We have noticed that an SPI program in the context ofGSD usually requires more initial investment than a similar pro-gram in a non-GSD environment. Moreover, managing an SPI pro-gram in the GSD context is expected to be far more complex anddifficult because of the geographically distributed locations andlack of reliable infrastructure for GSD teams [9]. That is why SPIpractitioners face new challenges such as developing global SPIpractices, creating confidence and trust among the vendor and cus-tomer companies, managing the expectations of what can andwhat cannot be done in a distributed setting, and understandingthe human and cultural specific aspects of SPI initiatives. To suc-cessfully address these challenges, SPI practitioners need to gaina solid understanding of the changing mechanics of designingand implementing better SPI practices in the context of GSD.

Despite the increasing importance of an empirically tested bodyof knowledge on different aspects of successfully implementing SPIinitiatives, little research has been carried out to identifying therelative ‘‘perceived value” of CMMI level 2 practices based on theperception of practitioners in developing countries which are fastbecoming the prime destinations for outsourcing. However, imple-menting models like CMMI is considered quite difficult in suchdeveloping countries as many of the companies are not onlysmall-to-medium but also have resource constraints; hence, SPIbecomes quite challenging [5].

Previously we have published, using Malaysian data, a primarystudy of only three process areas of CMMI level 2 [34]. This paper isa revised and substantially extended version in which we presentfindings from an empirical study, with Malaysian and Vietnamesepractitioners, that was aimed at further identifying the relative‘‘perceived value” of practices of six CMMI level 2 process areas.In this paper we have critically analysed and discussed each prac-tice of CMMI level 2 process areas with a detailed description of theresearch methodology used. In addition, we have used statisticalanalysis in comparing perceived values given by Malaysian andVietnamese practitioners. Our long-term research goal is to pro-vide SPI practitioners with a body of knowledge that can help themto design and implement successful SPI initiatives in developingcountries.

Our research is aimed at not only extending the findings ofexisting studies [61] by conducting a similar study in a differentculture; but also at expanding this type of research by identifyingthe relative value of each CMMI practice perceived by practitio-ners. Moreover, it is also possible that CMMI practices may varyfrom one geographical region to another. As part of a large projectabout SPI implementation, we have been carrying out a research

eived value practices of CMMI level 2: An empirical study, Inform. Softw.

Page 3: Identifying high perceived value practices of CMMI level 2: An

M. Niazi, M.A. Babar / Information and Software Technology xxx (2009) xxx–xxx 3

ARTICLE IN PRESS

project to investigate the CMMI practices in the Asia-pacific region.The results of this project are expected not only to help softwarepractitioners to understand the perceived values of CMMI practicesin the Asia-pacific region, but also to help them to compare thesepractices identified in other regions [61].

3. Study design

3.1. Perceived value

In this study, we define ‘perceived value’ to mean the extent towhich a CMMI level 2 specific practice is perceived to add value toa project or an organisation based on the perceptions and experi-ences of practitioners who have been working in the area of SPIin their respective organisations. This may be considered to be asubjective view as it relies on the self-reported data. However,the respondents of this study are considered to be active membersof SPI teams within their organisations. Hence, we are confidentthat their perceptions are grounded in significant experience ofdesigning and deploying CMMI-based practices in real world SPIinitiatives.

In order to describe the notion of perceived value of CMMI prac-tices, it is essential to decide on the ‘‘importance” of a perceived va-lue. For this purpose, we have used the following definition:

� If the majority of respondents (P50%) perceive that a CMMIpractice has high value then we treat that practice as critical.

A similar approach has been used in the literature [39,46]. Rain-er and Hall [46] identified important factors in SPI with the crite-rion that if the 50% or more participants perceive that a factorhas a major role in SPI efforts then that factor should be treatedas having a major impact on SPI. However, SPI practitioners can de-fine their own criterion in order to decide the importance of aCMMI practice. Like Wilkie et al. [61], we assert that ‘‘perceivedvalue” of a particular practice can be used as a judgement criterionfor determining activities that organisations need to pursue. Webelieve that where respondents from different organisations per-ceive a practice as having a high perceived value then that practiceshould be considered for its importance in an SPI program. Theinformation about relative ‘‘perceived value” can help practitionersand researchers to better understand various practices detailedwithin the CMMI for the purpose of developing more appropriateimplementation strategies and appraisal procedures for small-to-medium sized organisations.

3.2. Research method

Considering the available resources, we decided to use a face-to-face questionnaire based survey sessions to identify and under-stand the relative ‘‘perceived value” of CMMI level 2 practicesbased on the perception and experience of Malaysian and Vietnam-ese practitioners. A survey research method can use one or moredata gathering techniques such as interviews and self-adminis-tered questionnaires [26]. A survey research method is consideredsuitable for gathering self-reported quantitative and qualitativedata from a large number of respondents [25]. We decided to usea questionnaire as a data collection instrument in face-to-facemeeting sessions. Our choice of combining questionnaire andface-to-face meeting sessions as a means of collecting data wasmotivated by several factors such as collecting data from respon-dents whose native language is not English, possibility of clarifyingthe objectives of the research and different terms used in the ques-tionnaire and explanation the purpose of different questions in-cluded in the questionnaire, and ensuring data validation just

Please cite this article in press as: M. Niazi, M.A. Babar, Identifying high perceTechnol. (2009), doi:10.1016/j.infsof.2009.03.001

before finishing each survey session. The negotiated survey sessionduration was an hour.

3.3. Data collection

We collected data from 46 software development practitionersof 23 Malaysian and 8 Vietnamese software development organisa-tions. Appendix A shows the demographics of participants’organisations.

More than two dozen software development companies wereidentified in Vietnam with the help of the Vietnam CompetitiveInitiative (VNCI), a US-Aid funded organisation working to improvethe competitiveness of Vietnamese companies. An email invitationto participate in our research was sent to the Directors or Manag-ing Directors of the identified companies. Our invitation letter in-cluded a brief description of the research project and the natureof the commitment required. In return, we offered to make thefindings of the research available to the participating companiesand, in addition, to offer their staff half-day training in softwarearchitecture evaluation methods and techniques. As one of theresearchers was experienced in architecture evaluation this offerwas intended to provide an incentive for the companies to partic-ipate. Eight Vietnamese organisations agreed to allow their practi-tioners to participate in the research project. Each of theparticipating companies nominated a person to liaise with theresearchers to plan and execute the research. Potential participantsof this research were identified with the help of the liaison people.One or more relevant practitioners in each company were identi-fied based on their direct involvement in SPI implementation pro-grams in their respective organisations. Twenty-three practitionersfrom eight Vietnamese companies participated in our research. Thesurvey sessions were conducted in the participants’ offices.

More than 70 software development companies were identifiedin Malaysia with the help of Centre for Advanced Software Engi-neering (CASE), University of Technology Malaysia. A letter of invi-tation was sent to these companies for participation in thisresearch project. In return, CASE has organised a free seminar on‘‘software process improvement: a road to success” for these com-panies. Twenty-three practitioners from 23 Malaysian companieswere agreed to participate in our research. The survey sessionswere conducted after the SPI seminar in a hotel.

Based on the organisation size definition provided by the Aus-tralian Bureau of Statistics (SMALL: 0–19 employees, MEDIUM:20–199 employees and LARGE: 200+ employees) [59], our Malay-sian sample contains 2 practitioners from small sized organisationsand 21 practitioners from medium sized organisations. Our Viet-namese sample contains 21 practitioners from medium sizedorganisations and 2 practitioners from large sized organisations.Each company nominated employees who were involved in tack-ling real SPI implementation issues on a daily basis in their respec-tive organisations. It is important to acknowledge that thepractitioners nominated within organisations are representativeof practitioners in organisations as a whole. A truly representativesample is impossible to attain and researchers should try to re-move as much of the sample bias as possible [13]. In order to makethe sample fairly representative of SPI practitioners in particularorganisations, different groups of practitioners from each organisa-tion were nominated to participate in this research. The sample ofpractitioners involved in this research includes developers, qualityanalysts, Software Quality Analysis (SQA) team leaders, SQA man-agers, project managers, and senior management. Thus the sampleis not random but a convenience sample, because we sought a re-sponse from a person with a specific role within a software devel-opment organisation. We consider that the practitioners whoparticipated in this study fall into two main categories:

ived value practices of CMMI level 2: An empirical study, Inform. Softw.

Page 4: Identifying high perceived value practices of CMMI level 2: An

Requirements Management Specific Practices

61 52 50 41 48

3537 37 46

48

0%

20%

40%

60%

80%

100%

SP1.1-1

SP1.2-2

SP1.3-1

SP1.4-2

SP1.5-1

No AnswerNot SureZeroLowMediumHigh

CMMI Practice Number

CMMI Practice Description

SP1.1-1 Obtain an understanding of requirementsSP1.2-2 Obtain commitment to requirementsSP1.3-1 Manage requirements changesSP1.4-2 Maintain bidirectional traceability requirementsSP1.5-1 Identify inconsistencies between project work and requirements

Fig. 1. Requirements management practices.

4 M. Niazi, M.A. Babar / Information and Software Technology xxx (2009) xxx–xxx

ARTICLE IN PRESS

� ‘‘Developers” consisting of programmer/analyst/SQAcoordinator.

� ‘‘Managers” consisting of team leader/project manager, andsenior managers.

Our sample of data contains 19 developers and 27 managers. Weplanned the data collection procedure and process to meet the eth-ics requirements set for this research, i.e. protection of subjectsfrom harm, deception and loss of privacy. The dignity and interestof participants was respected at all times. Approval from the hostcompanies was gained prior to conducting the research. In additionthe participating companies gave us consent to publish the findingsprovided the confidentiality and privacy would be maintained.

3.4. Data collection instrument and analysis method

We used a closed ended questionnaire as an instrument to collectself-reported data. Our questionnaire was based on the CMMI level 2practices of six process areas. The questionnaire was also designedto elicit the perceived importance of each identified CMMI practice(perceived value). In order to indicate the importance of CMMI prac-tices, the respondents were supposed to rate each identified prac-tice’s relative value on a five-point scale (i.e. high, medium, low,zero, or not sure). In this questionnaire we have precisely describedall CMMI level 2 practices with their related process areas. The ques-tionnaire was tested in a pilot study for clarity and the required time.

In order to analyse the perceived value of each identified CMMIlevel 2 specific practice, the occurrence of a perceived value (high,medium, low, zero) in each questionnaire was counted. By compar-ing the occurrences of one CMMI practice’s perceived values ob-tained against the occurrences of other CMMI practices’ perceivedvalues, the relative importance of each CMMI practice has beenidentified. We have also used this approach to identify high andlow valued requirements engineering practices and SPI de-motiva-tors in our previous research [33,36]. For most of the data analysis,we used frequency analysis. We believe that the presentation ofdata along with their frequencies is an effective mechanism forcomparing and contrasting within, or across groups of variables.

Though all the participants were well-versed in English and thequestionnaire was in English, the research team had a Malaysianspeaking researcher in Malaysian and Vietnamese speaking re-searcher in Vietnam, who provided necessary interpretation andexplanation whenever required.

In order to find significant differences between the two datasets (i.e. Malaysian and Vietnamese practitioners) we have usedthe linear by linear association chi-square test. The linear by linearassociation test is preferred when testing the significant differencebetween two ordinal variables because it is more powerful thanPearson chi-square test [29].

4. Findings

The questionnaire was designed to gather data about the ‘‘per-ceived value” of practices of six of the seven CMMI maturity level 2process areas:

1. Requirements management.2. Project planning3. Project monitoring and control.4. Measurement and analysis.5. Process and product quality assurance.6. Configuration management.

We did not include ‘‘Supplier Agreement Management” processarea in our study as our participants were not managing the acqui-sition of products from suppliers.

Please cite this article in press as: M. Niazi, M.A. Babar, Identifying high percTechnol. (2009), doi:10.1016/j.infsof.2009.03.001

4.1. Requirements management

The requirements management process area is focused on thepractices that are considered important to manage the require-ments of the products and to identify inconsistencies betweenthe requirements of project’s plans and work products [11].

Fig. 1 presents the specific practices for requirements manage-ment process area reported by the Malaysian and Vietnamesepractitioners. The results show that the most common ‘high’ valuepractice (61%) is ‘obtain an understanding of requirements – SP1.1-1’. Research shows that if a system’s requirements are not fullyunderstood, it can have a huge impact on the effectiveness of thesoftware development process for that system [15,18,23,41,45,51]. When requirements are poorly defined, the end resultis a poor product or a cancelled project [23,51]. An industry surveyin the UK reported that only 16% of software projects could be con-sidered truly successful: ‘‘Projects are often poorly defined, codesof practice are frequently ignored and there is a woeful inabilityto learn from past experience” [58, p. 17]. Sommerville and Ran-som [49] have reported that well defined requirements should leadto business benefits. The evidence is clear: problems in the require-ments phase can have a large impact on the success of softwaredevelopment projects [18,23,45,48,49]. The majority of the partic-ipants of this study perceive this practice of ‘high’ value, whichshows the critical importance of this practice for improving soft-ware development processes using SPI models like CMMI.

The requirements management practice ‘obtain commitment torequirements – SP1.2-2’ is also a frequently reported ‘high’ valuepractice by the Malaysia and Vietnamese practitioners. One can ob-tain commitment to the requirements from the project stakehold-ers by involving them in a systems development process[14,23,45]. Involving stakeholders in the development processcan reduce their fear, for example, that development of a softwaresystem will have any negative consequences such as loss of job orloss of organisational power. It is also a common observation that ifa new system is installed in an organisation without consulting thestakeholders, who would be affected by the system, then they mayfeel that the new system is unnecessary and therefore they shouldnot co-operate in its successful execution.

The other frequently cited ‘high’ value practice is ‘managerequirements changes – SP1.3-1’. Our results have confirmed theprevious findings of several researchers that highlight the impor-tance of this requirements practice [18,32,36,37,49]. It is widely re-ported that requirements often change during the software/system

eived value practices of CMMI level 2: An empirical study, Inform. Softw.

Page 5: Identifying high perceived value practices of CMMI level 2: An

M. Niazi, M.A. Babar / Information and Software Technology xxx (2009) xxx–xxx 5

ARTICLE IN PRESS

development process. These changes are inevitable and driven byseveral factors including errors in original requirements, evolvingcustomer needs, constant changes in software and system require-ments, business goals, work environment and government regula-tion [4]. Volatile requirements are regarded as a factor that causesmajor difficulties during system development for most organisa-tions in the software development industry [62]. Volatile require-ments contribute to the problems of software project scheduleoverruns and may ultimately contribute to a software project fail-ure [54,62,63]. Furthermore, ad hoc change management can leadto escalating requirements, uncontrolled scope creep and poten-tially uncontrolled system development [15,60]. Only 41% of ourrespondents cited ‘requirements traceability – SP1.4-2’ as a ‘high’value practice. Traceability of requirements is, in our opinion,one of the most important parts of the requirements managementprocess. It will simply be impossible to manage requirements rela-tionships except for perhaps in very simple projects without anytraceability mechanisms agreed and implemented.

Hence, using the criterion described in Section 2, we can con-clude that this research has identified three, frequently cited(>50%), ‘requirements management’ practices (SP1.1-1, SP1.2-2,SP1.3-1) as having a ‘high’ perceived value by Malaysian and Viet-namese practitioners.

4.2. Project planning

The project planning process area incorporates the practicesconsidered important to establish and maintain project plans andactivities [11].

There are 14 specific practices in the ‘project planning’ processarea as shown in Fig. 2. Our results indicate that none of the spe-cific practices of this process area has been singled out as havinga ‘high’ value by majority of the participants of this study. These re-sults show that limited attention is being paid to project planningactivities in Malaysia and Vietnam. Given that the effective projectplanning is considered a key to the success for an information sys-tems development projects [22,28,44,55], it is quite interesting tofind that the majority of the participants do not appear to placemuch value on most of the practices of this process area. Differentstudies have reported the importance of project planning in projectmanagement activities: Zwikael and Sadeh [64] have found that

Project Planning Specific Practices

44 41 35 44 37 24 28 35 28 2046 37 26 30

50 46 52 41 39 52 54 52 5952

41 48 54 48

0%20%40%60%80%

100%

SP1.1-1

SP1.2-1

SP1.3-1

SP1.4-1

SP2.1-1

SP2.2-1

SP2.3-1

SP2.4-1

SP2.5-1

SP2.6-1

SP2.7-1

SP3.1-1

SP3.2-1

SP3.3-1

No AnswerNot SureZeroLowMediumHigh

CMMI Practice Number CMMI Practice Description

SP1.1-1 Estimate the scope of the projectSP1.2-1 Establish estimates of work product and task attributesSP1.3-1 Define project life cycleSP1.4-1 Determine estimates of effort and costSP2.1-1 Establish the budget and scheduleSP2.2-1 Identify project risksSP2.3-1 Plan for data managementSP2.4-1 Plan for project resourcesSP2.5-1 Plan for needed knowledge and skillsSP2.6-1 Plan stakeholder involvementSP2.7-1 Establish the project planSP3.1-1 Review plans that affect the projectSP3.2-1 Reconcile work and resource levelsSP3.3-1 Obtain plan commitment

Fig. 2. Process planning practices.

Please cite this article in press as: M. Niazi, M.A. Babar, Identifying high perceTechnol. (2009), doi:10.1016/j.infsof.2009.03.001

improving the project planning is an effective tool in dealing withhigh-risk projects; Sun-Jen and Wen-Ming [55] have reported theimpact of project planning on project duration; and Linda et al.[27] have described the lack of project planning as a risk to soft-ware projects.

Other studies have reported several problems due to the lack ofproject management activities: Taylor [57] reported that only12.7% of IT projects were successful (130 of 1027 surveyed); astudy conducted by a group of Fellows of the Royal Academy ofEngineering and British Computer Society shows that despitespending 22.6 billions pounds on IT projects in UK during 2003/2004 a significant number of projects failed to deliver key benefitson-time and to targeted cost and specification [58]; the CHAOS Re-port shows that on average the percentage of software projectscompleted on-time and on-budget improved from 16.2% in 1995[50] to 35% in 2006 [52]. Despite this improvement, nearly two-thirds of the projects examined in the 2006 CHAOS report were‘challenged’ (i.e. only partially successful) with the authors observ-ing that one of the main reasons for project failure is poor projectmanagement activities of which project planning an importantpart.

These findings provide useful information about different as-pects of the SPI initiatives in Malaysian and Vietnamese organisa-tions, which participated in this research. Lack of project planningis considered a risk to software projects [27,55]. Despite this wehave found that majority of the participants of this study appearto be unconvinced of the importance of project planning practices.One explanation of this finding can be that Malaysia and Vietnamare developing countries, where a large majority of the softwaredevelopment companies are small-to-medium sized organisations,which may not be able to allocate sufficient resources for trainingtheir employees in good project management practices.

The majority of the participants of the current study perceivedeight practices of project planning (i.e. SP1.1-1, SP1.3-1, SP2.2-1,SP2.3-1, SP2.4-1, SP2.5-1, SP2.6-1 and SP3.2-1) as ‘‘medium value”practices. If this situation prevails then it will help Malaysian andVietnamese practitioners to successfully implement a project plan-ning process area of CMMI level 2 in their respective organisations[40]. Hence, using the criterion described in Section 2, we have notidentified any frequently cited (>50%), ‘project planning’ practicewhich has a ‘high’ perceived value.

Project Monitoring and Control

33 28 44 30 28 39 39 44 41 39

50 54 41 52 4648 41 37 52 48

020406080

100

SP1.1-1

SP1.2-1

SP1.3-1

SP1.4-1

SP1.5-1

SP1.6-1

SP1.7-1

SP2.1-1

SP2.2-1

SP2.3-1

No AnswerNot SureZeroLowMediumHigh

CMMI PracticeNumber

CMMI Practice Description

SP1.1-1 Monitor project planning parameters

SP1.2-1 Monitor commitments

SP1.3-1 Monitor project risks

SP1.4-1 Monitor data management

SP1.5-1 Monitor stakeholder involvement

SP1.6-1 Conduct progress reviews

SP1.7-1 Conduct milestone reviews

SP2.1-1 Analyse issues

SP2.2-1 Take corrective action

SP2.3-1 Manage corrective action

Fig. 3. Project monitoring and control practices.

ived value practices of CMMI level 2: An empirical study, Inform. Softw.

Page 6: Identifying high perceived value practices of CMMI level 2: An

6 M. Niazi, M.A. Babar / Information and Software Technology xxx (2009) xxx–xxx

ARTICLE IN PRESS

4.3. Project monitoring and control practices

The purpose of this process area is to provide an understandingof the project’s progress in order to perform corrective actionswhen the project’s performance deviates from the original plan[11].

Fig. 3 shows that there are 10 specific practices in the projectmonitoring and control process area. It is clear from Fig. 3 thatnone of the specific practices of this process area has been singledout as having a ‘high’ value by majority of the participants of thisstudy. Rather, four practices (i.e. SP1.1-1, SP1.2-1, SP1.4-1 andSP2.2-1) of this process area are frequently cited (52%) ‘medium’value practices. Project monitoring and control is an importantactivity in any project in which project managers make sure thatthe team is making satisfactory progress to the project deliverables[10,27,28] and it is interesting to find that a majority of the partic-ipants do not place high value on the practices designed for ensur-ing satisfactory progress in project monitoring and control.Different studies have reported the need to monitor project pro-gress: John [22] reported that ‘‘if you don’t take time to figureout what happened during a project, both the good and the bad,you’re doomed to repeat it”; Sun-Jen and Wen-Ming [55] have re-ported the impact of lack of project monitoring on project dura-tion; and Linda et al. [27] have reported a lack of project controlas a risk to software projects. We assert that if this situation pre-vails in Malaysian and Vietnamese software development industry,the significant numbers of future projects may fail to deliver keybenefits on time and to targeted cost and specifications. Such a sit-uation can also have negative impact on software developmenteconomy of their countries as it will be hard for Malaysian andVietnamese practitioners to get CMMI level 2 certification for theirrespective organisations. Hence, using the criterion described inSection 2, this study has not identified any frequently cited(>50%), ‘project monitoring and control’ practice which has a ‘high’perceived value. However, four practices (i.e. SP1.1-1, SP1.2-1,SP1.4-1 and SP2.2-1) of this process area are frequently cited(52%) ‘medium’ value practices in Malaysia and Vietnam.

4.4. Measurement and analysis

This process area is used to develop a measurement capabilityin order to support the information needs of the management [11].

Project Monitoring and Control

33 28 44 30 28 39 39 44 41 39

50 54 41 52 4648 41 37 52 48

020406080

100

SP1.1-1

SP1.2-1

SP1.3-1

SP1.4-1

SP1.5-1

SP1.6-1

SP1.7-1

SP2.1-1

SP2.2-1

SP2.3-1

No AnswerNot SureZeroLowMediumHigh

CMMI Practice Number

CMMI Practice Description

SP1.1-1 Establish measurement objectivesSP1.2-1 Specify measuresSP1.3-1 Specify data collection and storage proceduresSP1.4-1 Specify analysis proceduresSP2.1-1 Collect measurement dataSP2.2-1 Analyse measurement dataSP2.3-1 Store data and resultsSP2.4-1 Communicate results

Fig. 4. Measurement and analysis.

Please cite this article in press as: M. Niazi, M.A. Babar, Identifying high percTechnol. (2009), doi:10.1016/j.infsof.2009.03.001

Fig. 4 shows that none of the specific practices of this processarea were perceived as ‘high’ value practices by our respondents.One reason for this could be that small and medium sized compa-nies that have more contacts with their employees are less likely touse measurement and analysis. However, gathering and analysingdata for a suitable set of metrics is considered an important activityin any project as it provides insight into issues and risks [44,45,56].Such information is vital to enable project managers to detect andresolve problems as early as possible [17]. The measurement andanalysis process area provides management with the visibility thatorganisations need to develop measurement guidelines in order toimprove their project management efforts [19,44,45]. Goldensonet al. [17] have described the need to focus on measurement andanalysis from the very beginning of a process improvement effortin order to provide value to both projects and organisations. Welldesigned measurements guidelines are expected to help projectmanagers to identify the problems, size and scope of the problems,sources of problems and any alternative course of actions [17]. Inother studies the majority of the project managers emphasisedthe need on the use of accurate estimation techniques for any pro-ject success [41,45]. Our results show that none of the measure-ment and analysis practice was perceived to be of ‘high’ value,however all practices were perceived as ‘medium value’ practices.The results show that the most common ‘medium’ value practices(61%) are ‘analyse measurement data – SP2.2-1’ and ‘store data andresults – SP2.3-1’. Most of our respondents (59%) consider ‘specifymeasures’ (SP1.2-1) and ‘collect measurement data’ (SP2.1-1) as‘medium’ value practices. The other measurement and analysispractices are also a frequently reported ‘medium’ value practicein Malaysia and Vietnam.

However, the majority of the study’s participants do not placehigh value on the practices designed for measurement and analy-sis. Based on these findings and using the criterion described inSection 2, this study has not identified any frequently cited(>50%), ‘measurement and analysis’ practice which has a ‘high’ per-ceived value.

4.5. Process and product quality assurance

The purpose of this process area is to provide staff and man-agement with useful insights into process and product qualityassurance activities [11]. There are four specific practices in the‘process and product quality assurance’ process area as shown

Process and Product Quality Assurance Specific Practices

28 28 28 26

63 61 59 65

0%

20%

40%

60%

80%

100%

SP1.1-1 SP1.2-1 SP2.1-1 SP2.2-1

No AnswerNot SureZeroLowMediumHigh

CMMI Practice Number

CMMI Practice Description

SP1.1-1 Objectively evaluate processesSP1.2-1 Objectively evaluate work products and servicesSP2.1-1 Communicate and ensure resolution of non-compliance issuesSP2.2-1 Establish records

Fig. 5. Process and product quality assurance practices.

eived value practices of CMMI level 2: An empirical study, Inform. Softw.

Page 7: Identifying high perceived value practices of CMMI level 2: An

M. Niazi, M.A. Babar / Information and Software Technology xxx (2009) xxx–xxx 7

ARTICLE IN PRESS

in Fig. 5. Our results indicate that none of the practices of thisprocess area were perceived as ‘high’ value practice by the Malay-sian and Vietnamese practitioners. Less than 30% of the partici-pants cited all practices of ‘process and product qualityassurance’ as ‘high’ value practices. These findings can be inter-preted as an indication of the limited attention being paid to pro-cess and quality assurance activities in Malaysian and Vietnamesesoftware development organisations of the participants of thisstudy. Other reasons could be limited resources and limited tech-nical expertise as most of the participants of our study belongedto small and medium sized companies and this is a particular is-sue for such companies [5]. Given that organisations need to dem-onstrate the ability of implementing process and product qualitypractices in order to achieve a certain level of CMMI maturity, it isan interesting revelation that the majority of our study’s partici-pants do not place a high value on the practices designed forensuring process and product quality. We argue that quality is-sues should be fixed as early as possible in any project because‘‘by the time you figure out you have a quality problem, it is prob-ably too late to fix it” [22]. Process and product quality assuranceis very important for companies that either develop software orbuy software. This is because the process and product qualityassurance is the activity which provides evidence that the meth-ods and techniques used are integrated, consistent and correctlyapplied [20].

However, the bar chart in Fig. 5 shows that more than half of therespondents perceived all practices of ‘process and product qualityassurance’ process area as having ‘medium value’. It will be inter-esting to find out the detailed reasons for this situation. This find-ing may also be considered as an indicator of an increasingrealization of the critical role of the practices of this process areain achieving CMMI maturity level 2.

Based on these findings and using the criterion described in Sec-tion 2, this study has not identified any frequently cited (>50%),‘process and product quality assurance’ practice which has a ‘high’perceived value. However, this study identified (>50%) all practicesof ‘process and product quality assurance’ process area as ‘medium’value practices.

Configuration Management Specific Practices

33 37 35 37 35 30 39

57 47 4652 54

5043

0%

20%

40%

60%

80%

100%

SP1.1-1

SP1.2-1

SP1.3-1

SP2.1-1

SP2.2-1

SP3.1-1

SP3.2-1

No AnswerNot SureZeroLowMediumHigh

CMMI Practice Number

CMMI Practice Description

SP1.1-1 Identify the configuration itemsSP1.2-1 Establish a configuration management systemSP1.3-1 Create or release baselinesSP2.1-1 Track change requestsSP2.2-1 Control configuration itemsSP3.1-1 Establish configuration management recordsSP3.2-1 Perform configuration audits

Fig. 6. Configuration management practices.

Please cite this article in press as: M. Niazi, M.A. Babar, Identifying high perceTechnol. (2009), doi:10.1016/j.infsof.2009.03.001

4.6. Configuration management

The configuration management process area is used to establishthe integrity of work products using configuration identification,configuration control, configuration status accounting, and config-uration audits [11].

Fig. 6 shows that there are in total 7 specific practices in theconfiguration management process area. Fig. 6 also presents thepercentages of the participants’ responses about the relative ‘value’of each of the specific practices of this process area. It is clear thatlike all specific practices of the process and product quality assur-ance process area, that none of the specific practices of this processarea has been singled out as having ‘high’ value by the majority ofthe participants of this study. However, a large majority of the par-ticipants perceived that each of the practices of this process areahas either ‘high’ or ‘medium’ value in their process improvementprogram. That means there were only a small number of partici-pants who perceived that the practices of this process area wereof ‘low value’.

Such findings can be considered as a problem specific to Malay-sia and Vietnamese companies as many companies in Westerncountries with relatively longer history of software developmenthave been found not being able to adequately control their config-uration items [61]. Other reason could be that a configuration man-agement may not be so relevant to small and medium sizedorganisations. With the passage of time, the importance of the con-figuration management practices may grow among Malaysian andVietnamese practitioners as they will learn from their Westernscolleagues.

However, more than half of the Malaysian and Vietnamesepractitioners have cited four practices as ‘medium’ value practicesas shown in Fig. 6. Hence, based on the criterion described in Sec-tion 2, we have not identified any practice in this process area thathas been perceived as of ‘high’ value by more than 50% of the par-ticipants of our study.

4.7. Comparing Malaysian and Vietnamese perceptions

We have also decided to analyse the findings based on theparticipants’ country, i.e. Malaysia and Vietnam. We contendthat an understanding of the similarities and differences foundamong practitioners’ perceptions, from different countries, aboutthe relative value of the CMMI practices can help managers todesign better SPI strategies as they can place more focus onthe practices that are considered of ‘high’ value by practitionersin both countries. We believe that when respondents from dif-ferent countries consider a practice of ‘high’ value, that practicetends to have a significant impact on the success of a SPI pro-gram. Tables 1–6 show the relative ‘perceived value’ of theCMMI practices reported by Malaysian and Vietnamesepractitioners.

Table 1 shows the list of requirements management practicesalong with their respective relative ‘value’ cited by Malaysianand Vietnamese practitioners. It is evident from Table 1 that fourpractices were considered as having a ‘high’ value by the majorityof the Malaysian practitioners (P50%), while majority of the Viet-namese practitioners considered three requirements managementpractices as having a ‘high’ value. Out of five requirements man-agement practices two are common between Malaysian and Viet-namese practitioners (i.e. obtain an understanding ofrequirements – SP1.1-1 and obtain commitment to requirements– SP1.2-2). This shows that the Malaysian and Vietnamese practi-tioners give importance to stakeholders’ participation in the sys-tems development process as this is one of the motivations toobtain commitment and understanding to requirements from pro-ject participants [14]. More than 50% of the Malaysian practitioners

ived value practices of CMMI level 2: An empirical study, Inform. Softw.

Page 8: Identifying high perceived value practices of CMMI level 2: An

Table 1Requirements management practices.

Requirements management practices Malaysia (n = 23) Vietnam (n = 23) Linear by linear association chi-square test a = 0.05

H M L Z NS/NR H M L Z NS/NR X2 df p

SP1.1-1 14 8 0 0 1 14 8 0 0 1 0.000 1 1.000SP1.2-2 12 9 2 0 0 12 8 1 0 2 0.610 1 0.435SP1.3-1 12 6 5 0 0 11 11 1 0 0 0.388 1 0.534SP1.4-2 12 6 3 1 1 7 15 0 0 1 0.000 1 1.000SP1.5-1 10 11 2 0 0 12 11 0 0 0 1.023 1 0.312

H = High, M = medium, L = low, Z = zero, NS/NR = not sure/no response.

Table 2Project planning practices.

Project planning practices Malaysia (n = 23) Vietnam (n = 23) Linear by linear association chi-square test a = 0.05

H M L Z NS/NR H M L Z NS/NR X2 df p

SP1.1-1 8 12 0 1 2 12 11 0 0 0 3.624 1 0.057SP1.2-1 12 6 3 0 2 7 15 0 0 1 0.021 1 0.886SP1.3-1 7 14 0 1 1 9 10 2 0 2 0.020 1 0.887SP1.4-1 7 14 1 0 1 13 5 1 0 4 0.132 1 0.717SP2.1-1 8 9 3 0 3 9 9 0 0 5 0.010 1 0.920SP2.2-1 6 11 2 0 4 5 13 3 0 2 0.334 1 0.564SP2.3-1 7 10 4 0 2 6 15 2 0 0 1.296 1 0.255SP2.4-1 7 11 3 0 2 9 13 0 0 2 1.501 1 0.221SP2.5-1 7 12 3 0 1 6 15 1 0 1 0.028 1 0.867SP2.6-1 6 9 5 0 3 3 15 2 0 3 0.014 1 0.906SP2.7-1 9 9 3 3 2 12 10 0 0 1 1.239 1 0.266SP3.1-1 9 11 2 0 1 8 11 1 0 3 0.388 1 0.533SP3.2-1 7 12 2 0 2 5 13 3 0 2 0.066 1 0.797SP3.3-1 7 12 1 0 3 7 10 2 0 4 0.172 1 0.678

H = High, M = medium, L = low, Z = zero, NS/NR = not sure/no response.

Table 3Project monitoring and control.

Project monitoring and control Malaysia (n = 23) Vietnam (n = 23) Linear by linear association chi-square test a = 0.05

H M L Z NS/NR H M L Z NS/NR X2 df p

SP1.1-1 10 8 3 0 2 5 15 0 0 3 0.225 1 0.635SP1.2-1 8 9 4 0 2 5 16 0 0 2 0.066 1 0.798SP1.3-1 10 8 2 1 2 10 11 2 0 0 1.526 1 0.217SP1.4-1 7 11 2 0 3 7 13 1 0 2 0.357 1 0.550SP1.5-1 5 9 5 0 4 8 12 0 0 3 1.682 1 0.195SP1.6-1 9 10 1 0 3 9 12 1 0 1 0.763 1 0.382SP1.7-1 10 8 3 0 2 8 11 0 0 4 0.192 1 0.662SP2.1-1 10 8 2 0 3 10 9 1 0 3 0.047 1 0.829SP2.2-1 9 12 1 0 1 10 12 1 0 0 0.734 1 0.392SP2.3-1 10 10 2 0 1 8 12 1 0 2 0.162 1 0.688

H = High, M = medium, L = low, Z = zero, NS/NR = not sure/no response.

Table 4Measurement and analysis.

Measurement and analysis Malaysia (n = 23) Vietnam (n = 23) Linear by linear association chi-square test a = 0.05

H M L Z NS/NR H M L Z NS/NR X2 df p

SP1.1-1 8 12 2 0 1 7 14 1 0 1 0.023 1 0.879SP1.2-1 7 13 2 0 1 6 14 2 0 1 0.000 1 1.000SP1.3-1 6 14 2 0 1 6 12 2 0 3 0.418 1 0.518SP1.4-1 6 12 4 0 1 6 13 3 0 1 0.089 1 0.765SP2.1-1 7 14 1 0 1 8 13 0 0 2 0.000 1 1.000SP2.2-1 8 13 1 0 1 6 15 1 0 1 0.024 1 0.876SP2.3-1 7 13 2 0 1 7 15 0 0 1 0.218 1 0.641SP2.4-1 4 16 2 0 1 10 9 1 0 3 0.065 1 0.798

H = High, M = medium, L = low, Z = zero, NS/NR = not sure/no response.

8 M. Niazi, M.A. Babar / Information and Software Technology xxx (2009) xxx–xxx

ARTICLE IN PRESS

consider ‘maintain bidirectional traceability requirements – SP1.4-2’ as a ‘high’ value practice while only 30% of the Vietnamese prac-titioners consider this practice as a ‘high’ value. However, 65% of

Please cite this article in press as: M. Niazi, M.A. Babar, Identifying high percTechnol. (2009), doi:10.1016/j.infsof.2009.03.001

the Vietnamese practitioners consider SP1.4-2 as a ‘medium’ valuepractice. Our results show that there is no significant difference(i.e. p 6 0.05) between two the data sets.

eived value practices of CMMI level 2: An empirical study, Inform. Softw.

Page 9: Identifying high perceived value practices of CMMI level 2: An

Table 5Process and product quality assurance practices.

Process and product quality assurance Malaysia (n = 23) Vietnam (n = 23) Linear by linear association chi-square test a = 0.05

H M L Z NS/NR H M L Z NS/NR X2 df p

SP1.1-1 6 13 2 0 2 7 16 0 0 0 2.467 1 0.116SP1.2-1 5 15 0 0 3 8 13 1 0 1 1.391 1 0.238SP2.1-1 6 14 1 0 2 7 13 1 0 2 0.068 1 0.795SP2.2-1 6 14 2 0 1 6 16 0 0 1 0.227 1 0.634

H = High, M = medium, L = low, Z = zero, NS/NR = not sure/no response.

Table 6Configuration management practices.

Configuration management practices Malaysia (n = 23) Vietnam (n = 23) Linear by linear association chi-square test a = 0.05

H M L Z NS/NR H M L Z NS/NR X2 df p

SP1.1-1 9 10 2 0 2 6 16 0 0 1 0.173 1 0.677SP1.2-1 10 8 3 0 2 7 14 0 0 2 0.016 1 0.901SP1.3-1 9 9 3 0 2 7 12 0 0 4 0.198 1 0.657SP2.1-1 8 12 1 0 2 9 12 1 0 1 0.466 1 0.495SP2.2-1 8 11 2 0 2 8 14 0 0 1 0.682 1 0.409SP3.1-1 9 10 2 0 2 5 14 1 0 3 0.357 1 0.550SP3.2-1 9 10 2 0 2 5 13 1 0 4 0.817 1 0.366

H = High, M = medium, L = low, Z = zero, NS/NR = not sure/no response.

M. Niazi, M.A. Babar / Information and Software Technology xxx (2009) xxx–xxx 9

ARTICLE IN PRESS

Table 2 shows the list of project management practices alongwith their respective relative ‘value’ cited by Malaysian and Viet-namese practitioners. We have identified only one practice(SP1.2-1) in Malaysia that has been perceived as of ‘high’ valueby more than 50% of the practitioners. However, we have identifiedthree practices (SP1.1-1, SP1.4-1 and SP2.7-1) which have beenperceived as of ‘high’ value by more than 50% of the Vietnamesepractitioners. Other practices have been perceived as of ‘medium’value by most of the Malaysian and Vietnamese practitioners.We have not identified any significant difference between thetwo data sets.

Table 3 shows the list of project monitoring and control prac-tices along with their respective relative ‘value’ cited by Malaysianand Vietnamese practitioners.

We have not identified any practice in this process area that hasbeen perceived as of ‘high’ value by more than 50% of the Malay-sian and Vietnamese practitioners. However, Table 3 shows thatmost of the practices were perceived as ‘medium value’ practicesby the Vietnamese practitioners. However, Malaysian practitionersare far behind in realizing the importance of project monitoringand control practices. Our results show that there is no significantdifference (i.e. p 6 0.05) between two data sets.

Table 4 shows the list of measurement and analysis practices ci-ted by the Malaysian and Vietnamese practitioners. It is clear thatnone of the specific practices of this process area was perceived asa ‘high’ value practice by more than 50% of the Malaysian and Viet-namese practitioners. The measurements and analysis is an impor-tant activity in any project. However, Table 4 shows that most ofthe practices were perceived as ‘medium value’ practices by theMalaysian and Vietnamese practitioners. The results show thatthe most common ‘medium’ value practice (16 out of 23) citedby the Malaysian practitioners is SP2.4-1 (communicate results).Table 4 also shows that the most common ‘medium’ value prac-tices (15 out of 23) cited by the Vietnamese practitioners areSP2.2-1 (analyse measurement data) and SP2.3-1 (store data andresults). Our results show that there is no significant difference(i.e. p 6 0.05) between two data sets.

Table 5 presents the relative ‘value’ of each of the practices ofthe process and quality assurance process area reported ourrespondents. It is interesting to note that no practice in this process

Please cite this article in press as: M. Niazi, M.A. Babar, Identifying high perceTechnol. (2009), doi:10.1016/j.infsof.2009.03.001

area has been frequently cited (P50%) as a ‘high’ perceived valuepractice by Malaysian and Vietnamese practitioners. However,the majority (P50%) of the Malaysian and Vietnamese practitio-ners consider all practices as a ‘medium’ value practice. We havealready provided some of the possible explanations for such situa-tion in Section 4.5. Moreover, we did not find any significant differ-ence between two data sets.

Table 6 shows the specific practices of the configuration man-agement process area along with their respective perceived ‘value’as cited by our respondents. Our results show that no specific prac-tice of this process area has been frequently (P50%) cited byMalaysian and Vietnamese practitioners as being a ‘high’ valuepractice. We argue that this is due to the fact that Malaysia andVietnam are developing countries and it will take them some timeto learn from their Westerns colleagues. Only one practice (trackchange requests – SP2.1-1) was considered as a ‘medium’ valuepractice by more than 50% of the Malaysian practitioners. How-ever, all practices of configuration management process were con-sidered as a ‘medium’ value practices by majority of theVietnamese practitioners. These results show the level of commit-ment Vietnamese practitioners are going to have in developingconfiguration management practices.

5. Summary

Our research has been motivated by the challenges faced bycompanies, especially small-to-medium sized, in implementingCMMI-based process improvement programs. Moreover, therehave also been calls to understand business drivers and relative‘perceived value’ of different practices of a process improvementmodel (such as CMMI) in order to make SPI methods and technol-ogies widely used [12]. We believe that a better understanding ofthe relative value of SPI practices perceived by practitioners shouldbe taken into consideration while designing and implementing anSPI program.

Other researchers have also conducted a study to identify therelative value of the each of the practices of CMMI level 2 processareas with the aim of helping practitioners to pay more attentionto the ‘high perceived value’ practices [61]. The relative ‘‘perceivedvalue” of CMMI practices can act as a guide for SPI program man-

ived value practices of CMMI level 2: An empirical study, Inform. Softw.

Page 10: Identifying high perceived value practices of CMMI level 2: An

10 M. Niazi, M.A. Babar / Information and Software Technology xxx (2009) xxx–xxx

ARTICLE IN PRESS

agers when designing and implementing SPI initiatives. The asser-tion behind this argument is that it will be easier to encourage theuse of those SPI practices that are commonly used elsewhere andknown to be perceived of high value by practitioners.

This study has identified the relative perceived value of differ-ent practices of CMMI level 2 process areas. Our results are basedon the analysis of self-reported data gathered to explore the expe-riences, opinions, and views of Malaysian and Vietnamese softwaredevelopment practitioners. In order to describe the notion of per-ceived value of CMMI level 2 practices, we have decided in Section3.1 the criterion for the ‘‘criticality” of a perceived value.

Using this criterion during the analysis of the responses of allthe participants of this study, we have identified (as summarizedin Table 7):

� three ‘requirements management’ practices as having a ‘high’perceived value;

� a number of medium value practices in different process areas ofCMMI level 2.

However, our study has not identified any frequently cited‘high’ value practice for other process areas of CMMI level 2.

We have also analysed the responses based on the participants’country. We suggest that understanding the similarities in SPIpractices across practitioners from different developing countriescan help to develop effective SPI implementation strategies inthese countries. This is because, where respondents from differentcountries consider a practice of ‘high’ value, that practice is likelyto have a significant impact on the success of a SPI program indeveloping countries. Based on this analysis, we have found (assummarized in Table 8):

Table 7Summary of perceived values of CMMI level 2 practices in Malaysia and Vietnam.

Perceivedvalue

Requirements management Project p

High SP1.1-1, SP1.2-2, SP1.3-1 _Medium _ SP1.1-1,

SP2.6-1

Measurement and analysis Process a

High – –Medium SP1.1-1, SP1.2-1, SP1.3-1, SP1.4-1, SP2.1-1,

SP2.2-1, SP2.3-1 and SP2.4-1SP1.1-1,

Table 8Comparing Malaysian and Vietnamese perceptions and experiences of CMMI level 2 pract

Perceivedvalue

Requirements management

Malaysia Vietnam

High SP1.1-1, SP1.2-2, SP1.3-1and SP1.4-2

SP1.1-1, SP1.2-2 and SP1.5-1

Medium – SP1.4-2

Project monitoring and control

Malaysia Vietnam

High _ _Medium SP2.2-1 SP1.1-1, SP1.2-1, SP1.4-1, SP1.5-1, SP1.6-

1, SP2.2-1 and SP2.3-1

Process and product quality assurance

Malaysia Vietnam

High – _Medium SP1.1-1, SP1.2-1, SP2.1-1

and SP2.2-1SP1.1-1, SP1.2-1, SP2.1-1 and SP2.2-1

Please cite this article in press as: M. Niazi, M.A. Babar, Identifying high percTechnol. (2009), doi:10.1016/j.infsof.2009.03.001

� Four specific practices of the ‘requirements management’ pro-cess area have been reported as ‘high’ value practices by Malay-sian practitioners. However, three practices of this process areahave been reported as a ‘high’ value by Vietnamese practitioners.

� We have identified only one practice (SP1.2-1) of the ‘projectplanning’ process area which has been perceived of a ‘high’ valueby more than 50% of the Malaysian practitioners. However, wehave identified three practices (SP1.1-1, SP1.4-1 and SP2.7-1)which have been perceived of ‘high’ value by more than 50% ofthe Vietnamese practitioners.

� We have not identified any practice in project monitoring andcontrol process area that has been perceived as of ‘high’ valueby more than 50% of the Malaysian and Vietnamesepractitioners.

� None of the specific practices of the ‘measurement and analysis’process area was perceived as a ‘high’ value practice by morethan 50% of the Malaysian and Vietnamese practitioners. How-ever, most of the practices were perceived as ‘medium’ valuepractices by our respondents.

� None of the practices of the ‘process and product quality assur-ance’ process area has been perceived as a ‘high’ value by Malay-sian and Vietnamese practitioners. However, all of the practiceswere perceived as ‘medium’ value practices by our respondents.

� None of the practices can be considered as having a ‘high per-ceived value’ in configuration management process area.

These findings indicate that our respondents are aware of theimportance of ‘requirements management’ process area as mostof the practices in these process areas were perceived as ‘high’ va-lue practices by more than 50% of our respondents. Our resultsindicate that Vietnamese practitioners have started realizing the

lanning Project monitoring and control

_SP1.3-1, SP2.2-1, SP2.3-1, SP2.4-1, SP2.5-1,and SP3.2-1

SP1.1-1, SP1.2-1, SP1.4-1 andSP2.2-1

nd product quality assurance Configuration management

–SP1.2-1, SP2.1-1 and SP2.2-1 SP1.1-1, SP2.1-1, SP2.2-1 and

SP3.1-1

ices.

Project planning

Malaysia Vietnam

SP1.2-1 SP1.1-1, SP1.4-1 and SP2.7-1

SP1.1-1, SP1.3-1, SP1.4-1, SP2.5-1, SP3.2-1 and SP3.3-1

SP1.2-1, SP2.2-1, SP2.3-1, SP2.4-1,SP2.5-1, SP2.6-1 and SP3.2-1

Measurement and analysis

Malaysia Vietnam

_ _SP1.1-1, SP1.2-1, SP1.3-1, SP1.4-1, SP2.1-1, SP2.2-1, SP2.3-1 and SP2.4-1

SP1.1-1, SP1.2-1, SP1.3-1, SP1.4-1,SP2.1-1, SP2.2-1 and SP2.3-1

Configuration management

Malaysia Vietnam

_ _SP2.1-1 SP1.1-1, SP1.2-1, SP1.3-1, SP2.1-1,

SP2.2-1, SP3.1-1 and SP3.2-1

eived value practices of CMMI level 2: An empirical study, Inform. Softw.

Page 11: Identifying high perceived value practices of CMMI level 2: An

M. Niazi, M.A. Babar / Information and Software Technology xxx (2009) xxx–xxx 11

ARTICLE IN PRESS

importance of other process areas of CMMI level 2 as most of thepractices in these process areas were perceived as ‘medium value’practices by the majority of the Vietnamese practitioners. Our re-sults also indicate that Malaysian practitioners are not fully awareof the importance of two process areas of CMMI level 2, i.e. ‘projectmonitoring and control’ and ‘configuration management’. This maybe due to the fact that often in organisations practitioners are un-der enormous time pressure (i.e. time pressure is often in the formof meeting project deadlines) [35]. Hence, they may have littletime in which to participate in project monitoring and control,and configuration management activities. These findings enableus to suggest that Malaysian organisations should design SPIawareness initiatives for their practitioners in order to fully under-stand the benefits of these two process areas. These SPI awarenessinitiatives can be organised in the form of training and coachingprograms. In addition, Malaysian organisations should establishsome procedures to avoid their developers from time pressure.

Our long-term research goal is to build an empirically testedbody of knowledge of different aspects of SPI initiatives and assess-ment. We are approaching this by firstly focusing on complement-ing and/or extending the current understanding aboutpractitioners’ attitudes toward and opinions of different aspects ofSPI models and programs. We plan to develop appropriate supportmechanisms and tools to facilitate the design and implementationof suitable SPI strategies. In this study, we have gained important in-sights into the relative ‘‘perceived value” of each practice of the sixprocess areas of the CMMI level 2 maturity. We found that practitio-ners view certain practices are of ‘high’ value and should be paidmore attention in any SPI program initiative.

6. Limitations

Construct validity is concerned with whether or not the mea-surement scales represent the attributes being measured. Our sur-vey instruments were based on the specific practices of the sixprocess areas of CMMI level 2 [11]. During the survey sessions,the researchers observed that the participants were able to recog-nize each of the practices without any difficulty. Hence, their re-sponses provided us with the confidence that all the practicesincluded in the survey instrument were relevant to the partici-pants’ workspace.

Another issue is that the questionnaire surveys are usually basedon self-reported data that reflects what people say they believe ordo, not necessarily what they actually believe or practice. Hence,our results are limited to the respondents’ knowledge, attitudes,and experiences regarding the relative ‘value’ of each of the specificpractices of different process areas of CMMI maturity level 2. How-ever, the participants’ perceptions and experiences have not beendirectly verified through another mechanism such as appraisingtheir organisational practices like Wilkie et al. [61]. This situationcan cause at least one problem – practitioners’ perceptions andexperiences may not be fully accurate as the relative ‘values’ as-signed to different practices may actually be different. However, likethe researchers of many studies based on opinion data [3,6,38], wealso have full confidence in our findings because we have collecteddata from practitioners who were directly involved in SPI efforts intheir respective organisations and their perceptions and experi-ences were explored without any direction from the researchers.

Sample size may be another issue as 46 practitioners partici-pated in this study. In this respect, our research was limited byavailable resources and the number of companies that could beconvinced to participate in the reported study. However, to gaina broader representation of developing countries practitioners’views on this topic, more practitioners and companies need to beincluded in a study.

Please cite this article in press as: M. Niazi, M.A. Babar, Identifying high perceTechnol. (2009), doi:10.1016/j.infsof.2009.03.001

Another limitation of the study is the non-existence of a proventheory about human and organisational aspects of SPI efforts toguide a research similar to ours. That is why we considered our re-search as an exploratory approach, aimed at gathering facts in thehopes of drawing some general conclusions. We expect that thefindings from this research can help us and the wider SPI commu-nity to identify a research direction to develop and validate a the-ory of mechanics of SPI based on a certain maturity model.

7. Conclusion and future work

Small and medium size companies should not be seen as being‘‘at fault” for not adopting CMMI – instead the SPI implementationapproaches and its transition mechanisms should be improved. Weargue that to achieve broader impact, SPI approaches must targetsmall and medium size software developing companies and re-quire very little cost and time to adopt. Radically different ap-proaches to SPI may be required. One size may not fit all, and itis possible that research into ‘‘tailoring” existing process capabilitymaturity models may address some of the issues of small and med-ium size companies.

Our results indicate that practitioners usually support a specificSPI practice on the basis of the relative value of that practice per-ceived by them. Based on these findings we can suggest that therelative perceived value is often grounded in relevant experiencesof real world and practitioners roles in different organisations. Theinformation about the perceived value can provide organisationswith new opportunities for targeting the areas which need realattention. By paying attention to weak areas of SPI initiatives, wecan develop a finer-grained framework, which can help organisa-tions to better implement SPI initiatives.

We encourage independent studies on this topic. This could in-crease confidence in our findings and also track changes in atti-tudes to SPI over time. We encourage further research into theextent and nature of variation of business context, needs, and con-straints of small and medium size software developing organisa-tions, and their differences to larger organisations. We believethat a good understanding of these issues is vital in improvingand creating effective SPI approaches. From the findings of thisstudy, we have identified following goals that we plan to followin future:

� Collect additional data on the perceived value of different SPIpractices by exploring the basis of practitioners’ perceptionsand experiences through more in-depth interviews and casestudies.

� Conduct empirical studies to determine the relationshipbetween the relative ‘‘perceived value” of each practice andhow its implementation is justified by return on investment.

� It is also important to determine the mechanics of encouragingpractitioners to support those practices which have been per-ceived of ‘‘low value” but are required to achieve a certain levelof process maturity.

� Conduct a comparative study to compare the Malaysian andVietnamese practitioners’ views on SPI practices with theircounterparts’ views in the UK [61]. This comparative study willprovide insight to SPI managers by knowing the views of practi-tioners in a developed country (i.e. UK) and a developing coun-try (i.e. Malaysia and Vietnam).

Acknowledgements

We are grateful to the participants and their companies for par-ticipating in this study. Lero is funded by Science Foundation Ire-land under Grant Number 03/CE2/I303-1.

ived value practices of CMMI level 2: An empirical study, Inform. Softw.

Page 12: Identifying high perceived value practices of CMMI level 2: An

12 M. Niazi, M.A. Babar / Information and Soft

ARTICLE IN PRESS

Appendix A. Demographics

ID

PleaTech

Companyage (yrs)

se cite this anol. (2009), d

Number ofemployees

rticle in pressoi:10.1016/j.in

Primaryfunction

as: M. Niazi, M.A.fsof.2009.03.001

Domain

Malaysia

1 >5 20–199 In-house

development

Business systemsandtelecommunications

2

>5 20–199 In-housedevelopment

Business systemsand real timesystems

3

>5 20–199 In-house andoutsourceddevelopment

Software systemsand windows based

4

3–5 20–199 In-house andoutsourceddevelopment

Real time systems

5

>5 20–199 IT serviceprovider

Business systems

6

>5 20–199 In-housedevelopment

Windows basedsystems

7

>5 20–199 In-house andoutsourceddevelopment

Security systems

8

3–5 20–199 In-house andoutsourceddevelopment

Safety critical

9

>5 20–199 In-housedevelopment

Windows basedsystems andembedded systems

10

>5 20–199 In-housedevelopment

Data processing andsystems software

11

>5 20–199 In-housedevelopment

Business systems

12

>5 20–199 In-housedevelopment

Real time systems

13

>5 <20 In-housedevelopment

Windows basedsystems

14

3–5 <20 In-house andoutsourceddevelopment

Safety critical

15

>5 20–199 Outsourceddevelopment

Business systems

16

>5 20–199 In-housedevelopment

Telecommunications

17

>5 20–199 In-house andoutsourceddevelopment

IT security

18

>5 20–199 Other Software systems 19 >5 20–199 In-house

development

Software systems

20

>5 20–199 In-housedevelopment

Telecommunications

21

>5 20–199 In-housedevelopment

Telecommunications

22

>5 20–199 In-housedevelopment

Windows basedsystems

23

>5 20–199 In-housedevelopment

Windows basedsystems

Babar, Identifying high perc

ID Number of Number of Titles of participants

ware Technology xxx (2009) xxx–xxx

eived

employees

value practices

participants

of CMMI level

Vietnam

1 80 2 Project manager, Team leader 2 70 6 Developer, Test leader,

Programmer, Divisional head,Developer, QA manager

3

150 2 Chief Technology Officer, QAmanager

4

150 3 Design team leader, R&D teamleader, QA team leader

5

700 2 Project manager, Process qualitymanager

6

150 2 QA manager, Operation manager 7 50 4 QA manager, Project engineer,

Project leader, Project leader

8 199 2 QA coordinator, QA manager

References

[1] M. Ali-Babar, L. Bass, I. Gorton, Factors influencing industrial practices ofsoftware architecture evaluation: an empirical investigation, in: Proceedings ofthe Third International Conference on Quality of Software Architectures(QoSA), 2007.

[2] N. Ashrafi, The impact of software process improvement on quality: in theoryand practice, Information & Management 40 (7) (2003) 677–690.

[3] N. Baddoo, T. Hall, De-Motivators of software process improvement: ananalysis of practitioner’s views, Journal of Systems and Software 66 (1) (2003)23–33.

[4] E.J. Barry, T. Mukhopadhyay, S.A. Slaughter, Software project duration andeffort: an empirical study, Information Technology and Management 3 (1–2)(2002) 113–136.

[5] J. Batista, D.F. Dias, Software process improvement in a very small team: a casewith CMM, Software Process – Improvement and Practice (5) (2000) 243–250.

[6] S. Beecham, T. Hall, A. Rainer, Software process problems in twelve softwarecompanies: an empirical analysis, Empirical Software Engineering 8 (2003) 7–42.

[7] S. Biffl, A. Aurum, B. Boehm, H. Erdogmus, P. Grunbacher, Value-BasedSoftware Engineering, Springer, 2005.

[8] J.G. Brodman, D.L. Johnson, What small businesses and small organizations sayabout the CMMI, in: Proceedings of the 16th International Conference onSoftware Engineering (ICSE), IEEE Computer Society, 1994.

[9] G. Caprihan, Managing software performance in the globally distributedsoftware development paradigm, in: International Conference on GlobalSoftware Engineering, 2006, pp. 83–91.

[10] CCTA, Managing Successful Projects with Prince 2, Central Computer andTelecommunications Agency, 2001.

[11] M. Chrissis, M. Konrad, S. Shrum, CMMI Guidelines for Process Integration andProduct Improvement, Addison Wesley, 2003.

[12] R. Conradi, A. Fuggetta, Improving software process improvement, IEEESoftware (July/August) (2002) 92–99.

[13] H. Coolican, Research Methods and Statistics in Psychology, Hodder andStoughton, London, 1999.

[14] M. DeBillis, C. Haapala, User-centric software engineering, IEEE Expert 10 (1)(1995) 34–41.

[15] K. El Emam, H.N. Madhavji, A field study of requirements engineering practicesin information systems development, in: Second International Symposium onRequirements Engineering, 1995, pp. 68–80.

[16] A. Florence, Lessons learned in attempting to achieve software CMM level 4,CrossTalk (August) (2001) 29–30.

[17] D. Goldenson, J. Jarzombek, T. Rout, Measurement and analysis in capabilitymaturity model integration models and software process improvement.CrossTalk, July 2003.

[18] T. Hall, S. Beecham, A. Rainer, Requirements problems in twelve softwarecompanies: an empirical analysis, IEE Proceedings – Software (August) (2002)153–160.

[19] B. Hughes, Practical Software Measurement, McGraw-Hill, 2000.[20] A. Jarvis, V. Crandall, INROADS to Software Quality, Prentice-Hall Inc., 1997.

ISBN: 0132384035.

2: An empirical study, Inform. Softw.

Page 13: Identifying high perceived value practices of CMMI level 2: An

M. Niazi, M.A. Babar / Information and Software Technology xxx (2009) xxx–xxx 13

ARTICLE IN PRESS

[21] J. Jiang, G. Klein, H.-G. Hwang, J. Huang, S.-Y. Hung, An exploration of therelationship between software development process maturity and projectperformance, Information & Management (41) (2004) 279–288.

[22] S.R. John, Critical success factors in software projects, IEEE Software (May/June) (1999) 18–23.

[23] V. June, C. Karl, B. Steven, C. Narciso, Requirements engineering and softwareproject success: an industrial survey in Australia and the US, AJIS 13 (1) (2005)225–238.

[24] K. Kautz, P.A. Nielsen, Implementing software process improvement: twocases of technology transfer, in: Proceedings of the 33rd Hawaii Conference onSystem Sciences, vol. 7, Maui, USA, 2000, pp. 1–10.

[25] B. Kitchenham, S.L. Pfleeger, Principles of Survey Research, Parts 1 to 6,Software Engineering Notes, 2001–2002.

[26] T.C. Lethbridge, Studying software engineers: data collection techniquesfor software field studies, Empirical Software Engineering 10 (2005) 311–341.

[27] W. Linda, K. Mark, R. Arun, Understanding software project risk: a clusteranalysis, Information & Management 42 (1) (2004) 115–125.

[28] R. Louis, B. François, Project management information systems: an empiricalstudy of their impact on project managers and project success, InternationalJournal of Project Management 26 (2) (2008) 213–220.

[29] B. Martin, An Introduction to Medical Statistics, third ed., Oxford MedicalPublications, 2000.

[30] F. McCaffery, G. Coleman, Analysing the cost of lightweight SPI assessments,in: Proceedings of the European Software Process Improvement (EuroSPI),Dublin, Ireland, 2008.

[31] O. Ngwenyama, P.A. Nielsen, Competing values in software processimprovement: an assumption analysis of CMM from an organizationalculture perspective, IEEE Transactions on Engineering Management 50 (1)(2003) 100–112.

[32] M. Niazi, An empirical study for the improvement of requirements engineeringprocess, in: The 17th International Conference on Software Engineering andKnowledge Engineering, July 14–16, 2005, Taipei, Taiwan, Republic of China,2005, pp. 396–399.

[33] M. Niazi, M. Ali-baber, De-motivators for software process improvement: anempirical investigation, Software Process Improvement and Practice Journal(Perspectives on Global Software Development: Special Issue on PROFES 2007)13 (3) (2008) 249–264.

[34] M. Niazi, M. Ali-baber, S. Ibrahim, An empirical study identifying highperceived value practices of CMMI level 2, in: International Conference onProduct Focused Software Process Improvement PROFES 2008, Italy, LNCS, vol.5089, 2008, pp. 427–441.

[35] M. Niazi, M. Ali Babar, De-motivators for software process improvement: ananalysis of Vietnamese practitioners’ views, in: International Conference onProduct Focused Software Process Improvement PROFES 2007, LNCS, vol.4589, 2007, pp. 118–131.

[36] M. Niazi, K. Cox, J. Verner, An empirical study identifying high perceived valuerequirements engineering practices, in: Fourteenth International Conferenceon Information Systems Development (ISD’2005), Karlstad University, Sweden,2005, pp. 15–17.

[37] M. Niazi, S. Shastry, Role of requirements engineering in softwaredevelopment process: an empirical study, in: IEEE International Multi-TopicConference (INMIC03), 2003, pp. 402–407.

[38] M. Niazi, D. Wilson, D. Zowghi, A framework for assisting the design ofeffective software process improvement implementation strategies, Journal ofSystems and Software 78 (2) (2005) 204–222.

[39] M. Niazi, D. Wilson, D. Zowghi, A maturity model for the implementation ofsoftware process improvement: an empirical study, Journal of Systems andSoftware 74 (2) (2005) 155–172.

[40] M. Niazi, D. Wilson, D. Zowghi, Critical success factors for software processimprovement: an empirical study, Software Process Improvement and PracticeJournal 11 (2) (2006) 193–211.

Please cite this article in press as: M. Niazi, M.A. Babar, Identifying high perceTechnol. (2009), doi:10.1016/j.infsof.2009.03.001

[41] J. Pereira, J.M. Verner, N. Cerba, M. Rivas, J.D. Procaccino, What do softwarepractitioners really think about project success: a cross-cultural comparison,Journal of Systems and Software 81 (6) (2008) 897–907.

[42] F.J. Pino, F. Garcia, M. Piattini, Software process improvement in small andmedium software enterprises: a systematic review, Software Quality Journal16 (2) (2008) 237–261.

[43] B. Pitterman, Telcordia Technologies: the journey to high maturity, IEEESoftware 17 (4) (2000) 89–96.

[44] J.D. Procaccino, J.M. Verner, Software project managers and project success: anexploratory study, The Journal of Systems and Software 79 (2) (2006) 1541–1551.

[45] J.D. Procaccino, J.M. Verner, K.M. Shelfer, D. Gefen, What do softwarepractitioners really think about project success: an exploratory study, TheJournal of Systems and Software 78 (2) (2005) 194–203.

[46] A. Rainer, T. Hall, Key success factors for implementing software processimprovement: a maturity-based analysis, Journal of Systems & Software 62 (2)(2002) 71–84.

[47] SEI, Process Maturity Profile of the Software Community, Software EngineeringInstitute, 2002.

[48] I. Sommerville, Software Engineering, fifth ed., Addison-Wesley, 1996.[49] I. Sommerville, J. Ransom, An empirical study of industrial requirements

engineering process assessment and improvement, ACM Transactions onSoftware Engineering and Methodology 14 (1) (2005) 85–117.

[50] Standish-Group, Chaos – The State of the Software Industry, Standish GroupInternational Technical Report, 1995, pp. 1–11.

[51] Standish-Group, Chaos: A Recipe for Success, Standish Group International,1999.

[52] Standish-Group, Chaos Report, 2006.[53] M. Staples, M. Niazi, R. Jeffery, A. Abrahams, P. Byatt, R. Murphy, An

exploratory study of why organizations do not adopt CMMI, Journal ofSystems and Software 80 (6) (2007) 883–895.

[54] G. Stark, A. Skillicorn, R. Ameele, An examination of the effects of requirementschanges on software maintenance releases, Journal of Software Maintenance:Research and Practice 11 (1999) 293–309.

[55] H. Sun-Jen, H. Wen-Ming, Exploring the relationship between software projectduration and risk exposure: a cluster analysis, Information & Management 45(3) (2008) 175–182.

[56] A. Sweeney, D.W. Bustard, Software process improvement: making it happenin practice, Software Quality Journal 6 (4) (1997) 265–273.

[57] A. Taylor, IT projects: sink or swim, Computer Bulletin (January), 2000.[58] The-Royal-Academy-of-Engineering, The Challenges of Complex IT Projects,

The Report of a Working Group from The Royal Academy of Engineering andThe British Computer Society, UK, 2004. ISBN 1-903496-15-2.

[59] D. Trewin, Small Business in Australia: 2001, Australian Bureau of StatisticsReport 1321.0, 2002.

[60] J. Verner, W.M. Evanco, In-house software development: what softwareproject management practices lead to success?, IEEE Software 22 (1) (2005)86–93

[61] F.G. Wilkie, D. McFall, F. McCaffery, An evaluation of CMMI process areas forsmall to medium-sized software development organisations, Software ProcessImprovement and Practice 10 (2005) 189–201.

[62] D. Zowghi, N. Nurmuliani, A study of the impact of requirements volatility onsoftware project performance, in: Ninth Asia–Pacific Software EngineeringConference, 2002, pp. 3–11.

[63] D. Zowghi, N. Nurmuliani, S. Powell, The impact of requirements volatility onsoftware development lifecycle, in: Software Engineering Conference,Proceedings 2004, Australian, 2004, pp. 28–37.

[64] O. Zwikael, A. Sadeh, Planning effort as an effective risk management tool,Journal of Operations Management 25 (4) (2007) 755–767.

ived value practices of CMMI level 2: An empirical study, Inform. Softw.