development and validation of an instrument to .../67531/metadc... · pettit, alex z. development...

157
APPROVED: Leon Kappelman, Major Professor Robert Pavur, Committee Member Brian O’Connor, Committee Member Suliman Hawamdeh, Chair of the Department of Library and Information Sciences Harold Totten, Dean of the College of Information Mark Wardell, Dean of the Toulouse Graduate School DEVELOPMENT AND VALIDATION OF AN INSTRUMENT TO OPERATIONALIZE INFORMATION SYSTEM REQUIREMENTS CAPABILITIES Alex Z. Pettit Dissertation Prepared for the Degree of DOCTOR OF PHILOSOPHY UNIVERSITY OF NORTH TEXAS May 2014

Upload: others

Post on 17-Jun-2020

3 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

APPROVED:

Leon Kappelman, Major Professor Robert Pavur, Committee Member Brian O’Connor, Committee Member Suliman Hawamdeh, Chair of the

Department of Library and Information Sciences

Harold Totten, Dean of the College of Information

Mark Wardell, Dean of the Toulouse Graduate School

DEVELOPMENT AND VALIDATION OF AN INSTRUMENT TO OPERATIONALIZE

INFORMATION SYSTEM REQUIREMENTS CAPABILITIES

Alex Z. Pettit

Dissertation Prepared for the Degree of

DOCTOR OF PHILOSOPHY

UNIVERSITY OF NORTH TEXAS

May 2014

Page 2: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

Pettit, Alex Z. Development and Validation of an Instrument to Operationalize

Information System Requirements Capabilities. Doctor of Philosophy (Information

Science), May 2014, 146 pp., 46 tables, 5 figures, references, 223 titles.

As a discipline, information systems (IS) has struggled with the challenge of

alignment of product (primarily software and the infrastructure needed to run it) with the

needs of the organization it supports. This has been characterized as the pursuit of

alignment of information technology (IT) with the business or organization, which begins

with the gathering of the requirements of the organization, which then guide the creation

of the IS requirements, which in turn guide the creation of the IT solution itself. This

research is primarily focused on developing and validating an instrument to

operationalize such requirements capabilities.

Requirements capabilities at the development of software or the implementation

of a specific IT solution are referred to as capabilities for software requirements or more

commonly systems analysis and design (SA&D) capabilities. This research describes

and validates an instrument for SA&D capabilities for content validity, construct validity,

internal consistency, and an exploratory factor analysis. SA&D capabilities were

expected to coalesce strongly around a single dimension. Yet in validating the SA&D

capabilities instrument, it became apparent that SA&D capabilities are not the

unidimensional construct traditionally perceived. Instead it appears that four

dimensions underlie SA&D capabilities, and these are associated with alignment

maturity (governance, partnership, communications, and value). These sub factors of

requirements capabilities are described in this research and represent distinct

capabilities critical to the successful alignment of IT with the business.

Page 3: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

Copyright 2014

by

Alex Z. Pettit

ii

Page 4: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

ACKNOWLEDGEMENTS

To my mother (Sophie) and father (Alton), my foundation and strength.

To my children: Zachary, Alton, Emma, AJ, and Jenna, my happiness and joy.

To Jim and Barbara Hruby and Cordell and Gloria Ingman, my friends

and mentors.

Thank you, all of you, for all you have done.

iii

Page 5: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

TABLE OF CONTENTS

Page

LIST OF TABLES ............................................................................................................ vi

LIST OF FIGURES .......................................................................................................... ix

Chapters

1. INTRODUCTION ....................................................................................... 1

Theoretical Background .................................................................. 5

Problem Statement ......................................................................... 9

Primary and Secondary Research Questions ............................... 11

Research Design ........................................................................... 12

Conclusion .................................................................................... 13

2. LITERATURE REVIEW ........................................................................... 15

The Gap within RA ........................................................................ 16

Where this Fits in the Literature Stream ........................................ 17

The Differences between Software Development Practices and Requirements Capabilities ............................................................ 18

RA through Reductionism ............................................................. 20

RA through Systems Theory ......................................................... 23

Efforts to Balance Reductionism and Systems Approaches ......... 26

Enterprise Architecture as an Approach to Requirements Decomposition in Context ............................................................. 26

Relevance of Holistic RA Decomposition for Commercial

Software Implementation .............................................................. 30 The Gap Remains Between IT and the Business.......................... 33

Evolution of Maturity Models in Software Development ................ 35

iii

Page 6: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

Requirements as Distinct from Software Development Capabilities ................................................................................... 36

RA Maturity Defined ...................................................................... 38

Better Articulating the Gap in RA .................................................. 39

Separating RA Capabilities from Software Development Capabilities ................................................................................... 41

Research Proposal ........................................................................ 42

Research Questions ...................................................................... 44

Summary ....................................................................................... 47

3. METHODOLOGY .................................................................................... 48

Population and Sample ................................................................. 50

Research Strategy ........................................................................ 51

The Importance of Instrument Validation ...................................... 52

Instrument Validation Process....................................................... 52

Content Validity: RA and Design Capabilities Construct ............... 53 Summary ....................................................................................... 61

4. DATA ANALYSIS AND RESULTS ........................................................... 62

Introduction ................................................................................... 62

Survey Execution and Data Collection .......................................... 63

Demographics ............................................................................... 65

Nonresponse Bias ......................................................................... 76

H1: Content Validity ...................................................................... 78

H1: Criterion Validity ..................................................................... 79

H1: Reliability Analysis .................................................................. 80

H1: Construct Validity .................................................................... 83

iv

Page 7: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

H2: Exploratory Analysis ............................................................... 85

H2: Analysis of Dimensionality ...................................................... 87

Summary ..................................................................................... 100

5. DISCUSSION ........................................................................................ 101

Discussion of Research Findings ................................................ 101

Theoretical Implications .............................................................. 101

Limitations ................................................................................... 103

Areas for Future Research .......................................................... 105

Conclusion .................................................................................. 106

APPENDIX – SURVEY INSTRUMENT ....................................................................... 108

REFERENCES ............................................................................................................ 128

v

Page 8: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

LIST OF TABLES

Page

1. Question Mapping Matrix .................................................................................... 58

2. How Will You be Answering the Questions? ...................................................... 66

3. Organization Contains Organizational Change Management Practice ............... 67

4. Extent of Centralization for IT Budget and IT Operations in Organization .......... 67

5. Does the Organization Have an EA Practice? .................................................... 67

6. Is Organization’s EA Practice a Part of the IT Department? ............................... 67

7. Approximate Budget of Organization’s EA Practice ............................................ 68

8. Number of Employees in the Organization’s EA Practice ................................... 68

9. Number of Years Organization’s EA Practice Has Existed ................................. 68

10. Organization In-house Software Development/Maintenance Practice ................ 69

11. In-house Development Percentage of Total Software Development and Maintenance ...................................................................................................... 69 12. IS Specifies a Comprehensive Procedure for Software Development and Maintenance ....................................................................................................... 69 13. Organizational IT Department Utilization ............................................................ 71

14. IT Department Manager’s Supervisor ................................................................. 71

15. Number of IT Departments in Organization ........................................................ 72

16. Total IT Operating Budget in Last Fiscal Year .................................................... 72

17. Number of IT Department Employees ................................................................ 73

18. IS Department Aspires to Software Development Practices of Software Engineering Institute’s (SEI) CMMI for Software Development .......................... 73 19. SEI CMMI Practice Level for Organization ......................................................... 73

20. Description of Organization ................................................................................ 74

vi

Page 9: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

21. Number of Employees in Organization ............................................................... 74

22. Organization’s Gross Revenue in the Last Fiscal Year ...................................... 74

23. Organization’s Location ...................................................................................... 75

24. Participant’s Primary Job Title ............................................................................ 75

25. Supervisor’s Primary Job Title ............................................................................ 75

26. How Will You be Answering the Questions? ...................................................... 77

27. The Organization Has an Organizational Change Management Practice .......... 77

28. Degree of Centralization for IT Budget and IT Operations in the Organization ... 77

29. Does the Organization Have an EA Practice? .................................................... 77

30. Reliability Statistics ............................................................................................. 80

31. Summary Items Statistics ................................................................................... 81

32. Organization’s Requirements, Capabilities, and Practices for SA&D Context .... 81

33. Force 1 Factor with Varimax Rotation ................................................................ 83

34. Commonalities .................................................................................................... 84

35. KMO and Bartlett’s Test ..................................................................................... 86

36. Total Variance Explained Force 1 Factor ........................................................... 88

37. Initial Eigenvalues ............................................................................................... 89

38. Total Variance Explained .................................................................................... 90

39. Rotated Component Matrix ................................................................................. 91

40. Second Factor Analysis ...................................................................................... 92

41. Third Factor Analysis .......................................................................................... 94

42. SEM Fit Indices .................................................................................................. 95

43. Governance Dimension Mean, Standard Deviation, and Factor Load ................ 99

44. Partnership Dimension Mean, Standard Deviation, and Factor Load ................. 99

vii

Page 10: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

45. Communication Dimension Mean, Standard Deviation, and Factor Load .......... 99

46. Factor: Value/Competency Measured ................................................................ 99

viii

Page 11: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

LIST OF FIGURES

Page

1. Federated model of the enterprise ........................................................................ 6

2. Department/service driven architecture (with permission of John A. Zachman) ........................................................................................................... 28 3. The research instrument operationalizing requirements practices ...................... 54

4. Scree plot ........................................................................................................... 90

5. Confirmatory model ............................................................................................ 98

ix

Page 12: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

CHAPTER 1

INTRODUCTION

“Alignment, alignment, my kingdom for alignment” summarizes the top

information technology (IT) management concerns from the 2003 to 2013 surveys of the

Society of Information Management. Technology advancement, changes to business

requirements, and the evolution of business management practices have not changed

the need (and, by extension, the perceived failure) of internal IT organizations to align

with the business.

Yet with this obsession over alignment, IT departments seem less a core

competency of the organization than ever before. This is surprising since much of what

we consider to be software development is actually application domain problem solving

using software as the way to provide the solution (Blum, 1989; Carroll & Swatman,

1998). Many believe IT no longer provides a lasting strategic advantage (Carr, 2003).

Application implementation approaches and methods emphasize changing the process

to match the software product (Young, 1996). The growth of global outsourcing and

cloud computing has led many to suggest that the role of the chief information officer

(CIO) will disappear (Smith & McKeen, 2006). The diminishment of the strategic

importance of IT is further debated in study of changes in the reporting relationship of

the CIO within the organization (Luftman, Kempaiah, & Nash, 2006). The failure rate of

information system (IS) development projects continues to contribute to this perception.

Numerous studies and popular news stories articulate the failures of software products

and the associated cost to the business community and the public (Jørgensen &

Moløkken-Østvold, 2006). Custom software application development has given way to

application implementation approaches and methods, placing the commercial software

1

Page 13: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

vendor as the trusted IT partner of the business and diminishing the role of the internal

IT organization.

The most important part of software development is deciding what to build

(Brooks, 1987). This is easy to state, but when making choices in uncertainty under

conditions of scarcity, we need the capability to understand the overall goal in the

largest context to define value and make correct decisions. Due to the scarcity of

resources (time, money, manpower), system boundaries (defined as the line separating

what is to be considered within the development scope of a system and what is

considered exogenous) have been narrowed, with more specialized systems developed

that provide more features, fewer defects, and better matching of the narrowly defined

specifications than ever before (Jones, 2008).

Much of this improvement is a consequence of the various IT process

improvement efforts such as the Software Engineering Institute’s (SEI’s) capability

maturity model integration (CMMI), the IT services improvement efforts such as the

Information Technology Infrastructure Library (ITIL), the IT governance improvement

efforts as found in the control objectives for information and related technology (COBIT),

and the project management improvement efforts defined by the Project Management

Body of Knowledge (PMBOK; Gibson, Goldenson, & Kost, 2006; Hill & Turbitt, 2006;

Jugdev & Muller, 2005). In each case, the system boundary is defined as the area of

elements which describe a discrete business process reduced to as narrow a scope as

possible, leaving those things outside the system boundary as actors to be interacted

with in a pre-defined way.

This decomposition (or reductionism) approach to problem solving is illustrated in

the implementation of popular software development methodologies, best represented

2

Page 14: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

in the adoption of the process improvement program of CMMI, first developed in the late

1980s. The definition of CMMI is one in which contains essential elements of effective

processes and a defined improvement plan, producing improvements in quality and

effectiveness (Chrissis, Konrad, & Shrum, 2003; Paulk, 1995). This model integrates the

organization’s business requirements and the design capabilities of an IS with the more

narrowly defined scope to produce higher quality software. In part, this is due to

deemphasizing the larger requirements analysis (RA) and design process which

precede the creation of software, as these are considered exogenous.

Highly decentralized organizations tend to optimize process performance at the

expense of overall organizational effectiveness. Thus, the impact of the omission of

explicit requirements capabilities in the CMMI model is lessened by the tendency to

optimize performance of the smallest process. This narrowing of the system boundary

makes requirements definition simpler, improving both the quality of the software and

the efficiency of producing it. This is not to say requirements capabilities are not

recognized as essential aspects of software development (Nguyen & Swatman, 2003).

Many researchers write that the challenge to correctly articulate requirements is the

most significant problem facing software development today (Brooks, 1987; Jones,

1996; Kappelman, 2010).

Requirements articulation comes in two general contexts: that of the process or

sub-process to be defined, and that of the organizational context the system exists

within. Brooks (1975, 1995) captured this when he wrote of the essential importance of

getting requirements right in both these contexts, labeling them as the management of

accidents versus the management of essence. Efforts to improve software

development can be seen in the reduction of accidents, defined as coding bugs or

3

Page 15: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

software defects seen in software products (Jones, 2000). This improvement is a result

of the application of software process improvement plans for either compliance needs or

to address software quality issues (Layman, 2005). From the Society for Information

Management (SIM) survey of 2007, more organizations have aspired to the

development practices of CMMI since the same assessment in 1997 (Kappelman &

Salmans, 2008). The number of CMMI Levels 2, 3, and 4 shops has increased, with the

percentage of software defects falling proportionally (Jones, Subramanyam, &

Bonsignour, 2011). Software has improved in quality and delivery speed, significantly

improving the automation of specific business processes (Berry, 2008).

This virtuous cycle of reducing accidents and narrowing scope has taken the

focus away from the other challenge of software development, that of capturing the

essence. Brooks (1987) states that “the hardest single part of building a software

system is deciding precisely what to build. No other part of the work so cripples the

system if done wrong. No other part is as difficult to rectify later” (p. 199). Many other

researchers have articulated the difficulties of capturing requirements with the

identification of the system boundary as the first step in that process (Cheng & Atlee,

2007; Damian & Chisan, 2006). Others have found that although tools of higher quality

are being developed, these tools do not meet the needs of the business (Kappelman &

Salmans, 2008). This is further supported in the continued alignment gap between the

organization and the IT department, the failure of most organizations to articulate a

strategic plan in the form of an enterprise design, and the accentuation of business silos

which articulate plans out of context to the overall needs of the enterprise (Luftman &

Kempaiah, 2008; Ross, Weill, & Robertson, 2006; Zachman, 1997). In the practice of

4

Page 16: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

software development, it has been stated that delivering the wrong software is often

worse than delivering no software at all (Gerush, 2010).

Theoretical Background

The classic production function defined by Cobb and Douglas in 1928 (as cited in

Meeusen & van den Broeck, 1977) defines the relationship between inputs and outputs,

where labor and capital combine to create what is to be produced as represented by the

equation LK=F, where L represents labor, K represents capital, and F represents the

product to be created. With the advent of IT, this equation has been altered to

incorporate information, which can substitute for either labor or capital depending upon

the product, to form LK(i)=F. This has motivated companies to consume more

information and fewer things (labor or capital), creating a struggle in classical production

theory (Cleveland, 1982). Consequently, investment in IS has increased in importance

and represents an increasingly significant percentage of corporation budgets (Harter &

Slaughter, 2000; Kappelman, McLean, Luftman, & Johnson, 2013).

Not all IT services are equal, however. Some services through the evolution of IT

have become commodity in nature, while others (like software development) remain a

special order process, where the needs of the end user are met through the creation of

software solutions (Carr, 2003; Fowler & Highsmith, 2001).

Using the State of Oklahoma IT services as an illustration, a categorization of IT

services appears in Figure 1. At the bottom are the enterprise services, defined as

where solution differences do not make a difference to the end user.

5

Page 17: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

Federated Model of The Enterprise

Agency Layer

Enterprise Layer

Infrastructure Layer

Technology Services

Agency

Services

Shared Application Services

Centralized Technology

Centralized Application

Services Utilized by Multiple Agencies

Agency Specific Services Remain in

Agency

Information Technology Processes (ITIL and PMBOK)

Figure 1. Federated model of the enterprise.

The classic example of this is telephone service, where dial tone provided by one carrier

is indistinguishable from another carrier’s dial tone. As much as 60% of the budget for

the State of Oklahoma’s IT services fit into this category. This would include compute,

storage, network connectivity, security, email, messaging, and user support services

(Capgemeni, 2010). Service quality in this category equates to service availability,

where if the service is too often unavailable it is deemed to be low quality, leaving price

as the sole distinguishing factor among solutions. In the middle are shared services,

which are most often the back-office processes of accounting, procurement, and human

resource management support systems, common to all functions but not as universal as

the services at the enterprise level. At the very top are those IT solutions which

6

Page 18: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

specifically support a discrete and unique business service, most often a bespoke

software solution written specifically to optimize a function unique to the delivery of the

product or service the business performs.

It is at the top of the pyramid where alignment of IT and the business primarily

happens, with software development as the primary IT service which develops and

supports those business-specific activities which really make a difference. The capture

of the essence in the overall organizational context is essential to achieving IT to

business alignment. Yet with the gap between software delivery and the specific

business need cited as the weakest link in the chain of software engineering practices,

this seems to be a continuing problem in software development and cited as the cause

of most IT project failures (Jones et al., 2011; Kamata & Tamai, 2007). This alignment

begins with defining the business requirements which represent the high-level

enterprise objectives and include the past, present, and future conditions or capabilities

in an enterprise (Gerush, 2010). One approach to accomplishing this is through

enterprise architecture (EA) management, using the structure of EA to frame and

capture requirements of the larger context of the overall enterprise, defining a larger

overall system boundary which the smaller system boundaries of the discrete business

process solutions must observe (Lange, Mendling, & Recker, 2012; Langenberg &

Wegmann, 2004). EA, the noun, is the output of the RA process when applied to

capturing the objectives of the larger organization; it defines the present and future state

in a formalized way and maps out the integration of all the parts for the purpose of

managing change (Aziz, Obitz, Modi, & Sarkar, 2006; Kappelman, 2010; Zachman,

1999).

7

Page 19: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

EA can be thought of as the IT tool for establishing this larger system boundary

and framing a learning organization. The importance of establishing a learning

organization is a relatively recent field of research in management science, with few

methodologies available for measuring organizational learning capabilities (Goh &

Richards, 1997; Marsick & Watkins, 2003). One of the best known methodologies has

been described by Senge (1990), articulating the essential elements of systems thinking

within the contexts of personal mastery, mental models, shared vision, and team

learning. All but personal mastery relate to elements of requirements capabilities, with

mental models, shared vision, and team learning represented as artifacts, practices, or

benefits of an organization wishing to produce an aligned IT services organization from

a systems thinking approach. EA provides the tools to actualize the IT learning in the

larger organizational context, and it is expected that organizations with better tools will

also have higher relative organizational learning maturity and enjoy greater IT to

business alignment.

EA and system analysis and design (SA&D) capabilities together define the

requirements capacities needed to achieve IT – business alignment, IS success, the

development of complimentary IT and business strategies, and to provide the models

and processes used to support the learning organization. The difference between SA&D

as applied to software and EA is the difference in context. Although many of the same

capabilities and practices are used in both, the development of requirements in the

practice of software development is generally done out of context with the overall goals

of the enterprise, akin to how overall goals of the enterprise are developed without

consideration of previously derived requirements. Requirements live outside the specific

software development project–before, during and after completion–with no checklist

8

Page 20: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

available to ensure delivery of quality software requirements, resulting in relatively

immature requirements capabilities (Gerush, 2010). Simplification of requirements to an

analysis of fit between user activity and software support may represent the gap in the

alignment of IT to the business, the difference between successful IT projects and those

which are deemed unsuccessful, a missing antecedent in user acceptance, and even

offer insight into the development of a learning organization. Development of models to

articulate the gap between SA&D in identifying software development requirements and

EA requirements of the overall enterprise could provide the bridge for organizations to

track and operationalize the vision of enterprise transformation.

Problem Statement

As exploratory research moves to explanatory research, it is essential that

instruments be validated to strengthen earlier empirical findings and further the research

stream (Straub, 1989). As our maturity models for software development have improved

and have increasingly been applied to software application development activities, we

have seen a corresponding improvement in the quality of software development, with

software containing more features and fewer bugs produced as a direct consequence of

the accountability applied through benchmarking against maturity models (Beecham,

Hall, & Rainer, 2003a; Harter & Slaughter, 2000).

However, this improvement in the quality, timeliness, and cost of software

development (the management of the accidents in the creation of software) has not

improved the development of software which meets the business needs (alignment of

the technology that supports or actualizes the strategy of the organization). Improved

capabilities to elicit functional requirements (those which indicate how the system

should work) have not translated to improvement in capabilities to develop non-

9

Page 21: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

functional requirements (those which identify what organizational priorities the software

is to support; Charette, 2005).

This gap between the maturity of developing functional requirements and non-

functional requirements may be correlated with other gaps identified in other research

streams. Alignment between users and IT tools has been studied by several authors

who have created alignment maturity instruments (Luftman & Brier, 1999),

measurements of IS success (DeLone & McLean, 2003; Seddon, 1997), and user

acceptance models (Mathieson, 1991; Venkatesh, Morris, Davis, & Davis, 2003). The

gap between user acceptance and non-acceptance, between the successful and

unsuccessful IT solutions, between business-IT alignment and misalignment may be

traced to this gap in requirements.

Additionally, this gap may correspond with levels in maturity in EA capabilities

and practices (where organizational requirements are captured) and maturity in the

development of the learning organization (how companies establish a culture of

continuous improvement). It is expected that a narrower gap between functional and

non-functional requirements will correspond with a higher maturity level in EA and the

learning organization.

Maturity models are derived from stage theories and assume that all

organizations go through distinct stages over time (Nolan, 1973). Stage hypothesis is an

important concept in this research, with stage models proposed to describe a variety of

phenomena. Stage hypothesis assumes predictable patterns exist as an organization

matures much as those found in the growth of biological entities (Greiner, 1972). This

concept has been extended to include models for software development maturity such

as the CMMI (Chrissis et al., 2003), EA maturity (Ross et al., 2006), IT and business

10

Page 22: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

alignment maturity (Luftman & Kempaiah, 2007), and learning organizations (Goh &

Richards, 1997). The relationship between these maturity levels and RA capabilities in

the small (SA&D) and large (EA) context has not been researched.

Building upon an SA&D instrument used in a previous study (Kappelman &

Salmans, 2008), this research will provide a comparison between theory and practice

while validating the instrument and refining the sub-constructs within the overall

construct of RA capabilities (Bagozzi, 1980). It is the lack of validated measures in new

fields of research like RA where confirmatory research can dispel concerns about the

validity of the findings and clarify where the next exploratory steps researchers should

take toward establishing a comprehensive descriptive maturity model (Bagozzi, 1980;

Straub, 1989).

This study focused on validating the operationalization of SA&D capabilities in

the small requirements context of software development. It is through this

operationalization of SA&D requirements efforts and activities that the foundation to

explore the relationships between IS success, IT alignment, software engineering

capabilities, user acceptance, and the organization’s learning maturity level were

clarified and provide the foundation for future research.

Primary and Secondary Research Questions

The primary research question is: can requirements efforts and activities be

operationalized within the SA&D context of developing a discrete software solution?

The secondary research question is: do requirements efforts and activities in the context

of developing a discrete software solution constitute a unidimensional or

multidimensional construct?

11

Page 23: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

These questions were answered by building on an earlier instrument (Kappelman

& Salmans, 2008) used to measure SA&D efforts and activities and were expanded to

ask questions in two contexts: first, with respect to the goals of the specific SA&D);

second, with respect to the overall business objectives of the organization (EA).

Although focused on the former, this research proposed a new paradigm that RA is, in

fact, the combination of RA in an organization-wide (EA) context combined with the RA

for the discrete software solutions being developed (SA&D). Although IT historically has

conducted requirements activities in the context of a particular system development

project, EA is basically the expansion of that activity so that it is conducted in the

context of the entire enterprise (Chen & Pozgay, 2002; Finkelstein, 2004). “EA is

software requirements on steroids” (Kappelman, 2010, p. 166).

This research only focused on requirements efforts and activities within the

context of developing a discrete software solution. Such efforts and activities are a

subset of the overall RA construct as not all dimensions may yet be identified. As part of

the instrument validation, the instrument tested for unidimensionality; however, it was

hypothesized that, at best, unidimensionality would be weakly supported.

Research Design

The instrument used in the 2007 Society for Information Management Enterprise

Architecture Working Group (SIMEAWG) study of RA and design capabilities was

refined and used to collect data from system professionals in SIM. The SIMEAWG

provided the Delphi and pilot testing group, and the results and comments were used to

evaluate question validity prior to general distribution and actual data collection.

Traditional software requirements development produces too much detail for

stakeholders to understand and too few details for developers to have a fully articulate

12

Page 24: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

requirements specification (Gerush, 2009). Prioritization of feature development and

alignment of IT with business objectives and value have not been explicitly tied

together, with neither methodological approach nor maturity index available to measure

how well this is performed. Consequently, poor requirements often result in project

failure or system rejection (Beecham, Hall, & Rainer, 2003b; Cheng & Atlee, 2007).

Developing requirements which explicitly define requisite features while linking to

business outcome and value is difficult to do, with business requirements living on long

past the end of the development of the application (Gerush, 2010). The cost of finding

and fixing bugs or defects in software is the largest single expense element in the

history of software, with the majority of delivered defects originating in requirements

activities and development, and bug repairs starting with requirements and continuing

through development and deployment (Jones, 2002). Moreover, the need to prioritize

features in context with business priorities is seldom done, and as business

reprioritizes, so too must development projects to ensure alignment is maintained. It is

not uncommon for executive leadership changes in an organization to have no impact

upon the system development activities, and yet the change of executive leadership

signals a change in the desired outcomes of the organization, as it is the executive

board who is held accountable for the performance of the organization (Carver, 2001).

This lack of impact illustrates the gap between the requirements development of

systems with the overall requirements for the organization. Traditional software project-

focused performance metrics of quality, cost, and timeliness of solution delivery

contribute to this alignment gap and may even exacerbate it.

13

Page 25: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

Conclusion

This study and the data collected help fill these gaps in our understanding of the

links between software development and IT to business alignment. It accomplished this

conceptually by separating RA capabilities into two distinct categories–SA&D for the

application and EA for the organization–including how its applications “fit” together and

operationalizing those capabilities in the context of the development of a discrete

software solution (SA&D). Identification of potential dimensions or sub factors in such

requirements efforts and activities in the context of software development, and the

exploration of these factors using existing models of IT to business alignment, will

surface potential leverage points to ensure successful alignment and possibly a

structure which requirements capabilities must satisfy to be considered comprehensive.

This introduction is followed by a comprehensive literature review to identify the

key constructs of the research model (IT alignment, EA maturity, RA capabilities in a

small and large contexts, organizational maturity, and IS success). Theoretical

foundations for these concepts and the articulation of the instrument will be discussed,

with the proposed methodology for the execution and analysis of the survey results.

Results of the survey are presented with limitations and methodological issues

identified, concluding with an interpretation of the results and implication for both the

research community and IT practitioners.

14

Page 26: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

CHAPTER 2

LITERATURE REVIEW

For four decades, information technology’s (IT’s) greatest good has been about

how to achieve and sustain alignment of technology with the needs of the business

(Luftman & Kempaiah, 2007). Many factors have affected the business-IT relationship

over the years, including the increased focus on compliance support, high-profile

failures of IT initiatives, disappointing returns on IT investment, and the perception that

technology is no longer a strategic advantage (Carr, 2003; Jørgensen & Moløkken-

Østvold, 2006; Melville, Kraemer, & Gurbaxani, 2004). Most of the popular information

system development (ISD) and implementation methods emphasize changing the

business processes to match the software product, resulting in the perception that IT

should not be a core competency of the business (Brown, Hagel, & Carr, 2003). The

growth of global IT outsourcing as a practice supports this perception, with IT practice

leading research into identification of value or the development of performance metrics

to ascertain IT success (DiRomauldo & Gurbaxani, 1998; Leiblein, Reuer, & Dalsace,

2002). Others believe that in the hyper-competitive marketplace of today an

organization’s ability to develop high quality software at low cost is one of the most

important success factors in business (Harter & Slaughter, 2000). From a chief

information officer (CIO) survey, IT professionals believe they are building better tools

but not the tools the business needs (Kappelman & Salmans, 2008). Yet software

project success metrics continue to revolve around the three traditional measurements

of cost, quality, and timeliness (Melville, Kraemer, & Gurbaxani, 2004).

Software itself has become one of the most widely used products in human

history but suffers one of the highest failure rates due to poor quality (Jones et al.,

15

Page 27: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

2011). Authors have estimated the cost of IT project failure at as much as $70 billion

each year, with IT projects success rate of 34% for 13,000 software projects undertaken

at Fortune 500 firms (Charette, 2005; Nelson, 2005, 2007). Studies of causes of these

failures due to quality, cost, or timeliness name deficiencies in the defining of the user’s

requirements as the primary contributing factor (Kamata & Tamai, 2007; Kashanchi &

Toland, 2006). The age-old challenge of requirements analysis (RA) holds the key to IT

value and relevance in the 21st century.

The Gap within RA

Brooks characterized the essence of software as a complex construct of

interlocking concepts and hypothesizes that requirements development difficulties can

be framed into those addressing the essence of the need being addressed or the

accidents which occur in the process of software development (Brooks, 1987, 1995;

Mays, 1994). Historically, these two focuses have been researched independently.

Software development process improvement research has concentrated on identifying

how to make software development easier and with fewer defects (Xia & Lee, 2004).

User acceptance, task-technology fit, and information system (IS) success models have

focused on the achievement of the essence (or outcome) of the development process

(Benbasat & Barki, 2007; Dillon & Morris, 1996; Goodhue & Thompson, 1995). This

division in academic research mirrors that in applied practices, with metrics focused on

software quality separate from metrics focused on user satisfaction of the resulting

technology solutions and in the categorization of information sciences (IS) research

articles (McNee, Riedl, & Konstan, 2006; Sidorova, Evangelopoulos, Valacich, &

Ramakrishnan, 2008). Brooks (1987) and others identify this duality without referring to

it as a gap, seeing both of these together as the elements of complexity associated with

16

Page 28: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

the development of effective requirements necessary for optimal software development

(Sawyer, Sommerville, & Viller, 1998). Software development practitioners have also

identified this duality without recognizing it as a gap and have attempted to fix this using

integrated software development methodologies, such as seen in Agile methods

(Wendorff, 2002).

Where This Fits in the Literature Stream

Orlikowski and Iacono (2001), and later Benbasat and Zmud (2003) wrote of the

classification and core properties of the IS discipline. They called for a more disciplined

approach to IS research to set boundaries and structure. This is necessary because IS

research is both behavioral and analytical, with social needs being defined with out-of-

context technical requirements of systems and development practices (Davis, 1982).

“We conceptualize the IT artifact as the application of IT to enable or support some

task(s) embedded within a structure(s) that itself is embedded within a context(s)”

(Benbaset & Zmud, 2003, p. 188). Benbaset & Zmud go on to categorized IS research

into capabilities, practices, usage, and impact surrounding the IT artifact itself; the

technology solution used to accomplish a task within a larger task, structure, and

context (Alter, 2002). In literature where this classification is used to categorize

academic articles by subject matter, a majority of the articles pertain to managerial

issues, with fewer addressing the activities of designing and building the hardware and

software at the core of the artifact (Sidorova et al., 2008). Though not the only type of

research of the IT artifact, requirements capability studies clearly fall within the bounds

and are at the very core of this definition of the IT artifact.

17

Page 29: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

The Differences between Software Development Practices and

Requirements Capabilities

A distinction is necessary between practices (method and process) and

capabilities (use), as often these terms are used interchangeably but really have

different meanings. Practices describe how to do something. A process is a prescriptive

set of steps meant to be specifically followed to achieve an outcome (a recipe), whereas

a method can be thought of as an approach, style, or set of assumptions used to

achieve a goal but is not prescriptive in nature (Neufeldt & Guralnik, 1997). Capabilities,

on the other hand, refer to how well or capably one performs a method or process.

Practices provide a way to do something. Capabilities reflect the ability to apply or use

practices to accomplish a goal. To illustrate, the scientific method is a method for

general problem solving which is not prescriptive to solving any single problem, but the

specific process applied to a particular experiment are the steps that must be adhered

to for an experiment to be replicated by other researchers. The term “methodology”

when applied to software development can include processes, procedures, methods,

and tools for RA, software engineering, project management, and even deployment and

training, to achieve the desired outcome. With over 1,000 software development

methodologies, each as a collection of various practices to accomplish software

development, it is sometimes difficult to distinguish between a methodology or a

discrete practice or procedure within an overall methodology (Hirschheim & Klein,

1989). Nevertheless, a distinction is undoubtedly clear between one’s ability to perform

them, and the methodologies, practices, processes, and methods themselves.

Software development is often explained as a series of model transformations

from concept to design to construction, where each stage should improve the match

18

Page 30: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

between the user’s desired outcome and the developer’s software product (Hu, Mao, &

Ning, 2009). As such, software development methodologies have focused much

attention upon how to manage or eliminate errors in programming, establishing a field of

predictive approaches to eliminating errors in software development (Matson, Barrett, &

Mellichamp, 1994). This focus on programming error elimination has had considerable

influence on the practice of RA, integrating RA into nearly every aspect of the overall

software developmental process (Wasserman, Freeman, & Porcella, 1983). This focus

continues today in the improvement of elicitation of requirements, RA, communication of

requirements, agreement on the requirements, and changing of the requirements

through the software development process (Nuseibeh & Easterbrook, 2000). This has

also provided the basis for measurement of software development capabilities,

establishing the metrics by which maturity of the development of software or the

implementation process can be measured (Chrissis et al., 2003; Tyson, Albert, &

Brownsword, 2003). These approaches have focused on RA to improve the software

development or software implementation process, but they do not address the problems

associated with the integration of the software solution requirements into the overall

organizational goals.

RA in the simplest terms is the elicitation of user information needs and the

design of the IS to satisfy those needs (Byrd, Cossick, & Zmud, 1992). As previously

discussed, RA is an essential element of creating the IT artifact, a seminal model to

categorize IT literature into relevance within the field of IS (Orlikowski & Iacono, 2001).

Viewed as a component of ISD, the study of RA often has been subjugated to the role

of a unidimensional component in the overall methodologies of ISD (Avison & Wood-

Harper, 1991). Two approaches to RA have established themselves in the research

19

Page 31: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

stream: a reductionist approach, focusing on decomposing the process of elicitation,

representation, and validation of the user requirements (Carroll & Swatman, 1998); and

a systems theory approach, viewing the RA process as the development of the mental

model and the simulation of that model as the process to refine the requirements

(Vessey & Conger, 1993). The essence of the difference is the question of whether RA

is a process for developing mental models or simply a supporting methodology as a part

of the overall ISD process (Davis, 2010).

RA through Reductionism

The reductionist approach is the most intuitive approach to determining

requirements, and was widely popularized by DeMarco (1979) and others through the

adoption of procedures to capture requirements. In the more sequential, less iterative

waterfall methodologies, users are interviewed repeatedly with subsequent iterations

producing more refined requirements. Sometimes referred to as the analysis phase, this

process focuses on capturing the logical requirements the software is to deliver

(Whitten, Barlow, & Bentley, 1997). At the conclusion the user confirms the capture of

the requirements by signing off on them, and software development begins,

transforming the logical requirements derived in the analysis phase into a physical

design representation of the system to be developed and finally into the system itself.

This has been adapted to incorporate prototyping in the more iterative spiral

methodologies, but RA remains a distinctly separate process from system design and

software development (Maciaszek, 2007; Vessey & Conger, 1993). Reductionist

approaches to RA have contributed greatly to the science of RA, and many

methodologies including object-oriented, iterative and incremental, iterfall, and others

20

Page 32: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

demonstrate the influence of the reductionist approach (Kulonda, Arif, & Luce, 2003;

Sommerville & Sawyer, 1997).

This reductionist approach can be summarized as a tendency to explain a

complex set of facts, entities, or structure by a simpler set described in the elements

produced by RA. Just as science has advanced by reductionism, it would seem RA can

advance using the same methods; by examining smaller and smaller pieces, one can

understand a complex system, and when assembled, it can explain the whole (Holland,

1999).

Some argue that these forms of RA are not truly engineering practices, producing

requirements specifications consisting merely of nice sounding words (Gilb, 2010).

There is an old joke among IS developers that requirements analysts are the people

who program in PowerPoint, never linking what they produce to something which can

be empirically measured and where success is defined as the user sign-off on

requirements. Others have taken the view that the problem in the reductionist approach

to RA is that not enough emphasis is placed upon finding errors early in the

development lifecycle, resulting in exponential cost growth when later correcting errors

in software development or implementation. This value of good RA practices comes

through the cost avoidance of fixing errors detected earlier in the development process

compared to the cost to repair if they are detected later (Berry, 2008; Hay, 2003; Jones,

2008). Others have defined reductionist RA as a practice in risk management with

agreed upon requirements either transferring, mitigating, or avoiding risks in software

development (Boehm, 1991). Additionally, the emergence in other studies of a number

of themes related to the interaction between developers and clients including conflict

resolution, education, technical and business advice, and management of the RA

21

Page 33: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

process support the hypothesis that RA is something more than a unidimensional

construct (Carroll & Swatman, 1998).

Part of the disagreement lies in the definition of software quality and value,

because perceptions are often subjective and contextual to the situation, with quality not

an intrinsic property to the software product itself (Jones et al., 2011). One very

beneficial area of research has been in the sharpening of the definitions of software

quality and value, providing a constant metric for evaluating quality from differing

development processes (Ashrafi, 2003; Harter & Slaughter, 2000). This pertains to the

accident management aspect of Brooks’ (1987) observation and is an essential metric

to software development methodologies and maturity models. Much progress has been

made to improve software quality through improving methodologies to develop software,

with programming errors falling with improved reductionism RA capabilities and

practices (Chrissis et al., 2003).

Another challenge is the increasing importance of code re-use in software

development. Object-oriented code practices offer more rapid development of solutions

through the construction and integration of reusable programming objects, binding data

and processes together, and re-using this object repetitively (Mylopoulos, Chung, & Yu,

1999). However, siloed organizations produce siloed programming objects, reducing

flexibility in re-use outside the business process for which the object was designed. It is

only by the reduction of RA artifacts to the most elemental level and then keeping those

artifacts separated as long as possible which ensures the resulting software objects are

reusable in the new incremental development context (Boehm, 2006; Sowa & Zachman,

1992; Zachman, 1987, 2004).

22

Page 34: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

This reductionism through decomposition of tasks in turn has led to reductionism

in the system boundary to be supported with software. A recent example is the

Healthcare.gov website, where no integration of discrete component development was

assigned, producing components without interoperabilities (Lawson, 2013). Elements

are functional but, overall, the outcome is not achieved. Agile and other methodologies

which blend RA into system analysis and design (SA&D) fail to address the facets of

RA, by transferring the burden of completeness from the developer to the subject matter

experts.

RA through Systems Theory

General systems theory (GST) was originally proposed by von Bertalanffy in

1926 to focus research away from specialization and back towards addressing the

wholeness of a system using biology (Boulding, 1956; von Bertalanffy, 1926). In his

example, perfect knowledge of the nervous system, skeletal system, muscular system,

and all other systems of a frog do not explain the actions or activities of a live frog.

Ackoff (1971) picked up on this in his system of systems concepts, organizing systems

into a more defined hierarchy in levels of complexity, expanding upon Boulding’s (1956)

work. All of these authors identified the limitation of reductionism (the narrowing focus or

specialization of knowledge) when studying a complicated system in that the whole is

greater than the sum of its parts (Neufeldt & Guralnik, 1997). In Boulding’s (1956)

writings, GST was to be a tool that would enable researchers to effectively move

between the world of theory about a complex system (or entity) and the messy world of

practice. It was also recognized that we pay for generality by sacrificing content

specificity, but this tradeoff is desirable when dealing with messy (or wicked) problems

(Winter & Checkland, 2003). This was best characterized by Weaver in 1948, who

23

Page 35: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

categorized problems into ones of simplicity (where with few variables to consider

reductionism is the best approach), disorganized complexity (where reductionism can

be applied through the application of statistical techniques to generalize a large number

of variables), and organized complexity (where a systems theory approach is required).

The adoption of GST to the analysis of requirements represents a true paradigm

shift. A paradigm constitutes the essence of a discipline; its core assumptions and the

implicit or explicit view of reality (Kuhn, 1970). Unresolved questions running counter to

the anticipated results (particularly those regarding the user acceptance of technology)

demanded a new approach to how these issues should be addressed (van Gigch,

2003). Reductionism, by focusing on the quality of the system or how well the system

fits the task to be performed, failed to explain why users chose to adopt or reject a

technology (Barki, Titah, & Boffo, 2007). Systems theory provided a new way to view

the man-machine interface, and many specific research processes have evolved into

the discipline of IS including the application of system dynamics (Forrester, 1998;

Repenning, Goncalves, & Black, 2001), soft systems methodology (Winter &

Checkland, 2003), and the identification of system archetypes to describe system

behavior (Senge, 1990; Wolstenholme, 2004).

Specific to the study of RA, the GST approach was the first general paradigm

applied to elicit user requirements, emphasizing the need to focus on the strategic

purpose for the system, not just the tactical design of the software (Bostrom & Heinen,

1977). The implementation of a management information system (MIS) was viewed as

an intervention strategy, and it was the belief that the major reason for IS failures was

due to ignoring the system-within-a-system integration challenges first articulated by

Ackoff (Lucas, 1971). This systems approach contributed to user acceptance theory,

24

Page 36: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

inspiring the task-technology fit model (Goodhue & Thompson, 1995), the technology

acceptance model (Davis, 1989), and the IS success model (DeLone & McLean, 2003).

In each of these, the IS is viewed holistically and in context with the overall system of

the business (or enterprise), and gaps between the software system and the enterprise

use are evaluated. Thus, RA is the IT discipline of knowing one’s customer and

communicating that knowledge, often through the use of models.

After a departure from a systems approach to RA and software development,

efforts to return to a more holistic approach have been attempted. Joint application

development, rapid application development, extreme programming, Agile software

development methodologies, and others have been proposed to overcome problems

experienced with traditional methods of software development through reductionism

(Tuunanen, 2003; Wendorff, 2002). All of these methodologies are designed to engage

the user into the software development process to make the refinement of requirements

a continuous process and reduce the errors of essence (Fruhling & de Vreede, 2006).

As previously discussed, this is simply old wine in new bottles, with the same emphasis

on delivering narrowly focused solutions without regard for the overall structure of the

organization, blending SA&D into software engineering without addressing the

limitations of the core requirements solicitation process (Hilkka, Tuure, & Matti, 2005).

Said another way, RA is not the same as software development, and the intimate

entwining of the subject matter expert with the programmer as an application is

developed cannot change this. Additionally, there is the problem of binding too early

when RA is entwined with SA&D (as in the case of Agile methods) and done as a part of

coding, producing code that is not reusable for future purposes.

25

Page 37: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

Efforts to Balance Reductionism and Systems Approaches

The process of requirements modeling most closely resembles the creation of

shared mental models. Human activity, often in the form of user and developer

meetings, creates the forum where the artifacts which manifest the system are created

(Wolf, Rode, Sussman, & Kellogg, 2006). This process is where the user and the

developer learn together how the system can support the activity of the user through the

development of models on paper, in modeling applications, in system prototypes, and

finally in the creation of a working system. This is demonstrated in DeLone and

McLean’s (2003) IS success model, where it can be supposed poor quality

requirements produces a poor quality system and ultimately user dissatisfaction.

From practice, we know that the application of RA through an exclusively

reductionist approach simplifies performance measurements and produces technically

better tools (applications with fewer defects) but does not guarantee the achievement of

the larger mission or objectives of the organization, leading instead, at best, to

stovepipes, excessive complexity, disintegration, redundancy, high cost, and slow

change. New development methodologies to incorporate the user into the development

process also have not ensured that the resulting systems support the overall objectives

of the enterprise. The challenge, then, is how to achieve holistic reductionism,

decomposing requirements in the context of the enterprise, and thus retaining the

connection to the outcomes desired by the organization.

Enterprise Architecture as an Approach to

Requirements Decomposition in Context

The concept of RA capabilities is important in that they are considered essential

to sustained competitive advantage (Brits, Botha, & Herselman, 2006). As a component

26

Page 38: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

to software development, RA is core to the IS discipline, albeit a subjective and

evolutionary process running throughout the ISD processes (Hirschheim & Klein, 1989).

RA capabilities within ISD can be thought of as the development of the IT models

representing the business, a subset of the systems theory approach of Zachman (1997)

where requirements would describe all business related models. ISD is the instantiation

of requirements into a tangible model through the development of software (Nguyen &

Swatman, 2003). RA is regarded as an iterative process of discovering requirements

and redefining these over time through iterations of decomposition (Daugulis, 2000).

This decomposition produces requirements which are specific to a narrowly defined

business process (represented as slivers in columns of the ontology known as the

Zachman enterprise framework) and defined to an excruciating level of detail (see

Figure 2).

As proposed by the Institute for Electrical and Electronic Engineers (IEEE) in

2000, enterprise architecture (EA) is the fundamental organization of a system

embodied in its components, their relationship to each other and to the environment,

and the principles guiding its design and evolution (Jen & Lee, 2000). This was further

expanded upon by Zachman in 2007 as “the set of descriptive representations that are

required in order to create an object” (p. 2). A central goal of EA is the alignment of the

IS with the goals and objectives of the organization (Ross et al., 2006).

EA links the essence (the goals, objectives, mission and purpose of the

organization) with the IS systems optimized for the execution of the discrete business

processes (Ross, 2003).

27

Page 39: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

e.g. DATA

ENTERPRISE ARCHITECTURE - A FRAMEWORK

Builder

SCOPE(CONTEXTUAL)

MODEL(CONCEPTUAL)

ENTERPRISE

Designer

SYSTEMMODEL(LOGICAL)

TECHNOLOGYMODEL(PHYSICAL)

DETAILEDREPRESEN- TATIONS(OUT-OF- CONTEXT)

Sub-Contractor

FUNCTIONINGENTERPRISE

DATA FUNCTION NETWORK

e.g. Data Definition

Ent = FieldReln = Address

e.g. Physical Data Model

Ent = Segment/Table/etc.Reln = Pointer/Key/etc.

e.g. Logical Data Model

Ent = Data EntityReln = Data Relationship

e.g. Semantic Model

Ent = Business EntityReln = Business Relationship

List of Things Importantto the Business

ENTITY = Class ofBusiness Thing

List of Processes theBusiness Performs

Function = Class ofBusiness Process

e.g. Application Architecture

I/O = User ViewsProc .= Application Function

e.g. System Design

I/O = Data Elements/SetsProc.= Computer Function

e.g. Program

I/O = Control BlockProc.= Language Stmt

e.g. FUNCTION

e.g. Business Process Model

Proc. = Business ProcessI/O = Business Resources

List of Locations in which the Business Operates

Node = Major BusinessLocation

e.g. Business Logistics System

Node = Business LocationLink = Business Linkage

e.g. Distributed System

Node = I/S Function(Processor, Storage, etc)Link = Line Characteristics

e.g. Technology Architecture

Node = Hardware/SystemSoftware

Link = Line Specifications

e.g. Network Architecture

Node = AddressesLink = Protocols

e.g. NETWORK

Architecture

Planner

Owner

Builder

ENTERPRISEMODEL

(CONCEPTUAL)

Designer

SYSTEMMODEL

(LOGICAL)

TECHNOLOGYMODEL

(PHYSICAL)

DETAILEDREPRESEN-

TATIONS (OUT-OF

CONTEXT)

Sub-Contractor

FUNCTIONING

MOTIVATIONTIMEPEOPLE

e.g. Rule Specification

End = Sub-conditionMeans = Step

e.g. Rule Design

End = ConditionMeans = Action

e.g., Business Rule Model

End = Structural AssertionMeans =Action Assertion

End = Business ObjectiveMeans = Business Strategy

List of Business Goals/Strat

Ends/Means=Major Bus. Goal/Critical Success Factor

List of Events Significant

Time = Major Business Event

e.g. Processing Structure

Cycle = Processing CycleTime = System Event

e.g. Control Structure

Cycle = Component CycleTime = Execute

e.g. Timing Definition

Cycle = Machine CycleTime = Interrupt

e.g. SCHEDULE

e.g. Master Schedule

Time = Business EventCycle = Business Cycle

List of Organizations

People = Major Organizations

e.g. Work Flow Model

People = Organization UnitWork = Work Product

e.g. Human Interface

People = RoleWork = Deliverable

e.g. Presentation Architecture

People = UserWork = Screen Formate.g. Security Architecture

People = IdentityWork = Job

e.g. ORGANIZATION

Planner

Owner

to the BusinessImportant to the Business

What How Where Who When Why

John A. Zachman, Zachman International (810) 231-0531

SCOPE(CONTEXTUAL)

Architecture

e.g. STRATEGYENTERPRISE

e.g. Business Plan

TM

T e c h n o l o g y S e r v i c e s

Department/ServiceSpecific Technology Services and Tools

Figure 2. Department/service driven architecture (with permission of John A. Zachman).

EA can be thought of as a flashlight illuminating gaps among various organizational

components in either the current or future state of the enterprise (Gregor, Hart, &

Martin, 2007). At the very highest and most strategic level of the EA model can be found

the overarching requirements of the enterprise, the essence of what every software

solution used by the organization should fulfill in some aspect, in the overall context the

organization (Ross, 2003). Beneath that are the lower-level viewpoints on how this is to

be logically and physically accomplished, down to the actual functioning enterprise at

the very lowest level, as demonstrated above in Figure 2. RA decomposition to optimize

28

Page 40: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

a single process can produce fragmentation and redundancy of the systems supporting

an enterprise.

Highly decentralized organizations tend to optimize departmental performance at

the expense of the overall organizational effectiveness, and technology applications

selected based upon process alignment tend to exacerbate these silos (Jeffers,

Muhanna, & Nault, 2008). This is graphically illustrated above in Figure 2, showing how

department- or service-specific solutions in the City of Denton, TX, define each

essential aspect of the system uniquely for specific departmental user needs and out of

context with the overall needs of the city. These may represent commercial off-the-shelf

software solutions or software developed for a specific purpose unique to the

organization using any popular software development methodology. As optimal

solutions are defined by users for accomplishing a discrete process, disaggregation of

the processes of the whole is inevitable, leaving the internal technology services

department with the job of developing middleware to achieve higher-level outcomes,

supporting unanticipated integration requirements among these narrowly defined

software solutions.

It is the limitations of the reductionist RA process, and the resulting system

stovepipes, which contributes to deficiencies in achieving the outcomes (essence) of the

software which in the aggregate are the IS used by the organization. Generating a

holistic view of the organization is essential to purposeful organizational design and

defining organizational-wide information requirements (Goh & Richards, 1997). This

necessitates development of new tools to evaluate requirements in an organizational-

wide context and new metrics to measure the successful application of these tools

(Armour, Kaisler, & Liu, 1999). This presupposes the articulation and separation of

29

Page 41: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

requirements into two constructs: those requirements pertaining to the development of

the specific software solution (SA&D) and those requirements pertaining to the overall

objectives of the organization (EA).

Relevance of Holistic RA Decomposition for

Commercial Software Implementation

Holistic RA decomposition is also important to the selection and implementation

of commercial-off-the-shelf (COTS) software packages. It has been estimated that the

largest 1,000 companies use COTS packages to satisfy half of the requirements (Jones

et al., 2011). However, even the smallest IT departments support dozens of process-

specific COTS applications selected by the department process owner to perform a

specific stovepipe function of the enterprise. With more applications available as cloud-

based solutions, the number of applications supported by the internal IT department is

not the complete set of software solutions used by the organization. In the process of

selecting a COTS, a “best fit” defined around how the software supports the existing

business processes is used to evaluate options at a functional level (Maiden & Ncube,

1998; Tyson et al., 2003). Application complexity (measured in function points, the

logical functional requirements a software solution provides a user (Kemerer, 1993),

reporting capabilities, and generic interfacing abilities encapsulate the majority of

specific service deliverables (requirements) of systems purchased today (Maiden &

Ncube, 1998).

These process-centric evaluations ignore other requirements including data,

roles, events, locations, and rules (Tyson et al., 2003). Once selected, these solutions

are shoved into the enterprise, conceptually from the bottom up in a Zachman ontology

context, with implementation specialists discouraging modification to the product,

30

Page 42: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

pressuring the employees to change the existing processes to conform to the process

coded into the COTS (Zairi & Sinclair, 1995). With the rising popularity of software as a

service and cloud computing, the ability to articulate requirements which support

business needs becomes more critical as the dependence upon external software

vendors increases. What would appear to be an alignment of IT with the business may

in fact be an alignment with the vendor or the tool created for the discrete business

process, resulting in a shifting of the burden of requirements capability away from the

enterprise and to the vendor, a process well-documented by Senge and others as an

archetype with similarities to models depicting dependencies of an addiction (Senge,

1990; Wolstenholme, 2004). Taken to the extreme, this can leave organizations in the

situation where they no longer have the capacity to articulate requirements or match

software to facilitate achievement of their own unique objectives.

This is not to say that COTS systems are inherently bad fits for an organization.

Some processes are necessary but not core to the business and benefit greatly from a

ready-made vendor solution (Strassman, 1997). This implementation of COTS

technologies has brought faster automation to processes that would have been done by

an internal IT department writing custom code. Not only do vendors have economies of

scale, they can also dedicate technical staff to becoming process specialists on par with

the business process specialists of an organization, developing a one-size-fits-many

solution, homogenizing differences which do not make a difference in standardizing

common business processes (such as payroll), while providing solutions to meet

externally imposed requirements (i.e., Sarbox reporting, changing tax tables, and the

Basel II Accords), across multiple clients, thereby simultaneously reducing cost and

improving quality (Hui, Liaskos, & Mylopoulos, 2003). Overall, the vendor may have

31

Page 43: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

more customers to support than the internal IT department, but the vendor’s customers

all generally speak the same specialized language of a particular business process or

function (e.g., accounting, human resources, materials requirement planning,

procurement). It is in the interest of the vendors to improve communication between

technologists and business process specialists, reducing the product defects

(accidents) as the vendors focus on improving development processes (Jones, 1996;

van der Raadt, Hoorn, & van Vliet, 2005). However, this improvement of the fit between

the business process and the COTS system comes at the expense of the overall

enterprise infrastructure integration, where the requirements of the enterprise are left

undefined or even ignored, resulting in the failure of software to support the larger

outcome the enterprise is trying to achieve (the essence). This may also manifest itself

in an increase in the cost of enterprise integration, compounded by the perspective that

more IT is better. It is difficult to measure business application integration in business

scorecards, project management updates, functional requirement lists, or vendor

materials, whereas it is relatively easy to measure the cost, quality, and timeliness of

the development or implementation of a specific IT project (Jiang, Klein, Hwang, Huang,

& Hung, 2004). It is easy to count the business functions which have been automated

with technology and to define this as the value of IT to the organization. Technology

applications selected using a narrowly defined process-centric RA approaches tend to

accelerate disintegration and may actually increase costs due to integration and

maintenance costs. This is exacerbated by the practice of empowering the employee

responsible for the performance of the business process to select from a short listing of

“best practice” solutions to improve their performance (Kurtz & Snowden, 2004). It is

this sub-optimization which contributes to the misalignment of IT, producing the

32

Page 44: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

counterintuitive result of alignment with the process at the expense of supporting the

overall goals of the organization (Aziz et al., 2006). Often, business productivity

improvements are not harvested, with headcount increasing after the implementation of

a software solution justified as a productivity tool.

The Gap Remains between IT and the Business

Numerous studies have found that IT alignment is still a challenge for the

enterprise (Chan, Sabherwal, & Thatcher, 2006; Luftman & Brier, 1999; Sauer, 2004).

These studies and others articulate how large a gap exists between the RA research

focusing on the essence and the practice of RA focusing on software development

(Berry & Lawrence, 1998). In “No Silver Bullet: Essence and Accidents of Software

Engineering,” Brooks (1987) articulated it best when he wrote that

[N]o part of the conceptual work is so difficult as establishing the detailed

technical requirements, no other part so cripples the resulting system if done

wrong, no other part is more difficult to rectify later, and no silver bullet exists to

deliver one order-of-magnitude improvement in productivity, reliability, or

simplicity. (p. 10)

In part this is due to our success in solving previous problems through advances in our

technology; as quickly as we solve one problem (e.g., intuitive user interfaces), we

address the next frontier of not-easily-solved problems (Berry, 2008). Thus, this gap is

the next critical problem in software development to be addressed, and so we focus on

the evolution of RA from a bureaucratic and contractual process to a systems-focused

activity encompassing both the application and larger enterprise context it serves

(Nuseibeh & Easterbrook, 2000). Yet the linkages elude us, evidenced by the lack of

33

Page 45: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

tools and the lack of metrics employed to measure the compatibility of these two

contexts for requirements capabilities.

In part, this gap between essence and accidents is a by-product of the

approximation of software development to classical manufacturing. A Communications

of the ACM editorial stated, “In the end, and in the beginning, it is all about

programming” (Crawford, 2001, p. 2). This represents a perpetual bias in engineering,

where value can only be measured when something tangible is created. IT is still

considered by many to be primarily a tool-making discipline, with the value proposition

centered on the capability to build high-quality tools. This defies experience, where

companies who produce great products do so while outsourcing the manufacturing

process. It would not be possible to build a car without first understanding the

requirements of the driver. If the wrong design or a bad requirement is manufactured or

programmed, the result fails to meet the user need regardless of the quality of the

production process. The paradox is that although the ability to identify and learn about

new requirements methods no longer presents a barrier, it is the difficulty to implement

and measure the impact which inhibits the use of better methods (Glass, 2006). There

persists a bias that RA and design capabilities are useless without corresponding

capabilities to build the software, and that the capabilities of design are linked to the

capabilities to build (Lyytinen & Robey, 1999). The continued emphasis on operational

metrics may contribute to this paradox, underpinned by the belief that the measurement

of development capability is only possible by simultaneously measuring design

capability (Kappelman et al., 2013). Blending RA with software development is no

shortcut to improving RA capabilities. Therefore, it is necessary to look first at the

evolution of maturity models of software development to better understand how RA

34

Page 46: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

practices have been influenced by and evolved through this linkage to development

capabilities.

Evolution of Maturity Models in Software Development

Systems engineering capability measurement has evolved with the development

of software capability maturity model integration (CMMI), benchmarking the extent to

which a specific software development process is explicitly defined, managed,

measured, controlled, and effective (Paulk, 1995). In practice, the level of maturity

indicates how successful a company is in software development practices and reflects

the adage that people care about what they are held accountable for through

measurement (Beecham et al., 2003a). When the measure of RA is tied to the measure

of software development, correlation would be expected, which is borne out in the

literature (Ashrafi, 2003; Fitzgerald & O'Kane, 1999; Jiang et al., 2004; Salmans, 2009).

This reflects a belief that there is little point to having mature RA capabilities if all other

software engineering processes are at a lower maturity level (Beecham et al., 2003a).

Models for measuring the growth of capabilities have been used in

organizational research and IS research to describe patterns which occur in a

hierarchical succession and generally in some sequence (King & Teo, 1997). These

models can generally be labeled as maturity models, where the effectiveness of a

process is reflected in the relative maturity of the organizational practice, as in the case

of the Software Engineering Institute’s (SEI’s) software CMMI (Kaner & Karni, 2004).

Maturity models provide a basis for the control of software development processes and

practices by benchmarking performance against agreed-upon standards for processes,

outcomes, and practices (Steghuis, Daneva, & van Eck, 2005). Specific to software

development, the success of CMMI has had great influence on research in RA and has

35

Page 47: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

become the de-facto industry standard for measuring software quality processes

(Leffingwell, 1996). Studies confirm that software process maturity is positively

associated with project performance, finding that the integrity of software work products

can be correlated to the integrity of the development processes (Fitzgerald & O'Kane,

1999; Jiang et al., 2004; Jones, 2000; Jones et al., 2011).

Many authors have proposed maturity models for requirements engineering and

have proffered a variety of constructs to measure the capabilities of an organization to

elicit and capture requirements (Cheng & Atlee, 2007; Niazi, Wilson, & Zowghi, 2005;

Sommerville & Sawyer, 1997). All of these emphasize the relationship between RA

capabilities and the ability to build software as measured by the CMMI, measuring RA

as a component within the process of creating software (Beecham, Hall, & Rainer,

2005). As discussed previously, this is because there is an association between the

capability or maturity of a process and what is measured (where there is accountability)

(Beecham et al., 2003a). This implies that RA performance is primarily held accountable

through the development of the application, and thus is tightly coupled with the

capabilities of the organization to write software in a vicious or virtuous cycle (Lyytinen

& Robey, 1999). This is most clearly illustrated in methodologies to decompose the RA

process and directly associate it with the CMMI, identifying RA sub-processes to

augment specific CMMI levels (Beecham et al., 2005).

Requirements as Distinct from Software Development Capabilities

Many researchers and practitioners write of how much more important it is to get

the requirements correct, finding that requirements maturity is more important to

outcome than the capacity to develop software (Mylopoulos et al., 1999). This echoes

Brooks’ (1975, 1987, 1995, 2003) observation that the most difficult part of system

36

Page 48: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

development is getting the requirements right. Research in the field has identified how

much more expensive errors are to fix when discovered after requirements have been

set and development is undertaken (Hay, 2003; Jones, 1996). This implies that

requirements development capabilities are separate from software development

capabilities (Salmans, 2009). Requirements drive the development process, defining

unambiguous testable statements to describe the processing, information, performance,

error handling, or capacity needed by the user to perform some function (Halbleib,

2004). This has led to the proposal of new requirements engineering maturity

frameworks without regard to correlation to development capabilities (Niazi, Cox, &

Verner, 2008). However, this has done little to the independent measurement of RA

capabilities or towards associating RA capabilities with anything other than software

development capabilities.

Despite the recognition that requirements capabilities are more important or at

least equal to development capabilities, alignment between specific software

requirements and overall business needs continues to be difficult, with little done to

isolate and measure RA capabilities (Goodhue & Thompson, 1995; Luftman &

Kempaiah, 2007; Niazi et al., 2008). In part, this is due to the fact that requirements

resides in the problem space whereas development resides in the solution space, with

the task of problem definition being less constrained than that of solutions development

(Cheng & Atlee, 2007). This parallels the Zachman (2007, 2009) ontology, where the

scope is set for the enterprise (row 1), and the remaining rows define the requirements,

schematics, specifications, configurations, and instances of implementations within

those boundaries. This also relates to Luftman’s IT-business alignment maturity criteria,

where different levels reflect the degree of linkage between the problem definition (a

37

Page 49: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

business need) and the articulated solution (the IT solutions employed; Luftman &

Kempaiah, 2007).

RA Maturity Defined

Both the decomposition and system approaches in software development can be

found in the RA literature. Decomposition in RA can be summarized as ensuring that

the needs, goals, and objectives of one component are consistent with the attributes of

other components (Oh & Pinsonneault, 2007). Systems approaches in RA articulate that

excessive focus on technical points instead of business alignment is a contributing

factor to development failures (Kim & Peterson, 2003). Capabilities in both are

essential, and are possibly correlated, but should be viewed as distinct capabilities.

Constructs of maturity in RA must mirror the duality expressed by Brooks (1987)

of essence and accidents and reflected in the bifurcation of RA into analysis of user

requirements and the alignment of systems to support business objectives. As

demonstrated by Jones (2008), accident control is process related, with improved

processes resulting in fewer accidents (errors) in coding. Essence, however, is

dependent upon the integration of IS planning and business planning, the insufficiency

or absence of either resulting in the misalignment of IT with business objectives (King &

Teo, 1997).

Stages of growth models have been used in both organizational research and IS

research to illustrate development of maturity and process improvement in predictable

order (Greiner, 1997; Kaner & Karni, 2004). As previously identified, a gap exists

between software development maturity and outcome quality models, where

improvement in software development (as exemplified in the CMMI model) do not

necessarily translate to improved user acceptance or IS success. When trying to explain

38

Page 50: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

perceived usefulness, users address it in the larger sense of the context of the

environment they of which they are aware, such that a solution perceived to be useful to

one user may not be perceived to be useful to another user working within a different

context (Venkatesh & Davis, 2000). This can be narrowed to deficiencies in accounting

for the role of an overarching set of organizational requirements often invisible in system

development, in the selection of a commercial application, or alignment of IT with

business objectives (Jones et al., 2011). We must better articulate the gap between

essence and accidents and isolate RA capabilities for each context.

Better Articulating the Gap in RA

Several studies support the existence of this gap between process improvement

and outcome management, in part because the requirements process is more

heterogeneous than the development processes (Sommerville & Sawyer, 1997). Other

researchers have found where process improvement focuses on actualization of

agreed-upon requirements, little attention has been paid to ascertaining if those

requirements are the right ones (Hutchings & Knox, 1995). Studies also support Brooks’

essence and accidents duality in finding that organizational and technical requirement

deficiencies are not completely correlated, with technical problems diminishing with

CMMI maturity but organizational problems remaining static (Beecham et al., 2005).

This can be attributed in part to the failure of organizations to develop strategies for

addressing changes to the organization or from making any attempt at tying

organizational requirements to software features or functionality (Zachman, 2004).

This deficiency is apparent in the alignment gap, with the majority of

organizations reporting that their IT capabilities are neither aligned nor effective, and IT

is kept largely separate from the organization’s core business functions (Kappelman &

39

Page 51: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

Salmans, 2008; Shpilberg, Berez, Puryear, & Shah, 2007). Alignment matters, as

demonstrated in studies where organizations realize up to 40% greater return than their

competitors for the same IT investment (Ross et al., 2006; Weill, 2004). Much has been

written on tools to support strategic IS planning, but little exists on addressing IT

requirements in the context of the planning process for the enterprise (Rogerson &

Fidler, 1994). One approach has been to involve business departments in one-on-one

relationships with IT executives in the belief this will ensure alignment with strategies

(Weill, 2004). In another approach, business strategy is viewed as the leader to IT

strategy, providing the input to IT without including IT in the process to formulate

strategy (Versteeg & Bouwman, 2006). This may in practice be a recipe for alignment

with tactics and a misalignment with the organization, an extension of the vendor-driven

alignment previously discussed. This has reduced the role of IT from a partner helping

to shape requirements to an integrator not directly connected to the function of the

organization (McKeen & Smith, 2008). Vendors now target the business manager more

than the internal IT specialists, knowing that is where the RA and fit evaluation occurs.

This gap is also apparent in the EA community, with disconnection between the

technical architecture and the business architecture despite the articulation of the

importance of complimentary business and IT strategy (Weill, 2004). It is much the

same problem, with the difficulties of ensuring all solutions complement one another to

achieve the overall goals of the organization being much more difficult to measure than

the performance of a specific function or business process out of context with the rest of

the organization (Alter, 2002). Put another way, organizational culture determines much

of what can be done to manage change (Ngwenyama & Nielsen, 2003). Software

implementation is by definition an exercise in organizational change. If this is to achieve

40

Page 52: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

alignment with the business goals, it is essential that RA activities incorporate the

elements of alignment and organizational change models.

Separating RA Capabilities from Software Development Capabilities

To advance our discipline, it is necessary to separate RA capabilities from

software development capabilities. This is much easier to state than to accomplish, with

the effectiveness of RA activities to address issues of essence questioned by some as

being too complex and variable to measure to any useful degree (Palyagar, 2006). This

perspective is supported by those who believe that the very nature of application

development is to be a change agent to the organization, imposing a machine-like order

on a human environment (Bostrom & Heinen, 1977). This line of thinking has helped

spur the school of thought that the practice of separating process definition

(requirements gathering) from process execution (programming) is flawed, arguing that

architecture and engineering are inseparable and lend support to the software

development methodologies of concurrent design-and-build as we see in Agile and

eXtreme (Fowler & Highsmith, 2001; Wendorff, 2002). Still others write that due to the

very complexity of understanding human and organizational needs, requirements

determination will always be an ad-hoc process (Browne & Ramesh, 2002). This has

been extended by some to argue that good requirements practices are neither

necessary nor sufficient, producing neither reductions in the cost of development nor

increases in quality of the resulting product (Davis & Zowghi, 2006).

One of the other challenges is that defining requirements which bind the

organization is difficult work, requiring a comprehensive understanding of ends as well

as means, recognizing the border between them and their relationship to one another

(Farrell & Connell, 2010). This challenge often goes unmet, being easier to optimize and

41

Page 53: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

measure performance of stovepipe processes out of context of the whole (Kim et al.,

2006). Even within EA, Zachman (1987, 1997, 2004, 2007) often has opined that

requirements development is hard work, which is also reflected in governance models

where policy alignment with desired outcomes is neither easy nor quick. It is this

disconnection which runs through both the research and practice of IS development,

and which this study helped articulate and measure.

Research Proposal

To transform IT back into a strategic advantage, an RA capabilities definition and

measurement instrument must be developed, benchmarking and connecting both the

accident and essence aspects of success. This need is articulated in the CMMI

research stream, recognizing that RA is in need of further support but placing it outside

the scope of process improvement models (Beecham et al., 2005; Sommerville &

Ransom, 2005). Interestingly, this same need has been articulated in the EA research

stream, with no comprehensive model for the measurement of EA benefits,

benchmarking performance to maturity in terms of payoffs for the organization (Tamm,

Seddon, Shanks, & Reynolds, 2011). Thus, outcome measurements of assessing IT

value need to be incorporated, with other models measuring the success and alignment

of IS used to provide the attributes and causal linkages to measure relationships

between good RA practices and value to the organization.

One approach would be to research the relationship between maturity in software

development capabilities and RA capabilities, where empirical evidence is needed to

demonstrate improved RA correlates to improvements in software engineering (Damian

& Chisan, 2006). Another approach is to separate RA maturity from software

engineering maturity, allowing that factors which motivate an organization to pursue

42

Page 54: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

software engineering improvements may not motivate improvements in RA capabilities

or in alignment (Mylopoulos et al., 1999). Although RA processes are addressed in

CMMI, it does not allow for the independent measurement of RA capabilities from the

process of system development (Beecham et al., 2005; Hay, 2003).

Previous research has supported a positive correlation between RA and design

practices and artifacts, and can be validly operationalized (Salmans, 2009). This is

related to Brooks’ (1987) accident versus essence hypothesis, but it is also illustrated in

Luftman’s (2004) IT-Business alignment gap and the Zachman (as cited in Kappelman

2010) EA gap between stovepipe solutions and enterprise integration. In all three

instances, context is the central issue, where capabilities at the level of instantiation

(software development) risk disconnection from the level of organizational planning

(EA). Although it is apparent from this synthesis of the research that ISD maturity

includes RA capabilities, requirements capabilities out of context of software

development have seldom been studied. This provides the foundation to examine the

differentiation between RA capabilities for an SA&D and RA capabilities for EA, a

distinction which has not yet been clearly articulated or operationalized. It is necessary

to extend previous research on RA to separate RA capabilities in the practice of system

development from the RA within the enterprise context. This isolation allows the

analysis of RA in the narrow context of software development separate from the

software development capabilities. In the future, this will allow comparison of the degree

of maturity of requirements capabilities to meet the out-of-context user requirement with

the maturity of requirements capabilities used to address the overall organizational

framework. Finally, the relation of RA activities with the overall value produced by the

whole of an organization’s IS needs to be examined and sub factors and relations

43

Page 55: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

identified using attributes such as those of Luftman’s (2004) business–IT alignment

maturity model, Senge’s (1990) learning organization model, and the DeLone and

McLean (2003) IS success model.

Research Questions

The primary research question was: can requirements efforts and activities be

operationalized within the context of developing a discrete software solution? The

secondary research question was: do requirements efforts and activities in the context

of developing a discrete software solution constitute a unidimensional or

multidimensional construct?

The focus of this study was the primary research question. As part of the

instrument validation, the results for requirements efforts and activities tested for

unidimensionality. However, as illustrated in the literature review, alignment and

organizational learning are clearly multidimensional constructs. As these are often

stated goals of specific system development projects, it was expected that RA would be

consistent with these constructs and exhibit multiple dimensions. This analysis was

accomplished by building on a previously developed instrument and asking

requirements-related questions in two contexts: in development of the software for a

specific purpose (SA&D); and in the context of the overall business objectives and

technology environment of the organization (EA).

RA capabilities which support the development of a specific software solution

(SA&D capabilities) were separated and measured distinctly from system development

capabilities (SA&D use). Historically, essence and accidents have been confounded as

the requirements component has been incorporated into ISD capabilities, comprised of

44

Page 56: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

both the ability to elicit requirements (RA practices and capabilities) and to instantiate

requirements (use of RA outputs).

RA capability for SA&D is defined as the ability or maturity of requirements

development practices used to describe a functional process or a product or service and

design a solution in order to develop software to achieve a specific business objective

(e.g., the requirements elicitation process performed in the development or selection of

a payroll system). In this case, the purpose of SA&D was to precisely clarify and define

the goals, processes, structure, function, behavior, hardware, software, interfaces, and

data for a proposed software-based solution. A system is defined as an arrangement or

collection of objects which operate together for a common purpose. SA&D is typically

followed by the application of software development methods to perform at least the

partial automation of complex human and/or mechanical systems.

Future papers will analyze the data collected to contrast requirements practices

in the system development context with the practice of RA to define the EA in the

context of the overall enterprise objectives, using, for example, enterprise-wide data

models, enterprise-wide process models, application portfolio models, and enterprise-

wide IT services catalogues. The purpose of EA is to provide a clear and

comprehensive picture of an organization to serve as

a blueprint for organizational change defined in models (of words, graphics, and

other depictions) that describe (in both business and technology terms) how the

entity operates today and how it intends to operate in the future; it may also

include a plan for transitioning to this future state. (Kappelman, 2010, p. 2)

An EA is the holistic set of artifacts describing an enterprise over time, including the

strategy, objectives, business entities, technologies, procedures, purpose, jobs and

45

Page 57: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

positions, processes, timings, policies, and data models (Zachman, 2009). An EA

includes and serves as the guiding principles for the RA and design of systems which

serve as part of the transformation of the organization from the current to a desired

future state. Misalignment of IT is then defined as the failure to understand

requirements in the larger context, that of the overall enterprise.

Two distinct RA capabilities were ariticulated: those performed in the small

context (to mitigate accidents and essence mismatches in the context of IT support for a

business process) and those performed in the large context (to mitigate mismatches of

essence in the overall context of enterprise planning and architecture of the

organization). RA in the large context is defined as an element of EA; RA in the small

context is defined as an element of SA&D. This research focused on requirements

efforts and activities within the context of SA&D.

The hypotheses derived from this literature review and in the research model

include:

Primary Research Hypothesis:

H1: Requirements efforts and activities within the context of SA&D can be validly

operationalized.

Secondary Research Hypothesis:

H2: Requirements efforts and activities within the context of SA&D are a

multidimensional construct.

From Luftman’s (2004) alignment model for IT–business alignment,

measurements of the effectiveness of communications, the user’s perception of IT value

provided, the governance model employed, partnership between business and the IT

organization, the respondent’s evaluation of the scope and architecture of IT solutions

46

Page 58: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

employed, and the measurement of the requisite skills to achieve alignment provided

the theoretical constructs to identify potential sub factors or dimensions. A dimension

may have significance in the SA&D context but not in the overall enterprise context or

vice versa, and there may be unique dimensions in each context. Good practices in RA

in the SA&D context may not translate well in an overall enterprise context, resulting in

an IT shop that produces high quality solutions but that fail to align with the overall

business objectives. It is possible that dimensions can be identified which appear to

correspond with Luftman’s alignment model.

Summary

In both theory and practice, RA capabilities have been considered as an element

of both the software development process and as an intrinsic component to system

implementation. As witnessed in other research areas, the overall measurements of IS

success, IT alignment, and user acceptance all demonstrate a gap between what the IT

organization strives to accomplish and what the business wishes IT to provide. This gap

in alignment, user acceptance, and IS success may in fact all be the same gap, the gap

between RA in the out-of-context development of a system or software to support a

specific business process, and the overall requirements or objectives the organization

values and strives to achieve. This disconnection between RA capabilities in a narrow

and a broad context exists in the literature stream and manifests itself in practice every

day. Articulating, measuring, and addressing this disconnection may hold the key to

achieving IT alignment, user acceptance, and ultimate IS success. Identifying and

operationalizing the dimensions to RA may hold the key to identifying where gaps in our

understanding of the need is incomplete, and could potentially help develop a more

comprehensive approach to alignment of IT with the business.

47

Page 59: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

CHAPTER 3

METHODOLOGY

The intent of this research is to investigate whether requirements efforts and

activities related to the development of a specific software product (requirements

analysis [RA] efforts and activities in the small context of systems analysis and design

[SA&D]) can be validly operationalized. Validation is the process of ensuring the model

is sufficiently accurate for the purpose intended (Beecham, Hall, Britton, Cottee, &

Rainer, 2005). A previously used instrument developed in 2007 for the Society for

Information Management (SIM) provided the basis to develop a refined instrument to

collect data from current senior information technology (IT) professionals who are

members of SIM.

Working with the members of the Society for Information Management Enterprise

Architecture Working Group (SIMEAWG) as an expert panel, we started with a

discussion of the 2007 study, the survey questionnaire, and the findings with a goal of

determining how the next study could fill in gaps in both the professional and academic

knowledge stream. It was determined that a deeper examination of actual practices was

necessary, bifurcating the requirements efforts and activities in a narrow context

(systems analysis and design [SA&D]) from those in the larger context (enterprise

architecture [EA]), and both from the practice of software development capabilities

(software engineering and project management). Beginning with the 2007 instrument,

an extensive literature study was conducted and a new instrument developed which

was then returned to the SIMEAWG members for feedback. This recursive process

proceeded through several iterations, and a final pilot test of the online questionnaire

48

Page 60: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

was performed with all members of the SIMEAWG. The differences between the 2007

and 2012 questionnaire and instrumentation included:

1. shortening the demographic questions section regarding the respondent,

the organization, and their respective IT department

2. inclusion of the 2007 study’s software engineering and project

management questions, which were developed by Kappelman (1997) and

first used in SIM’s first Y2K study in 1996

3. adding questions about software development and maintenance practices

4. expanding the scope of the RA capabilities questions

5. adding specific questions about the organization’s EA practices (e.g., does

your organization have an EA group, do you use a digital repository for EA

artifacts)

6. dropping questions regarding the perceptions of EA

7. adding questions about the use of certain enterprise-wide requirements-

related practices and artifacts (e.g., data model, services bus)

8. revising and expanding EA and system analysis and design (SA&D)

questions from the 2007 survey in order to assess specific practices

instead of just perceptions

In summary, the focus of this instrument was to determine the state of actual

practices and capabilities and identification of specific actions taken in the performance

of requirements gathering, requirements validation, requirements documentation, and

the utilization of the requirements as distinct from system development capabilities and

practices. Moreover, RA capabilities were expanded and defined in two distinctly

49

Page 61: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

different dimensions or contexts: EA, being the holistic context; and SA&D, being the

narrow context. Specifically, SA&D and EA were defined in the questionnaire as follows:

1. the purpose of SA&D is to precisely clarify and define the goals,

processes, structure, function, behavior, hardware, software, interfaces,

and data for a proposed IT system solution

2. an EA is the holistic set of artifacts describing an enterprise over time,

including the strategy, objectives, business entities, technologies,

purpose, jobs and positions, processes, timings, policies, and data models

employed; an EA includes and serves as the holistic context for SA&D.

The survey divided requirements practices and capabilities into these two

separate categories, dimensions, or contexts and summarized the difference as SA&D

being about the parts and EA about how all the parts fit together in an overall,

enterprise-wide context. The questionnaire also asked about the extent to which certain

enterprise-wide technologies are used, the extent organizations are developing their

own software, and the nature of their software development capabilities.

All of the requirements practice and capabilities questions were answered with a

5-point Likert-type scale ranging from 1 – “Strongly Disagree” to 5 – “Strongly Agree,”

with 3 being “Neutral” and a “Don’t Know/Not Applicable” option. In software process

assessment research, this approach provides the empirical information necessary to

support maturity level assignments and value realized by the organization (Goldenson,

El Emam, Herbsleb, & Deephouse, 1999).

Population and Sample

IT professionals from SIM were polled directly from the membership lists. SIM

membership numbers 3,730 IT professionals worldwide, but members are located

50

Page 62: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

primarily in North America. The membership list was made available through the

SIMEAWG. Members of SIM include mostly IT executives and about 10% IT vendors,

consultants, and academics, primarily from for-profit organizations. The SIM group has

provided survey populations for many IS research studies, the earliest dating back to

the 1980s to study key issues in information systems (IS) development (Ball & Harris,

1982; Dickson, Leitheiser, Wetherbe, & Nechis, 1984). Additionally, SIM has contributed

to many studies for IT executives and researchers (Brancheau, Janz, & Wetherbe,

1996; Brancheau & Wetherbe, 1987; Luftman, 2005; Luftman et al., 2006; Niederman,

Brancheau, & Wetherbe, 1991). SIM has also contributed to several books on topics as

diverse as information management (Choo, 2002), Year 2000 activities and issues

(Kappelman, 1997), and EA (Kappelman, 2010).

Research Strategy

Similar to previous instrument validation research, a field study using an

electronically delivered survey instrument was distributed. Participants were identified

from the SIM membership list, and a unique identifier was created for each participant.

An email was sent to each member with a link to automatically authenticate them

uniquely to the survey instrument. If the participant was disconnected or had to leave

the survey before completion, they would simply select the link again, and they would

be taken to where they had previously stopped.

The SIMEAWG was instrumental in providing subject matter expertise in

reviewing and improving the survey questions, and provided the population to pilot test

the questionnaire and the instrumentation in order to make rudimentary analysis on

question statistical validity. Results of the pilot survey were used to assist in evaluation

of content validity, construct validity, and concurrent validity through the analysis of

51

Page 63: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

correlation between scores from the pilot and the application of the general survey

(Kerlinger & Lee, 1999; Nunnally, 1975).

The Importance of Instrument Validation

Instrument validation has been a subject for discussion in positivist research for

decades, but seldom addressed in Information Science (IS) research, with exploratory,

qualitative, and non-empirical methodologies dominating the field which do not require

the same level of validation (Straub, 1989). Survey research is a quantitative method,

primarily concerned with relationships between variables or projected findings

descriptively to a population, particularly where research and theory are in the formative

stages (Benbasat, Goldstein, & Mead, 1987). With academic study of IS dependent

upon methods and application of instruments to produce knowledge, validation of

instruments used to collect, analyzing, and interpret data is essential for the progress of

the discipline (Pinsonneault & Kraemer, 1993). As research evolves from exploratory to

confirming, it becomes necessary to take research instruments and verify their findings

(Vessey & Weber, 1984). Furthermore, the quality of survey research varies

considerably among studies with explanatory studies of generally better quality than

exploratory and descriptive studies (Boudreau, Gefen, & Straub, 2001). When fully

validated instruments are available, studies can be replicated in different settings,

measuring the same research constructs in the same way (Cook, 1993). Thus, the

validation of survey questionnaire instruments used to gather data is itself an essential

contribution to the IS discipline.

Instrument Validation Process

Even though research instruments are often adopted by other researchers, the

original instrument itself was often not validated or the researcher has made major

52

Page 64: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

alterations in the validated instrument without testing it, thereby losing the rigor of the

previous work (Straub, 1989). Instrument validation is essentially about determining if

the instrument measures what it is supposed to be measuring and whether different

respondents view the questions as representative of the content measured without bias

(Cronbach, 1971). Instrument validation is key to generalizing the findings and

contributing to the research stream. Content validity, construct validity, and reliability

were explored and support for internal validity was demonstrated. Construct validity,

content validity, and reliability needed to be established or the instrument risks would be

deemed invalid, particularly when measuring unobservable constructs such as RA

capabilities (Godfrey & Hill, 1995).

Content Validity: RA and Design Capabilities Construct

Working with the data collected from the previous application of the survey

instrument in 2007, validation that the questions were measuring what they were

supposed to be measuring was much less subjective than would be possible with a

newly formed instrument. The previous instrument’s measurement of RA capabilities

was expanded to separate instruments for the requirements efforts and activities for

both the larger organizational objectives, represented by EA, and of the smaller

application objectives, represented as SA&D.

Although SA&D was the focus of this research, as illustrated in Figure 3, a dual

scale (requirements in the small vs. large context) was developed, and the respondent

was asked to contrast the aspect of RA in either the work of software development or in

the articulation of overall software alignment with the business objective simultaneously,

thereby giving the respondent the ability to articulate a contrast (if one exists) between

the same activities in the small or large context.

53

Page 65: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

For each context (SA&D and EA), my organization’s requirements, capabilities, and practices __________________

7. System Analysis and Design context 8. Enterprise Architecture context

SDa Db NANDc Ad SAe

N/A,

DKf SDa Db NANDc Ad SAe

N/A,

DKf

a. are measured.

b. are benchmarked against

other organizations.

c. are aligned with the

organization’s objectives

d. are highly disciplined.

e. are valued by non-IT

leadership.

f. have non-IT leadership buy-

in and support.

g. are characterized by

communication between

stakeholders and the

requirements team.

h. describe our present “as is”

or current environment.

i. describe our “to be” or

desired environment.

j. further innovation in our

organization.

k. are viewed strictly as an IT

initiative.

l. improve our ability to

manage risk.

m. contribute directly to the

goals and objectives of our

business plan.

n. are well prioritized by non-

IT leadership.

o. are valued by IT leadership.

54

Page 66: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

p. have IT leadership buy-in

and support.

q. are well prioritized by IT

leadership.

r. serve to establish

stakeholder agreement on

requirements.

s. serve to establish quality

goals with stakeholders.

t. are highly developed and/or

mature.

u. save time on projects.

v. save money on projects.

w. help us manage complexity.

x. help us manage change.

Figure 3. The research instrument operationalizing requirements practices. Sa = strongly disagree. Db = disagree. NANDc = neither agree nor disagree. Ad = agree. SAe = strongly agree. N/A, DKf = not applicable, don’t know.

The purpose of doing requirements is to provide a clear and comprehensive

picture to serve as a blueprint for organizational change defined in models (of words,

graphics, and other depictions) that describe (in both business and technology terms)

how the entity operates today and how it intends to operate in the future. We proposed

there were two separate contexts in developing requirements: (a) those associated with

describing a specific IS solution, and (b) those associated with the goals and objectives

of the overall organization. For the purposes of this survey we called these SA&D

practices and EA practices, but they may have other names in your organization.

The purpose of SA&D is to precisely clarify and define the goals, processes,

structure, function, behavior, hardware, software, interfaces, and data for a proposed IT

system solution. A system is an arrangement or collection of objects which operate

55

Page 67: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

together for a common purpose. SA&D typically involves the application of software

development methods to at least the partial automation of complex human and/or

mechanical systems.

An EA is the holistic set of artifacts describing an enterprise over time, including

the strategy, objectives, business entities, technologies, procedures, purpose, jobs and

positions, processes, timings, policies, and data model(s). An EA includes and serves

as the holistic context for SA&D.

In short, SA&D is about describing the parts, and EA is about describing how all

those parts fit together. For example, the data schema for payroll is an SA&D output or

artifact while the data schema for the entire organization is an EA artifact (that

incidentally contains many SA&D artifacts).

Although a subject for future research, it was expected that this will help

ascertain the degree of independence and interdependence of the two dimensions of

RA capabilities–that of the small versus large context. The questions themselves were

largely unaltered from the original 2007 instrument and had been mapped to Zachman’s

(1997) ontology, Senge’s (1990) learning organization model, DeLone and McLean’s

(2003) IS success model, Luftman’s (2004) maturity model, the Software Engineering

Institute (SEI) capability maturity model integration (CMMI; Chrissis et al., 2003), Ross’

(2003) model of EA maturity, the Government Accountability Office (GAO; Bellman &

Rausch, 2004) framework for EA improvement, and the Office of Management and

Budget (OMB; Martin & Robertson, 2003) framework for EA assessment. Sub factors in

factor analysis may be traced to these mappings, indicating a relationship between RA

capabilities and attributes of alignment or organizational learning. The question mapping

matrix can be seen in Table 1, each of which are briefly discussed individually below.

56

Page 68: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

The Zachman (1997) framework or enterprise ontology is one of the authoritative

models of EA and has provided a formal and highly structured way of viewing an

organization. The schema of categorizing elements by primitive questions (what, how,

where, who, when, and why) and by views or paradigms (executive, business

management, architect, engineer, technician, and the as-built organization) provides a

taxonomy for categorizing elements of an EA. Instrument questions are categorized as

to which primitive question they address, if they identify artifacts created by EA

activities, or if they assess value of performing EA functions that the organization

realizes. This addresses the capture of requirements in the larger sense of meeting the

organizational purpose and objectives, or to paraphrase Brooks (1987), are the right

tools for the organization being developed for the organization. The Zachman (1997)

framework addresses the capture of the requirements in the larger context, associating

the essence of the overall objectives of the organization to the out of context

requirements.

Senge’s (1990) learning organization model identifies the five main technologies

or characteristics an organization exhibits in its development toward becoming a

learning organization. Systems thinking, personal mastery, mental models, shared

vision, and team learning together comprise the aspects developed by an organization

to be successful in a changing environment. Questions are categorized into which of

these characteristics they best fit. This aspect explores the relationship between the

maturity of the software development processes and the maturity of the organization as

a learning entity. In some cases, questions may fit more than one category, and

consequently are not categorized.

57

Page 69: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

Table 1

Question Mapping Matrix

58

Page 70: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

It is hypothesized that maturity as a learning organization is positively correlated with

maturity in RA.

DeLone and McLean’s (2003) model of IS success has long been a model to

provide perspectives of success for IS. Six contributing factors are identified:

information quality, system quality, service quality, intention to use/use, user

satisfaction, and net benefits. These have been further defined by DeLone and McLean

and other researchers who have used the IS success model. Questions pertaining to

each factor are marked above in Table 1. This aspect explores the relationship between

mature development processes and perceived IS success. System success has been

called the central performance variable in the management information system (MIS),

but the association of IS success with software development capabilities has been

identified as a relationship needing clarification (Straub, 1989; Taylor & Todd, 1995).

Luftman’s (2004) alignment maturity model identifies six alignment criteria:

communications, value measurements, governance, partnership, scope and

architecture, and skills. Depending upon the development of the identified attributes, an

organization is classified into one of five maturity categories: initial, committed, focused,

improved, and optimized. Questions pertaining to each criterion are marked, and it may

be possible through analysis of responses to judge a level of maturity from the results.

This aspect explores the relationship between the maturity of the development

capabilities and the alignment of the organization with the IT products developed.

The Carnegie Mellon SEI’s CMMI was introduced in Chapter 2 and is probably

the best known maturity model in IT, having been used in a variety of applications

(Chrissis et al., 2003). CMMI is one of the most popular process improvement training

and appraisal programs and is required by many U.S. Government contracts,

59

Page 71: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

particularly those for software development. CMMI can be used to guide processes

across a project or division, and is rated according to maturity levels (initial, repeatable,

defined, quantitatively managed, and optimized). This model is the base model for

measuring software development maturity. It is focused on development capabilities in

the context of defined software development project activities, thus providing an

information system development (ISD) metric analog for RA capabilities in the small, or

to describe in the context of Brooks (2003), in the management of accidents.

Traditionally, requirements capabilities are assessed as a component of software

engineering. CMMI defines ISD as including RA, software engineering, and project

management (Gibson et al., 2006).

Ross et al.’s (2006) model of EA maturity was developed through MIT’s Center

for Information Systems Research comprising four stages: business silos, standardized

technology, optimized core, and business modularity. Similar to the Luftman (2004)

model, this model provides an evolutionary approach to maturing EA practices of a firm.

Including this model explores the relationship between the maturity of the development

practices with the maturity of the EA practices. Unlike the Luftman model, however,

aspects of this model are more demographic in nature than practices and artifacts.

The GAO framework for assessing EA management is a stage model evolved

from the federal EA framework (FEAF; Martin & Robertson, 2003). The five stages are

awareness, foundation, development of the EA, completing the EA, and leveraging the

EA to manage change. Thus, should a respondent report the existence of either SA&D

or EA artifacts or practices, it corresponds to a higher level of maturity. Similar to the

other models, the dual context questions help analyze independence and

interdependence of RA and EA activities, artifacts, and practices.

60

Page 72: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

The OMB framework for EA assessment consists of five levels of maturity (initial,

managed, utilized, results-oriented, and optimized), and thus it is similar to the GAO

framework (Bellman & Rausch, 2004; Martin & Robertson, 2003). However, OMB has

extended it to a higher level of defined maturity. Starting with the base framework of

GAO which focuses on completion of specific activities, OMB articulates more use and

results aspects, making it more of a true maturity scale rather than a completion metric

for project management activities of establishing an EA practice.

It was expected that RA capabilities in an SA&D context would be more mature

than RA capabilities in an EA context, that they would be correlated with one another,

and that any sub factors to RA capabilities would correspond to factors of IS success,

learning organizational capability, and IT alignment. The survey was generically titled as

an “Information Management Practices Study” to try to prevent self-selection bias from

too technical of a title.

Summary

“IS cannot get alignment right if we cannot get the requirements right” (as cited

in Kappelman, 2010, p. 89). Validating an instrument to operationalize requirements

efforts and activities in the SA&D context is the first step toward defining the RA

capabilities artifact and constructs. Identification sub factors underlying RA capabilities

in different contexts can then shed light upon the challenges of alignment, user

acceptance, and IS success. The question mapping matrix in Table 1 suggests these

dimensions exist, and analysis of the data collected supported or failed to support this

hypothesis.

61

Page 73: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

CHAPTER 4

DATA ANALYSIS AND RESULTS

Introduction

This study focused on developing and validating an instrument to measure the

requirements efforts and activities focused on the systems analysis and design (SA&D)

activities related to a specific software application. Secondarily, and not the subject of

this research, an instrument was also developed to ascertain if requirements efforts and

activities in the smaller context of SA&D could be measured separately from the larger

enterprise architecture (EA) context. A research model was hypothesized, and an

electronic survey was developed and distributed to information technology (IT)

professionals. The following are the results of the statistical analysis of the data

collected.

The nature of the survey and demographic data of the respondents are

presented, followed by an examination of the content validity, criterion validity,

concurrent validity, predictive validity, reliability, construct validity, and internal

consistency which, taken together, supported or failed to support the first hypothesis:

requirements practices (i.e., efforts and activities) within the context of system analysis

and design (SA&D) can be validly operationalized. Next, the dimensionality of the

instrument is examined, and the results of exploratory factor analysis on the dimensions

associated with SA&D efforts and activities are presented and discussed. This then

supports or fails to support the second hypothesis: requirements and activities within the

context of SA&D are a multidimensional construct.

62

Page 74: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

Survey Execution and Data Collection

Qualtrics survey software was used to develop and distribute the final survey

instrument. The survey was distributed to the Society for Information Management

(SIM) membership mailing list, with completed survey results stored in a server at the

University of North Texas (UNT) Institutional Research Center. The results were

converted to Excel spreadsheets for analysis by a variety of statistical software

packages. The survey invitation was distributed by email with an embedded individual

hyperlink to connect each participant to the hosting server. The authentication by each

respondent produced anonymous but contiguous answers, supporting a variety of

analysis approaches to the individual questions and the collective whole.

Additionally, uniquely identifying each respondent ensured every respondent

could only complete one questionnaire. All respondents affirmed understanding of the

obligatory informed consent notice consisting of the title of the survey, the principal

researcher, the purpose for the study, the procedures for maintaining the confidentiality

of responses and participation records, and the UNT Institutional Review Board

approval statement before proceeding to the survey instrument. A statement of

participants’ rights was also provided to each respondent. Respondents were offered a

copy of the research findings if they were willing to provide an email address, allowing

them to see the overall findings.

The SIM provided a listing of email addresses for the entire SIM membership.

This represents a convenience sampling of the population of IT professionals. A

convenience sample was defined as drawing representative data by selecting people

because of the ease of their volunteering or selecting units because of their availability

or easy access. The advantages of this type of sampling were the availability and the

63

Page 75: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

quickness with which data could be gathered. The disadvantages were the risk that the

sample might not represent the population as a whole, and it might be biased by

volunteers (Hinkle, Wiersma, & Jurs, 2003). However, it should be noted that the overall

population of IT professionals was not known, making a completely random sample

impossible to ensure. Therefore, convenience sampling was perhaps the best which

could be done.

This listing was programmed into the survey software package to provide an

email notification, a cover letter, and the link to the online survey instrument. Survey

invitations were electronically distributed, and followup reminders to participants who

had not responded were sent. Invitations were sent out to each potential respondent

that contained an individualized link to connect them to the web server. Respondents

were authenticated to ensure only one response per individual and to permit

respondents to re-connect to their survey should the connection be terminated during

their session.

Surveys were sent out on Monday, August 27, 2012, with followup reminders to

those who had not completed the survey on September 7, September 16, and

September 24, 2012. The survey closed on Sunday, September 30, 2012. A total of

3,730 invitations were sent out with 231 completing the entire survey, producing a fairly

low response rate of about 6%. This was significantly fewer than the EA work group’s

2007 survey of SIM membership, which received 377 completed responses out of 2,863

members for approximately a 13% response rate. There were 161 incomplete

responses which were not included in the analysis; demographics of these respondents

did not indicate any dissimilarities from those who completed the questionnaire. It was

expected that the response rate would be reduced by the in-depth nature of the

64

Page 76: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

requirements analysis (RA; SA&D and EA) practices of the questionnaire, which was

where the majority of the respondents dropped out.

Demographics

The vast majority of respondents worked in for-profit organizations (80%), with

11% from governmental agencies and the remaining 9% from not-for-profit

organizations. Over 90% of the organizations were primarily located in the United

States, with a few respondents from Canada, Asia, South America, and Australia. From

the initial respondent group of 231, 33% held the title of director, 24% held the title of

chief information officer (CIO), 16% vice president, 3% EA, and 2% chief executive

officer (CEO). A little fewer than 20% of respondents reported themselves as holding a

title of something other than one of those listed above.

Of those responding, 25% reported to the CIO, 19% to the CEO, 12% to the chief

financial officer (CFO), 11% to a director, and 10% or fewer reported to either the chief

technology officer (CTO), a deputy director or undersecretary, or to a member of the

board of directors. Respondents reported that 39% of their top IT managers reported to

their CEO or president, with 54% responding that they reported to some other functional

or business unit executive. Other noteworthy demographics were that half the

organizations (114, or 49%) reported they had an EA practice, with the majority as a

component of the IT department (88%), and half of the respondents reported that their

firms had an organizational change management practice.

Context was an important consideration when interpreting the responses, and

respondents were asked from what perspective they would take the survey. Nearly

three quarters answered as an employee from the perspective of all business units, and

11% as an employee from the perspective of a single business or geographic unit.

65

Page 77: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

Thirteen percent shared they would respond from the point of view of a consultant or

service provider and 3% from some other viewpoint not listed. Over half (58%) of the

respondents worked with or in IT departments of greater than 50 people, and 81%

worked with firms whose gross revenue exceeded $50 million.

Surprisingly, 87% reported they had an in-house software development or

maintenance practice, and 48% responded that half or more of all software

development was performed by their in-house group. Only 24% of respondents

indicated they aspired to the Software Engineering Institute’s Capability Maturity Model

Integration (CMMI) for software development, and overall 60% assessed their own

capabilities at a Level 2 (repeatable) or greater. This compared favorably to previous

self-assessments. Tables 2–25 present demographic information collected concerning

the respondents and the characteristics of the IT shops and organizations in which they

worked.

Table 2

How Will You be Answering the Questions?

Job Function Frequency Percent As an employee from the perspective of all business units

168

73%

As an employee from the perspective of a single business/geographic unit

26 11%

As a consultant or service provider

30 13%

Other

7 3%

Total Number of Responses 231

66

Page 78: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

Table 3

Organization Contains Organizational Change Management Practice

Range Frequency Percent Strongly Disagree 34 15% Disagree 65 28% Neither Agree Nor Disagree 19 8% Agree 83 36% Strongly Agree 30 13% N/A / Don’t Know 0 0% Total Number of Responses 231

Table 4

Extent of Centralization for IT Budget and IT Operations in Organization

Range Frequency Percent 1 (Totally Decentralized) 6 3% 2 10 4% 3 (Half & Half) 52 23% 4 98 42% 5 (Totally Centralized) 65 28% Total Number of Responses 231

Table 5

Does the Organization Have an EA Practice?

Response Frequency Percent Yes 114 49% No 117 51% Total Number of Responses 231

Table 6

Is Organization’s EA Practice a Part of the IT Department?

Response Frequency Percent Yes 100 88% No 14 12% Total Number of Responses 114

67

Page 79: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

Table 7

Approximate Budget of Organization’s EA Practice

Range in Dollars Frequency Percent Less than $100,000 24 21% $100,000 - $249,999 18 16% $250,000 - $499,999 21 18% $500,000 - $999,999 13 12% $1 million - $9.9 million 30 26% $10 million - $49.9 million 5 4% $50 million - $99.9 million 2 2% $100 million - $499.9 million 0 0% $500 million - $1 billion 0 0% Greater than $1 billion 1 1% Total Number of Responses 114

Table 8

Number of Employees in the Organization’s EA Practice

Range in Numbers Frequency Percent Less than 10 84 74% 10-20 14 12% 21-50 8 7% 51-100 6 5% Greater than 100 2 2% Total Number of Responses 114

Table 9

Number of Years Organization’s EA Practice Has Existed

Range in Numbers Frequency Percent Less than 5 58 51% 5-10 50 44% 11-15 4 3% Greater than 15 2 2% Total Number of Responses 114

68

Page 80: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

Table 10

Organization In-house Software Development/Maintenance Practice

Response Frequency Percent Yes 200 87% No 31 13% Total Number of Responses 231

Table 11

In-house Development Percentage of Total Software Development and Maintenance

Range in Numbers Frequency Percent Less than 10 7 3% 10-20 24 12% 21-30 18 9% 31-40 15 7% 41-50 42 21% 51-60 12 6% 61-70 10 5% 71-80 35 18% 81-90 17 9% 91-100 20 10% Total Number of Responses 200

Table 12

IS Specifies a Comprehensive Procedure for Software Development and Maintenance

Procedure

Strongly Disagree

Disagree

Neither Agree Nor Disagree

Agree

Strongly Agree

N/A / Don’t Know

# % # % # % # % # % # % 1) Establishes

stakeholder requirements agreement

4 2% 25 13% 16 8% 84 42% 71 35% 0 0%

2) Identifies IS professional training needs

8 4% 53 26% 39 20% 70 35% 30 15% 0 0%

(table continues)

69

Page 81: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

Table 12 (continued)

Procedure

Strongly Disagree

Disagree

Neither Agree Nor Disagree

Agree

Strongly Agree

N/A / Don’t Know

# % # % # % # % # % # %

3) Establishes quality goals with stakeholders.

3 2% 30 15% 33 16% 92 46% 42 21% 0 0%

4) Estimates resource needs 2 1% 27 14% 31 15% 98 49% 42 21% 0 0%

5) Tracks progress and resource use 4 2% 24 12% 32 16% 84 42% 56 28% 0 0%

6) Software quality assurance 5 2% 26 13% 26 13% 87 44% 56 28% 0 0%

7) Continuous process improvement

8 4% 35 18% 50 25% 71 35% 36 18% 0 0%

8) Coordination and communication among stakeholders

2 1% 13 7% 24 12% 101 50% 60 30% 0 0%

9) Selects, contracts, tracks, and reviews software contractors

4 2% 22 11% 29 14% 83 42% 58 29% 4 2%

10) Analyzes problems and prevents reoccurrence

2 1% 31 15% 35 18% 82 41% 48 24% 2 1%

11) Tailors process to project specific needs

1 1% 19 9% 29 14% 98 49% 52 26% 1 1%

12) Continuous productivity improvements

4 2% 38 19% 53 26% 75 38% 29 14% 1 1%

Total Responses 200

70

Page 82: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

Table 13

Organizational IT Department Utilization

Procedure

Strongly Disagree

Disagree

Neither Agree Nor

Disagree

Agree

Strongly Agree

N/A / Don’t Know

# % # % # % # % # % # % a) Enterprise-wide

data model(s) 24 10% 75 33% 28 12% 74 32% 26 11% 4 2%

b) Enterprise-wide process model(s)

24 10% 75 33% 37 16% 67 29% 22 9% 6 3%

c) Enterprise-wide rules model(s) 30 13% 94 41% 49 21% 37 16% 15 6% 6 3%

d) Enterprise-wide information service bus

25 11% 75 33% 40 17% 55 24% 24 10% 12 5%

e) Enterprise-wide scheduling model(s)

24 10% 83 36% 44 19% 54 23% 18 8% 8 4%

f) Enterprise-wide library of reusable components

25 11% 94 41% 38 16% 53 23% 15 6% 6 3%

g) Single digital repository for models of the business and its technologies

36 16% 95 41% 32 14% 44 19% 19 8% 5 2%

Total Responses

231

Table 14

IT Department Manager’s Supervisor

Job Function Frequency Percent Deputy Director / Under Sec/Deputy Admin 9 4% Agency Director / Sec / Admin 3 1% Chairman of Board 1 1% Chief Executive Officer / President 90 39% Chief Enterprise Architect 0 0% Chief Financial Officer / Treasurer / Controller 56 24%

71

Page 83: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

Chief Operating Officer 42 18% CTO 1 1%

(table continues) Table 14 (continued)

Job Function Frequency Percent Vice President of Business Unit / Division 17 7% Other 12 5% Total Number of Responses 231

Table 15

Number of IT Departments in Organization

Range in Numbers Frequency Percent 0 7 3% 1 120 52% 2 24 10% 3 11 4% 4 9 4% 5 18 8% 6 12 5% 7 3 1% 8 7 3% 9 2 1% 10 4 2% 11-20 10 4% 21-50 2 1% 51-100 1 1% Greater than 100 1 1% Total Number of Responses 231

Table 16

Total IT Operating Budget in Last Fiscal Year

Range in Dollars Frequency Percent Less than $100,000 4 2% $100,000 - $249,999 3 1% $250,000 - $499,999 6 3% $500,000 - $999,999 14 6% $1 million - $9.9 million 82 35% $10 million - $49.9 million 62 27% $50 million - $99.9 million 22 10% $100 million - $499.9 million 20 8%

72

Page 84: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

$500 million - $1 billion 9 4% Greater than $1 billion 9 4% Total Number of Responses 231

Table 17

Number of IT Department Employees

Number Range Frequency Percent Less than 50 97 42% 50-99 34 15% 100-499 62 27% 500-999 10 4% 1,000-4,999 18 8% 5,000-9,999 3 1% 10,000-19,999 5 2% 20,000-30,000 0 0% Greater than 30,000 2 1% Total Number of Responses 231

Table 18

IS Department Aspires to Software Development Practices of SEI’s CMMI for Software

Development

Range Frequency Percent Strongly Disagree 29 13% Disagree 57 25% Neither Agree Nor Disagree 59 25% Agree 40 17% Strongly Agree 17 7% N/A / Don’t Know 29 13% Total Number of Responses 231

Table 19

SEI CMMI Practice Level for Organization

Level Range Frequency Percent Initial (level 1) 52 23% Repeatable (level 2) 70 30% Defined (level 3) 48 21% Managed (level 4) 17 7% Optimizing (level 5) 4 2%

73

Page 85: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

N/A / Don’t Know 40 17% Total Number of Responses 231

Table 20

Description of Organization

Organization Type Frequency Percent Profit-making 183 79% Governmental 26 11% Not-for-profit 22 10% Total Number of Responses 231

Table 21

Number of Employees in Organization

Number Range Frequency Percent Less than 100 22 10% 100-499 42 18% 500-999 34 15% 1,000-4,999 56 24% 5,000-9,999 28 12% 10,000-19,999 12 5% 20,000-29,999 12 5% 30,000-50,000 14 6% Greater than 50,000 10 4% Don’t Know 1 1% Total Number of Responses 231

Table 22

Organization’s Gross Revenue in the Last Fiscal Year

Range in Dollars Frequency Percent Less than $50 million 30 13% $50-$100 million 18 8% $101-$500 million 40 17% $501-$999 million 23 10% $1-$4.9 billion 60 26% $5-$9.9 billion 12 5% $10-$14.9 billion 7 3% $15-$24.9 billion 9 4%

74

Page 86: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

$25-$50 billion 8 4% Greater than $50 billion 9 4% Don’t Know 15 6% Total Number of Responses 231

Table 23

Organization’s Location

Location Frequency Percent United States 212 91% North America 10 4% South America 1 1% Africa 0 0% Asia 4 2% Australia 1 1% Europe 3 1% Middle East 0 0% Total Number of Responses 231

Table 24

Participant’s Primary Job Title

Job Title Frequency Percent Chief Executive Officer 4 2% CIO 55 23% Chief Financial Officer / Treasurer / Controller 0 0% CTO 4 2% Director 77 33% Enterprise Architect 8 4% Member of the Board 1 1% Vice President 36 15% Other 46 20% Total Number of Responses 231

Table 25

Supervisor’s Primary Job Title

Title Function Frequency Percent Agency Director / Sec / Admin 1 1% Chief Executive Officer 43 18% Chief Enterprise Architect 0 0% Chief Financial Officer / Treasurer / Controller 27 11%

75

Page 87: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

CIO 57 25% Chief Operating Officer 10 4% CTO 8 4%

(table continues) Table 25 (continued)

Title Function Frequency Percent Deputy Dir / Under Sec/Dep Admin 6 3% Director 25 11% Enterprise Architect 0 0% Member of Board of Directors 3 1% Other 51 22% Total Number of Responses 231

Nonresponse Bias

To assess nonresponse bias, answers of the 161 responses from the study

participants who started but did not complete the questionnaire were compared with the

231 who did complete the questionnaire. This was done as it best represented a

potential population of respondents whose characteristics might have predisposed them

to not respond to the survey.

Only the first four questions of the survey were answered by all 161 incomplete

respondents: how the respondent will be answering the questionnaire, if their

organization has a change management practice, the extent to which the IT services

are centralized or decentralized, and if the organization has an EA practice. This gave

insight into both the individuals who did not complete the survey as well as the

organizations they represented. As illustrated in Tables 26–29 (represented by

percentages), there did not appear to be any significant difference in characteristics of

incomplete responses from completed responses.

76

Page 88: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

Table 26

How Will You be Answering the Questions?

As an employee from the perspective of all business units

As an employee from the perspective of a single business/geographic unit

As a consultant or service provider

Other

Complete 72% 11% 13% 3% Incomplete 62% 27% 9% 2%

Table 27

The Organization Has an Organizational Change Management Practice

Strongly Disagree Disagree Neither Agree

nor Disagree Agree Strongly Agree

N/A / Don’t know

Complete 15% 28% 8% 36% 13% 0% Incomplete 9% 31% 11% 32% 15% 2%

Table 28

Degree of Centralization for IT Budget and IT Operations in the Organization

1 (totally decentralized)

2

3 (half & half)

4

5 (totally centralized)

Complete 3% 4% 23% 42% 28% Incomplete 2.5% 2.5% 28% 34% 33%

Table 29

Does the Organization Have an EA Practice?

Yes No Complete 49% 51% Incomplete 52% 48%

77

Page 89: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

H1: Content Validity

Validity is crucial in the development of an instrument as it illustrates the extent to

which the questions accurately measure the constructs it is intended to measure. Over

35 terms were used to connote kinds of validity, but only three were in common use

today: content, criterion-related, and construct (Lynn, 1986). Content validity considered

the individual items’ representativeness and comprehensiveness to the relevance of the

targeted construct and was assessed by examining the process used to generate the

questions (Kappelman, 1995). Beginning with the original 2007 survey and using the

members of the SIM EA Work Group (SIMEAWG) as an expert panel, a recursive

process of refinement was performed. By focusing on specific questions and related

findings, the panel determined the extent to which individual items (questions) targeted

a specific aspect of a construct. This process initiated the design of a refined survey

instrument, and specific concerns were expressed and recorded. A literature review was

then conducted focusing on both these expressed concerns and on more current

research in software development methodologies. This material was incorporated into a

new questionnaire that was returned to the SIMEAWG members for feedback and

shifted focus to other areas, which informed another draft questionnaire. This process

was repeated through several iterations.

This process was finalized with a pilot test of the online questionnaire. Nine

individuals completed the pilot survey, which informed us on the structure of the

questionnaire itself within the confines of the survey application used. It also assisted us

to sharpen the definitions of the small and large context paradigms. These nine results

were not tabulated as the pilot exercise was to ensure that the research intent was not

affected by how the question itself was constructed within Qualtrics. Thus, face validity

78

Page 90: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

was addressed with the Society for Information Management enterprise architecture

working group (SIMEAWG) in a modified Delphi study approach, in meetings with the

working group to discuss the questionnaire, and through refinements suggested after a

pilot test of the instrument. At the conclusion of the pilot, it was the consensus of the

working group that the instrument did measure the constructs intended.

H1: Criterion Validity

There are two different criteria validities: concurrent and predictive. Criterion

validity is also referred to as concrete validity and has to do with how well the

measurements correspond with real-world criteria. Concurrent validity was established

by comparing collected responses with scores from 2007 for the same constructs,

allowing validation of the refined questions against the previous instrument.

As previously illustrated in Table 1, the questions posed to the respondents had

their roots in aspects of the Carnegie Mellon maturity model (Chrissi et al., 2003),

DeLone and McLean’s (2003) IS success model, and Luftman’s (2004) alignment

maturity model, all established research models which have demonstrated how specific

elements influence the ability of organizations to develop quality software, achieve

business success with IS systems, or realize overall alignment of IT with organizational

goals and objectives. Other well-known models from both practice and research were

referenced, including the Office of Management and Budget (OMB; Bellman & Rausch,

2004) EA maturity model, the Government Accountability Office (GAO; Martin &

Robertson, 2003) EA maturity framework, Ross et al.’s (2006) EA maturity framework,

Senge’s (1990) model of organizational learning, and Zachman’s (1997) framework for

EA. Previous research of these models supported predictive validity, particularly in

CMMI, one of the most researched models in IS. As previously discussed, the success

79

Page 91: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

of CMMI has had great influence on research in RA, and studies confirm that software

development process maturity is positively associated with project performance, finding

that the integrity of software work products can be correlated to the integrity of the

development processes (Fitzgerald & O’Kane, 1999; Jiang et al., 2004; Jones, 2000;

Jones et al., 2011; Leffingwell, 1996). Thus, the instrument demonstrated criterion

validity.

H1: Reliability Analysis

Reliability indicates the repeatability or consistency of the measurement when

applied to different samples of the same population. A measure is regarded as reliable if

it would give the same result on repeated use, assuming what is being measuring does

not change as it is measured, or that it does not change between measurements. This

is generally supported by a high Cronbach’s alpha score, which measures internal

consistency. Internal consistency is defined as the inter-correlation of test items. A

score of 0.700 or higher is considered to be an adequate score. Mean and variance

values indicative of a normal distribution also support question reliability. Tables 30 and

31 provide the reliability statistics and the summary of item statistics. Table 32 presents

the tabulated responses for RA capabilities.

Table 30

Reliability Statistics

Cronbach’s Alpha

Cronbach’s Alpha Based on Standardized Items

Number of Items

.946 .948 24

80

Page 92: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

Table 31

Summary Item Statistics

Mean Minimum Maximum Range Maximum/ Minimum Variance Number

of Items Item Means Item

3.564 1.230

2.580 .794

4.323 1.661

1.743 .867

1.676 2.092

.166

.056 24 24

Table 32

Organization’s Requirements, Capabilities, and Practices for SA&D Context

Procedure

Strongly Disagree

Disagree

Neither Agree Nor

Disagree

Agree

Strongly Agree

N/A / Don’t Know

# % # % # % # % # % # % a) Measured 26 11% 70 30% 34 15% 85 37% 12 5% 4 2% b) Benchmarked

against other organizations

38 16% 95 41% 40 17% 39 17% 13 6% 6 3%

c) Aligned with the organization’s objectives

3 1% 11 5% 37 16% 117 51% 59 25% 4 2%

d) Highly disciplined 23 10% 57 25% 59 25% 59 25% 29 13% 4 2%

e) Valued by non-IT leadership 14 6% 46 20% 49 21% 89 38% 29 13% 4 2%

f) Have non-IT leadership buy-in and support

9 4% 32 14% 47 20% 102 44% 37 16% 4 2%

g) Characterized by effective communication between stakeholders and requirements team

9 4% 26 11% 49 21% 107 47% 35 15% 5 2%

h) Describe present “as is” 8 4% 28 12% 41 18% 118 51% 31 13% 5 2%

81

Page 93: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

or current environment

i) Describe “to be” or desired environment

8 4% 30 13% 47 20% 104 45% 38 16% 4 2%

(table continues) Table 32 (continued)

Procedure

Strongly Disagree

Disagree

Neither Agree Nor

Disagree

Agree

Strongly Agree

N/A / Don’t Know

j) Further innovation in organization

11 5% 37 16% 56 24% 86 37% 37 16% 4 2%

k) Viewed strictly as IT initiative 10 4% 58 25% 34 15% 70 30% 50 22% 9 4%

l) Improve ability to manage risk 3 1% 27 12% 47 20% 93 41% 54 23% 7 3%

m) Contribute directly to goals and objectives of business plan

4 2% 14 6% 45 19% 99 43% 63 27% 6 3%

n) Well prioritized by non-IT leadership

14 6% 68 30% 54 23% 61 26% 30 13% 4 2%

o) Valued by IT leadership 2 1% 12 5% 19 8% 94 41% 97 42% 7 3%

p) IT leadership buy-in and support

2 1% 9 4% 14 6% 97 42% 101 43% 8 4%

q) Well prioritized by IT leadership

2 1% 17 7% 47 20% 86 37% 73 32% 6 3%

r) Serve to establish stakeholder agreement on requirements

6 3% 16 7% 36 16% 109 47% 60 25% 4 2%

s) Serve to establish quality goals with stakeholders

9 4% 28 12% 64 28% 91 39% 34 15% 5 2%

82

Page 94: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

t) Highly developed and mature

28 12% 58 25% 67 29% 47 20% 27 12% 4 2%

u) Save time on projects 9 4% 36 16% 58 25% 85 37% 38 16% 5 2%

v) Save money on projects 10 4% 32 14% 62 27% 86 37% 37 16% 4 2%

w) Helps manage complexity 10 4% 28 12% 33 14% 102 44% 52 23% 6 3%

x) Helps manage change 9 4% 30 13% 39 17% 97 42% 49 21% 7 3%

Total Responses 231

The alpha score of 0.946 was consistent with the alpha score from the 2007

instrument. The item and summary statistics indicated a normal distribution, supporting

instrument reliability and internal consistency.

H1: Construct Validity

In assessing construct validity, convergent and discriminate validity must be

addressed. The purpose of discriminate validity is to ensure measurements which are

supposed to be unrelated actually are by minimizing the cross-loading within a factor.

All factor loadings should be above 0.50 in order to demonstrate the items measure on

individual factor or construct for convergent validity (Aladwani & Palvia, 2002). Several

factors loaded strongly on a single factor, with 12 of 24 questions loading above the

recommended loading of 0.70, 20 of 24 with loadings above 0.60, and 23 of 24 above

0.50 (Hair, Black, Babin, Anderson, & Tatham, 2009). The complete listing of all factor

loadings on a force 1 factor appears in Table 33. The loadings of all factors except

question 7k demonstrated convergent validity.

Table 33

Force 1 Factor with Varimax Rotation

Question Number Question Description Factor Load

83

Page 95: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

7a SA_D_Are_Measured .645 7b SA_D_Are_Benchmarked .514 7c SA_D_Are_Aligned .777 7d SA_D_Are_Disciplined .764 7e SA_D_Valued_By_Non_IT .684 7f SA_D_Have_Non_IT_Buy_In .699 7g SA_D_Characterized_by_Communication .778 7h SA_D_Describes_As_Is .664 7i SA_D_Describes_To_Be .776 7j SA_D_Furthers_Innovation .788 7k SA_D_Viewed_As_Strictly_IT -.171 7l SA_D_Improves_Risk_Management .646 7m SA_D_Contributes_To_Goals_and_Objectives .687

(table continues) Table 33 (continued) Question Number Question Description Factor Load 7n SA_D_Prioritized_by_Non_IT_Leadership .678 7o SA_D_Valued_by_IT_Leadership .604 7p SA_D_Have_IT_Leadership_Buy_In .612 7q SA_D_Prioritized_by_IT_Leadership .642 7r SA_D_Serves_for_Stakeholder_Buy_In .771 7s SA_D_Establishes_Quality_Goals .752 7t SA_D_Are_Highly_Developed .784 7u SA_D_Saves_Project_Time .765 7v SA_D_Saves_Project_Money .759 7w SA_D_Helps_Manage_Complexity .774 7x SA_D_Helps_Manage_Change .766

Principal component factor analysis with a Varimax orthogonal rotation was

chosen because it is appropriate when the goal is to simplify the factors while

maximizing the variance of the loadings across the variables (Hooper, Coughlan, &

Mullen, 2008; Tabachnick & Fidell, 2007). Varimax is the most popular rotation method,

simplifying the interpretation by ideally associating each variable to one or only a few

factors (Abdi, 2003). Orthogonal rotations are the most commonly employed methods

and are preferred for data reduction (Hooper et al., 2008).

Initial communalities are estimates of the variance in each variable accounted for

by all components or factors. Extraction communalities are estimates of the variance in

84

Page 96: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

each variable accounted for by the factors (or components) in the factor solution. Small

values indicate variables that do not fit well or fail to converge with the factor solution,

indicating a need for additional analysis. Communalities are provided in Table 34.

Table 34

Communalities

Question Number

Question Description

Initial Extraction 7a SA_D_Are_Measured 1.000 .516 7b SA_D_Are_Benchmarked 1.000 .328

(table continues) Table 34 (continued) Question Number Question Description Initial Extraction 7c SA_D_Are_Aligned 1.000 .654 7d SA_D_Are_Disciplined 1.000 .661 7e SA_D_Valued_By_Non_ IT 1.000 .707 7f SA_D_Have_Non_IT_ Buy_In 1.000 .720 7g SA_D_Characterized_by_

Communication 1.000 .636

7h SA_D_Describes_As_Is 1.000 .446 7i SA_D_Describes_To_Be 1.000 .616 7j SA_D_Furthers_ Innovation 1.000 .628 7k SA_D_Viewed_As_Strictly_IT 1.000 .335 7l SA_D_Improves_Risk_ Management 1.000 .547 7m SA_D_Contributes_To_

Goals_and_Objectives 1.000 .476

7n SA_D_Prioritized_by_ Non_IT_Leadership

1.000 .580

7o SA_D_Valued_by_IT_ Leadership 1.000 .806 7p SA_D_Have_IT_ Leadership_Buy_In 1.000 .840 7q SA_D_Prioritized_by_IT_ Leadership 1.000 .647 7r SA_D_Serves_for_ Stakeholder_Buy_In 1.000 .595 7s SA_D_Establishes_ Quality_Goals 1.000 .584 7t SA_D_Are_Highly_ Developed 1.000 .672 7u SA_D_Saves_Project_ Time 1.000 .737 7v SA_D_Saves_Project_ Money 1.000 .743 7w SA_D_Helps_Manage_ Complexity 1.000 .772 7x SA_D_Helps_Manage_ Change 1.000 .715 Note: Extraction Method: Principal Component Analysis.

When evaluating communalities, scores below 0.200 are considered poor, of

0.200 to 0.300 are considered fair, 0.300 to 0.400 are good, 0.400 to 0.500 are very

85

Page 97: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

good, and above 0.500 are considered excellent (Segars & Grover, 1993). By this

categorization, two scores were considered good, two qualified as very good, and all the

rest were excellent. Thus, the instrument demonstrated construct validity.

H2: Exploratory Analysis

Exploratory factor analysis is a data driven technique using correlations between

variables to form factors. In this study, the primary question was designed as a one

factor model, and most items loaded well on the one factor for the data collected. As an

exploratory analysis, principal factor analysis was performed on the construct of

requirement capabilities within the SA&D context, producing unexpected results. In the

works cited and researched, RA capabilities were considered to be a unidimensional

construct, and so it was expected that a single primary factor would emerge from the

analysis (Gilb, 2010; Leffingwell, 1996). High factor loadings as illustrated in Table 33

indicated the presence of a single overall factor and supported unidimensionallity.

However, it appeared possible to operationalize four distinct dimensions or sub factors,

given that there were four eigenvalues greater than one, and only 49% of the variance

was explained by a single factor.

First, the Kaiser-Meyer-Olkin (KMO) measure of sampling adequacy (MSA) and

Bartlett’s test of sphericity were run to assess the suitability of the data for a factor

analysis. These results appear in Table 35. KMO measures of appropriateness in factor

analysis have been categorized in the 0.90s as marvelous, in the 0.80s as meritorious,

in the 0.70s as middling, in the 0.60s as mediocre, in the 0.50s as miserable, and below

0.50 as unacceptable (Hinkle et al., 2003). The KMO MSA value of 0.914 fell well into

the marvelous category, above the thresholds recommended for factor analysis (Hair et

al., 2009; Hair, 2006; Hooper et al., 2008; Tabachnick & Fidell, 2007).

86

Page 98: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

Table 35

KMO and Bartlett’s Test

KMO Measure of Sampling Adequacy .914

Bartlett’s Test of Sphericity Approx. Chi-Square 2779.838 Df 231 Sig. .000

Bartlett’s test of sphericity tests the hypothesis that the correlation matrix is an

identity matrix, that all diagonal elements are 1, and all off-diagonal elements are 0,

implying that all of the variables are uncorrelated (Hinkle et al., 2003). If the significance

(Sig.) value for this test is less than our alpha level, the null hypothesis is rejected that

the population matrix is an identity matrix. The Sig. value for this analysis rejected the

null hypothesis and concluded there were data set correlations appropriate for factor

analysis.

H2: Analysis of Dimensionality

It was expected that in the pursuit of the primary research question (can

requirements efforts and activities related to the development of a specific software

system [in the smaller context of SA&D] be operationalized through a scale of

measurement?) unidimensionality would be supported. How strong that support might

be was questionable, although, as discussed previously, SA&D is discussed in the

literature as a unidimensional construct (Berry & Lawrence, 1998; Cheng & Atlee, 2007;

Davis, 1982; Schreiber, Nora, Stage, Barlow, & King, 2006). Concern for

unidimensionality was cited by many researchers and founded upon cautioning against

combining several independent measurements into a single composite scale

(Kappelman, 1995). To that end a factor analysis forcing a single factor was run with

the factor load results displayed previously in Table 33. All but one item loaded well on

87

Page 99: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

a single factor, with 12 of 24 questions above the recommended loading of 0.70, 20 of

24 with loadings above 0.60, and 23 of 24 above 0.50 (Hair et al., 2009). High loadings

by all these questions would indicate the presence of a single overall factor. However,

forcing one factor explained only 48.9% of the total variance detected, which is

illustrated in Table 36. Although a unidimensional construct is supported by a force 1

factor analysis, the low variance explained and four eigenvalues greater than one

strongly suggests that other forces are at work. What seemed to be present in the

collected data was the representation of sub factors or dimensions within the overall

construct of RA and design capabilities.

Table 36

Total Variance Explained Force 1 Factor

Component

Initial Eigenvalues

Extraction Sums of Squared Loadings

Total % of

Variance Cumulative

% Total % of

Variance Cumulative

% 1 11.745 48.936 48.936 11.745 48.936 48.936

Note: Extraction Method: Principal Component Analysis.

As previously hypothesized and from the data collected, it would seem the

possibility of being able to operationalize independent dimensions exists and should be

acknowledged. Principal component analysis is a way of identifying patterns in data and

expressing the data in such a way as to highlight their similarities and differences.

Beginning with an analysis of the initial eigenvalues of the results collected, the

eigenvalues can be compared to determine the contribution of a particular feature or

component in the data set. The eigenvalues are displayed in Table 37, and the

associated scree plot is illustrated in Figure 4. As clearly illustrated in the scree plot,

after the fourth factor the eigenvalues fell to nearly a flat line. As illustrated in Table 38,

88

Page 100: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

eigenvalues beyond the fourth component fell to less than 1.00. Components with

eigenvalues less than 1.00 were not considered to be stable, as they accounted for less

variability than a single variable and were generally not retained in the analysis (Girden

& Kabacoff, 2010).

89

Page 101: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

Table 37

Initial Eigenvalues

Component Initial Eigenvalues Total % of Variance Cumulative %

1 11.745 48.936 48.936 2 1.681 7.006 55.941 3 1.535 6.395 62.337 4 1.240 5.166 67.503 5 .828 3.449 70.951 6 .733 3.053 74.005 7 .686 2.858 76.863 8 .612 2.552 79.415 9 .527 2.198 81.613

10 .504 2.100 83.713 11 .495 2.063 85.775 12 .451 1.878 87.653 13 .380 1.582 89.235 14 .377 1.569 90.804 15 .342 1.425 92.229 16 .309 1.286 93.515 17 .281 1.172 94.687 18 .272 1.135 95.822 19 .246 1.024 96.846 20 .216 .901 97.747 21 .179 .746 98.493 22 .146 .607 99.100 23 .121 .505 99.605 24 .095 .395 100.000

90

Page 102: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

Eige

nval

ue

12

10

8

6

4

2

0

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 Figure 4. Scree plot.

Table 38

Total Variance Explained

Component

Initial Eigenvalues

Extraction Sums of Squared Loadings

Total % of Variance

Cumulative %

Total % of Variance

Cumulative %

1 11.745 48.936 48.936 11.745 48.936 48.936 2 1.681 7.006 55.941 1.681 7.006 55.941 3 1.535 6.395 62.337 1.535 6.395 62.337 4 1.240 5.166 67.503 1.240 5.166 67.503

As illustrated in Table 38, expansion to four factors explained 67.5% of the

variance and, although not ideal, this represented a significant improvement over the

48.9% explained through forcing a single factor construct analysis (Table 36). As

hypothesized, it would seem the possibility of being able to operationalize four

91

Page 103: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

dimensions or sub factors exists and should be acknowledged. Principle component

extraction using SPSS Version 21 on all 24 items was performed for 231 respondents

with eigenvalues that exceeded one. The results of the complete factor analysis on the

requirements efforts and activities in an SA&D context are displayed in Table 39. Four

primary factors emerged from the analysis, all with significant factor loading. These

factors were closely correlated and appeared to be sub factors of the overarching

construct of SA&D capabilities. Loadings less than 0.300 are not shown in Table 39.

Table 39

Rotated Component Matrix

Question Number

Question Description

Component 1 2 3 4

7a SA_D_Are_Measured .806 7b SA_D_Are_Benchmarked .777 7c SA_D_Are_Aligned .354 .508 .325 .329 7d SA_D_Are_Disciplined .439 .611 .396 7e SA_D_Valued_By_Non_IT_Management .395 .643 7f SA_D_Have_Non_IT_Leader_Buy_In .491 .648 7g SA_D_Char_By_Communications .414 .401 .422 7h SA_D_Describes_As_Is .378 .323 7i SA_D_Describes_To_Be .507 .325 .338 7j SA_D_Furthers_Innovation .475 .335 .441 7k SA_D_Strictly_IT -.776 7l SA_D_Improve_Risk_Management .716 7m SA_D_Contributes_To_Goals .557 .370 7n SA_D_Prioritized_By_Non_IT_Leadership .370 .304 .405 7o SA_D_Valued_By_IT_Leadership .840 7p SA_D_Have_IT_Buy_In_And_Support .881 7q SA_D_Prioritized_By_IT_Leadership .728 7r SA_D_Est_Stakeholder_Requirements .576 7s SA_D_Establish_Quality_Goals .549 .445 7t SA_D_Highly_Developed_and_Mature .445 .632 .307 7u SA_D_Saves_Time .822 7v SA_D_Saves_Money .822 7w SA_D_Helps_Manage_Complexity .819 7x A_D_Helps_Manage_Change .760 Note: Extraction Method: Principal Component Analysis. Rotation Method: Varimax with Kaiser Normalization.

92

Page 104: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

From the 2007 survey results, questions 7h, 7i, 7j, and 7k exhibited low factor

loadings from respondents. Although these loadings improved with the narrowing of the

question context to focus in SA&D, these questions continued to exhibit significant cross

loading among factors. Additionally, these four questions exhibited bimodal tendencies

from respondents in the 2007 survey. Interestingly, these tendencies were muted in the

narrower context of SA&D and produced normal distributions. More analysis is

necessary to fully appreciate the meanings of this cross-loading, which could indicate

nuances or potentially other dimensions not fully articulated in this data set. However, in

the analysis of the data set, it was decided to remove these four questions and re-run

the factor analysis to try to more clearly and cleanly isolate and possibly associate the

dimensions with other IS models. This second analysis appears in Table 40.

Table 40

Second Factor Analysis

Rotational Method: Varimax Orthogonal Transformation Matrix

1 2 3 4 1 0.64436 0.48787 0.42580 0.40679 2 -0.51348 0.40549 0.66190 -0.36580 3 -0.55477 0.33646 -0.23684 0.72314 4 -0.11567 -0.69596 0.56964 0.42164

Rotated Factor Pattern

Question Number Question Description Factor1 Factor2 Factor3 Factor4

Q7u save time 0.83681 Q7v save money 0.82287 Q7w complexity 0.80762 Q7x manage change 0.74623

(table continues)

93

Page 105: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

Table 40 (continued) Question Number Question Description Factor1 Factor2 Factor3 Factor4

Q7l manage risk 0.70918 Q7r stake agree 0.56050 Q7m goals object 0.55946 Q7s quality goals 0.50878 Q7e valued non-IT 0.82171 Q7f non-IT leader buy in 0.80419 Q7n prioritized non-IT 0.68616 Q7g effective communications 0.54795 Q7p IT buy-in 0.87427 Q7o valued IT 0.84958 Q7q prioritized by IT 0.73014 Q7c aligned org Q7a measured 0.83285 Q7b benchmarked 0.78923 Q7t mature 0.61387 Q7d highly disciplined 0.57725

Note: Values less than 0.5 are not printed.

Four components clearly came through in the analysis with the removal of the

four previously discussed questions. Question 7c was the only question to load across

all four factors, which would be expected with a question asking if SA&D activities were

aligned with the business. In this second analysis, question 7c did not significantly load

on any factor.

The decision to keep an item is based upon relevant theory and practical

experience. Statistical rule of thumb is to recommend that factor loadings of 0.50 be

considered as the minimal significance threshold for retention. Questions 7g, 7k, 7m, 7r,

and 7s exhibited factor loadings below 0.6000. Question 7v was analyzed in the pursuit

of a parsimonious model demonstrating independent dimensions. In the structural

equation model (SEM), when question 7v was included the goodness of fit indices (GFI)

was 0.8600, and the root mean squared error of approximation (RMSEA) was 0.0936.

When question 7v was removed, the SEM model had a GFI of 0.9198 and an RMSEA

94

Page 106: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

of 0.0632 as indicated in Table 42. As a statistical rule of thumb, a GFI of at least

0.9000 is indicative of a well-fitting model. Therefore, the inclusion of item 7v reduced

the SEM fit.

It should be noted that this pruning is exploratory and needs verification with

other data sets. It is not the only approach possible, but it is an example that does

support the greater than one dimension hypotheses, which was already supported by

the greater than 1 eigenvalues, the scree plot, and the increase in variance explained

by four factors. Although this pruning does support 4 independent dimensions, there is

no reason to believe the 4 dimensions are in fact independent, so the pruning is merely

exploratory and an example of how to measure the 4 dimensions independently. Further

research is needed, and now that a valid measure of SA&D capabilities has been

provided, such research is facilitated.

These additional questions were dropped, and the principal component

extraction was run again (see Table 41). Five questions had loading values above

0.8000, six loadings were between 0.7000 and 0.8000, three questions loaded between

0.6000 and 0.7000, and one question loaded slightly below 0.6000.

Table 41

Third Factor Analysis

Rotational Method: Varimax Orthogonal Transformation Matrix

1 2 3 4

1 0.58708 0.48930 0.44299 0.46870 2 -0.43406 0.34540 0.68831 -0.46745 3 -0.60970 0.52933 -0.30864 0.50281 4 -0.30853 -0.60091 0.48448 0.55587

(table continues)

95

Page 107: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

Table 41 (continued) Rotated Factor Pattern

Question Number Question Description Factor1 Factor2 Factor3 Factor4

Q7w complexity 0.84275 Q7u save time 0.78379 Q7x manage change 0.78272 Q7l manage risk 0.74693 Q7e valued non-IT 0.84831 Q7f non-IT leader 0.79372 Q7n prioritized non-IT 0.69394 Q7p IT buy-in 0.88524 Q7o valued IT 0.86331 Q7q prioritized by IT 0.72406 Q7a measured 0.83257 Q7b benchmarked 0.79854 Q7t mature 0.63290 Q7d highly disciplined 0.59754

Note: Values less than 0.5 are not printed.

SEM is a general statistical technique for testing and estimating causal relations

using statistical data and qualitative causal assumptions (Hinkle et al., 2003). It is used

in multivariate analysis to evaluate how well the proposed model is supported by the

data. This analysis is displayed in Table 42.

Table 42

SEM Fit Indices

Index Variable Value N Variables 14 N Parameters 34 Absolute Index Chi-Square 131.4135 Chi-Square DF 71 PR > Chi-Square <.0001 Root Mean Square Residual (RMSR) 0.0625 Standardized RMSR (SRMSR) 0.0537 GFI 0.9198 Parsimony Index Adjusted GFI (AGFI) 0.8815 Parsimonious GFI 0.7177 RMSEA Estimate 0.0632 RMSEA Lower 90% Confidence Limit 0.0460

(table continues)

96

Page 108: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

Table (continued) Index Variable Value

RMSEA Upper 90% Confidence Limit 0.0799 Incremental Index Bentler Comparative Fix Index 0.9674 Bentler-Bonett Normal Fit Index (NFI) 0.9323 Bentler Bonett Non-normed Index 0.9582 Bollen Normed Index RHO1 0.9132 Bollen Non-normed Index Delta2 0.9677

Beginning with the absolute fit indices, the Chi-square value is the traditional

measure for evaluating overall fit and assesses the discrepancy between the sample

and the fitted covariance matrices (Hu & Bentler, 1999). The value of the Chi-square

divided by its degrees of freedom was 1.85 (131.41 / 71), where a value of less than 2

supported a good fitting model (Schreiber et al., 2006).

The root mean square residual (RMSR) and standardized root mean square

residual (SRMSR) are the square root of the difference between the residuals of the

sample covariance matrix and the hypothesized model. Values range from zero to 1.

Well-fitted models are expected to score less than 0.05, with models of less than 0.08

deemed acceptable (Hu & Bentler, 1999). MacCallum, Browne and Sugawara (1996)

used 0.01, 0.05, and 0.08 to indicate excellent, good, and mediocre fit respectively,

using the root mean square error of approximation (RMSEA). The RMSR value of

0.0625 and the SRMSR value of 0.0537 both scored close to a well-fitted model. The

goodness of fit index value of 0.9198 also supported a well-fitted model. To address

limitations of less parsimonious models, adjusted goodness of fit, parsimonious

goodness of fit, and the RMSEA were presented. The adjusted goodness of fit value of

0.8815 was slightly below the 0.90 desired, as was the parsimonious GFI of 0.7177.

These measurements are highly dependent upon the size of the population and are

expected to increase as further analysis is done with other data sets.

97

Page 109: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

It is not uncommon for fit indices to contradict one another, and a good rule is to

report several indicators to demonstrate the overall fit of the model. Root mean squared

error (RMSE) measures the lack of fit in a model when compared to a perfectly fitting

model, with RMSE values of under .06 considered a good fit (Ullman, 2006). These

were calculated to be 0.046 at the lower 90% confidence limit and 0.0799 at the upper

90%, for an overall estimate of 0.0632. In the incremental indices section, all fit

measurements were calculated to be above 0.9, which was considered to be a good fit

(Hu & Bentler, 1999).

As discussed previously, the process of eliciting software requirements is an

exercise in aligning IT with business activities, strategies, and desired outcomes. This

process of elicitation (elemental in software development capabilities) appears to have

several sub factors or dimensions, which in turn can be associated with the dimensions

of alignment identified by Luftman (2004). All questions were mapped to Luftman’s IS

alignment model, which in turn correlated to the four proposed SA&D maturity

dimensions of governance, partnership, value/competency measures, and

communications. These provided the theoretical constructs to identify the potential

dimensions consistent with the question mapping proposed in Figure 5. Although

broader in scope than this instrument was designed to measure, Luftman’s maturity

model had applicability to the out-of-context SA&D activities.

Significant factors in measuring maturity in IT-business alignment would be

expected to be significant in the practices of building IT systems. Moreover, where

Luftman’s (2004) IT-Business alignment maturity is a single construct with multiple

dimensions, it is consistent to find requirements efforts and activities to contain multiple

dimensions as well. Figure 5 illustrates the confirmatory model, associating the question

98

Page 110: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

to the proposed dimension or sub factor. Cronbach’s alpha scores for each dimension

(0.88, 0.82, 0.86, and 0.85) were all above the generally recommended lower limit of

0.70 (Hair et al., 2009). Tables 43, 44, 45, and 46 present the mean value, standard

deviation, and rotated factor loading for each dimension of SA&D practices and the

associated questions. Thus, the secondary hypotheses that requirements efforts and

activities within the context of SA&D are a multidimensional construct is supported.

Figure 5. Confirmatory model.

Requirements Capabilities

Q7W

Q7U

Q7X

Q7L

Governance α = .88

.84275

.78379

.78272

.74693

Q7E

Q7F

Q7N

Partnership α= .82

.84831

.79372

.69394

Q7P

Q7O

Q7Q

Communications α= .86

.88524

.86331

.72406

Q7A

Q7B

Q7T

Q7D

Value α = .85

.83257

.79854

.63290

.59754

99

Page 111: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

Table 43

Governance Dimension Mean, Standard Deviation, and Factor Load

Question Description Mean Std. Deviation

Rotated Factor Loading

W_SA_D_Helps_Manage_Complexity 3.70 1.081 0.84275 U_SA_D_Saves_Time 3.45 1.061 0.78379 X_SA_D_Helps_Manage_Change 3.66 1.075 0.78272 L_SA_D_Improve_Risk_Management 3.73 .996 0.74693 Note: Cronbach α = .88

Table 44

Partnership Dimension Mean, Standard Deviation, and Factor Load

Question Description Mean Std. Deviation

Rotated Factor Loading

E_SA_D_Are_Valued_by_Non_IT 3.32 1.118 0.84831 F_SA_D_Have_Non_IT_Leader_Buy_In 3.56 1.050 0.79372 N_SA_D_Prioritized_by_Non_IT_Leadership 3.11 1.154 0.69394 Note: Cronbach α = .82

Table 45

Communication Dimension Mean, Standard Deviation, and Factor Load

Question Description Mean Std. Deviation

Rotated Factor Loading

P_SA_D_Have_IT_Buy_In_And_Support 4.29 0.825 0.88524 O_SA_D_Valued_By_IT_Leadership 4.22 0.877 0.86331 Q_SA_D_Prioritized_By_IT_Leadership 3.94 0.957 0.72406 Note: Cronbach α = .86

Table 46

Factor: Value/Competency Measured

Question Description Mean Std. Deviation

Rotated Factor Loading

A_SA_D_Are_Measured 2.95 1.164 0.83257 B_SA_D_Are_Benchmarked 2.53 1.132 0.79854 T_SA_D_Highly_Developed_and_Mature 2.94 1.197 0.63290 D_SA_D_Are_Disciplined 3.07 1.195 0.59754 Note: Cronbach α = .85

100

Page 112: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

Summary

This chapter presented the statistical analysis of the data from the survey of the

SIM members during August and September of 2012. The results indicated support for

both hypotheses: requirements efforts and activities within the context of SA&D can be

validly operationalized, and requirements efforts and activities within the context of

SA&D are a multidimensional construct. In addition, these dimensions within RA

capabilities could be associated with the overall maturity and alignment of IT with the

business, mirroring other models of alignment maturity. Overall, the instrument

demonstrated strong reliability and internal consistency. Further research of those

questions with significant cross loading should be explored to see if the interactivities of

these dimensions are indicative of other dynamics yet to be determined. This in turn

may lead to additional pruning or additional refinement of the instrument to further

enhance its validity. This research only represented one data set and one approach,

and it is recommended the entire instrument be used for additional future validation.

Further discussion on the research findings, limitations of this study, and areas of future

research will be presented in the next chapter.

101

Page 113: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

CHAPTER 5

DISCUSSION

Discussion of Research Findings

This research demonstrated a valid instrument to measure requirements

practices in the context of specific discrete software development activities (i.e., system

analysis and design [SA&D] practices). One of information technology’s (IT’s) most

essential capabilities can now be benchmarked and improved. Previous research

identified requirements work as a unidimensional construct. Separating requirements

into two contexts of SA&D and enterprise architecture (EA) expanded this concept,

although further research is needed to test the validity of this division. Moreover, when

limiting validly operationalizing SA&D to the narrowly defined context of software

development, this research indicated the existence of four dimensions of SA&D which

appeared to be related to Luftman’s (2004) alignment maturity model of governance,

partnership, communication, and value or competency measurement.

Theoretical Implications

The existence of dimensions to SA&D activities is significant for many reasons.

First, dimensions may indicate critical success factors to such requirements efforts and

activities which were unknown before this time. Studies articulating critical success

factors in requirements capabilities are rare, and findings are fragmented, leaving

improvement in this area as more an art than a science (Brown & Vessey, 1999; West,

Grant, Gerush, & D’Silva, 2010). Second, poor requirements continue to be cited as

among the single greatest determining factors to the success of any IT project, be it

development, implementation, or maintenance (Gilb, 2010; Kappelman, McKeeman, &

Zhang, 2006; McKeen, 1983; Young, 2004). Sub factors or dimensions may be

102

Page 114: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

indicative of antecedents which could influence the development of practices or

methodologies to improve requirements practices and capabilities. Finally, the

separation of requirements activities into two contexts may point to separate dimensions

and/or antecedents for each, necessitating a different methodology depending upon the

desired purpose. These dimensions may be critical for effective software development,

be antecedents which influence the development of requirements, or may be indicative

of a requirement capability maturity which might inform IT to business alignment,

information science (IS) success, and end-user acceptance.

It was the need for improved requirements capabilities and the possibility of

articulating requirements in a large and small context which launched this research.

Valid measurement of requirements practices for development of a system has been

successfully demonstrated. Such requirements efforts and activities to develop software

or implement a specific IT solution were expected to coalesce strongly around a single

factor, mirroring previous discussions of requirement capabilities. Although

unidimensionality was supported, four sub dimensions were indicated, and it appears

that success factors or dimensions underlying requirements efforts and activities are

nuanced in the narrower context of a specific system development.

Operationalizing requirements also establishes the foundation for performance

metrics. Governance, partnership, communications, and competency are four of the six

alignment maturity attributes from Luftman’s (2004) alignment maturity model.

Requirements capabilities and alignment would be expected to correlate with one

another, and activities to increase requirements capabilities focused on improving

specific elements associated with these four attributes should produce a corresponding

improvement in alignment. This could possibly articulate a path toward improving

103

Page 115: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

alignment beginning with the development and implementation of tools to support the

business processes, improving IT-business alignment with every system developed and

deployed.

Critical success factors may also be articulated by this association with such

improvements. The refinement and maturity of the operational metrics employed, and IT

governance activities such as the involvement of the IT steering committees with the

requirements gathering process and the role of the IT function in the development of

business strategy, may predict problems ahead if outside certain tolerances. With the

persistent refusal of Agile development groups to measure using standard metrics,

critical success factors are more important than ever to ensure successful development

efforts and to protect jobs (Jones et al., 2011). Additionally, this expands upon Brooks’

(1987) assertion that requirements are the essence of the IS development process and

that they may in fact be the essence of IT to business alignment. Feedback is the single

greatest determinant of human behavior, and metrics provide feedback in IT (Sterman,

2002). IT performance metrics often focus on operational measurements, such as on

time and on budget (Kappelman et al., 2013). Interestingly, the dimensions of SA&D

practices appear to concentrate upon the alignment of IT and the business. Thus, IT’s

misalignment with the business may be rooted in the immaturity of requirements

capabilities. By establishing a way to improve requirements, we may also find our way

to alignment.

Limitations

This study presented the analysis from the survey given to the Society for

Information Management (SIM) in 2012. Although not statistically insignificant, the low

104

Page 116: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

response rate coupled with the relatively small population of 231 respondents would

suggest additional studies with this instrument for further validation.

Additionally, SIM’s membership can be generally categorized as senior IT

professionals, as was reflected in the demographics of the participants. Often, it is not

the senior IT professionals who actively engage in the development of requirements,

and they may not have the same perspective as those who do. Generally speaking, it

could be that senior IT professionals view the process of eliciting requirements as a

separate activity from the development of software, and those who develop software

(particularly those using Agile or other requirements-focused practices) see them as

one in the same activity. Contrasting these results with those of a different population of

IT professionals could bring greater clarity to both the validity of the measurement for

requirements analysis (RA) capabilities for a specific software development as well as

the dimensionality of such requirements capabilities.

Analysis around demographic descriptives such as industry or location was not

conducted while others (not-for-profit vs. for-profit organizations) analyses were not

possible due to the overwhelming representation of the for-profit companies based in

United States. Future data sets will be used to ascertain if dimensions in requirements

efforts and activities by context also vary by organization demographic.

Another limitation was the breadth of the instrument to support future research.

Decisions were made as to how to structure the instrument in the interests of more

clearly articulating the differences between requirements efforts and activities in a small

context and requirements efforts and activities in a larger context. This possibly fatigued

the potential respondents and reduced the response rates.

105

Page 117: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

Finally, in exploring the multidimensionality of SA&D practices, the process of

eliminating questions to exhibit a parsimonious model could be construed as trying to fit

the data to a predetermined model. Research with other data sets will either confirm or

refute the specific dimensions proposed, but the existence of dimensions within SA&D

and EA is worthy of additional study and could provide another lens through which to

view the age-old question of IT to business alignment.

Areas for Future Research

The gathering and analysis of additional data sets, as well as further analysis of

the SIM data set will be the first area of further research, specifically around the

distinction between requirements in a small (SA&D) context and in the larger (EA)

context. These will provide insight into how context (enterprise or specific software

solution) matters in requirements practices and how associated activities at the micro

level (the specific system development) relate to those same activities at the macro

level (in the context of the entire enterprise). Additionally, dimensionality along

Luftman’s (2004) alignment model or Senge’s (1990) learning organization model may

give insight into additional sub factors, which may or may not be the same in both

contexts. Additional collected data using this same instrument from Zachman’s (1997)

EA, the National Association of State Chief Information Officers (CIOs), and from

Chinese IT organizations should be analyzed and differences explored.

Discriminate validity measurements should be analyzed to establish differences

between RA practices in software development and RA practices in a larger EA context,

articulating artifacts and procedures for each. It is expected correlations between RA

practices in small and large contexts will provide insight into how metrics and

motivations drive both requirements capabilities and IT alignment maturity. RA

106

Page 118: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

capabilities are very important not only to information science development success

(Brooks, 2003; McKeen & Smith, 2008), but to IT success in general (Luftman &

Kempaiah, 2007). The identification of critical success factors for requirements

capabilities implies that IT shops that do align and engage business process owners in

their SA&D activities may have a better chance for project success, but this correlation

will need to be explored further in the small and large contexts. The multidimensional

aspect of RA capabilities has implications for training, required skills for IT

professionals, the hiring of IT people for specific functions, and team building within IT

departments and project groups.

Finally, it would be helpful to associate perceived alignment by the business with

development capabilities of the IT departments in a large context. With IT’s focus on

requirements and system development in the small context, it could be extremely

valuable to associate specific requirements capabilities to differences in the business’

perceived IT-business alignment, and thereby perhaps identify critical success factors

and chart a path of specific practices to foster greater value from and importance of IT.

Conclusion

This study supported the development and validation of a requirements practices

instrument that validly operationalized requirements efforts and activities in the smaller

context of software development activities. For practitioners, the identification of critical

success factors for requirements efforts and activities implies that IT shops that do align

and engage business process owners, users, and business subject matter experts in

their SA&D activities may have a better appreciation of the alignment gap between IT

and the business and a better chance for IT project success. Further research will be

necessary to establish the correlation between IT project success to requirements

107

Page 119: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

practices and capabilities in both a small and large context. Further analysis on the

multidimensional aspect of RA capabilities has implications for training, required skills

for IT professionals performing analysis and design activities, the hiring of IT people for

specific functions, and team building within IT departments and project groups.

For academics, the identification of sub factors could be the bridge between

essence and accidents, and provide more clarity to efforts to improve software

development capabilities. By dividing requirements efforts into two contexts and

potentially identifying different critical success factors for each, there may be a maturity

model implied in this work, with more mature IT organizations emphasizing business

engagement capabilities equally with build-and-run capabilities and following some

logically ordered approach to achieving both. Whether a viable maturity model can be

discerned from this research stream remains to be seen.

This research study contributes to the body of research by showing that

requirements efforts and activities can be validly measured in a systems development

context. It also proposes that these SA&D practices are distinct from enterprise-wide

ones and are associated requirements efforts and activities with alignment of IT with the

business. In the pursuit of alignment, improvements in requirements capabilities may

not represent the total solution, but this research supports improvements in

requirements practices in the overall context of alignment of IT with business outcomes.

108

Page 120: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

APPENDIX

SURVEY INSTRUMENT

109

Page 121: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

110

Page 122: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

111

Page 123: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

112

Page 124: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

113

Page 125: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

114

Page 126: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

115

Page 127: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

116

Page 128: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

117

Page 129: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

118

Page 130: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

119

Page 131: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

120

Page 132: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

121

Page 133: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

122

Page 134: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

123

Page 135: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

124

Page 136: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

125

Page 137: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

126

Page 138: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

127

Page 139: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

128

Page 140: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

REFERENCES

Abdi, H. (2003). Factor rotations in factor analyses. In M. Lewis-Beck, A. Bryman, & T. Futing (Eds.), Encyclopedia for Research Methods for the Social Sciences (pp. 792-795). Thousand Oaks, CA: Sage.

Ackoff, R. L. (1971). Towards a system of systems concepts. Management Science,

17(11), 661-671. Aladwani, A. M., & Palvia, P. C. (2002). Developing and validating an instrument for

measuring user-perceived web quality. Information & Management, 39(6), 467-476.

Alter, S. (2002). Sidestepping the IT artifact, scrapping the IS silo, and laying claim to

“systems in organizations.” Communications of the Association for Information Systems, 12, 494-526.

Armour, F., Kaisler, S., & Liu, S. (1999). A big-picture look at enterprise architectures. IT

Professional, 1(1), 35-42. Ashrafi, N. (2003). The impact of software process improvement on quality: In theory

and practice. Information & Management, 40(7), 677-690. Avison, D., & Wood-Harper, A. (1991). Information systems development research: An

exploration of ideas in practice. The Computer Journal, 34(2), 98-112. Aziz, S., Obitz, T., Modi, R., & Sarkar, S. (2006). Enterprise architecture: A governance

framework—Part II: Making enterprise architecture work within the organization. Retrieved from http://www.infosys.com/consulting/architecture-services/white-papers/Documents/EA-governance-2.pdf.

Bagozzi, R. P. (1980). Causal modeling in marketing. New York: Wiley & Sons. Ball, L., & Harris, R. (1982). SMIS members: A membership analysis. MIS Quarterly,

6(1), 19-38. Barki, H., Titah, R., & Boffo, C. (2007). Information system use-related activity: An

expanded Behavioral conceptualization of individual-level information system use. Information Systems Research, 18(2), 173-192. doi: 10.1287/isre.1070.0122

Beecham, S., Hall, T., Britton, C., Cottee, M., & Rainer, A. (2005). Using an expert

panel to validate a requirements process improvement model. Journal of Systems and Software, 76(3), 251-275.

129

Page 141: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

Beecham, S., Hall, T., & Rainer, A. (2003a). Building a requirements process improvement model (Technical Report TR 379). Hertforshire, UK: University of Hertfordshire Hatfield.

Beecham, S., Hall, T., & Rainer, A. (2003b). Software process improvement problems in

twelve software companies: An empirical analysis. Empirical Software Engineering, 8(1), 7-42.

Beecham, S., Hall, T., & Rainer, A. (2005). Defining a requirements process

improvement model. Software Quality Journal, 13(3), 247-279. Bellman, B., & Rausch, F. (2004). Enterprise architecture for e-government. In R.

Traunmüller, Lecture Notes in Computer Science: Electronic Government (Vol. 3183, pp. 48-56), Berlin, Germany: Springer-Verlag. doi: 10-1007/b99836

Benbasat, I., & Barki, H. (2007). Quo vadis, TAM? Journal of the Association for

Information Systems, 8(4), 211-218. Benbasat, I., Goldstein, D. K., & Mead, M. (1987). The case research strategy in studies

of information systems. MIS Quarterly, 11(3), 369-386. doi: 10.2307/248684 Benbasat, I., & Zmud, R. W. (2003). The identity crisis within the is discipline: Defining

and communicating the discipline's core properties. MIS Quarterly, 27(2), 183-194.

Berry, D. M. (2008). The software engineering silver bullet conundrum. IEEE Software,

25(2), 18-19. doi: 10.1109/MS.2008.51 Berry, D. M., & Lawrence, B. (1998). Requirements engineering. IEEE Software, 15(2),

26-29. doi: 10.1190/MS.1998.663780 Blum, B. I. (1989). A paradigm for the 1990s validated in the 1980s. Paper presented at

the Proceedings of the AIAA Conference, Monterey, CA. doi: 10.2514/MCOMP89 Boehm, B. (2006). The future of software processes. In M. L. B. Boehm & L. J.

Osterweild (Eds.), Lecture Notes in Computer Science: Unifying the Software Process Spectrum (Vol. 3840, pp. 10-24). Berlin, Germany: Springer-Verlag. doi: 10.1007-11608035_2

Boehm, B. W. (1991). Software risk management: Principles and practices. IEEE

Software, 8(1), 32-41. doi: 1109/52.62930 Bostrom, R. P., & Heinen, J. S. (1977). MIS problems and failures: A socio-technical

perspective. Part I: The causes. MIS Quarterly, 1(3), 17-32. doi: 10.2307/248710

130

Page 142: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

Boudreau, M. C., Gefen, D., & Straub, D. W. (2001). Validation in information systems research: A state-of-the-art assessment. MIS Quarterly, 25(1), 1-16. doi: 10.2307/3250956

Boulding, K. (1956). General systems theory: The skeleton of science. Management

Science, 2(3), 197-208. doi: 10.1287/mnsc.2.3.197 Brancheau, J. C., Janz, B. D., & Wetherbe, J. C. (1996). Key issues in information

systems management: 1994-95 SIM Delphi results. MIS Quarterly, 20(2), 225-242.

Brancheau, J. C., & Wetherbe, J. C. (1987). Key issues in information systems

management. MIS Quarterly, 11(1), 23-45. Brits, J., Botha, G., & Herselman, M. (2006). Conceptual framework for modeling

business capabilities (Unpublished master's thesis). Tshwane University of Technology, South Africa.

Brooks, F. (1987). No silver bullet: Essence and accidents of software engineering.

Computer, 20(4), 10-19. doi: 10.1109/MC.1987.1663532 Brooks, F. P. (1975). The mythical man-month: Essays on software engineering.

Reading, MA: Addison-Wesley. Brooks, F. P. (1995). The mythical man-month: Essays on software engineering,

anniversary edition. Boston, MA: Addison-Wesley. Brooks, F. P. (2003). Three great challenges for half-century-old computer science.

Journal of the ACM, 50(1), 25-26. doi: 10.1145/602382.602397 Brown, C., & Vessey, I. (1999). ERP implementation approaches: Toward a contingency

framework. Paper presented at the Proceedings of the 20th International Conference on Information Systems, Charlotte, NC.

Brown, J. S., Hagel, J., & Carr, N. (2003). Does IT matter? Letters to the editor. Harvard

Business Review, 81(7), 109-112. Browne, G. J., & Ramesh, V. (2002). Improving information requirements determination:

A cognitive perspective. Information & Management, 39(8), 625-645. doi: 10.1016/S0378-7206(02)00014-9

Byrd, T., Cossick, K., & Zmud, R. (1992). A synthesis of research on requirements

analysis and knowledge acquisition techniques. MIS Quarterly, 16(1), 117-138. doi: 10.2307/249704

131

Page 143: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

Capgemeni. (2010). Information technology and telecommunications transfer, coordination, and modernization study. Retrieved from http://www.ok.gov/cio/documents/ITModernizationStudy01042011.pdf

Carr, N. (2003). IT doesn't matter. Harvard Business Review, 81(5), 41-49. Carroll, J., & Swatman, P. (1998). The process of deriving requirements: Learning from

practice. Proceedings of the 9th Australian Conference on Information Systems, Univeristy of NSW, Sydney, Australia.

Carver, J. (2001). Carver's Policy Governance® Model in Nonprofit Organizations.

Gouvernance: Revue Internationale, 2(1), 30-48. Chan, Y., Sabherwal, R., & Thatcher, J. (2006). Antecedents and outcomes of strategic

IS alignment: An empirical investigation. IEEE Transactions on Engineering Management, 53(1), 27-47.

Charette, R. N. (2005). Why software fails. IEEE Spectrum, 42(9), 42-49. doi:

10.100/MSPEC.2005.1502528 Chen, P., & Pozgay, A. (2002). Architecture practice: A fundamental discipline for

information systems. Paper presented at ACIS 2002 Proceedings, Australia. Cheng, B. H. C., & Atlee, J. M. (2007). Research directions in requirements

engineering. Future of Software Engineering (FOSE -07), 285-303. doi: 10.1109/FOSE.2007.17

Choo, C. (2002). Information management for the intelligent organization: The art of

scanning the environment (3rd ed.). Medford, NJ: Information Today. Chrissis, M., Konrad, M., & Shrum, S. (2003). CMMI guidlines for process integration

and product improvement. Boston, MA: Addison-Wesley Longman. Cleveland, H. (1982). Information as a resource. The Futurist, 16(6), 34-39. Cook, T. D. (1993). A quasi-sampling theory of the generalization of causal

relationships. In T. D. Cook, New Directions for Program Evaluation: Understanding Causes and Generalizing About Them (pp. 39-82). San Francisco, CA: Jossey-Bass. doi: 10.1002/ev.1638

Crawford, D. (2001). Forum. Communications of the ACM, 44(10), 11-12. doi:

10.1145/383845.383848 Cronbach, L. J. (1971). Test validation. In R. L. Thorndike (Ed.), Educational

Measurement (2nd ed., pp. 443-507). Washington, DC: American Council on Education.

132

Page 144: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

Damian, D., & Chisan, J. (2006). An empirical study of the complex relationships

between requirements engineering processes and other processes that lead to payoffs in productivity, quality, and risk management. IEEE Transactions on Software Engineering, 32(7), 433-453. doi: 10.1109/TSE.2006.61

Daugulis, A. (2000). Time aspects in requirements engineering: Or ‘every cloud has a

silver lining.’ Requirements Engineering, 5(3), 137-143. doi: 10.1107/s007660070005

Davis, A. M., & Zowghi, D. (2006). Good requirements practices are neither necessary

nor sufficient. Requirements Engineering, 11(1), 1-3. doi: 10.1007/s00766-004-0206-4

Davis, F. D. (1989). Perceived usefulness, perceived ease of use, and user acceptance

of information technology. MIS Quarterly, 13(3), 319-340. doi: 10.2307/249008 Davis, G. B. (1982). Strategies for information requirements determination. IBM

Systems Journal, 21(1), 4-30. doi: 10.1147/sj.211.004 Davis, G. B. (2010). Strategies for information requirements determination. IBM

Systems Journal, 21(1), 4-30. DeLone, W. H., & McLean, E. R. (2003). The DeLone and McLean model of information

systems success: A ten-year update. Journal of Management Information Systems, 19(4), 9-30.

DeMarco, T. (1979). Structured analysis and system specification. Englewood Cliffs, NJ:

Prentice-Hall. Dickson, G. W., Leitheiser, R. L., Wetherbe, J. C., & Nechis, M. (1984). Key information

systems issues for the 1980's. MIS Quarterly, 135-159. doi: 10.2307/248662 Dillon, A., & Morris, M. G. (1996). User acceptance of information technology: Theories

and models. In M. Williams (Ed.), Annual Review of Information Science and Technology (Vol. 31, pp. 3-32). Medford, NJ: Information Today.

DiRomauldo, A., & Gurbaxani, V. (1998). Strategic intent for IT outsourcing. Sloan

Management Review, 39(4), 67-80. Farrell, P. S. E., & Connell, D. (2010). Organizational agility. Paper presented at the 15th

International Command and Control Research and Technology Symposium, Santa, Monica, CA.

Finkelstein, C. (2004). Systems development strategies for 21st century enterprises. The

Data Administration Newsletter, 1-9.

133

Page 145: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

Fitzgerald, B., & O'Kane, T. (1999). A longitudinal study of software process

improvement. IEEE Software 16(3), 37-45. doi: 10.1109/52.765785 Forrester, J. W. (1998). Designing the future (D-4726). Paper presented at Universidad

de Sevilla, Sevilla, Spain. Fowler, M., & Highsmith, J. (2001). The agile manifesto. Software Development, 9(8),

28-35. Fruhling, A., & de Vreede, G. J. (2006). Field experiences with eXtreme programming:

Developing an emergency response system. Journal of Management Information Systems, 22(4), 39-68. doi: 10.2753/MIS0742-1222220403

Gerush, M. (2009). Just do it: Modernize your requirements practices. Retrieved from

www.forrester.com Gerush, M. (2010). Best practices: Your ten-step program to improve requirements and

deliver better software. Retrieved from http://www.forrester.com/. Gibson, D. L., Goldenson, D. R., & Kost, K. (2006). Performance results of CMMI-based

process improvement. Pittsburgh, PA: Software Engineering Institute. Gilb, T. (2010). What’s wrong with requirements specification? An analysis of the

fundamental failings of conventional thinking about software requirements, and some suggestions for getting it right. Journal of Software Engineering and Applications, 3(9), 827-838. doi: 10.4236/isea.2010.39096

Girden, E. R., & Kabacoff, R. (2010). Evaluating research articles from start to finish (3rd

ed.). Thousand Oaks, CA: Sage. Glass, R. (2006). The Standish report: Does it really describe a software crisis?

Communications of the ACM, 49(8), 15-16. doi: 10.1145/1145287.1145301 Godfrey, P. C., & Hill, C. W. L. (1995). The problem of unobservables in strategic

management research. Strategic Management Journal, 16(7), 519-533. doi: 10.1002/smj.4250160703

Goh, S., & Richards, G. (1997). Benchmarking the learning capability of organizations.

European Management Journal, 15(5), 575-583. doi: 10.1016/S0263-2372(97)00036-4

Goldenson, D., El Emam, K., Herbsleb, J., & Deephouse, C. (1999). Empirical studies of

software process assessment methods. In K. El Emam & N. H. Maahavji (Eds.), Elements of Software Process Assessment and Improvement. Los Alamitos, CA: IEEE CS Press.

134

Page 146: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

Goodhue, D., & Thompson, R. (1995). Task-technology fit and individual performance.

MIS Quarterly, 19(2), 213-236. doi: 10.2307/249689 Gregor, S., Hart, D., & Martin, N. (2007). Enterprise architectures: Enablers of business

strategy and IS/IT alignment in government. Information Technology & People, 20(2), 96-120. doi: 10.1108/09593840710758031

Greiner, L. (1972). Evolution and revolution as organizations grow. Harvard Business

Review, 50(4), 1-11. Greiner, L. (1997). Evolution and Revolution as Organizations Grow: A company's past

has clues for management that are critical to future success. Family Business Review, 10(4), 397-409. doi: 10.1111/j.1741-6248.1997.00397.x

Hair, J. F., Black, W., Babin, B., Anderson, R., E., & Tatham, R. L. (2006). Multivariate

data analysis (6th ed.). Upper Saddle River, NJ: Pearson Education. Hair, J. F., Black, W., Babin, B., Anderson, R. E., & Tatham, R. L. (2009). Multivariate

data analysis (7th ed.) Upper Saddle River, NJ: Pearson Education. Halbleib, H. (2004). Requirements management. IS Management, 21(1), 8-14. Harter, D. E., & Slaughter, S. A. (2000). Process maturity and software quality: A field

study. Paper presented at the ICIS '00 Proceedings of the Twenty-First International Conference on Information Systems, Atlanta, GA.

Hay, D. C. (2003). Requirements analysis: From business views to architecture. Upper

Saddle River, NJ: Prentice Hall. Hilkka, M. R., Tuure, T., & Matti, R. (2005). Is extreme programming just old wine in

new bottles: A comparison of two cases. Journal of Database Management, 16(4), 41-61. doi: 10.4018/jdm.2005100103

Hill, P., & Turbitt, K. (2006). Combine ITIL and COBIT to meet business challenges.

BMC Software. Hinkle, D., Wiersma, W., & Jurs, S. (2003). Applied Statistics for the Behavioral

Sciences (5 ed.). Boston, MA: Houghton Mifflin. Hirschheim, R., & Klein, H. K. (1989). Four paradigms of information systems

development. Communications of the ACM, 32(10), 1199-1216. doi: 10.4018/jdm.2005100103

Holland, J. H. (1999). Emergence: From chaos to order. Cambridge, MA: Perseus

Books.

135

Page 147: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

Hooper, D., Coughlan, J., & Mullen, M. (2008). Structural equation modelling:

Guidelines for determining model fit. Electronic Journal of Business Research Methods, 6(1), 53-60.

Hu, C., Mao, X., & Ning, H. (2009). Integrating model transformation in agent-oriented

software engineering. Paper presented at the Proceedings of the 2009 IEEE/WIC/AMC Internationa Joint Conference on Web Intelligence and Intelligent Agent Technology (Vol. 2), Milan, Italy. doi: 10.1109/WITHIN-IAT.2009.127

Hu, L. T., & Bentler, P. M. (1999). Cutoff criteria for fit indexes in covariance structure

analysis: Conventional criteria versus new alternatives. Structural Equation Modeling: A Multidisciplinary Journal, 6(1), 1-55. doi: 10.1080/10705519909540118

Hui, B., Liaskos, S., & Mylopoulos, J. (2003). Requirements analysis for customizable

software goals-skills-preferences framework. Paper presented at the RE '03 Proceedings of the 11th IEEE International Conference on Requirements Engineering, Washington, DC.

Hutchings, A. F., & Knox, S. T. (1995). Creating products customers demand.

Communications of the ACM, 38(5), 72-80. doi: 10.1145/203356.203370 Jeffers, P., Muhanna, W., & Nault, B. (2008). Information technology and process

performance: An empirical investigation of the interaction between IT and non-IT resources. Decision Sciences, 39(4), 703-735. doi: 10.1111/j.1540-5915.2008.00209x

Jen, L., & Lee, Y. (2000). Working group. IEEE recommended practice for architectural

description of software-intensive systems. IEEE Architecture, 1471-2000. Jiang, J. J., Klein, G., Hwang, H. G., Huang, J., & Hung, S. Y. (2004). An exploration of

the relationship between software development process maturity and project performance. Information & Management, 41(3), 279-288.

Jones, C. (1996). Patterns of software system failure and success. London, England:

International Thomson Computer Press. Jones, C. (2000). Software assessments, benchmarks, and best practices. Boston, MA:

Addison-Wesley Longman. Jones, C. (2002). Software quality in 2002: A survey of the state of the art. Burlington,

MA: Six Lincoln Knoll Lane. Jones, C. (2008). Measuring defect potentials and defect removal efficiency. CrossTalk

The Journal of Defense Software Engineering, 21(6), 11-13.

136

Page 148: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

Jones, C., Subramanyam, J., & Bonsignour, O. (2011). The Economics of Software

Quality. Upper Saddle River, NJ: Addison-Wesley. Jugdev, K., & Muller, R. (2005). A retrospective look at our evolving understanding of

project success. Project Management Journal, 36(4), 19-31. Jørgensen, M., & Moløkken-Østvold, K. (2006). How Large Are Software Cost

Overruns? A Review of the 1994 CHAOS Report. Software Practitioner, 16(4/5), 13-14.

Kamata, M. I., & Tamai, T. (2007). How Does Requirements Quality Relate to Project

Success or Failure? Paper presented at the RE '07 15th IEEE International Requirements Engineering Conference, Delhi, India.

Kaner, M., & Karni, R. (2004). A capability maturity model for knowledge-based

decisionmaking. Information Knowledge Systems Management, 4(4), 225-252. Kappelman, L. (1997). Year 2000 problem: Strategies and solutions from the fortune

100. Boston, MA: International Thomson Computer Press. Kappelman, L. A. (1995). Measuring user involvement: A diffusion of innovation

perspective. Data Base Advances, 26(2-3), 65-86. Kappelman, L. A. (2010). The SIM guide to enterprise architecture. Boca Raton, FL:

CRC Press. Kappelman, L. A., McKeeman, R., & Zhang, L. (2006). Early warning signs of IT project

failure: The dominant dozen. Information Systems Management, 23(4), 31-36. Kappelman, L., McLean, E., Luftman, J., & Johnson, V. (2013). Key issues of IT

organizations and their leadership: The 2013 SIM IT trends study. MIS Quarterly Executive, 12(4).

Kappelman, L., & Salmans, B. (2008). Information management practices survey 2007

preliminary report:The state of EA: Progress, not perfection. Paper presented at the Society for Information Management Enterprise Architecture Working Group, Chicago, IL.

Kashanchi, R., & Toland, J. (2006). Can ITIL contribute to IT/business alignment? An

initial investigation. Wirtschaftsinformatik, 48(5), 340-348. Kemerer, C. F. (1993). Reliability of function points measurement: A field experiment.

Communications of the ACM, 36(2), 85-97.

137

Page 149: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

Kerlinger, F. N., & Lee, H. B. (1999). Foundations of behavioral research (4th ed.). Fort Worth, TX: Harcourt Brace.

Kim, C. S., & Peterson, D. K. (2003). A comparison of the perceived importance of

information systems development strategies by developers from the United States and Korea. Information Resources Management Journal, 16(2), 1-18.

Kim, J. W., Kwon, J. H., Kim, Y. G., Song, C. Y., Kim, H. S., & Baik, D. K. (2006).

EAFoC: Enterprise architecture framework based on commonality. Journal of Computer Science and Technology, 21(6), 952-964.

King, W. R., & Teo, T. S. H. (1997). Integration between business planning and

information systems planning: Validating a stage hypothesis. Decision Sciences, 28(2), 279-308.

Kuhn, T. (1970). The structure of scientific revolutions. Chicago, IL: University of

Chicago Press. Kulonda, D., Arif, M., & Luce, J. (2003). Legacy systems: Must history repeat itself?

Journal of Knowledge Management Practice, 4, 1-18. Kurtz, C. F., & Snowden, D. J. (2004). The new dynamics of strategy: Sense-making in

a complex and complicated world. IBM Systems Journal, 42(3), 462-483. Lange, M., Mendling, J., & Recker, J. (2012). A comprehensive EA benefit realization

model--An exploratory study. Paper presented at the System Science (HICSS), 2012 45th Hawaii International Conference, Maui, HI. doi: 10.1109/HICSS.2012.50

Langenberg, K., & Wegmann, A. (2004) Enterprise architecture: What aspects is current

research targeting (Technical Report IC/2004/77). Switzerland, EPFL. Lawson, L. (2013). HealthCare.gov lesson? Ignoring integration will bite you every time.

Retrieved from http://www.itbusinessedge.com/ Layman, B. (2005). Implementing an organizational software process improvement

program. In R. H. Thayer & M. Dorfman (Eds.), Software engineering (Vol. 2): The supporting processes (3rd ed., pp. 279-288). Hoboken, NJ: John Wiley and Sons.

Leffingwell, D. A. (1996). A Field Guide to Effective Requirements Management Under

SEI's Capability Maturity Model. Rational Software Corporation. Leiblein, M. J., Reuer, J. J., & Dalsace, F. (2002). Do make or buy decisions matter?

The influence of organizational governance on technological performance. Strategic Management Journal, 23(9), 817-833. doi: 10.1002/smj.259

138

Page 150: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

Lucas Jr, H. C. (1971). A user-oriented approach to systems design. Paper presented at

the ACM '71 Proceedings of the 1971 26th Annual Conference, New York, NY. doi: 10.1145/800184.810503

Luftman, J. (2004). Assessing business-IT alignment maturity. COmmunication

technology governance, 4(1), 1-51. Luftman, J. (2005). Key issues for IT executives 2004. MIS Quarterly, 4(2), 269-285. Luftman, J., & Brier, T. (1999). Achieving and sustaining business-IT alignment.

California Management Review, 42(1), 110. Luftman, J., & Kempaiah, R. (2007). An update on business-IT alignment: "A line" has

been drawn. MIS Quarterly Executive, 6(3), 13. Luftman, J., & Kempaiah, R. (2008). Key issues for IT executives 2007. MIS Quarterly

Executive, 7(2), 99-112. Luftman, J., Kempaiah, R., & Nash, E. (2006). Key issues for IT executives 2005. MIS

Quarterly Executive, 5(2), 81-99. Lynn, M. R. (1986). Determination and quantification of content validity. Nursing

Research, 35(6), 382-386. Lyytinen, K., & Robey, D. (1999). Learning failure in information systems development.

Information Systems Journal, 9(2), 85-101. MacCallum, R. C., Browne, M. W., & Sugawara, H. M. (1996). Power analysis and

determination of sample size for covariance structure modeling. Psychological Methods, 1, 130-149.

Maciaszek, L. (2007). Requirements analysis and system design (3rd ed.). New York,

NY: Addison-Wesley. Maiden, N. A., & Ncube, C. (1998). Acquiring COTS software selection requirements.

IEEE Software, 15(2), 46-56. Marsick, V. J., & Watkins, K. E. (2003). Demonstrating the value of an organization's

learning culture: The dimensions of the learning organization questionnaire. Advances in Developing Human Resources, 5(2), 132-151.

Martin, R., & Robertson, E. (2003). A comparison of frameworks for enterprise

architecture modeling. Proceeding of Conceptual Modeling - ER 2003, 2813, pp. 562-564. doi:10.1007/978-3-540-39648-2_43

139

Page 151: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

Mathieson, K. (1991). Predicting user intentions: Comparing the technology acceptance model with the theory of planned behavior. Information Systems Research, 2(3), 173-191.

Matson, J. E., Barrett, B. E., & Mellichamp, J. M. (1994). Software development cost

estimation using function points. IEEE Transactions on Software Engineering, 20(4), 275-287. doi: 10.1109/32.277575

Mays, R. G. (1994). Forging a silver bullet from the essence of software. IBM systems

Journal, 33(1), 20-45. doi: 10.1147/sj.331.0020 McKeen, J. D. (1983). Successful development strategies for business application

systems. MIS Quarterly, 7(3), 47-65. McKeen, J. D., & Smith, H. A. (2008). Developments in practice XXIX: The emerging

role of the enterprise business architect. Communications of AIS, 2008(22), 261-274.

McNee, S. M., Riedl, J., & Konstan, J. A. (2006). Being accurate is not enough: How

accuracy metrics have hurt recommender systems. Paper presented at the CHI'06 Extended Abstracts on Human Factors in Computing Systems, New York, NY. doi: 10.1145/112451.1125659

Meeusen, W., & van den Broeck, J. (1977). Efficiency estimation from Cobb-Douglas

production functions with composed error. International Economic Review, 18(2), 435-444.

Melville, N., Kraemer, K., & Gurbaxani, V. (2004). Information technology and

organizational performance: An integrative model of IT business value. MIS Quarterly, 28(2), 283-322.

Mylopoulos, J., Chung, L., & Yu, E. (1999). From object-oriented to goal-oriented

requirements analysis. Communications of the ACM, 42(1), 31-37. Nelson, R. (2007). IT project management: Infamous failures, classic mistakes, and

best practices. MIS Quarterly Executive, 6(2), 67-78. Nelson, R. R. (2005). Project retrospectives: Evaluating project success, failure and

everything in between. MIS Quarterly Executive, 4(3), 361-372. Neufeldt, V., & Guralnik, D. (1997). Webster's new world college dictionary. Cleveland,

OH: Macmillan. Nguyen, L., & Swatman, P. A. (2003). Managing the requirements engineering process.

Requirements Engineering, 8(1), 55-68.

140

Page 152: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

Ngwenyama, O., & Nielsen, P. A. (2003). Competing values in software process improvement: An assumption analysis of CMM from an organizational culture perspective. IEEE Transactions on Engineering Management 50(1), 100-112.

Niazi, M., Cox, K., & Verner, J. (2008). A measurement framework for assessing the

maturity of requirements engineering process. Software Quality Journal, 16(2), 213-235.

Niazi, M., Wilson, D., & Zowghi, D. (2005). A maturity model for the implementation of

software process improvement: An empirical study. Journal of Systems and Software, 74(2), 155-172.

Niederman, F., Brancheau, J. C., & Wetherbe, J. C. (1991). Information systems

management issues for the 1990s. MIS Quarterly, 15(4), 475-500. doi: 10.2307/249452

Nolan, R. L. (1973). Managing the computer resource: A stage hypothesis.

Communications of the ACM, 16(7), 399-405. Nunnally, J. C. (1975). Psychometric theory. 25 years ago and now. Educational

Researcher, 4(10), 7-21. Nuseibeh, B., & Easterbrook, S. (2000). Requirements engineering: A roadmap. Paper

presented at the ICSE '00 Proceedings of the COnference on The Future of Software Engineering, New York, NY. doi: 10.1145/336512.336523

Oh, W., & Pinsonneault, A. (2007). On the assessment of the strategic value of

information technologies: Conceptual and analytical approaches. Management Information Systems Quarterly, 31(2), 3.

Orlikowski, W., & Iacono, C. (2001). Research commentary: Desperately seeking the

"IT" in IT research-A call to theorizing the IT artifact. Information Systems Research, 12(2), 121-134.

Palyagar, B. (2006). A Framework for validating process improvements in requirements

engineering. 12th International RE, 33-36. Paulk, M. C. (1995). The capability maturity model: Guidelines for improving the

software process (Vol. 66). New York, NY: Addison-Wesley. Pinsonneault, A., & Kraemer, K. L. (1993). Survey research methodology in

management information systems: An assessment. Journal of Management Information Systems, 10(2), 75-105.

Repenning, N., Goncalves, P., & Black, L. (2001). Past the tipping point. California

Management Review, 43(4), 44.

141

Page 153: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

Rogerson, S., & Fidler, C. (1994). Strategic information systems planning: Its adoption

and use. Information Management & Computer Science, 2(1), 12-17. doi: 10.1108/09685229410058740

Ross, J. W. (2003). Creating a strategic IT architecture competency: Learning in stages.

Center for Informatrion Systems Research Working Paper, 335, 1-15. Ross, J. W., Weill, P., & Robertson, D. C. (2006). Enterprise architecture as strategy.

Boston, MA: Harvard Business School. Salmans, B. R. (2009). Accident versus essence: Investigating the relationship among

information systems development and requirements capabilities and perceptions of enterprise architecture (Unpublished doctoral dissertation). University of North Texas, Denton, TX.

Sauer, C. (2004). Strategic alignment revisited: Connecting organizational architecture

and IT infrastructure. Paper presented at the Proceedings of the 37th Annual Hawaii International Conference on System Sciences, Maui, HI.

Sawyer, P., Sommerville, I., & Viller, S. (1998). Improving the requirements process.

Paper presented at the Fourth International Workshop on Requirements Engineering: Foundation for Software Quality, Pisa, Italy.

Schreiber, J. B., Nora, A., Stage, F. K., Barlow, E. A., & King, J. (2006). Reporting

structural equation modeling and confirmatory factor analysis results: A review. The Journal of Educational Research, 99(6), 323-338.

Seddon, P. (1997). A respecification and extension of the DeLone and McLean model of

IS success. Information Systems Research, 8(3), 240-253. Segars, A. H., & Grover, V. (1993). Re-examining perceived ease of use and

Usefulness: A confirmatory factor analysis. MIS Quarterly, 17(4), 517-525. Senge, P. (1990). The fifth discipline: The art and practice of the learning organization.

New York, NY: Doubleday. Shpilberg, D., Berez, S., Puryear, R., & Shah, S. (2007). Avoiding the alignment trap in

IT. MIT Sloan Management Review, 49(1), 51. Sidorova, A., Evangelopoulos, N., Valacich, J., & Ramakrishnan, T. (2008). Uncovering

the intellectual core of the information systems discipline. MIS Quarterly, 32(3), 467-482.

Smith, H., & McKeen, J. D. (2006). IT in 2010: The next frontier. MIS Quarterly

Executive, 5(3), 125-136.

142

Page 154: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

Sommerville, I., & Ransom, J. (2005). An empirical study of industrial requirements

engineering process dsassessment and improvement. ACM Transactions on Software Engineering and Methodology (TOSEM), 14(1), 85-117.

Sommerville, I., & Sawyer, P. (1997). Requirements engineering: West Sussex,

England: John Wiley & Sons. Sowa, J., & Zachman, J. (1992). Extending and formalizing the framework for

information systems architecture. IBM Systems Journal, 31(3), 590-616. Steghuis, C., Daneva, M., & van Eck, P. (2005). Correlating architecture maturity and

enterprise systems usage maturity to improve business/IT alignment. Paper presented at the First International Workshop on Requirements Engineering for Business Need and IT Alignment, Paris, France.

Sterman, J. D. (2002). All models are wrong: Reflections on becoming a systems

scientist. System Dynamics Review, 18(4), 501-531. Strassman, P. A. (1997, Septmeber). Computers have yet to make companies more

productive. Computerworld. Straub, D. W. (1989). Validating instruments in MIS research. MIS Quarterly, 13(2),

147-169. Tabachnick, B., & Fidell, L. (2007). Profile analysis: The multivariate approach to

repeated measures. In B. Tabachnick & L. Fidell, Using Multivariate Statistics (5th ed., pp. 311-374). Boston, MA: Pearson/Ally & Bacon.

Tamm, T., Seddon, P. B., Shanks, G., & Reynolds, P. (2011). How does enterprise

architecture add value to organisations? Communications of the Association for Information Systems, 28(1), 10.

Taylor, S., and Todd, P. A. (1995). Understanding information technology usage: A test

of competing models. Information Systems Research, 6(2), 144-176. Tuunanen, T. (2003). A new perspective on requirements elicitation methods. Journal of

Information Technology Theory and Application (JITTA), 5(3), 7. Tyson, B., Albert, C., & Brownsword, L. (2003). Interpreting capability maturity model

integration (CMMI) for COTS-based system (Report No. CMU/SEI-2003-TR-022). Pittsburgh, PA: Software Engineering Institute, Carnegie Mellon University.

Ullman, J. B. (2006). Structural equation modeling: Reviewing the basics and moving

forward. Journal of Personality Assessment, 87(1), 35-50.

143

Page 155: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

van der Raadt, B., Hoorn, J., & van Vliet, H. (2005). Alignment and maturity are siblings in architecture assessment. Lecture Notes in Computer Science, 3520, 357–371.

van Gigch, J. (2003). The paradigm of the science of management and of the

management science disciplines. Systems Research and Behavioral Science, 20(6), 499-506.

Venkatesh, V., & Davis, F. D. (2000). A theoretical extension of the technology

acceptance model: Four longitudinal field studies. Management Science, 46(2), 186.

Venkatesh, V., Morris, M. G., Davis, G. B., & Davis, F. D. (2003). User acceptance of

information technology: Toward a unified view. MIS Quarterly, 27(3), 425-478. Versteeg, G., & Bouwman, H. (2006). Business architecture: A new paradigm to relate

business strategy to ICT. Information Systems Frontiers, 8(2), 91-102. Vessey, I., & Conger, S. (1993). Learning to specify information requirements: The

relationship between application and methodology. Journal of Management Information Systems, 10(2), 177-201.

Vessey, I., & Weber, R. (1984). Research on structured programming: An empiricist's

evaluation. IEEE Transactions on Software Engineering, (4), 397-407. von Bertalanffy L. (1926). PhD diss. University of Vienna; 1926a. Fechner und das

Problem der Integration höherer Ordnung (Unpublished doctoral dissertation), University of Vienna, Vienna, Austria.

Wasserman, A., Freeman, P., & Porcella, M. (1983). Characteristics of software

development methodologies. In T. W. Olle, H. G. Sol, & C. J. Tully (Eds.), Information systems design methodologies: A feature analysis (pp. 37-62). New York, NY: North-Holland.

Weaver, W. (1948). Science and complexity. American Scientist, 36(4), 536-544. Weill, P. (2004). Don't just lead, govern: How top-performing firms govern IT. MIS

Quarterly Executive, 3(1), 1-17. Wendorff, P. (2002). An essential distinction of agile software development processes

based on systems thinking in software engineering management. In M. Marchesi & G. Succi, G. (Eds), Proceedings of the 3rd International Conference on eXtreme Programming and Agile Processes in Software Engineering (XP; pp. 216-128), Alghero, Italy.

West, D., Grant, T., Gerush, M., & D’Silva, D. (2010). Agile development: Mainstream

adoption has changed agility. Retrieved from http://www.heacademy.ac.uk/.

144

Page 156: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

Whitten, J. L., Barlow, V. M., & Bentley, L. (1997). Systems analysis and design

methods. New York, NY: Irwin. Winter, M., & Checkland, P. (2003). Soft systems: A fresh perspective for project

management. Civil Engineering, 156(4), 187-192. Wolf, T. V., Rode, J. A., Sussman, J., & Kellogg, W. A. (2006). Dispelling design as the

black art of CHI. Paper presented at the Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, New York, NY. doi: 10.1145/1124772.1124853

Wolstenholme, E. (2004). Using generic system archetypes to support thinking and

modelling. System Dynamics Review, 20(4), 341-356. Xia, W., & Lee, G. (2004). Grasping the complexity of IS development projects.

Communications of the ACM, 47(5), 68-74. Young, E. (1996). Package enabled re-engineering solution definition. Ernst & Young

Fusion Series. Retrieved from http://www.processence.com/ Young, R. R. (2004). The requirements engineering handbook. Boston, MA: Artech

House. Zachman, J. (1997). Enterprise architecture: The issue of the century. Database

Programming and Design, 10(3), 44-53. Zachman, J. (1999). A framework for information systems architecture. IBM Systems

Journal, 38(2/3), 454-470. Zachman, J. (2009). The Zachman institute for framework architecture. Retrieved from

www.zachman.com Zachman, J. A. (1987). A framework for information systems architecture. IBM Systems

Journal, 26(3), 276-292. Zachman, J. A. (2004). The challenge is change: A management paper. Available from

Information Engineering Services Pty. Ltd. via http://members.ozemail.com.au/~visible/papers/zachman2.htm

Zachman, J. A. (2007). Architecture is architecture IS architecture. EIM Insight

Magazine, 1(1). Retrieved from http://www.eiminstitute.org/. Zachman, J. A., & Sowa, J. (1992). Extending and formalizing the framework for

information systems architecture. IBM Systems Journal, 31(3), 590-61.

145

Page 157: Development and Validation of an Instrument to .../67531/metadc... · Pettit, Alex Z. Development and Validation of an Instrument to Operationalize Information System Requirements

Zairi, M., & Sinclair, D. (1995). Business process re-engineering and process management: A survey of current practice and future trends in integrated management. Business Process Re-engineering & Management Journal, 1(1), 8-30.

146