nurturing systems thinking: an empirically based...

7
Nurturing Systems Thinking: An Empirically Based Framework to Improve Systems Development Processes Arjun Vijayanarayanan and Kelly Neville Embry-Riddle Aeronautical University Department of Human Factors and Systems 600 S. Clyde Morris Blvd. Daytona Beach, FL Abstract - In systems development, engineering teams tend to focus on schedule, cost, and technology components, all of which are salient elements of the system. Importance and attention must also be given to nonsalient system elements, the neglect of which may contribute to the failure of large systems engineering projects. In this effort, a framework is developed to improve quality in systems engineering processes. The framework can be used to help developers expand their attention and efforts beyond the salient aspects of the system development process. Qualitative research methods were used to identify the technical, teamwork, management, and organizational factors that can affect the quality of a system development project. More specifically, interview data were assessed in a bottom-up manner to identify emergent patterns and in a top-down manner to evaluate the relevance of Human Factors Analysis and Classification System (HFACS) factors, a framework used to improve safety in the aviation industry. Keywords: Systems development, systems thinking, qualitative research, quality 1 Background Late in his career, accomplished computer scientist and mathematician Joseph Goguen turned his attentions to the relations between social and technical factors in systems development. Goguen [8] noted that “large projects have an embarrassingly high failure rate” (p. 165), and his observation is supported by statistics produced by the Standish “CHAOS Report”, which provides information on software project failures and the factors that lead to failure. The project failure rates reported during 1994-2009 can be seen in Table 1. The statistics are especially alarming in light of the costs of large systems and software engineering projects. For example, the US Internal Revenue Service (IRS) spent $4 billion on a system that, as described by an IRS official, does “not work in the real world,” *16+ and the US Federal Bureau of Investigation spent $170 million on a system overhaul that ultimately they abandoned [5]. Other well- known system development failures involved upgrades to the London emergency dispatch system [7] and the US air traffic control system [4]. Table 1 The Standish Group’s “CHAOS Report” Showing Information Technology (IT) Project Failures and Successes from 1994-2009 Year Failed Projects Challenged Projects Successful Projects 1994 31% 53% 16% 1996 40% 33% 27% 1998 28% 46% 26% 2000 23% 49% 28% 2004 18% 53% 29% 2006 19% 46% 35% 2009 24% 44% 32% Note. Adapted from [6]. Goguen [9] implicates as a root cause of the very high failure rate the neglect of human and social aspects of systems during engineering projects. Additional evidence of this tendency to neglect human and social aspects can be found in the effort to add a human view to the Department of Defense Architectural Framework (DoDAF) [1], [3], [10]. Similarly, there is no accepted engineering practice, procedure, or modeling formalism for designing user interfaces that support the work and work practices of the users. Robert Hoffman, a human factors methodologist, notes that popular software and system engineering approaches (e.g., waterfall, spiral, and agile development frameworks) are noticeably lacking in

Upload: truongdiep

Post on 09-Jun-2018

213 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Nurturing Systems Thinking: An Empirically Based …worldcomp-proceedings.com/proc/p2011/SER3744.pdfNurturing Systems Thinking: An Empirically Based Framework to Improve Systems Development

Nurturing Systems Thinking: An Empirically Based Framework to Improve Systems Development Processes

Arjun Vijayanarayanan and Kelly Neville Embry-Riddle Aeronautical University

Department of Human Factors and Systems 600 S. Clyde Morris Blvd.

Daytona Beach, FL

Abstract - In systems development, engineering teams tend to focus on schedule, cost, and technology components, all of which are salient elements of the system. Importance and attention must also be given to nonsalient system elements, the neglect of which may contribute to the failure of large systems engineering projects. In this effort, a framework is developed to improve quality in systems engineering processes. The framework can be used to help developers expand their attention and efforts beyond the salient aspects of the system development process. Qualitative research methods were used to identify the technical, teamwork, management, and organizational factors that can affect the quality of a system development project. More specifically, interview data were assessed in a bottom-up manner to identify emergent patterns and in a top-down manner to evaluate the relevance of Human Factors Analysis and Classification System (HFACS) factors, a framework used to improve safety in the aviation industry. Keywords: Systems development, systems thinking, qualitative research, quality

1 Background

Late in his career, accomplished computer scientist and mathematician Joseph Goguen turned his attentions to the relations between social and technical factors in systems development. Goguen [8] noted that “large projects have an embarrassingly high failure rate” (p. 165), and his observation is supported by statistics produced by the Standish “CHAOS Report”, which provides information on software project failures and the factors that lead to failure. The project failure rates reported during 1994-2009 can be seen in Table 1.

The statistics are especially alarming in light of the costs of large systems and software engineering projects. For example, the US Internal Revenue Service (IRS) spent $4 billion on a system that, as described by an IRS official,

does “not work in the real world,” *16+ and the US Federal Bureau of Investigation spent $170 million on a system overhaul that ultimately they abandoned [5]. Other well-known system development failures involved upgrades to the London emergency dispatch system [7] and the US air traffic control system [4].

Table 1

The Standish Group’s “CHAOS Report” Showing Information Technology (IT) Project Failures and Successes from 1994-2009

Year Failed Projects

Challenged Projects

Successful Projects

1994 31% 53% 16%

1996 40% 33% 27%

1998 28% 46% 26%

2000 23% 49% 28%

2004 18% 53% 29%

2006 19% 46% 35%

2009 24% 44% 32%

Note. Adapted from [6].

Goguen [9] implicates as a root cause of the very high failure rate the neglect of human and social aspects of systems during engineering projects. Additional evidence of this tendency to neglect human and social aspects can be found in the effort to add a human view to the Department of Defense Architectural Framework (DoDAF) [1], [3], [10]. Similarly, there is no accepted engineering practice, procedure, or modeling formalism for designing user interfaces that support the work and work practices of the users. Robert Hoffman, a human factors methodologist, notes that popular software and system engineering approaches (e.g., waterfall, spiral, and agile development frameworks) are noticeably lacking in

Page 2: Nurturing Systems Thinking: An Empirically Based …worldcomp-proceedings.com/proc/p2011/SER3744.pdfNurturing Systems Thinking: An Empirically Based Framework to Improve Systems Development

guidance for integrating technology designs with their users and use environments (for example, see [11]). In the following sections we will discuss two approaches: Goguen’s ethnomethodological approach and Systems thinking approach, which illustrate the importance of the social aspects of systems.

1.1 Goguen’s Ethnomethodological Approach

Goguen used ethnomethodology as a tool to study the social aspects of a system. Ethnomethodology is a method for understanding the ways in which culture shapes and imbues the work practices, goals, and priorities of people in an organization by analyzing their accounts of their day-to-day experiences, work artifacts, and their communications. It is a descriptive method and does not engage in the explanation or evaluation of the particular culture undertaken as a topic of study. Goguen [9] states, “Ethnomethodology tries to reconcile radical empiricism with the situatedness of social data, by looking closely at how competent members of a group actually organize their interactions” (p. 10). Goguen uses ethnomethodology to explain the principle of accountability and the principle of orderliness. According to the principle of accountability, the members in a group are held accountable for their actions depending upon where their group is placed in the society or, in our context, in the organization. Thus, the behavior and interaction of members of a work group are constrained by the nature of accountability imposed by the group. According to the principle of orderliness, social interaction and behavior are orderly and can be understood with respect to contextual and cultural constraints. Thus, a group’s interaction can only be fully understood in the context of that particular group, which is the essence of ‘qualities of situatedness’. Thus to understand the nature of a system, the work group should be analyzed. The failure to take into account this human side of the system leads to system designs that are poorly matched to an organization’s work culture and practices. Thus, ethnomethodological approaches play a vital role in the development and achievement of systems thinking, an important systems engineering perspective that will be explained in the next section.

1.2 Systems Thinking Approach

According to International Council on Systems Engineering (INCOSE) [13], a system is “a construct or collection of different elements that together produce results not obtainable by the elements alone. The elements, or parts, can include people, hardware, software, facilities, policies, and documents; that is, all

things required to produce systems-level results" (“Definition of a System”, para. 1). Note that people are included in this definition as a core system element. Thus although people and related social factors tend to be neglected by systems developers, the systems engineering profession identifies them as key factors to be addressed. The systems engineering profession emphasizes the importance of systems thinking - the consideration of a system in a holistic way, in terms of all its components and their relationships. To help developers engineer systems in ways consistent with the goal of systems thinking, a framework called the Systems Thinking Framework for Systems Development (STFSD) is being developed.

The idea of the systems thinking framework came from the Human Factors Analysis and Classification System (HFACS) [22]. HFACS is a safety assessment framework widely used in aviation accident investigation because it helps analysts consider the latent failures within a web of causal influences. A similar framework in systems development could help developers monitor whether they are engineering the system in a balanced way; that is, whether they are comprehensively addressing the factors that affect the quality of a system development team’s work and whether they are using processes, methods, tools, etc. that support a balanced and holistic approach. The framework should help developers attend to development activities that address human aspects of systems as well as they attend to the technical aspects. The framework’s development is rooted in the premise that helping developers attend comprehensively to development activities, including activities that address the human (or social) aspects of systems, will lead to products in which humans are well integrated.

2 Method

2.1 Data Collection

In a recent issue of the journal Systems Engineering, [21] lament the lack of empirical research in the field of systems engineering. Methods they advocate for studying systems engineering include semi-structured and unstructured interviews, qualitative research methods that were used in the present research because they produce rich, relatively unbiased (i.e., less influenced by the researcher) data. This empirical research was conducted by analyzing the transcripts of six semi-structured interviews with experienced systems engineers.

The semi-structured interviews analyzed for this research were collected by [12]. Hoffman et al. analyzed these data to identify challenges faced by systems

Page 3: Nurturing Systems Thinking: An Empirically Based …worldcomp-proceedings.com/proc/p2011/SER3744.pdfNurturing Systems Thinking: An Empirically Based Framework to Improve Systems Development

engineering teams; in the proposed work, the data were re-analyzed to identify systems development practices, activities, and other factors involved in the conduct of a development effort. The identified factors serve as an initial solution for the proposed framework.

Hoffman et al. conducted semi-structured interviews with six systems development practitioners. Four were trained as software engineers and the other two were trained as systems engineers. Five of the engineers had 20 years or more of experience and one had 15 years of experience. Each engineer was asked to recall a challenging engineering project and to talk through it, describing challenges faced and strategies used to overcome them. This approach is based on the Critical Decision Method of knowledge elicitation [15]. Interviews were approximately 2 hrs in duration. Interview data were audio recorded; audio recordings were subsequently transcribed.

2.2 Data Analysis

To analyze the interviews, the transcripts were broken down into data elements, where each element represented a low-level unit of information—a specific idea, claim, or description. Data analysis methods focused on identifying development teams’ goals, activities, concerns, and other influences on development work. Data analysis was conducted using bottom-up and top-down coding approaches. The top-down approach allowed us to evaluate the relevance of organizational factors used in HFACS; the bottom-up approach allowed us to identify factors that seem to be suggested by the data.

To conduct the bottom-up coding, data were first reviewed broadly to identify themes that emerged. An initial set of data-driven codes (categories of information found in the data itself) was developed based on these early emergent themes. Part of the data was coded using this initial set of codes; as they were used, the codes were adapted to achieve a better fit with the patterns in the data. This adapted set of codes was then used to code all data. The codes used with all the data were relatively stable but nevertheless continued to evolve through the entire process. When a data element could not be coded using the existing bottom-up codes, a new code was identified and added to the set of bottom-up codes. Data driven codes fell into categories that included project management, work conditions, and teamwork strategies.

In the case of the top-down coding, HFACS categories that represented organizational influences were used as codes. HFACS-based codes were dropped if they were not found to be relevant to any interview data. These HFACS

categories used to code included factors such as organizational process, organizational climate, and resource management.

Multiple coders (three in number; Coders A, B, and C) were trained to code the data. Coder A performed both the data-driven coding and coding using the HFACS categories, Coder B performed the data-driven coding and Coder C performed the HFACS coding. To assess the reliability of coding, the percent of correspondence between each coding pair was determined. Differences in codes were reconciled by Coder A. Thus, Coder A used each second set of codes to help him consider alternative interpretations and perspectives.

3 Results and Discussion

Percentages of data elements assigned the same code by both coders in a pair were 74.71% for data-driven codes and 81.81% for HFACS codes. The final sets of codes that emerged from the coding analyses became the factors that comprise the first version of the systems thinking framework.

3.1 Organization of the Framework

The Systems Thinking framework for Systems Development consists of four tiers, namely organizational influences, project management, team processes and technical activities. Each tier has a set of factors associated with it. The factor external pressures envelopes all the tiers. The hierarchical representation of the systems thinking framework shown in Figure 1 specifies four major sources of influence on the technical activities tier. These influences are changes and associated pressures in the three tiers above: external pressures, organizational influences, project management, and team processes. These sources of influence are organized in the framework according to the extent to which they interface directly with and directly impact the technical activities. The structure represents the way in which different layers of an organization can influence the way work is conducted, a concept borrowed from the HFACS framework.

Page 4: Nurturing Systems Thinking: An Empirically Based …worldcomp-proceedings.com/proc/p2011/SER3744.pdfNurturing Systems Thinking: An Empirically Based Framework to Improve Systems Development

3.2 The Emergent Tier

When people think of the structure of a system development organization, it is common to think of it as an organization led by executives, where project managers help oversee or coordinate the technical activities. Support for this conceptualization was found in the interviews, however, the data suggested a fourth, unanticipated tier. In particular, the data contained a great deal of information regarding teamwork. Teamwork taxonomies by [17] and [20] were identified and used to perform a “finer-grained” analysis on teamwork. From this teamwork analysis, the following codes, as seen in Table 2, emerged as important.

Table 2

List of Added Codes

Team Coordination and Communication

Mission Analysis, Formulation, and Planning

Goal Specification

Team Monitoring and Back-Up Behavior

Coordination

Conflict Management

Initiative

Information Exchange

Build Common Ground and Awareness

Figure 1 presents the high-level factors derived from the coding analysis. All codes used in the analysis are included as factors in the framework, as long as they mapped to data in at least one interview. Codes that mapped to data in only one or two of the interviews were kept in the framework; weak evidence for them in this particular study does not disprove their relevance to systems engineering. Future research can shed more light on their potential roles and, in the meantime, their inclusion in the framework may enrich the framework’s support for systems thinking. More firmly established factors are factors that were supported by mappings to data in three or more of the interviews.

Page 5: Nurturing Systems Thinking: An Empirically Based …worldcomp-proceedings.com/proc/p2011/SER3744.pdfNurturing Systems Thinking: An Empirically Based Framework to Improve Systems Development

Figure 1. The Systems Thinking Framework for Systems Development (STFSD). Note. Dashed lines around nodes indicate that a factor was found in only one or two of the six interviews. All the other factors were found in at least three of the

interviews.

3.3 Encouraging Attention to Human Aspects of Systems

A main goal behind developing the systems thinking framework was to help developers attend to the large number of factors that can influence the outcome of their work and, especially, the factors that can influence how well a development team attends to social aspects of systems and to the integration of social and technical aspects. To this end, framework factors that can contribute directly to the integration of social and technical aspects have been identified post hoc. These factors consist of ‘Acquire Domain Knowledge’ on the ‘Technical Activities’ level, all teamwork strategies identified under ‘Team Processes’, and the factor ‘Complexity Management’ under the ‘Project Management’ node. Justification for selecting these particular factors is as follows:

Acquire Domain Knowledge – Acquiring domain knowledge is a type of technical work that involves learning about the environment and activities in which a system will be used. Acquiring Domain Knowledge is essential to both understanding the human aspects of a system and integrating them with the technical aspects. This component of systems development is where Goguen’s recommendation for using ethnomethodology can be brought to bear.

Teamwork strategies – In the interviews, teamwork strategies were typically cited as supporting teamwork among development team members as they worked on a system’s technical components; the same strategies could facilitate teamwork between members working on technical and human components of a system. Because people working on technical and human system aspects often are trained in different disciplines guided by differing

Page 6: Nurturing Systems Thinking: An Empirically Based …worldcomp-proceedings.com/proc/p2011/SER3744.pdfNurturing Systems Thinking: An Empirically Based Framework to Improve Systems Development

systems development philosophies, the teamwork strategies may be especially valuable—even vital—to the success of their working relationships.

Complexity Management - Complexity management encompasses strategies that are used to either reduce or cope with the complexity of a system being developed or of the systems development process. When the human aspects of a system are considered, their consideration tends to introduce a great deal of complexity into the development process and those aspects furthermore represent complexity that is often ignored within the system being developed. Consequently, complexity management activities may be critical to the success of a team that addresses a system’s human elements.

In fact, the interview data did not address human aspects of systems or human-technology integration, with very few exceptions. Similarly, the interviewees tended not to relate the three factors listed above to human-technology integration. For example, a domain analysis of the environment and activities in which a system would be used was described in only one interview. One other interviewee used the term ‘domain analysis’ but it was in the context of gaining an understanding of the engineering domain and engineering precedents. The teamwork factors, when described by the interviewees, tended to relate to conflicts, tensions, team monitoring, information exchange, and shared understanding among team members focused on technical aspects of a system. Complexity management typically involved technology decomposition and technology requirements management.

Support at the organizational and program management levels of the systems thinking framework would likely be required for the three factors above to be used in support of human-focused development activities and human-technology integration. Future research could evaluate this hypothesis and future versions of the systems thinking framework may further develop the roles and influence of these levels. More human-related activities may also be added to the framework as additional research is conducted and the relationships between specific activities and the probability of successful outcome are better understood. Generally speaking, the integration of human and technology elements tends to occur well when development work includes opportunities for human and technology elements to shape each other (for example, see [2], [18]).

4 Conclusions

This research involved the development of a

framework for drawing attention to both straightforward

and the underlying, latent and more abstract factors that

can affect a system development project’s quality and

chances for success. The framework makes explicit the

influences that contribute to tunnel vision during

development practices and which thereby put a project at

risk. Use of the resulting framework should help

development teams to better understand causal factors

that can lead to development failure and look out for them

in the future. Systems engineering processes and projects

should consequently become more effective and more

likely to result in a system in which all elements integrate

well. Based on this research, we suggest a certain set of

factors that systems development teams should monitor;

other factors may be added based on future work. Future

work should additionally investigate risks and strategies

associated with the framework factors so that more

specific and useful guidance can be built into the

framework.

5 References

[1] Baker, K., Stewart, A., Pogue, C., & Ramotar, R. (2008). Human views: Extensions to the Department of Defense Architecture Framework (DRDC CR-2008-001). Defense Techical Information Center, Ft. Belvoir, VA. Retrieved from http://www.dtic.mil. [2] Benbya, H. & McKelvey, B. (2006). Using coevolutionary and complexity theories to improve IS alignment: A multi-level approach. Journal of Information Technology, 21, 284-298. [3] Bruseberg, A. (2008). Human views for MODAF as a bridge between human factors integration and systems engineering. Journal of Cognitive Engineering and Decision Making, 2, 220-248. [4] Carr, D. F., & Cone, E. (2002, April 8). Can FAA Salvage Its IT Disaster? Baseline Magazine. Retrieved from http://www.baselinemag.com/article2/0,1540,656862,00.asp. [5] Eggen, D. (2005, June 6). FBI Pushed Ahead with Troubled Software. Retrieved from Washingtonpost. com.

Page 7: Nurturing Systems Thinking: An Empirically Based …worldcomp-proceedings.com/proc/p2011/SER3744.pdfNurturing Systems Thinking: An Empirically Based Framework to Improve Systems Development

[6] Eveleens, J. L., & Verhoef, C. (2010). The rise and fall of the Chaos report figures. IEEE Software, 6, 30-36. [7] Finkelstein, A., & Dowell, J. (1996). A Comedy of Errors: The London Ambulance Service Case Study. IEEE Proc. 8th Int’l Workshop Software Specification and Design, 2–4. [8] Goguen, J. A. (1994). Requirements engineering as the reconciliation of social and technical issues. In M. Jirotka and J. A. Goguen (Eds.), Requirements Engineering: Social and Technical Issues (pp. 165-199). San Diego, CA: Academic Press Professional. [9] Goguen, J. A. (1997). Towards a social, ethical theory of information. In G. Bowker, L. Gasser, L. Star, & W. Turner (Eds.), Social Science Research: Vol. 115. Technical Systems and Cooperative Work: Beyond the Great Divide (pp. 27-56). Mahwah, NJ: Erlbaum. [10] Handley, H.A.H. & Smillie, R.J. (2008). Architecture framework human view: The NATO approach. Systems Engineering, 11, 156-164. [11] Hoffman, R. R., & Elm, W. C. (2006) HCC implications for the procurement process. IEEE: Intelligent Systems, January/February, 74-81. [12] Hoffman, R.R., Neville, K., & Fowlkes, J. (2009). Using cognitive task analysis to explore issues in the procurement of intelligent decision support systems. Cognition, Technology, & Work, 11, 57-70. [13] INCOSE (2006). A consensus of the INCOSE Fellows. Retrieved from http://www.incose.org/practice. [14] Johnson, R.B. (1997). Examining the validity structure of qualitative research. Education, 118, 282-292. [15] Klein, G., Calderwood, R., & MacGregor, D. (1989). Critical decision method for eliciting knowledge. IEEE Transactions on Systems, Man, and Cybernetics, 19(1), 462-472. [16] Marketplace (1997, January). Marketplace for January 31

st, 1997. National Public Radio.

[17] Marks, M. A., Mathieu, J.E., & Zaccaro, S. J. (2001). A temporally based framework and taxonomy of team processes. Academy of Management Review, 26(3), 356-376.

[18] Norman, D. O. (July 2004). Engineering a complex system: A study of the AOC (MITRE Technical Report No. 04-0527). MITRE. Retrieved from http://www.mitre.org/work/tech_papers/tech_papers_04/norman_aoc/norman_aoc.pdf. [19] Paletz, B. F., Bearman, C., Orasanu, J., & Holbrook, J. (2009). Socializing the Human Factors Analysis and Classification System: Incorporating Social Psychological phenomena into a Human Factors Error Classification System. Human Factors, 51, 435-445. [20] Smith-Jentsch, K.A, Zeisig, R.L, Acton, B., & McPherson, J.A. (1998). Team dimensional training: A strategy for guided team self-correction. In J.A. Cannon-Bowers & E. Salas (Eds.), Making Decisions under Stress: Implications for Individual and Team Training (pp. 271-297). Washington, DC: APA Press.

[21] Valerdi, R., & Davidz, H. L. (2009). Empirical research in systems engineering: Challenges and opportunities of a new frontier. Systems Engineering, 12, 169-181. [22] Wiegmann, D. A., & Shappell, S. A. (2003). A human

error approach to aviation accident analysis: The Human

Factors Analysis and Classification System. Burlington, VT:

Ashgate Publishing Company.