creating a collaborative virtual command center among four separate organizations in the united...

19
75 Creating A Collaborative Virtual Command Center Among Four Separate Organizations In The United States Army: An Exploratory Case Study Anne Kohnke, PhD Lawrence Technological University College of Management Teresa Gonda Defense Acquisition University Anne Kohnke, PhD 248-204-3085 Assistant Professor of IT Lawrence Technological University College of Management 21000 West Ten Mile Rd Southfield, MI 48075 [email protected] Teresa Gonda 586-925-2601 Defense Acquisition University Midwest Region Sterling Hts, MI Campus [email protected] Abstract While individual leadership skills are an important factor in transforming organizations, leading through a common mission and shared purpose of collaborative and efficient interaction requires leaders to participate in different ways. The purpose of this study was to determine how to create a collaborative and efficient virtual Command Center within the new leadership structure of a Life Cycle Management Command within the United States Army. This case study employed a variety of organization development tools to bring together leadership, change agents, and customers to create a new vision and strategic plan of how to operate effectively. Leadership used the results of this study to implement the structural changes and new processes required for a successful transformation to a virtual business. Keywords: organization development, virtual organization, lean six sigma _______________ Disclaimer: Reference herein to any specific commercial company, product, process, or service by trade name, trademark, manufacturer, or otherwise, does not necessarily constitute or imply its endorsement, recommendation, or favoring by the United States Government or the Department of the Army (DoA). e opinions of the authors expressed herein do not necessarily state or reflect those of the United States Government or the DoA, and shall not be used for advertising or product endorsement purposes. Gonda and Kohnke

Upload: ltu

Post on 24-Nov-2023

0 views

Category:

Documents


0 download

TRANSCRIPT

75

Creating A Collaborative Virtual Command Center Among Four Separate Organizations In The United States Army: An

Exploratory Case Study

Anne Kohnke, PhDLawrence Technological University

College of Management

Teresa GondaDefense Acquisition University

Anne Kohnke, PhD248-204-3085Assistant Professor of ITLawrence Technological UniversityCollege of Management21000 West Ten Mile RdSouthfield, MI [email protected]

Teresa Gonda586-925-2601Defense Acquisition UniversityMidwest RegionSterling Hts, MI [email protected]

AbstractWhile individual leadership skills are an important factor in transforming organizations, leading through a common mission and shared purpose of collaborative and efficient interaction requires leaders to participate in different ways. The purpose of this study was to determine how to create a collaborative and efficient virtual Command Center within the new leadership structure of a Life Cycle Management Command within the United States Army. This case study employed a variety of organization development tools to bring together leadership, change agents, and customers to create a new vision and strategic plan of how to operate effectively. Leadership used the results of this study to implement the structural changes and new processes required for a successful transformation to a virtual business.

Keywords: organization development, virtual organization, lean six sigma

_______________

Disclaimer: Reference herein to any specific commercial company, product, process, or service by trade name, trademark, manufacturer, or otherwise, does not necessarily constitute or imply its endorsement, recommendation, or favoring by the United States Government or the Department of the Army (DoA). The opinions of the authors expressed herein do not necessarily state or reflect those of the United States Government or the DoA, and shall not be used for advertising or product endorsement purposes.

Gonda and Kohnke

Organization Development Journal l Winter 201376

to the field, the Army created the concept of a Life Cycle Management Command (LCMC). The function of this new command center is to integrate the acquisition functions, the technology support functions, and the logistics and sustainment functions into a more holistic business model under the care of a commanding general (Kern & Bolton, 2004). Traditionally, command centers are often managed by standard command and control and/or control of funding, however since few of the organizations brought together in this new concept actually report to the Commanding General, the new leadership is open to leading through a common mission and shared purpose of collaborative and efficient interaction.

The purpose of this study was to determine how to create a collaborative and efficient virtual business organization within the new leadership structure of the Life Cycle Management Command within the United States Army.

Literature Review

There are several change models in the literature that look at the process from a variety of aspects and highlight different challenges (Cooperrider, Sorensen, Yaeger, & Whitney, 2001; Anderson & Anderson, 2001; Kirkpatrick, 2001; Mento, Jones, & Dirndorfer, 2002; Leppitt, 2006; Light, 2005). There are also competing philosophies and descriptions of the different roles and types of change agents (Pettigrew & Whipp, 1991; Quinn, 1993; Dawson, 2003; Kotter, 1995; Pendlebury, Grouard, & Meston, 1998). Palmer, Dunford, and Akin (2009) caution against thinking that change can be neatly categorized in terms of size and type of change and that there are simple recipes to follow. They do not ascribe to a distinction between

Introduction

The United States Army is currently undergoing significant changes in the way it does acquisition, which is the business of acquiring equipment for the services (Carter, 2010). Historically, the size and cost of the Army fluctuates based on the country’s need. Because of the current state of the economy and draw down of two wars, the Army now finds itself in an urgent situation of having to reduce costs by cutting programs and becoming more efficient (Obama, 2012).

Within the Army, there are two separate chains of command that have authority in the area of providing and maintaining equipment for the military. The first is a civilian chain of command, headed up by the Army Acquisition Executive, who is a political appointee responsible for controlling funds provided by Congress to acquire new equipment. The second is a military chain of command that reports to the Chief of Staff of the Army and is responsible for the readiness and logistics of equipment. This second chain of command is lead by the Army Material Command which is also the home of the Research, Development, and Engineering function. There is overlap between the chains of command, because the civilian Army Acquisition Executive also holds the title Assistant Secretary of the Army for Acquisition, Logistics, and Technology [ASA(ALT)]. Additionally, coordination and communication are critical between these two chains of command as funding for the Research, Development, and Engineering Command, which reports up through the military command is funded by the civilian command.

In order to better synchronize the two chains of command and create a more collaborative and efficient method of delivering equipment

77

managing change and leading change, but see the aspects of both being part of the change equation. Senge, Kleiner, Ross, Roth, & Smith (1999) give four distinct challenges to initiating change. They are: (a) not enough time, (b) no help, (c) not relevant, (d) walking the talk. They provide an analysis and series of articles by contributing authors on each of these topics. Time is shown to be a factor of many contributions, not the least of which are multiple change initiatives, but the fact is, if there is not time for reflection, for planning the initiative, working on the initiative, or for more reflection, the initiative will stall. In terms of the help needed, an initiative will need coherent, consistent, knowledgeable coaching, guidance, and support or it will fail. If the initiative does not have a clear and compelling business case for learning, it will not survive. And finally, if there is a gap between espoused values and the actions of those championing the change, the initiative is vulnerable.

Van de Ven and Sun (2011) also state that unprecedented change unfolds in ambiguous and uncertain ways and requires continuous reflective communication among the change agents. Frahm and Brown (2007) state that management theory is firm on the notion that communication is the very foundation of change efforts and that the need for a formal change communications strategy is critical throughout an effort. They add that the first step is making certain that change communication is receiever-oriented and that in continuous change, the communication is continuously reframed. They recommend those leading emergent change in continuous change efforts need to ensure that the managers and the employees have aligned expectations with the change goal and the accompanying styles of communication. They recommend participative types of communication

approaches, especially since some contend that in a traditional organization, information can be seen as a scarce commodity that is blocked at certain levels. A good strategy will take this into consideration. Kotter and Cohen (2002) provide innovative examples of change that not only target getting to the right people, but getting to the heart of the receiver in a meaningful way, so that it will have an impact.

Caffrey and Medina (2011) describe why wholesale continuous improvement efforts in general fail. They cite lack of leadership engagement to drive vision and ride the change process, overlooking the need for change management in pushing past what they describe are nine signs of resistance, inadequate communication including no change management communications plan and implementation, poor project selection and scoping of projects and people, and not establishing good metrics and measurement system.

Because relationships are important in interventions, trust can be an issue in change. Transforming to a virtual organization can bring up even more significant issues of trust (Msanjila & Afsarmanesh, 2008). A virtual organization (VO) has been defined as an “association of (legally) independent organizations (VO partners) that come together and share resources and skills to achieve a common goal, such as acquiring and executing a market/society opportunity” (Msanjila & Afsarmanesh, 2008). A virtual business environment (VBE) has been defined as “an alliance of organizations (VBE members) and related supporting institutions, adhering to a long term base cooperation agreement, and adopting common operating principles and infrastructures, with the main goal of increasing both their chances and preparedness toward collaboration in potential

Gonda and Kohnke

Organization Development Journal l Winter 201378

VOs (Msanjila & Afsarmanesh, 2008). Msanjila & Afsarmanesh (2008) provide an analysis and set of tools to assess trust in VBEs. They look at causality (with an analysis of past performance and behavior), transparency, fairness, and complexity they also look at trust between members, between external stakeholders, and between VBE members and the VBE administration. The trust between members enhances efficiency and success, the trust between members and the administration enhance the chance of the member remaining loyal to the VBE, and external stakeholders need to trust the VBE if they are to become members or customers.

But what happens when trust breaks down across virtual organizations? Gillespie and Dietz (2009) say the research suggests that it can be repaired if the perceived violator responds appropriately, but that most do not. Most act too late with more attention to external relations than internal. Drawing from the literature, they describe three dimensions of trustworthiness: (a) ability - an organization’s collective competencies and characteristics to function reliably, (b) benevolence - organizational action demonstrating care and concern for the well being of stakeholders, and (c) integrity - organizational action that consistently adheres to moral principles and a code of conduct acceptable as honest and fair. They contend that failures at an organizational level have multiple contributory causes, which they say is supported by systems and multilevel theory and by the research. They acknowledge that once trust is broken, there is a cognitive tendency to privilege the negative. This cognitive bias to the negative leads to sinister assumptions about motive and character and competence. The primary focus of repair means addressing the cognitive bias. They suggest distrust regulation (avoiding the behaviors that caused

the distrust) and trustworthiness demonstration (demonstrating positive promotion of behaviors and verbal responses that actively demonstrate benevolence and integrity). This process involves continual cycles of evaluation, intervention, and re-evaluation. They conclude that while the external environment should not be overstated, organizational level failures provoking trust issues often occur in response to structurally embedded pressures and constraints. Signals about which behaviors are rewarded and punished are sent from many directions.

Methodology

The new leadership of the Life Cycle Management Command (LCMC) agreed to govern themselves by creating an LCMC Executive Committee chaired by the Commanding General. The Executive Committee made a decision to use Lean Six Sigma (LSS) practices to run the interventions required to create the new LCMC operating construct. Using the LSS model from Frigon and Jackson’s Enterprise Excellence (2008), an executive champion was elected for each intervention, and a Black Belt was selected to lead each intervention. One of the interventions chosen was to focus on the technology planning and insertion aspects of the acquisition process and determine how to align the Research, Development, and Engineering Center (RDEC) with two separate Program Executive Offices (PEO) into a common LCMC construct. The RDEC and the PEOs had separate reporting chains, separate funding streams, separate cultures and no one reported directly to the LCMC Commanding General. This intervention was called RDE Alignment.

79

Unit of AnalysisFour organizations within this Army LCMC

were the focus of this study. The four organizations are as follows:

1. Commanding General and staff2. Program Executive Office ‘A’3. Program Executive Office ‘B’4. Research, Development, and Engineering

Center (RDEC)

ParticipantsThe participants consisted of five groups:

1. LCMC Executive Committee: this committee comprised of the Commanding General,the PEOs, the RDEC Director, and otherexecutives in charge of additional functionswithin the LCMC, such as Logistics/Sustainment, Contracting, and Legal.

2. RDE Alignment Executive Steering Group:an RDE Alignment Executive Steering Group was created to guide all activitiesand initiatives under this intervention. Thegroup consisted of the Executive Champion(Director of the RDEC), the ExecutiveDeputies of PEO ‘A’ and PEO ‘B’, and theExecutive for Research in the RDEC. Thedeputies in this instance also fill the roleof sponsor. While the role of the executivechampion is to remove obstacles, the sponsors interact on a more regular basis tosteer the effort. In this effort, PEO A was alsoregularly involved and an active participant.

3. Action Research Team: the Action ResearchTeam consisted of three individuals whoworked as mentors and consultants in the roleof participant observer action researchers.This team was given a large conference roomas a “war room” to use full time through the

duration of the eighteen month initiative. This room had video teleconferencing capabilities, a large white board, easels with large newsprint, and room to put material up on the walls.

4. Core Team: a core team was created by theRDE Alignment Executive Steering Groupwho chose key individuals from each of theirorganizations. The core team had 12 members and consisted of the Action Research Teamplus chief engineers from some of the majorProject Management Offices within the PEOsand Science and Technology Planning Leadsfrom the RDEC.

5. Subject Matter Experts: additional subjectmatter experts were called in when necessarywith support from the Core Team and theExecutive Steering Group.

Analytical FrameworkIn its classic form, Lean Six Sigma (LSS)

is meant to improve existing processes (George, Rowlands, Price, & Maxey, 2005). LSS includes an approach of hierarchical mentoring and training, a philosophy of continuous strategic improvement, and an analytical framework for solving problems. The analytical framework for LSS for improving existing processes consists of five steps which are: Define, Measure, Analyze, Improve, and Control. Design for LSS is an accepted practice in industry used to create new processes (George, 2003), but it has not yet been adopted wholesale within the Army. Additional tools used throughout this project included qualitative structured interviews, an affinity diagram and scoring matrix to understand and prioritize issues, a Pareto diagram to examine interdependencies, a fishbone structure to identify cause and effect, a failure modes and effect analysis

Gonda and Kohnke

Organization Development Journal l Winter 201380

(FMEA) to evaluate the processes, and the Labovitz and Rosansky Organizational Alignment model to develop a solution set of recommendations.

As the Define phase started, the problem was scoped out and included identifying the leadership team and team members, creation of a team charter, and the development of a problem statement. Initial data was collected to help understand the problem. Structured interviews were conducted to determine the priorities, concerns and insights of the leaders and process owners. The data were gathered and then the core group performed a brain storming and Affinity Diagram process to further analyze, understand, and prioritize the issues around the problem. The results were reported back to the Executive Steering Group with this data for validation. The issues were scored in a decision matrix to examine any Pareto effect and an interdependency analysis was performed to assess root causes. A charter that clearly defined the problem, the roles and responsibilities, and time line was developed and signed during this phase.

The measure phase is the data gathering phase that included more in-depth customer interviews based on the refined issues. A literature review was performed, best practices were investigated, and process mapping of existing processes was performed. The data collected during the measure phase was then analyzed. A Cause and Effect (Fishbone) exercise was conducted and failure modes and effects analysis (FMEA) was performed. An interdependencies analysis was conducted on root causes. The FMEA was then used as a predictive tool by having subject matter experts rescore projected results based on the recommended improvements. Based on the recommended improvements, implementation and sustainment metrics were then developed; communications plans and an implementation plan were created,

process owners were identified, and the process was transferred to the process owner.

Results

The Define PhaseTo help scope the change effort, the project

team participated in an exercise that included brainstorming, the creation of an affinity diagram, and a scoring matrix. The core team assembled and using sticky notes brainstormed issues affecting alignment of the organizations. The notes were then assembled into affinities or themes using a group thematic analysis process whereby the team quietly moves the notes into common themes and then the group determines the name of each of the themes. The team then created a matrix with the themes written across the top of a matrix. Next, the team placed scoring criteria across the left hand side of the matrix. The scoring criteria were: program performance, cost risk, schedule risk, and overall risk. The items were then scored by the group as a facilitated consensus. The results were laid out to create a Pareto diagram and sorted in order from highest to lowest. The desired result was that a few items would account for eighty percent or more of the problems and that these would stand out from the rest in what is called a Pareto effect (George, Rowlands, Price, & Maxey, 2005). These items became the priority problems that were addressed in this initiative.

Figure 1 below shows the Pareto chart resulting from the brainstorm and affinity exercise. Several issues became apparent such as the need to align priorities, improve communication, improve data overload problem, define roles and responsibilities on each side, and improve response time and credibility with the project managers.

81

Figure 1: Pareto chart for RDE Alignment Define Phase

While “alignment of priorities” is seen as the top priority, no clear Pareto effect is seen in the plot. To understand if there was another method in order to gain clarity of distinctions between the issues, the Action Research Team performed an interdependency analysis results. An interdependency analysis can provide valuable insights into the relationships and dependencies of complex analytical problems. The results of the interrelationship diagraph are shown below in Figure 2.

Figure 2: Interdependency diagraph of problems from Pareto chart.

The interdependencies numbered ‘7’ (roles and responsibilities and improve communication) is an insight that, although known, the diagraph validated what the participants already understood—communication is improved when organizations have clearly defined roles and responsibilities.

Another relationship revealed from the diagraph was with the three items numbered ‘2’: align ATOs with life cycle, the need to improve RDE response time, and the need to improve credibility with project managers. What became clear from the interdependency diagraph was the need for a quality management system to address individual and whole-scale themes that emerged.

At this stage of the process, the Action Research Team determined that the data suggested the first project in the initiative was to establish a formal alignment process between the RDEC and the PEOs to align priorities. This was called a portfolio management process and the project was formally created as a Green Belt project. The core team performed a stakeholder analysis and then interviewed key stakeholders as to what they wanted to see in a portfolio management process. The stakeholders were asked what aspects were critical to quality for the portfolio management process and what information they desired. The data was collected and the core team, along with additional subject matter experts group, performed a new affinity process on this information. The result was a list of over fifty key insights of information needs such as the need to provide a centralized database of existing technology and technology under development, links to detailed project information, process flows, and a conflict resolution process. The group then prioritized this information using the same scoring matrix exercise.

Best practices of portfolio management were investigated and the Action Research Team found related processes in place at two other Army Commands and a commercial portfolio management IT solution that could facilitate the process. The research suggested that a much higher level of organizational and process maturity was

Gonda and Kohnke

Organization Development Journal l Winter 201382

required to perform portfolio management than what was currently in place. The data collected also showed the desire for a mature portfolio management process that allowed top decision makers the capability to understand the prioritized needs across the LCMC in both the acquisition, logistics, and technology communities and to see where the resources were being applied and if changes needed to be made. The desired future state process would then allow the leaders to leverage the resources of one another and move them to top priorities to gain efficiencies. However, the data collected also showed that there were no integrative processes, existing process owners, or specialized skills currently in place for this desired process to be built. At this point, the team faced a dilemma in the scope of the project—either the team could produce some sort of capability in the short time frame allowed for the Green Belt project or the team could pause and define the bigger picture that would allow for the creation of the mature process being called for by the results of the initiative thus far. The team recommended a pause to define the larger systemic environment needed to allow for portfolio management and this decision was supported by the Executive Steering Group.

The Measure PhaseThe Action Research Team, along with four

project managers, process mapped the business planning process for each of the PMs. These ranged from the steps of collecting needs and data from the user and the representatives in the field, to the point of putting a solution on contract. The team then process mapped the two science and technology project planning processes. There was no overarching business process identified within the RDEC. Research revealed a project from another

RDEC where an overarching business process was defined. The team also uncovered DoD references on acquisition and technology management that provided guidance on working collaboratively across the training and doctrine community of the Army (referred to as the “user”), acquisition (PEO/PM), technology, and logistics all across the phases of the Life Cycle (Office of the Under Secretary of Defense for Acquisition, Technology, and Logistics, 2003). From the literature, the Action Research team looked at the McKinsey 7-S organizational alignment model (Waterman, Peters, & Phillips, 1980) and the Labovitz and Rosansky model (Labovitz & Rosansky, 1997).

In follow-up interviews and in meetings of the Executive Steering Group, the PEOs (A & B) began to speak of a desired future state. Theywanted the ability to tap into a government center of excellence of engineering analysis services that could be used across Project Management Offices and cost shared. Key aspects of the vision included expertise in modeling and simulation, sophisticated systems analysis (e.g. design of experiments), systems engineering (particularly risk management, configuration management, and requirements analysis), systems technology integration, and subject matter expertise in particular areas such as survivability and specific platform engines and transmissions. Critical to this working was a demonstration of good project management skills by the systems and technology community and transparency of cost of the operations.

The Program Executive Office’s intent with this future state was to save day-to-day acquisition costs, as well as to provide the specialized analysis expertise to make the individual project managers within the PEOs smarter buyers when they went out to seek industry services. A solution to the problem

83

in processes across the LCMC, the Science and Technology and Program Management process time lines are not synchronized and do not feed into one another. Concerns identified in Policy included: there is no priority placed on an integrated event driven schedule, policies from higher headquarters are forcing a disconnect between the RDEC and PEOs. Examples of Communication difficulties were: RDECs and PEOs do not understand or communicate on roles and responsibilities needed for transitioning technology, RDECs and Project Managers not aware of each other’s needs or capabilities. Some key issues found in People were: people on both sides lack an understanding of the big picture, no strategic plan to develop people with the right skill mix, some people resist change. Themes found in Requirements/Priorities included: No data to prioritize (only pet projects), priority process and criteria vary by organization and are not shared, there is no synchronization of the users who create requirements and conflicting requirements result.

To capture more fidelity and to put the issues into context, the Action Research Team next guided the core team in a Failure Modes and Effects Analysis (FMEA) of the existing processes. In an FMEA for transactional processes, the process step under evaluation was listed first. Each of the key inputs to the process was evaluated separately for potential defects. The key or importance was determined by the process team working with the FMEA data. The question “what could go wrong?” was asked and the potential failure mode was recorded. The next question asked was “what is the impact on the key output variables as they relate to the customer (the one receiving the output)? This was then recorded. The severity, sometimes referred to as “impact” in standard risk analysis, was then scored. A ranking

statement began to emerge. In order to create a collaborative and efficient organization, one of the answers was to create an organization that could consistently, strategically, and collaboratively plan together.

The Analysis PhaseThe Action Research Team began to

construct the necessary pieces for the future state vision. The decomposition resulted in the need for an overarching collaborative planning process, a central planning function to manage the process, a center of excellence for systems engineering and integration, and the governance structure to bring the pieces together since the virtual organization had no other means of centralized decision making. In order to better understand the missing pieces and problems, the team decided to analyze the problem using a Cause and Effects (Fishbone) exercise on the general issues around alignment and to perform a failure modes and effects analysis on the existing processes before they designed the future state process.

For the Cause and Effects exercise, a large eight foot fish bone was created on the war room wall with masking tape. The branch labels were chosen based on the results of the previous affinity process. The core team and other subject matter experts were invited to place sticky notes on the fishbone during a two hour exercise and then throughout a three week time period. The Action Research Team performed a preliminary analysis of the findings in the Cause and Effects exercise and saw that communication was a significant factor. The branches in the fishbone were Process, Policy, Communication, People, Requirements/Priorities, and Miscellaneous. Examples of problems found in process were: no metrics to measure alignment, no consistency

Gonda and Kohnke

Organization Development Journal l Winter 201384

before product advances to next or subsequent process. Once again these are scored against pre-determined criteria. Table 3 below shows the likelihood of an existence of detectability and the ranking of 1-5 with 1 being almost certain and 5 almost impossible to detect.

Table 3: Detectability of the Existence of a Defect Ranking

Since a robust analysis of every step of each process was determined to be too lengthy, the Action Research Team guided the core team in a prioritization of the key steps. The steps that were determined to be in the scope of the effort and the top priority were Collect and Analyze Data, Clarification of Requirements, Development of Priorities, and Select Valid Approach in the PM processes and Develop Science and Technology proposal and Research, Development, and Engineering Command (RDECOM) Proposal Review in the Science and Technology development process.

The FMEA of the process steps resulted in 117 issues or causes of risk. The group also added issues from the Cause and Effects results that were not already accounted for in the FMEA. In the end, the Master FMEA had 170 issues that were scored with severity, occurrence, and detectability. The Action Research Team performed a thematic analysis on the data. Figure 3 below outlines the categories of the thematic analysis.

system and criteria were agreed upon ahead of time. The values for the ranking system ranged from 1-5 with 1 considered minor and 5 a hazard. Table 1 below shows the severity levels and a description of the criteria.

Table 1: Severity Criteria and Ranking

Next, each cause that could create the failure to happen was listed. The correct process was to continue to ask “what would cause that to happen” until a root cause is determined. There could be several root causes listed, but only one was listed per line. Next the question asked was “how often does the failure mode or the cause occur?” The FMEA process team had to decide how it would score this depending on the situation and the team chose to score the cause. This is called the “occurrence.” Table 2 below shows the Criteria values and definitions that were assigned to each criterion.

Table 2: Occurrence Criteria and Definition

Next, the participants were asked to describe the current controls in place to prevent the cause and/or failure from occurring. These were recorded and the participants were asked “how detectable is the cause or the failure mode?” This value is the likelihood the existence of a defect will be detected

85

facilitate this process. But it wasn’t clear whose role or responsibility it was to make this happen. The picture was a cyclical loop of interdependency. A comparative analysis of these results with the data collected from qualitative interviews, best practices, original Affinity Exercise, and Interdependency Analysis exercise suggested that a quality management system that addressed the issues of basic organizational alignment was missing.

The Improvement PhaseBased on the results of the analysis phase,

the Action Research Team then used the Labovitz and Rosansky Organizational Alignment Model to develop a solution set of recommendations to be able to consistently, strategically, and collaboratively plan together (Labovitz & Rosansky, 1997). This solution set created the foundation for a quality management systems of interdependent elements designed to align the organization and deliver the organizational behavior aspects determined critical to quality by the leadership. Figure 4 below outlines the solutions that were identified based on the model.

Figure 4: The RDE Alignment Solution Set based on the Labovitz and Rosanksy Alignment Model

The Action Research Team developed several conceptual products of a quality management system to construct a blueprint of the future state

Figure 3: Thematic Analysis of the Master FMEA data

The nine categories of risk from the thematic analyses were the following: Unclear Roles and Responsibilities, No Formal Process, No Strategy, Incorrect Skills, Lack of Collaboration, Lack of Coordinated Users, Funding Problems, Poor Communication, and Organizational Structure Problems.

Unclear Roles and Responsibilities accounted for 29% of the cumulative risk. No Formal Process accounted for 17% of the cumulative risk. No Strategy accounted for 13% ; Incorrect Skills 11%, Lack of Collaboration 11%, Lack of Coordinated Users 7%, Funding problems 4%, Poor Communication 4%, and Org. Structure Problems 3% of the cumulative risk.

An interdependency analysis of the nine issues revealed that they were all relatively equally dependent on one another suggesting that these issues were part of a larger systemic problem. In fact, the top four issues were revealed to be very interdependent. The literature suggests that in order to determine clear roles and responsibilities, processes needed to be mapped and clear roles assigned in terms of RACI charts (Frigon & Jackson, 2008; George, Rowlands, Price, & Maxey, 2005). In order to stop and process, map, and set aside the resources to accomplish this task, a strategic planning process was required. In order for all of this to happen, the right skill sets needed to exist to

Gonda and Kohnke

Organization Development Journal l Winter 201386

The Action Research Team then developed a proposed annual operating cycle by laying out the key milestones of the proposed framework over the year and included other existing milestones for key processes. The synchronization map shows what information needs to flow in what order, from process to process, in order for the framework to function effectively and efficiently.

The team then identified and developed new integrative roles to execute and manage the processes. The first priority in achieving the vision laid out by the Program Executive Office, was for the Research, Development, and Engineering Center to create a Chief Systems Engineer and supportive organization to facilitate bringing together the systems engineering and integration activities. Next, the team created the term “RDE Facilitator” to describe a new role of master planning facilitator that also included resources to collect and integrate the planning information of the different organizations and present the information to the appropriate bodies. The team then identified the need for an RDE Alignment working group and defined this group to include the Chief Systems Engineers of the Program Executive Office and the Research, Development, and Engineering Center plus the RDE Facilitator who would also function in the role of the chair of the group. The team suggested that the Deputy Executives of the Program Executive Office and the Research, Development, and Engineering Center come together to form an RDE Alignment Executive Steering group and resolve issues and provide guidance to the working group. Finally, the team suggested that there be an RDE Alignment Executive Board that essentially consisted of the executives at the LCMC leadership level, at least the Program Executive Office’s and the Research, Development, and Engineering

based on this solution set. The products were:• RDE Alignment Process Framework• Roles and Responsibilities Map for the

Framework• RDE Alignment Synchronized Time Map

(Annual Operating Cycle)• New Integrative Roles and Responsibilities

(including Governance)• Implementation and Success Metrics

The Action Research Team analyzed the business processes of the PMs and those found during the ‘best practices’ research that a ‘determined a generic collaborative business process’ was evident in the data. The team created a future state collaborative planning process with the following steps.

1. Capture and vet requirements2. Develop actionable prioritized needs3. Strategically plan4. Develop project plans5. Execute, monitor, and control projects

The team outlined detailed steps to three levels for each of the steps and created a Roles and Responsibility Matrix (RACI chart). Figure 5 below shows the Process steps down to the second level and specifies who is Responsible, Accountable, Consulted, and Informed (RACI) for each step.

Figure 5: The Role and Responsibility Matrix (RACI Chart)

87

of the blueprint, and assembled a presentation for the LCMC Executive Committee to close out the project. The project was briefed to the Executive Committee and it approved the project and concurred that the RDEC Director was the process owner and implementer for the LCMC.

Discussion

While the analytical framework in Lean Six Sigma (LSS) is designed for improving existing stable processes (George, Rowlands, Price, & Maxey, 2005), the Action Research team was able to effectively use the LSS framework and a variety of tools to perform key organization development tasks required to establish a new organizational design.

Key Lessons LearnedThe hierarchical training and mentoring

model within LSS proved effective for developing change agents. This was true for both the Green and Black Belts trained and the leadership who attended specialized executive leadership training. The Champion and Sponsor roles proved effective in establishing empowered leaders who could drive aspects of the intervention. In this case study, the LSS Black Belt Team Leader model was very effective in creating the conditions necessary to transform a technology researcher into a participant action researcher who was supported in leading a team that included chief engineers of higher organizational rank.

The toolsets used were very effective in the organization development tasks of gathering and refining requirements, gathering organizational information to perform diagnostics, and in the diagnosis of the key problems.

Center and the Commanding General. Finally, the team developed leading and

lagging metrics for the new system. The leading metrics measured whether initiatives were being put in place to create the pieces of the blue print, such as creating a chief systems engineer and supportive group in the RDEC and establishing the other new roles and creating the framework. Next, metrics were defined to measure whether the new processes were delivering expected results. The team then recommended some basic portfolio alignment metrics. Finally, the team recommended that the overall RDE Alignment metric of success be a measurement of how many collaborative agreements were being executed on time and within budget.

As a final step, the Action Research Team assembled a subset of subject matter experts who went back to the original FMEA and rescored the top eight issues/risks and projected an 84% reduction in risk if the new blueprint was implemented.

Control PhaseThe purpose of the Control Phase was to

lay the foundation for a successful implementation phase. The intervention thus far had developed a blueprint for creating a new aligned Command Center in the area of technology planning and insertion. However, in order for alignment to occur, the blueprint elements needed process owners who could take ownership of each piece and go forward and build and implement and refine the concepts into reality. Then the governance structure needed to be built to guide the process of tying all of the elements together into an aligned new Command Center. All of this required a new phase of implementation. The team developed an implementation schedule, proposed process owners for different elements

Gonda and Kohnke

Organization Development Journal l Winter 201388

be able to be completed within a reasonable time frame. For Green Belts, this is usually around two to four months and for Black Belt Projects this is approximately six to eight months (George, Rowlands, Price, & Maxey, 2005). At the end of these projects, Green and Black Belts are awarded to the persons who demonstrated leadership and proficiency in the process and tools. In this way, continuous improvement occurs alongside the development of knowledgeable experienced change agents who progress in proficiency to become mentors within the organization.

This intervention was beyond the scope of a Black Belt project as it lasted eighteen months and was designed to create a new process (and hence could not demonstrate actual improvements at the end of the project). While the objectives of the intervention were met, there is no mechanism for rewarding Black Belts and Green Belts within the Army’s defined LSS deployment policies for this type of project. Without a mechanism to reward the participants of such a large complex project, enthusiasm waned temporarily for further deployment efforts. If the Army had a formal Design for Lean Six Sigma deployment, this project would have fit the criteria and this challenge could been avoided.

Yet the scope of the project was still a problem and this pattern of choosing projects that were too large to complete in a timely manner ultimately resulted in the stalling of the establishment of the LCMC. The LCMC had established the Executive Committee itself as the LSS Review Board for the efforts launched to address establishing the LCMC operating construct. This Review Board was not well trained in organization development interventions and the leaders did not bring in outside consultants to help them through the process. As a result, each

The Action Research Team and participants found that conducting a failure modes and effects analysis (FMEA) on transactional processes is very tedious, but the discussions that occurred while coming to consensus proved very valuable. The Action Research Team learned that this exercise was an exceptional method of aligning the thought processes and perceptions of a group of people from different backgrounds and perceptions.

Performing a thematic analysis on the master FMEA (that combined the process FMEA and the Cause and Effect data) allowed for a breakthrough in understanding the systemic nature of the problems facing the LCMC. By performing a thematic analysis and using the cumulative risk to rank the themes in terms of risk priority, the Action Research Team was able to see a pattern emerge. When the team performed an interdependency analysis on the themes, it became clear that a quality management system was needed to address the themes individually and holistically.

The Action Research Team found scoring matrices and interdependency analysis to be generally valuable companion tools during the Affinity processes of the Define Phase and in the thematic analysis of the Cause and Effect and FMEA exercises. The lesson learned was that qualitative data collected can be looked at through the lens of scoring criteria and interdependency to search for patterns and priority of importance of data.

ChallengesThere were several challenges with using

the LSS and DMAIC approach for this intervention. An important aspect of the LSS leader development and strategic change process is in the selection of projects. Traditional projects are meant to be small, have quantifiable improvements, and should

89

and processes necessary to define the new quality management system.

Conclusion

In today’s challenging economic times, leaders are faced with ensuring the sustainable competitive edge of their organizations while at the same time having to do this with fewer resources—both human and capital. All types of organizations have been deeply impacted by the economy whether they are for-profit, nonprofit or in the government sector and are forced to evaluate the effectiveness and efficiencies within their own operating environments. Organization development has proven itself effective as leaders and managers attempt to build collaborative teams, productive workplaces, and sustainable organizations.

This study demonstrates that care must be taken to properly scope interventions and communicate effectively throughout the entire intervention. Asking what conversations need to happen is critical at the start of any complex intervention. There are a variety of effective models and tools to aid communication that are available to organization development practitioners including what Dannemiller Tyson Associates (2000) call “converge – diverge” in their action learning model. This model includes questions after each step of the process to ensure the right people have the right conversations to ensure wholeness of the system.

A more recent framework that has been developed is called the SOAR Framework. SOAR stands for strengths, opportunities, aspirations, and results. It is intended to be a “positive approach to strategic thinking and planning that allows organizations to construct its future through collaboration, shared understanding and a

of the efforts was too large and ill-defined and the teams and the Review Board became frustrated with the pace of the progress and eventually stopped the formal deployment after two years. Of the original six interventions, only RDE Alignment and one other project were closed out and successfully moved into an implementation phase. RDE Alignment was also the only project that had dedicated, full-time resources that were brought in from outside the organization. One of the efforts was later resurrected with a full time resource brought in from the outside and it was brought to a successful completion.

Finally, one of the most important lessons learned was in the area of communication and buy-in for the solution. The Action Research Team utilized the Core Team for many of the exercises, but once the Improvement Phase began they performed much of the work independently. Because the resulting future state was such a departure from the current state, it proved difficult to communicate a clear vision to the core team and to the leadership. In order to achieve buy-in and clarity by the Core Team and the Executive Steering Group, the Action Research Team determined that they should have designed exercises within the Improvement Phase to allow these groups to better take ownership and refine the concept.

The results of this study were used by the Director of the RDEC to shape a transformation strategy designed to significantly alter the RDEC to meet the intent of the blueprint. The Black Belt Lead was transitioned to the RDEC to continue the participant observer action research effort by leading the implementation phase. PEO A and PEO B immediately began holding communication “summits” with the RDEC that aligned topics with the proposed annual operating cycle and each organization began building the necessary roles

Gonda and Kohnke

Organization Development Journal l Winter 201390

References

Anderson, D., & Anderson, L. (2001). Beyond change management: Advanced strategies for today’s transformation leaders. San Francisco: Jossey-Bass/Pfeiffer.

Caffrey, N., & Medina, C. (2011). Doomed to Fail. Quality Progress 44(1) , 40-45.

Carter, A. (2010, Sept. 14). Better Buying Power: Guidance for Obtaining Greater Efficiency and Productivity in. Retrieved Feb. 20, 2012, from ACQWeb - Offices of the Under Secretary of Defense for Acquisition, Technology and Logistics: http://www.acq.osd.mil/docs/USD_ATL_Guidance_Memo_September_14_2010_FINAL.PDF

Cooperrider, D., Sorensen, P., Yaeger, T., & Whitney, D. (2001). Appreciative inquiry: Rethinking human organization toward a positive theory of change. Champaign, IL: Stipes Publishing.

Dannemiller Tyson Associates. (2000). Whole-Scale Change: Unleashing the Magic in Organizations. San Francisco: Berrett-Koehler Publishers, Inc.

Dawson, P. (2003). Understanding organizational change: The contemporary experience of people at work. London: Sage.

Frahm, J., & Brown, K. (2007). First Steps: linking change communication to change receptivity. Journal of Organizational Change, 370-387.

commitment to action” (Stavros, 2013; Stavros and Hinrichs, 2009, p.3). A valuable attribute of SOAR is that it “nurtures a culture of strategic learning and leadership by building a widespread appreciative intelligence—the capability for high performance, creativity and innovation in people by reframing the present view, appreciating the positive possibilities in any situation and envisioning how the future unfolds from the present moment.” (Stavros, 2013). As mentioned, the Review Board became frustrated with the pace of the progress and eventually stopped the formal deployment of the project due to ill-defined and outsized efforts. By using the SOAR framework and the potential to energize the organization to leverage their strengths, the focus of the dialogue and project milestones could have been vastly different.

Change models and diagnostic instruments have been instrumental in aiding change agents through the process of creating new visions and strategic plans that allow organizations to think and collaborate in new ways. Carefully choosing the framework and obtaining the skill sets necessary for successful facilitation throughout the change intervention is key. This study demonstrates that by effectively employing and adapting a structured change management framework and tools, the teams were able to create a new vision, blueprint, strategic plan, and implementation plan to build a collaborative and efficient virtual business organization and Command Center. The results helped four separate organizations within the United States Army to identify several issues and implement the structural changes and new processes that were required in order for this transformation to a virtual business organization to succeed.

~~~~~~~~~~

91

Harvard Business Review Press.

Labovitz, G., & Rosansky, V. (1997). The power of alignment: how great companies stay centered and accomplish extraordinary things. New York: John Wiley & Sons, Inc.

Leppitt, N. (2006). Challenging the code of change: Part 1. Praxis does not make perfect. Journal of Change Management. 6(2):121-142.

Light, P.C. (2005). The four pillars of high performance: How robust organizations achieve extraordinary results. New York: McGraw-Hill.

Mento, A.J., Jones, R.M., & Dirndorfer, W. (2002). A change management process: Grounded in both theory and practice. Journal of Change Management 3(1):45-69.

Msanjila, S., & Afsarmanesh, H. (2008). Trust analysis and assessment in virtual organization breeding environments. International Journal of Production Research 46(5) , 1253-1295.

Obama, B. (2012, Jan 5). President Obama Speaks on the Defense Strategic Review. Retrieved Feb. 20, 2012, from The White House: http://www.whitehouse.gov/photos-and- video/video/2012/01/05/president-obama-speaks-defense-strategic-review#transcript

Office of the Under Secretary of Defense for Acquisition, Technology, and Logistics. (2003). Manager’s Guide to Technology Transition in an Evolutionary Acquisition

Frigon, N. L., & Jackson, H. K. (2008). Enterprise Excellence: A Practical Guide to World Class Competition. Hoboken, NJ: John Wiley & Sons, Inc.

George, M. (2003). Lean Six Sigma for Service. New York: McGraw Hill.

George, M., Rowlands, D., Price, M., & Maxey, J. (2005). The Lean Six Sigma Toolbook. New York: McGraw Hill.

Gillespie, N., & Dietz, G. (2009). Trust Repair after an Organization-level failure. Academy of Management Review 34(1) , 127-145.

Kern, P. J., & Bolton, C. M. (2004, August 3). Memorandum of Agreement Between The Assistant Secretary of the Army for Acquisition, Logistics and Technology and The Commander, U.S. Army Materiel Command. Subject: Life-Cycle Management (LCM) Initiative . Department of the Army. Retrieved from https://acc.dau.mil/CommunityBrowser.aspx?id=32646 on 22 August 2011

Kirkpatrick, D.L. (2001). Managing change effectively: Approaches, methods, and case examples. Boston: Butterworth Heinemann.

Kotter, J.P. (1995). Leading change: Why transformation efforts fail. Harvard Business Review 73(2):59-67.

Kotter, J. P., & Cohen, D. S. (2002). The Heart of Change: Real-Life Stories of How People Change Their Organizations. Boston, MA:

Gonda and Kohnke

Organization Development Journal l Winter 201392

Waterman, R., Peters, T., & Phillips, J. (1980). Structure is not organizations. Business Horizons (June) , 14-26.

cld

Environment V1.0. Washington, DC: Office of the Under Secretary of Defense for Acquisition, Technology, and Logistics.

Palmer, I., Dunford, R., & Akin, G. (2009). Managing Organizational Change: A Multiple Perspectives Approach 2nd Ed. New York: McGraw-Hill/Irwin.

Pendlebury, J., Grouard, B., & Meston, F. (1998). The ten keys to successful change management. England: John Wiley & Sons, Ltd.

Pettigrew, A., & Whipp, R. (1991). Managing change for competitive success. Oxford: Blackwell.

Quinn, R.E. (1996). Deep change: Discovering the leader within. San Francisco: Jossey-Bass.

Senge, P., Kleiner, A. R., Ross, R., Roth, G., & Smith, B. (1999). The Dance of Change. New York: Doubleday.

Stavros, J. M. (2013). The Generative Nature of SOAR: Applications, Results and the New SOAR Profile. AI Practitioner 15(3), 7-30.

Stavros J.M. & Hinrichs, G. (2009). Thin Book of SOAR: Building Strengths-based Strategy. Bend, OR: Thin Book Publishers.

Van de Ven, A., & Sun, K. (2011). Breakdowns in implementing models of organizational change. Academy of Management Perspectives August , 58-74.

Copyright of Organization Development Journal is the property of Organization DevelopmentInstitute and its content may not be copied or emailed to multiple sites or posted to a listservwithout the copyright holder's express written permission. However, users may print,download, or email articles for individual use.