system development planning using readiness levels in...

13
System Development Planning Using Readiness Levels in a Cost of Development Minimization Model Romulo B. Magnaye, Brian J. Sauser,* and Jose E. Ramirez-Marquez School of Systems and Enterprises, Systems Development & Maturity Laboratory, Stevens Institute of Technology, Hoboken, NJ 07030 USING READINESS LEVELS IN A COST OF DEVELOPMENT MINIMIZATION MODEL Received 10 December 2008; Revised 7 March 2009; Accepted 4 June 2009, after one or more revisions Published online 27 August 2009 in Wiley Online Library (wileyonlinelibrary.com) DOI 10.1002/sys.20151 ABSTRACT The purpose of this paper is to describe a methodology that enables systems engineers or program managers to formulate a system-wide optimal development plan in order to facilitate monitoring and evaluation of the development process in terms of the actual readiness of the system. It proposes the use of the System Readiness Level (SRL) scale to measure the maturity of a system composed of technological elements, which are being developed based on cost-driven strategies (as opposed to time-to-market- driven strategies). A constrained optimization model is used to identify which critical technology elements and integration links can be matured to which levels at a particular time such that the development costs are minimized while a targeted SRL value is attained on schedule. While the algorithm does not necessarily reduce the total cost of development, it is able to find the minimal amount of expenditures which will achieve the desired maturity level during the earlier part of the development process. Essentially, it shifts the rest of the expenditures towards the latter part of the development during which model-based integration and testing procedures which have been suggested by others can be applied to reduce costs and time further. This can be a useful option to have when there is substantial uncertainty with the system due to its high novelty and technological content. As a value added approach, the model gives the systems engineer the information needed to understand what a delay may mean to development, so a more informed decision can be made. The paper concludes by discussing the possible use of the optimal development plan to monitor and control the progress of systems under development. © 2009 Wiley Periodicals, Inc. Syst Eng 13: 311–323, 2010 Key words: integration readiness level; probabilistic solution discovery algorithm; system readiness level; technology readiness level 1. INTRODUCTION In recent years, systems engineering has witnessed a steady flow of guidelines as well as applied methodologies (e.g., DOD 5000 [DoD, 2005], ISO 15288 [ISO/IEC, 2008], and the INCOSE SE Handbook [INCOSE, 2008]) which describe interrelated and iterative processes applied to the various functions or phases of system development. This proliferation has also revealed that the literature of systems engineering has *Author to whom all correspondence should be addressed (e-mail: [email protected]; [email protected]; [email protected]). Contract grant sponsors: Naval Postgraduate School (Contract # N00244-08- 000), Northrop Grumman Integrated Systems, and the U.S. Army Armament Research Development and Engineering Center Systems Engineering Vol 13, No. 4, 2010 © 2009 Wiley Periodicals, Inc. 311 Regular Paper

Upload: dinhkhue

Post on 06-Feb-2018

220 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: System development planning using readiness levels in …personal.stevens.edu/~bsauser/SysDML/Evolution_Lifecylce_Managem… · System Development Planning Using Readiness Levels

System Development Planning UsingReadiness Levels in a Cost of DevelopmentMinimization ModelRomulo B. Magnaye, Brian J. Sauser,* and Jose E. Ramirez-Marquez

School of Systems and Enterprises, Systems Development & Maturity Laboratory, Stevens Institute of Technology, Hoboken, NJ 07030USING READINESS LEVELS IN A COST OF DEVELOPMENT MINIMIZATION MODEL

Received 10 December 2008; Revised 7 March 2009; Accepted 4 June 2009, after one or more revisionsPublished online 27 August 2009 in Wiley Online Library (wileyonlinelibrary.com)DOI 10.1002/sys.20151

ABSTRACT

The purpose of this paper is to describe a methodology that enables systems engineers or programmanagers to formulate a system-wide optimal development plan in order to facilitate monitoring andevaluation of the development process in terms of the actual readiness of the system. It proposes the useof the System Readiness Level (SRL) scale to measure the maturity of a system composed of technologicalelements, which are being developed based on cost-driven strategies (as opposed to time-to-market-driven strategies). A constrained optimization model is used to identify which critical technology elementsand integration links can be matured to which levels at a particular time such that the development costsare minimized while a targeted SRL value is attained on schedule. While the algorithm does not necessarilyreduce the total cost of development, it is able to find the minimal amount of expenditures which willachieve the desired maturity level during the earlier part of the development process. Essentially, it shiftsthe rest of the expenditures towards the latter part of the development during which model-basedintegration and testing procedures which have been suggested by others can be applied to reduce costsand time further. This can be a useful option to have when there is substantial uncertainty with the systemdue to its high novelty and technological content. As a value added approach, the model gives the systemsengineer the information needed to understand what a delay may mean to development, so a moreinformed decision can be made. The paper concludes by discussing the possible use of the optimaldevelopment plan to monitor and control the progress of systems under development. © 2009 WileyPeriodicals, Inc. Syst Eng 13: 311–323, 2010

Key words: integration readiness level; probabilistic solution discovery algorithm; system readiness level;technology readiness level

1. INTRODUCTION

In recent years, systems engineering has witnessed a steadyflow of guidelines as well as applied methodologies (e.g.,DOD 5000 [DoD, 2005], ISO 15288 [ISO/IEC, 2008], andthe INCOSE SE Handbook [INCOSE, 2008]) which describeinterrelated and iterative processes applied to the variousfunctions or phases of system development. This proliferationhas also revealed that the literature of systems engineering has

*Author to whom all correspondence should be addressed (e-mail:[email protected]; [email protected]; [email protected]).

Contract grant sponsors: Naval Postgraduate School (Contract # N00244-08-000), Northrop Grumman Integrated Systems, and the U.S. Army ArmamentResearch Development and Engineering Center

Systems Engineering Vol 13, No. 4, 2010© 2009 Wiley Periodicals, Inc.

311

Regular Paper

Page 2: System development planning using readiness levels in …personal.stevens.edu/~bsauser/SysDML/Evolution_Lifecylce_Managem… · System Development Planning Using Readiness Levels

shown some convergence in what constitutes systems engi-neering [Bahill and Briggs, 2001; Bahill and Gissing, 1998;Daniels, Werner, and Bahill, 2001; Friedman and Sage, 2004;Shenhar and Sauser, 2008; Shenhar, 1999]. Even with anunderstanding of the processes and activities of systems en-gineering, there is still an issue with “what gets measured iswhat gets done.” Therefore, the planning, monitoring, andevaluation of the system’s development process and lifecyclebecome important to assessing the technical maturity of asystem and its major subsystems, monitoring progress of theactual system development, clarifying progress against plans,providing insight into process activities, and verifying thatrequirements have been correctly met in realizing the systemof interest.

Thus, as an assurance of monitoring system developmentagainst planning, there have been equally as many method-ologies developed. For example, Quality Function Deploy-ment [Hauser and Clausing, 1988], which originated in theKobe shipyards of Mitsubishi and was refined by Toyota, isprescribed for managing customer requirements and Pugh’sConcept Selection Process [1991] is applicable to the inves-tigation of alternatives. Both have been modified and ex-tended through the use of Fuzzy Set Theory in order to dealwith imprecise requirements [Verma, Smith, and Fabrycky,1999]. Checklists and Taxonomies (CAT) and Input/OutputMatrices (IOM) have been employed to translate needs intocustomer requirements—in both qualitative and quantitativeterms. Kasser [2004]suggested the First Requirements Eluci-dator Demonstration (FRED) tool to prevent the productionof poorly written requirements and to minimize the impact onsystems testing of the deficient requirements definitions. TheIntegrated Design Model [Vollerthun, 2002] links conceptualdesign, cost, and market considerations during the definitionand modeling of a proposed system. Shell [2003] applied anextended form of the Subsystem Tradeoff Functional Equa-tion to the design of complex systems architecture. DesignRules, Design for Assembly, and Design for Producibility,which together serve as the foundation for Design for Manu-facturability [Whitney, 1988], are relevant to several phases,such as the investigation of alternatives, modeling of thesystem, and integration. The Design-Build-Test Cycle [Clarkand Fujimoto, 1991] and Periodic Prototyping [Wheelwrightand Clark, 1992] are applicable to system modeling andintegration. Cost as an Independent Variable (CAIV) is usefulin establishing an aggressive but realistic manufacturing costtarget through its application to system definition, require-ments management and system optimization [Brady, 2001].To organize the development work as efficiently as possiblewhile safeguarding the program quality goals, Oppenheim[2004] proposed the use of the Lean Product DevelopmentFlow (LPDF). It applies to systems development the leanprinciples that were developed by Toyota and documented byWomack, Roos, and Jones [1990].

These examples of methodologies and tools represent themany techniques to manage the development of systems.However, one area where the proliferation of methodologieshas been minimal is in the planning, monitoring, and evalu-ation of the overall development process. Unfortunately, mostof the methodologies and tools in this area have focused onthe dimensions of cost and schedule (e.g., GANTT charts,

PERT/CPM, Technical Performance Measures, Earned ValueManagement [Barr, 1996; Brandon, 1998]), with minimalattention to developing methods and tools for monitoring andevaluating a system’s developmental maturity against a pre-scribed development plan. Monitoring progress in terms ofcosts and schedules may not effectively determine if thedevelopment strategy for the system is being met. That is, thesystems engineers need more information since they are usu-ally focused on not just the cost and schedule but moresignificantly on other development strategies such as maxi-mizing the level of system developmental maturity given abudgetary constraint, minimizing the cost of developmentwhile achieving a targeted level of system maturity, or maxi-mizing demonstrable system performance for a specific de-velopment period.

To put together a development plan that can meet theseother development strategies, the systems engineer has tounderstand which components of the system should be ad-vanced to which levels of maturity and when. The optimaldevelopment path (which is most likely different from theproject’s critical path) must be identified such that the devel-opment strategy can be implemented. Achieving this capabil-ity will enable the systems engineer to allocate the availableresources more effectively throughout the entire developmentlifecycle. It can also facilitate a more structured approach tomonitor and control the development process since there isnow a plan against which performance can be measured andmonitored. However, before this can be done, developmentmetrics which can measure maturity or readiness must first beestablished.

This paper proposes the use of a system maturity scale—the System Readiness Level or SRL [Sauser et al., 2008b,2008c] to formulate an optimal system development planwhere measures of developmental maturity drive the identifi-cation of the optimal path. We then present a constrainedoptimization model (i.e., SCODmin) to identify which technol-ogy elements and integrations can be matured within a systemsuch that the development costs are minimized while a tar-geted SRL value is attained within a specified timeframe.While the algorithm for this model does not necessarilyreduce the total cost of development, it is able to find theminimal amount of expenditures that will achieve the desiredmaturity level at certain points in time throughout the devel-opment process. We conclude by discussing the possible useof the optimal development plan to monitor and control theprogress of a system under development. Such a plan canfacilitate monitoring and evaluation of the development proc-ess in terms of the actual readiness of the system. The abilityto identify an optimal development plan is a significant steptowards establishing a program or system monitoring andevaluation tool that captures the status of the system in termsof its readiness or maturity.

2. SYSTEM DEVELOPMENT METRICS

2.1. Readiness of Technology Elements

For the last 20 years the National Aeronautics and SpaceAdministration (NASA) has used some variation of the Tech-

312 MAGNAYE, SAUSER, AND RAMIREZ-MARQUEZ

Systems Engineering DOI 10.1002/sys

Page 3: System development planning using readiness levels in …personal.stevens.edu/~bsauser/SysDML/Evolution_Lifecylce_Managem… · System Development Planning Using Readiness Levels

nology Readiness Level (TRL) scale [Mankins, 1995, 2002;Sadin, Povinelli, and Rosen, 1989] (see Table I) to assess thematurity of critical technology elements. More recently theUS Department of Defense (DOD), Department of Energy(DOE), and other government agencies as well as their civil-ian contractors have embraced and found new ways of utiliz-ing TRL. Unfortunately, TRL has begun to develop into ametric which is being taken out of its original context. NASAfirst started to use TRL to determine which technologies toinvest in during the post-Apollo era when budgets for spaceexploration were being cut. Nowadays, TRL is being used bysome organizations to make advanced decisions beyond itsoriginal design. TRL does not provide a total representationof what difficulty may occur while integrating technologiesor subsystems into an operational system [Dowling and Par-

doe, 2005; Mankins, 2002; Meystel et al., 2003; Smith, 2005;Valerdi and Kohl, 2004]. It provides no guidance as to theuncertainty that would be expected in moving through thematuration of TRL [Cundiff, 2003; Dowling and Pardoe,2005; Mankins, 2002; Moorehouse, 2001; Shishko, Ebbeler,and Fox, 2003; Smith, 2005], and it does not incorporate anycomparative analysis techniques for alternative TRLs[Cundiff, 2003; Dowling and Pardoe, 2005; Mankins, 2002;Smith, 2005; Valerdi and Kohl, 2004]. Most important tounderstanding the applications and limitations of TRL is thatit has only been applied to the evaluation of technology andwas never intended to measure the maturity of integration[Mandelbaum, 2008; Masterson and Wimberly, 2007; Moore-house, 2001; Valerdi and Kohl, 2004].

Table I. Technology Readiness Levels

USING READINESS LEVELS IN A COST OF DEVELOPMENT MINIMIZATION MODEL 313

Systems Engineering DOI 10.1002/sys

Page 4: System development planning using readiness levels in …personal.stevens.edu/~bsauser/SysDML/Evolution_Lifecylce_Managem… · System Development Planning Using Readiness Levels

2.2. Readiness of Integration Elements

Recognizing these shortcomings is important, consideringthat in systems development the proper maturation of theintegrations can be as relevant as the development of thetechnologies themselves [Clark and Fujimoto, 1991; Hender-son and Clark, 1990]. Integration elements tend to be multi-dimensional and can be more complicated than thetechnologies. Jain et al. [2008] describe that the increasingcomplexity of the systems integration process has been due inlarge part to the exponential development of technology andincrease in user demands. Currently, most efforts towardssystem integration are focused on the product and not on theprocess. This leads to inadequate analyses when decisionsregarding integration are made. Specifically, interoperabilitymust be emphasized early in the process because it contributesgreatly to the difficulty of integrating the components into asystem.

To address the issue of integration, there have been somenotable efforts to develop metrics which can be used tomeasure the progress of the integration elements. The Integra-tion Technology Analysis Methodology was proposed byMankins [2002] and was used to formulate an IntegrationTechnology Index (ITI). The Ministry of Defence in theUnited Kingdom also developed the Integration MaturityLevel (IML) as part of an overall Technology Insertion Metric[Dowling and Pardoe, 2005].

However, in a study of complex systems, such as the MarsClimate Orbiter, Ariane 5, and the Hubble Space Telescope,it was demonstrated that these approaches to integrationmeasurement would not have adequately identified the rootcauses of development risks and thus avoiding failure of thesesystems [Gove, 2007]. To improve on these integration meas-ures, an Integration Readiness Level (IRL) scale was intro-duced by Sauser, Verma, and Ramirez-Marquez [2006] andlater refined and expanded by Gove [2007]. A summary ispresented in Table II. This revised scale was applied to thecomplex systems mentioned above and was found to be betterat identifying inadequate levels of integration maturity as wellas specific areas of development which, when properly ad-dressed, could have prevented failure of these systems.

As a practical matter, evaluating integration links using anIRL reinforces a systemic view which can moderate thetendency to overemphasize efforts to develop technologies atthe expense of the integrations. This is particularly relevantduring the earlier stages of development—before a technol-ogy reaches a TRL of 6—when system considerations maynot yet be addressed fully.

2.3. System Readiness

To evaluate the maturity of the whole system under develop-ment, the TRL and IRL was used by Sauser et al. [2008b,2008c] to formulate a System Readiness Level (SRL) scale.The assumption here is that the subjectivity of TRL and IRLhave been mitigated through the use of guidelines [Sauser etal., 2009], standardized assessment procedures and/or bystatistical tools such as the one suggested by Tan, Ramirez-Marquez, and Sauser [2009] which uses all the evaluationsfrom as many stakeholders as feasible in order to generate

more statistically robust TRL and IRL values. Thus, ordinallabels of TRL and IRL have been transformed by the technol-ogy managers and the system engineer into numerical valuesand the SRL scale can only be as meaningful as the objective-ness and accuracy of the TRL and IRL numbers.

The SRL scale is calculated by using a normalized matrixof pair-wise comparisons of TRLs and IRLs that reflects theactual architecture of the system. Briefly stated, the IRLmatrix is obtained as a symmetric square matrix (of size n ×n) of all possible integrations between any two technologiesin the system. For technology integration to itself, perfectintegration is assumed (IRL = 9) while an IRL of zero is usedwhen there is no integration between two elements. On theother hand, the vector TRL defines the readiness level of eachof the technologies in the system. The calculation of the SRLhas also gone through a series of refinements and the mostrecent thorough discussion has been presented by Sauser et al[2008a]. In its current form, the SRL is calculated as

[SRL] =

SRL1SRL2

⋅ ⋅ ⋅SRLn

=

IRL11TRL1 + IRL12TRL2 + ⋅ ⋅ ⋅ + IRL1nTRLn

IRL21TRL1 + IRL22TRL2 + ⋅ ⋅ ⋅ + IRL2nTRLn

⋅ ⋅ ⋅ IRLn1TRL1 + IRLn2TRL2 + ⋅ ⋅ ⋅ + IRLnnTRLn

,

(1)where IRLij = IRLji, and

[SRL] =

SRL1

n1 +

SRL2

n2 + ⋅ ⋅ ⋅ +

SRLn

nn

n , (2)

where ni is the number of integrations with technology i plusits integration to itself.

The resulting SRL metric can be used to determine thematurity of a system and its status within the developmentallifecycle. Table III, for example, is a representation of how theSRL scale correlates to a systems engineering lifecycle. Thesevalues of the SRL scale shown in Table III are meant to beorganizational generic examples of how the calculated SRLvalues can be set as a guide by a systems engineer orprogram manager. That is, in practice the systems engineeror program manager at the outset must determine whatvalues of the SRL correlate to when one phase begins andends. A calibration of these relevant ranges for each phaseof system development will have to be program-specific or,at best, pertinent only to a particular class of systems whichshare a large degree of similarity. Therefore, the SRL valueof a system can only be compared to that of the same systemor a similar system.

The approach to estimating SRL is similar to the one usedin Failure Mode Effects and Criticality Analysis or FMECA[Becker and Flick, 1996; Deb et al., 1998], where an ordinaldatum, Severity, is transformed into a numerical value andcombined with the probabilities of Occurrence (O) and De-tection (D) to generate a Risk Priority Number (RPN).FMECA is used successfully in a wide range of applicationssuch as the semiconductor, healthcare and motor vehicleindustries.

314 MAGNAYE, SAUSER, AND RAMIREZ-MARQUEZ

Systems Engineering DOI 10.1002/sys

Page 5: System development planning using readiness levels in …personal.stevens.edu/~bsauser/SysDML/Evolution_Lifecylce_Managem… · System Development Planning Using Readiness Levels

Table II. Integration Readiness Levels [Gove, 2007]

Table III. System Readiness Levels

NOTE: These ranges have been derived conceptually and are undergoing field verification and validation under Naval Postgraduate School Contract # N00244-08-0005.

USING READINESS LEVELS IN A COST OF DEVELOPMENT MINIMIZATION MODEL 315

Systems Engineering DOI 10.1002/sys

Page 6: System development planning using readiness levels in …personal.stevens.edu/~bsauser/SysDML/Evolution_Lifecylce_Managem… · System Development Planning Using Readiness Levels

Transforming ordinal data into numbers is also done inAnalytical Hierarchy Process [Saaty, 1988], which allows theuse of subjective human judgment to determine the relativeimportance of variables used in pair-wise comparisons to findan optimal solution to a problem. Sample applications can befound in Bahurmoz [2003] and Tavana [2003].

The use of ordinal data in SRL can also be explained by awidely accepted practice in academic administration—the useof Grade Point Average or GPA—to determine the readinessof students to proceed to the next levels of development. Theperformance of a student is measured using a description oftheir accomplishments to which letter grades are awarded. Forexample, distinguished work is assigned an “A,” superiorwork is a “B,” satisfactory or average work earns a “C,” andso on. To calculate overall performance, numbers are assignedto each letter grade and paired with the corresponding aca-demic credits earned to estimate a weighted GPA. However,when comparing students relative to each other, such as in agroup of job applicants, it will only be valid as the absolutecriterion when they have the same academic major and comefrom comparably similar schools.

3. SYSTEM COST OF DEVELOPMENTMINIMIZATION MODEL

By utilizing the TRL and IRL, the SRL is designed to measurethe overall readiness of the system under development. Assuch, systems engineers or program managers can now setdevelopment goals in accordance with the system develop-ment strategy that they have formulated. This paper focuseson the cost-driven strategy [Laugen et al., 2005]. For example,a standard development strategy is to optimize the allocationof resources while attaining a certain level of system maturityor readiness within a specified time. In order to execute thedevelopment required to reach an SRL value by a certain time,it is necessary to know how to reach this level at a minimumcost. Such a development strategy for complex systems isprobably becoming more common in the government sectoras the political pressure to contain costs increases. To addressthese concerns, we are proposing Model SCODmin as anoptimization model whose objective is to minimize develop-ment cost (a function of TRL and IRL development) underconstraints associated with the required SRL value and sched-ule. This model recognizes that technologies compete forresources and that benefits can result in an improved SRL viathe optimal allocation of such resources. The general mathe-matical form of Model SCODmin follows:

Minimize: SCOD (TRL, IRL) = SCODfixed + SCODvariable (TRL, IRL), (3)

Subject to: SRL(TRL, IRL) ≥ λ, (4)

R1 (TRL, IRL) ≤ r1(5)

I

Rh (TRL, IRL) ≤ rh. (6)

In addition to the SRL and time or schedule constraints, otherpossible constraints could be technical performance parame-ters such as equivalent mass for space systems, peak loadcapacities for transportation, and so on.

The matrices IRL and TRL in Model SCODmin containthe decision variables. Each of these variables are integervalued and bounded by (IRLi, 9) and (TRLi, 9), respectively.That is, the TRL/IRL for the ith component cannot be belowits current level or above perfect technology or integrationdevelopment (IRL or TRL = 9).

To completely characterize the decision variables in ModelSCODmin, it is necessary to introduce the following transfor-mation:

yik =

1 if TRKi = k,0 otherwise

(7)

and

xijk =

1 if IRLij = k,0 otherwise,

for k = 1, … , 9.(8)

Notice that based on these binary variables, each of thepossible normalized TRL and IRL in the system can beobtained as

TRLi =

∑ k=1

9

kyik

9 and IRLij =

∑ k=1

9

kxijk

9.

Based on these binary variables SRLi is transformed to

SRLi =

k=1

9

kxi1k

k=1

9

ky1k

81 +

k=1

9

kxi2k

k=1

9

ky2k

81

+ ⋅ ⋅ ⋅ ⋅ +

k=1

9

kxijk

k=1

9

kyjk

81 + ⋅ ⋅ ⋅ ⋅ +

k=1

9

kxink

k=1

9

kynk

81

=

∑ j=1

n

k=1

9

kxijk

k=1

9

kyjk

81 .

(9)

Based on the computation of the SRL with these decisionvariables, Model SCODmin belongs to the class of binary,integer-valued, nonlinear problems. For a system with n tech-nologies containing m [m ≤ (n – 1)n/2] distinct integrations,and assuming all technologies and integrations are at theirlowest levels, there are 9n + m potential solutions to ModelSCODmin. Evaluating each possible solution is prohibitive soto generate a more timely optimal solution, a meta-heuristicapproach developed by Ramirez-Marquez and Rocco [2008]was applied to the system under development that is describedbelow. This approach, called Probabilistic Solution Discovery

316 MAGNAYE, SAUSER, AND RAMIREZ-MARQUEZ

Systems Engineering DOI 10.1002/sys

Page 7: System development planning using readiness levels in …personal.stevens.edu/~bsauser/SysDML/Evolution_Lifecylce_Managem… · System Development Planning Using Readiness Levels

Algorithm (PSDA), has the capability of producing quasiop-timal solutions in a relatively short period of time. However,it must be mentioned that the results cannot be proven to bethe optimal solution. This is because by taking a probabilisticapproach, the algorithm can only select subsets of the entirefeasible set from which to find a solution. Every time thealgorithm is run, a different subset is selected. Nevertheless,prior tests have indicated that PSDA results tend to be betterthan results from alternative meta-heuristic approaches[Ramirez-Marquez and Rocco, 2007].

4. DESCRIPTION OF THE ALGORITHM

As used in the solution of the minimization problem, thealgorithm follows three interrelated steps:

• Strategy Development—a Monte Carlo simulation isused to identify to what potential TRL or IRL levels thetechnologies and links can be advanced or matured.

• Analysis—each potential solution is analyzed by calcu-lating its associated cost, schedule, and SRL.

• Selection—through an evolutionary optimization tech-nique, a new optimal set of technologies and integrationlinks (with their corresponding TRLs and IRLs) arechosen (based on the cost, schedule and SRL values).

During Strategy Development, based on the probabilitiesdefined by vectors γiu and γiju, the simulation is used togenerate a specified number (defined by V) of potential de-signs, TRLu

v and IRLuv {v = 1, …, V). For each technology i,

γiuk (the kth element of vector γiu) defines the probability that,at cycle u, the TRL of such a technology will increase itscurrent readiness to level k [i.e., γiuk = P(yi

k = 1)]. Similarly,γijuk defines the probability that, at cycle u, the IRL betweenthe ith and jth technologies will increase its actual readinessto level k [i.e., γijuk = P(xij

k = 1)]. This step also contains thestopping rules of the algorithm. In essence, the first rule,which is used in this paper, dictates the algorithm to bestopped once both vector γiu and γiju can no longer be updated(i.e., all initial “appearance” probabilities are either 0 or 1).The second rule allows the user to set a specific number ofcycles. In the context of this algorithm a cycle is understoodas every time the value u is updated.

The second step, Solution Analysis, implements the ap-proach discussed in Sauser et al. [2008a] and summarized inSection 2 above to obtain the SRL, the resource (man-hours)consumption and cost of development associated with eachof the potential system design, TRLu

v and IRLuv .

The final step in the algorithm penalizes the developmentcost of the potential designs generated in cycle u wheneverthey violate the SRL and schedule requirements. The solu-tions are then ranked in increasing order of magnitude withrespect to the penalized cost. Then, the best of these solutionsis stored in set K and finally, a subset of size S of the rankedfeasible solutions, is used to update the probabilities definedby the vectors γiu and γiju. These new vectors are re-evaluatedin Step 1 to check for termination or for solution discovery.Finally, when either of the stopping rules has been met, thebest solution in set K is chosen as the optimal system design.The pseudo code of the PSDA optimization is as follows:

Inspection Strategy Probabilistic Solution Discovery Algo-rithm

Initialize: V, S, U, α, γiu, γiju, u = 1, K = ∅

Step 1 (Design Development):For v = 1, …, V,{generate a potential system design TRLu

v and IRLuv via

Monte Carlo simulation as dictated by γiu and γiju. Thatis, for each element in γiu

γiuk = P(yik = 1) and ∑

k

γiuk = 1 generate a random bi ∈ (0,1)such that:

if γiu1 ≥ bi → yi1 = 1;

else

∑ t=1

k−1

γiut < bi ≤ ∑

t=1

k

γiut ⇒ yi

k = 1, i = 1, …, n; k = 2, …, 9;

%(same procedure for γijuk )%

v = v + 1;}if (γiuk ∧ γijuk = 1 ∨ γiuk ∧ γijuk = 0 W i, j) ∨ u = U, then

Cost∗ = argminIRL,TRL ∈ K

{Cost____

(TRL, IRL)}

Stop.Else, go to step 2.

Step 2 (Design Analysis):For v = 1, ..., V, obtain:Cost(TRLu

v , IRLuv), SRL(TRLu

v , IRLuv), R1(TRLu

v , IRLuv ),

. . . , RH(TRLuv , IRLu

v)Go to Step 3.

Step 3 (Solution Discovery):For v = 1, …, V{

∆h(TRLuv , IRLu

v ) =

1 + a

Rh(TRLuv,IRLu

v) − rh

rh

, Rh(TRLu

v,IRLuv) > rh;

1, otherwise.

Where a > 0

∆SRL(TRLuv , IRLu

v ) =

1 + b

λ − SRL(TRLuv,IRLu

v)λ

, SRL(TRLu

v,IRLuv) < λ;

1, otherwise.

Where b > 0

USING READINESS LEVELS IN A COST OF DEVELOPMENT MINIMIZATION MODEL 317

Systems Engineering DOI 10.1002/sys

Page 8: System development planning using readiness levels in …personal.stevens.edu/~bsauser/SysDML/Evolution_Lifecylce_Managem… · System Development Planning Using Readiness Levels

Cost____

(TRLuv, IRLu

v) = Cost(TRLuv, IRLu

v)

× ∆SRL(TRLuv, IRLu

v) × ∏ h=1

H

∆h(TRLuv, IRLu

v);, IRLuv

}List Cost

____(TRLu

v, IRLuv) by increasing order of magnitude:

Cost____

(TRLu(1), IRLu

(1)) ≤ Cost(TRLu(2), IRLu

(2))

≤ . . . ≤ Cost____

(TRLu(V), IRLu

(V))

K = K ∪ Cost____

(TRLu(1), IRLu

(1));u → u + 1;For i, j, k, update vectors γiu and γiju, as follows:

γiuk =

∑ s=1

S

(yik | yi

k ∈ TRL(s))

S S < < V.

% same procedure for γijuk but use IRL%. Go to Step 1.

5. NOTATIONAL EXAMPLE AND RESULTS

The following example will use a simple system of six tech-nologies and seven integrations (see Fig. 1) to demonstratethe steps involved in calculating the SRL value and minimiz-ing the cost subject to constraints on system maturity andschedule. By evaluating the SRL of this system, an estimateof its actual readiness can be obtained before it is deployed.When reviewing the SRL for this system in its current state,the calculations yielded an SRL of 0.48. Referring to TableIII, this value indicates that this system should be in theTechnology Development phase, with the technologies closeto maturity (lowest TRL is 6) while integration elements arebehind, one as low as level 2 only. For the system used in thisexample, Tables IV and V present the incremental budgetaryand time requirements to mature each technology and integra-tion element from its current level to the next. For example,to mature Technology 1 from its current TRL of 8 to 9 willrequire another $900,000 and 349 labor-hours. In order tofully mature all the technologies and integration elements, anadditional $26.574 million and 19,122 labor-hours will berequired.

To further explain SCODmin, we describe a situation where,for example, management wants to increase maturity from thecurrent value of 0.48 to a level which realizes 80% of theremaining maturity (0.52 × 0.80) for a total SRL of 0.896,using a maximum of 80% of the remaining time (15,298labor-hours). In order to answer this, the PSDA cost minimi-zation model calculated a minimum additional developmentcost of $16.888 million and will require 11,309 labor-hours.Table VI summarizes the solution convergence for the last fivecycles of the algorithm.

In addition, the development plan which can achieve thisdesired SRL value of 0.896 with the least cost will be attainedif the subsystems which are based on each technology elementreach the maturity levels listed in Table VII. It shows that, ofthe six subsystems, three are ahead (SRL1, 2, 3), two are slightlybehind (SRL4, 5) and one is close to the same level (SRL6) asthat of the whole system. This insight can become useful whenthe maturity levels are associated with systems engineering

Figure 1. System concept diagram: Tech 1—Remote ManipulatorSystem (RMS); Tech 2—Special Purpose Dexterous Manipulator(SPDM); Tech 3—Electronic Control Unit (ECU); Tech 4—Autono-mous Grappling (AG); Tech 5—Autonomous Proximity Operations(APO); and Tech 6—Laser Image Detection and Radar (LIDAR).

Table IV. Estimated Incremental Cost (× 1000) and Time for Each Technology Effort

318 MAGNAYE, SAUSER, AND RAMIREZ-MARQUEZ

Systems Engineering DOI 10.1002/sys

Page 9: System development planning using readiness levels in …personal.stevens.edu/~bsauser/SysDML/Evolution_Lifecylce_Managem… · System Development Planning Using Readiness Levels

activities; hence, the spectrum of SRLi’s can indicate levelsof variation in the systems engineering activities which areneeded to mature the entire system.

Table VIII summarizes the additional results when thedesired improvements in SRL values are equivalent to 20%,40%, 60%, and 80%. Finally, Table IX indicates the develop-ment plan for each improvement scenario.

6. LIMITATIONS AND ASSUMPTIONS

It must be noted that the algorithm can only work if themanagement objectives are inherently feasible to begin with.If a prescribed objective is impossible to achieve, such aswhen minimal time or labor-hours are available, the algorithmwill not produce a solution.

It must also be stressed that the development plan, whichthe algorithm produced is only applicable to realizing thespecific development strategy which has been prescribed—tominimize costs subject to reaching a prespecified overallsystem maturity level within a prescribed period of time. Thisplan may change when a different development strategy isadopted. Furthermore, the costs which are minimized are thecosts to be incurred during a particular period (a year in ourexample) in order to reach the goal for that point in time. Thetotal or final cost of developing the system remains the sameas the original values estimated by the systems engineers.Essentially, it shifts as much as possible the balance of thebudget towards the latter part of the development process.This can be a useful option to have when there is substantialuncertainty about the system due to its high novelty andtechnological content. By delaying the expenditures to lateryears when more information becomes available, the devel-opment team can formulate more precise budget estimatesand have a better chance of finding potential savings. Forexample, with more understanding of the system components,the integration and testing methodologies proposed by Tret-mans [2007] can be applied. In a development regime char-acterized by pressures to reduce costs, this is an importantcapability for today’s program managers.

The use of TRL in the calculation of a system’s readinessmeans that the limitations associated with the subjectivity of

Table V. Estimated Incremental Cost (× 1000) and Time for Each Integration Effort

Table VI. Solution Convergence for 80% Improvement

Table VII. Subsystem and Composite SRLs

USING READINESS LEVELS IN A COST OF DEVELOPMENT MINIMIZATION MODEL 319

Systems Engineering DOI 10.1002/sys

Page 10: System development planning using readiness levels in …personal.stevens.edu/~bsauser/SysDML/Evolution_Lifecylce_Managem… · System Development Planning Using Readiness Levels

TRL are carried forward to the SRL. To have a more preciseSRL calculation, efforts must first be applied to minimize thesubjectivity of TRL as well as the IRL. This can be achievedby making sure that the description of the performance char-acteristics of a critical technology element or integration linkupon which its readiness level is based must be supported byclear evidence of actual technical performance that flowsfrom the execution of engineering plans for that component.That is, the project managers who are in-charge of developingtheir respective technology and integration elements mustshow unequivocally that the technical performance criteriawhich satisfy the description of a specific readiness value havebeen met. They must also be able to trace back such accom-plishments to the actual completion of the planned engineer-ing task. However, it must be recognized that there will beinstances where precise evaluations cannot be performed,such as when the relevant environment cannot be modeledwith adequate levels of fidelity. In such cases, then and onlythen should the project team resort to a subjective approachsuch as using TRL or IRL guides and calculators. However,to reduce subjectivity to a minimum, the use of these calcula-tors must be coupled with the application of well-establishedparameter estimation techniques.

It must be emphasized that SRL itself is not determined bylooking at the performance characteristics of the system as awhole and estimating directly its system readiness. Instead, itresults from the actual readiness of its critical technology andintegration elements. Therefore, the technical justificationsfor awarding a certain TRL or IRL value must be very clearlyunderstood and demonstrated. This is most important duringthe evaluation of the critical elements which by themselvesmay have reached their desired readiness, but may, in fact,turn out to be less mature when exposed to the dynamics oftheir actual integration into the subsystem or system to whichthey belong. Such a regression in the maturity of the criticalelements should be expected during integration since most

complex system components tend to be multidimensional andnot so predictable.

7. CONCLUSIONS AND FUTURE WORK

In addition to providing program managers and systems en-gineers with the possibility of avoiding unnecessary expendi-tures until the technologies and integration links are betterunderstood, the methodologies proposed in this paper can alsolead to more effective and efficient development of systems.The identification of a system-wide optimal development planmakes it possible to avoid performing tasks that do not nec-essarily contribute to the progress of a system, thus avoidingdelays. It also allows the program manager to monitor andevaluate the progress of the system under development interms of its readiness or maturity. Along with the data on itsstatus relative to cost and schedule performance, the systemsengineering or program manager can now have a better com-prehensive view of the system’s development. The monitoringand evaluation process may be done at various levels ofabstraction using appropriate tools. For example, the use ofexisting project management tools coupled with technologyand integration readiness assessment can allow for trackingof the progress of the individual technologies and integra-tions. The individual project reports can then be synthesizedinto a comprehensive monitoring and evaluation systemwhich can track the achieved SRL, cost, and schedule againstthe planned values. Such an approach, which we refer to asSystem Earned Readiness Management (SERM), as well asthe potential inclusion of other well-defined metrics of inter-est, particularly Manufacturing Readiness Level (MRL), arepart of the future research activities of the authors. As far asthe latter is concerned, according to the current SRL formu-lation, it is quite possible to expand its relevance by incorpo-rating other metrics that are of interest to systems engineersand program managers, such as MRL, or scales that measurereadiness to transition into deployment or facility upgrade.

Table IX. Development Plan

Table VIII. Best Solutions for Desired SRL Values

320 MAGNAYE, SAUSER, AND RAMIREZ-MARQUEZ

Systems Engineering DOI 10.1002/sys

Page 11: System development planning using readiness levels in …personal.stevens.edu/~bsauser/SysDML/Evolution_Lifecylce_Managem… · System Development Planning Using Readiness Levels

With regards to making the process and the outcome ofsystem development better, SCODmin becomes a value addedapproach which gives the systems engineer the informationneeded to understand what a delay may mean to development,so a more informed decision can be made. This approach canbe used by systems engineers or program managers as amethodology for identifying an optimal development plan inorder to better monitor and control a system under develop-ment. However, the concepts and methodologies developedhere can also have implications on the ability of systemsengineers to conduct other systems engineering activities assummarized in Table X.

These can, therefore, facilitate the development of bettersystems or products. As indicated, the use of SRL in theoptimization algorithm enables the program manager to ex-amine different development scenarios. For example, thesystem under development can be improved technically andeconomically if alternative technologies and integration ele-ments which are more mature but less capable are used toreplace those which are more capable but still experimental.Alternative system architectures can also be evaluated withgreater clarity since SRL requires the examination of thecritical technology elements, how they relate to each other andhow they are linked together as a system.

Before the proposed extensions of the research can bepursued, the IRL and SRL concepts must first be verified andvalidated using real life systems with actual data on the

resources needed to go from one readiness level to the next.The verification and validation efforts can also be used tostudy the degree of difficulty associated with advancingthrough the readiness level scales. It is anticipated that thisprocess of developing components is far from linear and thatthe marginal productivities of labor and capital are not con-stant throughout the component and system developmentlifecycles. They may also vary from one class of system to thenext. These research activities are currently underway and ifthey are successful, the optimization model can then be ap-plied to real life systems.

ACKNOWLEDGMENTS

The authors acknowledge the Naval Postgraduate School(Contract # N00244-08-000), Northrop Grumman IntegratedSystems, and the U.S. Army Armament Research Develop-ment and Engineering Center for support of this research.

REFERENCES

A.T. Bahill and C. Briggs, The systems engineering started in themiddle process: a consensus and project managers, Syst Eng 4(2)(2001), 156–167.

Table X. Relevance of Readiness Scales and Optimization to the Systems Engineering Process

USING READINESS LEVELS IN A COST OF DEVELOPMENT MINIMIZATION MODEL 321

Systems Engineering DOI 10.1002/sys

Page 12: System development planning using readiness levels in …personal.stevens.edu/~bsauser/SysDML/Evolution_Lifecylce_Managem… · System Development Planning Using Readiness Levels

A.T. Bahill and B. Gissing, Re-evaluating systems engineeringconcepts using systems thinking, IEEE Trans Syst Man CybernetPart C Appl Rev 28(4) (1998), 516–527.

M. Bahurmoz, The analytical hierarchy process at Dar Al-Hekma,Saudi Arabia, Interfaces 33(4) (2003), 70–78.

Z. Barr, Earned value analysis: A case study, Project ManagementNetwork 27(4) (1996), 31–36.

J.C. Becker and G. Flick, Practical approach to Failure Mode, Effectsand Criticality Analysis (FMECA) for computing systems, ProcHigh-Assurance Syst Eng Workshop, 1996, pp. 228–236.

J. Brady, Systems engineering cost as an independent variable, SystEng 4(4) (2001), 233–241.

D.M. Brandon, Implementing earned value easily and effectively,Project Management J 29(2) (1998), 11–19.

K.B. Clark and T. Fujimoto, Product development performance,Harvard Business School Press, Boston, 1991.

D. Cundiff, Manufacturing Readiness Levels (MRL), UnpublishedWhite Paper, 2003.

J. Daniels, P.W. Werner, and A.T. Bahill, Quantitative methods fortradeoff analyses, Syst Eng 4(3) (2001), 190–212.

S. Deb, S. Ghoshal, S. Mathur, R. Shrestha, and R. Pattipati, Mul-tisignal modeling for diagnosis, FMECA and reliability, ProcIEEE Int Conf Syst Man Cybernet 3 (1998), 3026–3031.

DoD, DoD Directive 5000.2, US Department of Defense, Washing-ton, DC, 2005.

T. Dowling and T. Pardoe, TIMPA—Technology Insertion Metrics,1, Ministry of Defence, London, UK, 2005.

G. Friedman and A.P. Sage, Case studies of systems engineering andmanagement in systems acquisition, Syst Eng 7(1) (2004), 84–97.

R. Gove, Development of an integration ontology for systems op-erational effectiveness, Masters of Science Thesis, Stevens Insti-tute of Technology, Hoboken, NJ, 2007.

J.R. Hauser and D. Clausing, The house of quality, Harvard Bus Rev3 (1988), 63–73.

R.M. Henderson and K. Clark, Architectural innovation: The recon-figuration of existing product technologies and the failure ofestablished firms, Admin Sci Quart 35(1) (1990), 9–30.

INCOSE, Systems engineering handbook, International Council onSystems Engineering, Seattle, WA, 2008.

ISO/IEC, IEE Std 15288: 2008, Systems and software engineer-ing—system life cycle processes, International Organization forStandardization/IFC-IEEE, Geneva, 2008.

R. Jain, A. Chandrasekaran, G. Elias, and R. Cloutier, Exploring theimpact of systems architecture and systems requirements onsystems integration complexity, IEEE Syst J 2(2) (2008), 209–223.

J.E. Kasser, The first requirements elucidator demonstration (FRED)tool, Syst Eng 7(3) (2004), 243–256.

B.T. Laugen, N. Acur, H. Boer, and J. Fick, Best manufacturingpractices—what do the best-performing companies do? Int JOper Prod Management 25(2) (2005), 131–150.

J. Mandelbaum, Technology readiness assessments for systems ofsystems, Technology Maturity Conference, Virginia Beach, VA,2008.

J.C. Mankins, Technology readiness levels, NASA, Houston, TX,1995.

J.C. Mankins, Approaches to strategic research and technology(R&T) analysis and road mapping, Acta Astronaut 51(1–9)(2002), 3–21.

E. Masterson and G. Wimberly, Air Force Technology ReadinessAssessment (TRA) process for major defense acquisition pro-grams, Technology Maturity Conference, Virginia Beach, VA,2007.

A. Meystel, J. Albus, E. Messina, and D. Leedom, Performancemeasures for intelligent systems: Measures of technology readi-ness, PERMIS ’03 White Paper, 2003.

D.J. Moorehouse, Detailed definitions and guidance for applicationof technology readiness levels, J Aircraft 39(1) (2001), 190–192.

B. Oppenheim, Lean product development flow, Syst Eng 7(4)(2004), 352–376.

S. Pugh, Total design: Integrated methods for successful productengineering, Addison-Wesley, Reading, MA, 1991,.

J.E. Ramirez-Marquez and C. Rocco, Probabilistic knowledge dis-covery algorithm in network reliability optimization, ESRELAnnu Conf, Stavanger, Norway, 2007.

J.E. Ramirez-Marquez and C. Rocco, All-terminal network reliabil-ity optimization via probabilistic solution discovery, reliabilityengineering & system safety, 2008, in press.

T. Saaty, The analytic hierarchy process, McGraw-Hill, New York,1988.

S.R. Sadin, F.P. Povinelli, and R. Rosen, The NASA technology pushtowards future space mission systems, Acta Astronaut 20 (1989),73–77.

B. Sauser, D. Verma, and J. Ramirez-Marquez, From TRL to SRL:The concept of systems readiness levels, Conf Syst Eng Res, LosAngeles, CA, 2006.

B. Sauser, J. Ramirez-Marquez, D. Henry, and D. DiMarzio, Asystem maturity index for the systems engineering life cycle, IntJ Indust Syst Eng 3(6) (2008a), 673–691.

B. Sauser, J. Ramirez-Marquez, R. Magnaye, and W. Tan, Systemmaturity indices for decision support in the defense acquisitionprocess, Defense Acquisition Res Symp, NPS, 2008b, pp. 127–140.

B. Sauser, J. Ramirez-Marquez, R. Magnaye, and W. Tan, A systemsapproach to expanding the technology readiness level withindefense acquisition, Int J Defense Acquisition Management 1(2008c), 39–58.

B. Sauser, E. Forbes, M. Long, and S. McGrory, Defining an inte-gration readiness level for defense acquisition, Int Conf IntCouncil Syst Eng, INCOSE, Singapore, July 2009.

T. Shell, The synthesis of optimal systems design solutions, Syst Eng6(2) (2003), 92–105.

A.J. Shenhar and B.J. Sauser, “Systems engineering management:The multidisciplinary discipline,” in Handbook of systems engi-neering and management, 2nd Edition. A.P. Sage and W.R. Rouse(Editors.), Hoboken, NJ: Wiley, 2009.

A.J. Shenhar, “Systems engineering management: The multidisci-plinary discipline,” in Handbook of systems engineering andmanagement, A.P. Sage and W.R. Rouse (Editors), New York:Wiley, 1999, pp. 113–136.

R. Shishko, D.H. Ebbeler and G. Fox, NASA technology assessmentusing real options valuation, Syst Eng 7(1) (2003), 1–12.

J.D. Smith, An alternative to technology readiness levels for Non-Developmental Item (NDI) software, 38th Hawaii Int Conf SystSci, 2005.

W. Tan, J.E. Ramirez-Marquez, and B. Sauser, A probabilistic ap-proach to reliable system readiness level assessment, J EngTechnol Management (2009), submitted.

322 MAGNAYE, SAUSER, AND RAMIREZ-MARQUEZ

Systems Engineering DOI 10.1002/sys

Page 13: System development planning using readiness levels in …personal.stevens.edu/~bsauser/SysDML/Evolution_Lifecylce_Managem… · System Development Planning Using Readiness Levels

M. Tavana, CROSS: A multicriteria group-decision-making modelfor evaluating and prioritizing advanced-technology projects atNASA, Interfaces 33(3) (2003), 40–56.

J. Tretmans, Tangram: Model-based integration and testing of com-plex high-tech systems, Embedded Systems Institute, Eind-hoven, The Netherlands, 2007.

R. Valerdi and R.J. Kohl, An approach to technology risk manage-ment, Eng Syst Div Symp, MIT, Cambridge, MA, 2004.

D. Verma, C. Smith, and W. Fabrycky, Fuzzy set based multi-attrib-ute conceptual design evaluation, Syst Eng 2(4) (1999), 187-–197.

A. Vollerthun, Design-to-market: Integrating conceptual design andmarketing, Syst Eng 5(4) (2002), 315–326.

S.C. Wheelwright and K.B. Clark, Revolutionizing product devel-opment, The Free Press, New York, 1992.

D.E. Whitney, Manufacturing by design, Harvard Bus Rev 88(4)(1988), 83–91.

J.P. Womack, D.T. Jones, and D. Roos, The machine that changedthe world, Rawson, New York, 1990.

Romulo B. Magnaye is a registered mining engineer. He received his B.S. in Mining Engineering, his M.B.A. from theUniversity of the Philippines, and a postgraduate research Diploma in Mineral Technology from Camborne School ofMines in England. He was the Assistant Chief of the International Relations Division in the Office of the PhilippineSecretary of Natural Resources and later became the Chief Project Evaluation Officer in the National DevelopmentCompany, the investment arm of the Philippine government. He worked at Rensselaer Polytechnic Institute, SkidmoreCollege, and the Pennsylvania State University teaching International Business and Manufacturing Strategy. In 2007,he joined the School of Systems and Enterprises of Stevens Institute of Technology to pursue his Ph.D. in EngineeringManagement as well as work as an Adjunct Professor of Global Business and Project Management. He was awardedfellowships by the British Council and the Danish Summer Research Institute (Copenhagen Business School). He isalso the recipient of a Robert Crooks Stanley Fellowship from Stevens. He was elected to Beta Gamma Sigma, the honorsociety for management.

Brian J. Sauser is an Assistant Professor in the School of Systems & Enterprises at Stevens Institute of Technology. Heholds a B.S. from Texas A&M University in Agricultural Development with an emphasis in Horticulture Technology,an M.S. from Rutgers, The State University of New Jersey in Bioresource Engineering, and a Ph.D. from Stevens Instituteof Technology in Project Management. Before joining Stevens, he spent more than 12 years working in government,industry and academia both as a researcher/engineer and director of programs. His research interests are in theories,tools, and methods for bridging the gap between systems engineering and project management for managing complexsystems. This includes the advancement of systems theory in the pursuit of a biology of systems, system and enterprisematurity assessment for system and enterprise management, and systems engineering capability assessment. His workon system maturity assessment has been adopted as a decision support tool by organizations within NASA, U.S. Army,U.S. Navy, Northrop Grumman, and Lockheed Martin (http://www.SystemReadinessLevel.com).

Jose Emmanuel Ramirez-Marquez is an Assistant Professor of the School of Systems & Enterprises at Stevens Instituteof Technology. A former Fulbright Scholar, he holds degrees from Rutgers University in Industrial Engineering (Ph.D.and M.S.) and Statistics (M.S.) and from Universidad Nacional Autonoma de Mexico in Actuarial Science. His researchefforts are currently focused on the reliability analysis and optimization of complex systems, the development ofmathematical models for sensor network operational effectiveness and the development of evolutionary optimizationalgorithms. In these areas, Dr. Ramirez-Marquez has conducted funded research for both private industry andgovernment. Also, he has published more than 50 refereed manuscripts related to these areas in technical journals, bookchapters, conference proceedings, and industry reports. Dr. Ramirez-Marquez has presented his research findings bothnationally and internationally in conferences such as INFORMS, IERC, ARSym, and ESREL. He is an Associate Editorfor the International Journal of Performability Engineering and currently serves as director of the QCRE division boardof the IIE and is a member of the Technical Committee on System Reliability for ESRA.

USING READINESS LEVELS IN A COST OF DEVELOPMENT MINIMIZATION MODEL 323

Systems Engineering DOI 10.1002/sys