prioritization of capability gaps for joint small arms program using multi-criteria decision...

7
JOURNAL OF MULTI-CRITERIA DECISION ANALYSIS J. Multi-Crit. Decis. Anal. 16: 179–185 (2009) Published online in Wiley Online Library (wileyonlinelibrary.com) DOI: 10.1002/mcda.446 Prioritization of Capability Gaps forJoint Small Arms Program Using Multi-criteria Decision Analysis IGOR LINKOV a, , F. KYLE SATTERSTROM b and GEORGE P. FENTON c a US Army Engineer Research and Development Center,Vicksburg, Mississippi, USA b Harvard University School of Engineering and Applied Sciences, Cambridge, Massachusetts, USA c Formerly of American Systems Corporation, Dumfries,Virginia, USA ABSTRACT Capability gaps prioritization is an important task required in multiple research and acquisition programs in military and industrial settings. We used multi-criteria decision analysis to prioritize capability gaps for small arms identified in a small-arms-specific Department of Defense (DOD) Joint Capabilities Integration & Development System study as approved by the DOD Joint Service Small Arms Program in 2005. A criteria hierarchy was created for use in judging the importance of each gap, taking into account time frames, tasks, conditions, and standards established by military subject matter experts. Military respondents representing US Army, Marine, Navy, Air Force, Coast Guard and SOCOM then completed an online preference survey using pairwise comparisons of each criterion, and analytical hierarchy process was used to produce a gap prioritization for each respondent and service, as well as one overall ranking. The priority rankings have helped in informing program and funding allocation decisions. Copyright r 2010 John Wiley & Sons, Ltd. KEY WORDS: MCDA; military; capability gaps prioritization INTRODUCTION The Joint Service Small Arms Program (JSSAP), a group in the United States Department of Defense that coordinates hand-held weaponry across mili- tary branches, uses the Joint Capabilities Integra- tion and Development System (JCIDS) procedural tool to identify small arms capabilities needed by the armed forces. As part of the Functional Needs Analysis step of this process, JSSAP had identified the current and future weapons capabilities it desired, and it also determined which of those capabilities it possessed and which it did not. The desired capabilities, which it did not yet possess, were termed as ‘capability gaps’. JSSAP sought a transparent and rigorous method for prioritizing these gaps for presentation in the Functional Solution Analysis step, which addresses ideas for both materiel and non-materiel approaches to gap mitigation. Many different approaches are available for conducting a capability gap prioritization. A subjective (i.e. gut feeling) prioritization would be easy to carry out, but it would not be rigorous, transparent, or reliable. An ad hoc weighting could be conducted, but it would require arbitrary weighting for multiple criteria, and it would be difficult to modify or adjust for specific service branches. Multi-criteria decision analysis (MCDA), on the other hand, is a transparent, state-of-the-art tool that can be tailored to fit specific decision constraints and allows visualization of the differences among different services’ or indi- viduals’ opinions. It is relatively more effort- intensive than the other options but was chosen for gap prioritization in this study due to its rigour and structure. Seventy-two unclassified capability gaps were identified in the JCIDS study across three different time frames—Near Term (the next 8 years), Mid Term (8–15 years in the future), and Far Term (greater than 15 years in the future). The capability gaps each related to one of seven specific tasks that JSSAP desires its soldiers have the ability to accomplish: the ability to communicate by receiving and transmitting data; to neutralize *Correspondence to: US Army Engineer Research and Development Center, 3909 Halls Ferry Road, Vicksburg, Mississippi 39180, USA. E-mail: Igor.Linkov@usace. army.mil Received 21 April 2010 Accepted 10 May 2010 Copyright r 2010 John Wiley & Sons, Ltd.

Upload: igor-linkov

Post on 15-Jun-2016

212 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Prioritization of capability gaps for joint small arms program using multi-criteria decision analysis

JOURNAL OF MULTI-CRITERIA DECISION ANALYSIS

J. Multi-Crit. Decis. Anal. 16: 179–185 (2009)

Published online in Wiley Online Library

(wileyonlinelibrary.com) DOI: 10.1002/mcda.446

PrioritizationofCapabilityGaps forJoint Small ArmsProgramUsingMulti-criteria Decision Analysis

IGOR LINKOVa,�, F. KYLE SATTERSTROMb and GEORGEP. FENTONc

aUSArmy Engineer Research and Development Center,Vicksburg,Mississippi, USAbHarvard University School of Engineering and Applied Sciences, Cambridge,Massachusetts, USAcFormerly of American Systems Corporation, Dumfries,Virginia, USA

ABSTRACT

Capability gaps prioritization is an important task required in multiple research and acquisition programs in militaryand industrial settings. We used multi-criteria decision analysis to prioritize capability gaps for small arms identifiedin a small-arms-specific Department of Defense (DOD) Joint Capabilities Integration & Development System studyas approved by the DOD Joint Service Small Arms Program in 2005. A criteria hierarchy was created for use injudging the importance of each gap, taking into account time frames, tasks, conditions, and standards established bymilitary subject matter experts. Military respondents representing US Army, Marine, Navy, Air Force, Coast Guardand SOCOM then completed an online preference survey using pairwise comparisons of each criterion, andanalytical hierarchy process was used to produce a gap prioritization for each respondent and service, as well as oneoverall ranking. The priority rankings have helped in informing program and funding allocation decisions.Copyright r 2010 John Wiley & Sons, Ltd.

KEY WORDS: MCDA; military; capability gaps prioritization

INTRODUCTION

The Joint Service Small Arms Program (JSSAP), agroup in the United States Department of Defensethat coordinates hand-held weaponry across mili-tary branches, uses the Joint Capabilities Integra-tion and Development System (JCIDS) proceduraltool to identify small arms capabilities needed bythe armed forces. As part of the Functional NeedsAnalysis step of this process, JSSAP had identifiedthe current and future weapons capabilities itdesired, and it also determined which of thosecapabilities it possessed and which it did not. Thedesired capabilities, which it did not yet possess,were termed as ‘capability gaps’. JSSAP sought atransparent and rigorous method for prioritizingthese gaps for presentation in the FunctionalSolution Analysis step, which addresses ideas forboth materiel and non-materiel approaches to gapmitigation.

Many different approaches are available forconducting a capability gap prioritization. Asubjective (i.e. gut feeling) prioritization would beeasy to carry out, but it would not be rigorous,transparent, or reliable. An ad hoc weighting couldbe conducted, but it would require arbitraryweighting for multiple criteria, and it would bedifficult to modify or adjust for specific servicebranches. Multi-criteria decision analysis (MCDA),on the other hand, is a transparent, state-of-the-arttool that can be tailored to fit specific decisionconstraints and allows visualization of thedifferences among different services’ or indi-viduals’ opinions. It is relatively more effort-intensive than the other options but was chosenfor gap prioritization in this study due to its rigourand structure.

Seventy-two unclassified capability gaps wereidentified in the JCIDS study across three differenttime frames—Near Term (the next 8 years), MidTerm (8–15 years in the future), and Far Term(greater than 15 years in the future). The capabilitygaps each related to one of seven specific tasks thatJSSAP desires its soldiers have the ability toaccomplish: the ability to communicate byreceiving and transmitting data; to neutralize

*Correspondence to: US Army Engineer Research andDevelopment Center, 3909 Halls Ferry Road, Vicksburg,Mississippi 39180, USA. E-mail: [email protected]

Received 21 April 2010

Accepted 10 May 2010Copyright r 2010 John Wiley & Sons, Ltd.

Page 2: Prioritization of capability gaps for joint small arms program using multi-criteria decision analysis

enemies using lethal or non-lethal force; tosuppress enemies at given distances; to breachwalls by creating holes in them; to defendthemselves at short range; to avoid detection;and to track and surveil targets by taggingand marking them. Prioritization of the gaps wasnecessary to facilitate funding allocation decisions,as creating a more efficient defense spendingprocess is an important national security issue(Office of the Under Secretary of Defense forAcquisition, Technology, and Logistics, 2009).

METHODS

General approachMCDA was used to conduct the prioritizationbecause it is a transparent and flexible way toimpart structure to the decision-making process.MCDA encompasses a variety of techniques,which generally consist of four steps: (1) creatinga hierarchy of criteria relevant to the decision athand, for use in evaluating the decision alterna-tives, (2) weighting the relative importance of thecriteria, (3) scoring how well each alternativeperforms on each criteria, and (4) combiningscores across criteria to produce an aggregatescore for each alternative. Although the goal of theprocess is often to select the single alternative withthe highest score, these scores can also be used toput the alternatives in order of priority. MostMCDA methodologies share similar steps 1 and 3but diverge on their processes for steps 2 and 4(Yoe, 2002). A detailed analysis of the theoreticalfoundations of different MCDA methods andtheir comparative strengths and weaknesses ispresented in Belton and Stewart (Belton andStewart, 2002).

The analytic hierarchy process (AHP) (Saaty,1994) was implemented in this study. AHP iscapable of handling problems with many decision-makers, alternatives, and criteria. Similar to otherMCDA methods, AHP uses an optimizationalgorithm to describe the merit of each option ona single numerical scale. Scores are calculated fromthe performance of alternatives with respect toindividual criteria and then aggregated into anoverall score. AHP prioritizes the decision alter-natives based on their objective function values,ranging from the highest to lowest. These scoresare often highly dependant on the weightings ofthe criteria and sub-criteria, and AHP determinesweightings for these criteria in an interesting way.

In AHP, all individual criteria underneath the sametree node are compared against each other inpairwise fashion and given relative preferencescores. These quantitative scores are compiled inmatrix form. For example, in examining the tasksassociated with small arms, AHP would require thedecision-maker to answer questions such as, ‘Withrespect to small arms, which task capability is moreimportant, neutralizing a target or avoidingdetection?’ A numerical scale is used to comparethe choices, and AHP moves systematically throughall pairwise comparisons of criteria andalternatives. AHP thus relies on the suppositionthat humans are more capable of making relativejudgments than absolute judgments.

To prioritize the small arms capability gaps, ourimplementation of the AHP involved the followingsteps:

� A criteria hierarchy was created.� Military respondents completed an online

preference survey.� Survey responses were used to weight criteria

for each respondent.� Consistency ratios were calculated.� Gaps were prioritized for each respondent,

based on the weighted criteria.� Geometric mean of survey rankings was used

to produce one ranking for each servicebranch of the armed forces.

� Geometric mean of service rankings was usedto produce one overall ranking.

Criteria hierarchyThe hierarchy was based on military goals andguidance. The overall goal of the project was toprioritize small arms capability gaps, and each ofthe four service branches (to include SpecialOperations Command and the US Coast Guard)of the armed forces contributed equally to thisprocess. Beneath the level of the service branches,the hierarchy consisted of different time frames(Near Term, Mid Term, and Far Term), sevensmall arms tasks within each time frame(Receive & Transmit Data, Neutralize, Suppress,Breach, Personal Defense, Avoid Detection, andTag & Mark), and specific measures relating toeach task. Military criteria specific to each measurehad been mapped against current and pro-grammed capabilities to identify specific capabilitygaps. Where applicable, battlespace ranges(i.e. spatial distances) were also taken into

I. LINKOV ET AL.180

Copyright r 2010 John Wiley & Sons, Ltd. J. Multi-Crit. Decis. Anal. 16: 179–185 (2009)

DOI: 10.1002/mcda

Page 3: Prioritization of capability gaps for joint small arms program using multi-criteria decision analysis

account. A basic outline of the hierarchy structureis given below (Figure 1). The full hierarchy isgiven in the Supplementary Information.1

Online preference surveyAfter constructing the hierarchy, the next step wasto give importance weightings to the items on eachof its levels. We created a preference survey andposted it online. Survey respondents from eachbranch of the military then gave numerical valuejudgments for the pairwise comparisons of allcomparable time frames, tasks, measures, andcriteria in the hierarchy. Possible numerical judg-ments ranged from 1 (equal importance) to 9(indicating that the item in the comparison whichreceived the 9 was extremely more important thanthe other). Only complete surveys were used in thefinal analysis (although subsequent analysisshowed that inclusion of the usable parts ofincomplete surveys would have caused negligiblechange in the overall results). Definitions for theitems being compared were available by holdingone’s mouse pointer over the terms in question.

Analytic hierarchy processAfter receipt of all surveys, weightings wereproduced for each respondent using the AHP. Asdescribed above, in AHP, the decision-maker doesnot give importance weightings directly; rather, thecategory weightings are derived from a series ofrelative judgments. In the example below, thedecision-maker has input three relative judgmentsin the form of weightings ratios. He has, forexample, weighted capability gaps in the NearTerm as extremely more important than those inthe Far Term (Table I).

From input judgments such as those above,AHP derives normalized weightings for the threecriteria. It does this by filling out the matrix asshown below (Table II), finding the largesteigenvalue of the matrix, finding the corres-ponding eigenvector, normalizing it, and usingthe values it contains as weights for items beingcompared (Table III). We implemented thisprocess in Microsoft Excel.

The importance weighting process was repeatedfor every group of tasks, measures, and criteriain the hierarchy. Because gaps correspond tospecific unmet criteria, the gaps were prioritizedbased on the importance weightings of theircorresponding criteria. These importance valuesdepended on the weights assigned to the relevanttime frame, task, measure, and criteria. All of theseweights were multiplied together to obtain onefinal value—the value of the objective function.Where gaps involved specific battlespace ranges, aprioritization of battlespace ranges for that taskwas also used as a factor in the weighting process.

Gaps were prioritized in three different ways:

(1) within each time term (a ranking of the 19Near-Term gaps, for example);

(2) across all time terms (a full ranking of all 72gaps); and

Goal: Prioritization

Service Branches

Time Frames

Tasks

Measures

Criteria

Gaps correspond to specific criteria within a specific Time Frame

Figure 1. Outline of decision hierarchy.

Table I. Relative importance weightings, in the ratioform of row element/column element

Time terms Near Term Mid Term Far Term

Near Term 8.0 9.0

Mid Term 6.0

Far Term

Table II. Relative importance weightings with all rowand column elements filled for AHP analysis

Time terms Near Term Mid Term Far Term

Near Term 1 8 9

Mid Term 1/8 1 6

Far Term 1/9 1/6 1

Table III. Importance weightings for time terms basedon example survey responses

Time term weightings

Near Term 0.780

Mid Term 0.170

Far Term 0.050

1Supporting information may be found in the onlineversion of this article.

PRIORITIZATION OF CAPABILITY GAPS 181

Copyright r 2010 John Wiley & Sons, Ltd. J. Multi-Crit. Decis. Anal. 16: 179–185 (2009)

DOI: 10.1002/mcda

Page 4: Prioritization of capability gaps for joint small arms program using multi-criteria decision analysis

(3) collected into categories (a ranking whichaggregated the importances of similar gapsacross different time terms).

Prioritizations were produced for each surveyrespondent. The Army provided 13 complete surveyresponses, whereas the Marine Corps provided four.The Air Force, Coast Guard, Navy, and Special OpsCommand each provided one. For services withmultiple respondents, a gap’s rank in the service’sprioritization was based on the geometric mean of itsrank in each of that service’s contributing rankings(Saaty, 1994). Similarly, a gap’s overall rank wasdetermined by the geometric mean of its rank in eachof the services’ prioritizations.

Ad hoc prioritizationFor the sake of comparison with the decisionanalysis results, an ad hoc gap prioritization wasalso conducted. As the final part of the onlinepreference survey, each respondent was asked to givea gut feeling importance value of 1 (least important)to 9 (most important) to each gap. Rankings werethen produced for each service based on the averageof respondents’ importance values, and an overallranking was produced by averaging each of theservices’ importance values. A ranking by categorywas not conducted using this method.

RESULTS AND DISCUSSIONS

Within individual servicesThe Army was the only service with a large (greaterthan 4) number of survey respondents. The twofigures below compare the ranking a gap receivedfrom different army individuals with the rankingthat gap received in the Army consensus. Althoughthere is a clear variation between individual respon-dents, the graphs show that there is also a definitetrend in the gap rankings, implying that theconsensus ranking for the Army is likely a robustone. These graphs also indicate that a larger numberof survey respondents from the other servicebranches would have led to a more service-representative consensus Figures 2 and 3.

Overall rankingThe top-ranked gap overall (Table IV), ascalculated by the AHP, is the Near Term gap 39;it was ranked number 1 by Army and USAF andnumber 2 by USMS and SOCOM. The other top-ranked gaps are shown below.

Across multiple servicesThe two graphs below compare Army and MarineCorps gap rankings with the overall consensusranking (see Supplementary Information for

Near Term Gaps

0

2

4

6

8

10

12

14

16

18

20

0

Consensus Army RankingIn

divi

dual

Arm

y R

anki

ng

2 4 6 8 10 12 14 16 18 20

Figure 2. Near term and overall comparison of Armyindividuals’ gap rankings and the consensus Army gaprankings.

All Gaps

00

8

8

16

16

24

24

32

32

40

40

48

48

56

56

64

64

72

72

Consensus Army Ranking

Indi

vidu

al A

rmy

Ran

king

Figure 3. Near term and overall comparison of Armyindividuals’ gap rankings and the consensus Army gaprankings.

I. LINKOV ET AL.182

Copyright r 2010 John Wiley & Sons, Ltd. J. Multi-Crit. Decis. Anal. 16: 179–185 (2009)

DOI: 10.1002/mcda

Page 5: Prioritization of capability gaps for joint small arms program using multi-criteria decision analysis

comparisons for the other services as well). TheArmy is the service branch whose final rankingwas closest to the overall rankings; the relativelylarge number of Army respondents likely helpedthe service reach this consensus. Interestingly, theMarine Corps, the only other service with morethan one survey respondent, produced the leastsimilar rankings to the overall consensus. In otherwords, the Marine Corps seems to disagree withthe prioritizations of the other services. Aninspection of the rankings reveals that this effectis due to the Marines’ high prioritization of gaps inthe Mid and Far Terms, compared with otherservice branches’ high prioritization of gaps in theNear Term. Looking within each time term, theMarines’ preferences are in fact quite similar to

the consensus prioritization. It is only whenranking in different time terms relative to oneanother that the Marines differ noticeably fromthe other service branches Figures 4 and 5.

Time horizonsWithin the decision analysis results, the differentvaluation of time terms provides an interestingeffect. Looking at the ‘All Gaps’ rankings(Supplementary Information), five of the sixservice branches had rankings of their own thatwere highly correlated with the final results. Thismeans that, for the most part, there was wide-spread agreement between five of the six services asto the overall prioritization. As discussed above,the divergence of the Marine Corps was primarily

Table IV. Capability gap prioritization results, based on decision analysis, in the Near Term, Mid Term, Far Term,and overall

Rank Gap Term Army Navy SOCOM USAF USCG USMC Geo Mean

Near-Term gaps

1 39 Near 1 3 2 1 7 2 2.09273

1 52 Near 2 2 3 7 1 1 2.09273

3 55 Near 3 8 1 2 2 4 2.69601

4 27 Near 8 1 4 6 12 8 5.13959

5 42 Near 4 4 9 3 8 9 5.60793

Mid-Term gaps

1 40 Mid 1 1 2 1 5 2 1.64755

2 53 Mid 2 4 5 5 1 1 2.41827

3 56 Mid 5 5 3 2 2 6 3.48775

4 62 Mid 3 2 4 7 4 3 3.55425

5 61 Mid 9 3 1 6 8 7 4.56685

Far-Term gaps

1 41 Far 1 1 2 1 5 1 1.4678

2 63 Far 3 2 3 4 1 3 2.44949

3 54 Far 2 4 4 9 2 2 3.23774

4 57 Far 6 10 1 5 3 6 4.18857

5 33 Far 7 5 5 2 11 4 4.98793

Overall

1 39 Near 1 3 2 2 7 10 3.07171

2 52 Near 2 2 3 17 1 5 3.17273

3 55 Near 3 11 1 6 2 20 4.46465

4 41 Far 7 13 13 1 20 3 6.43462

5 40 Mid 5 4 5 22 23 6 8.19817

Gap categories

Rank Army Navy SOCOM USAF USCG USMC Geo Mean

1 1 1 2 1 6 2 1.69838

2 2 2 3 8 1 1 2.13983

3 5 9 1 3 2 9 3.66664

4 10 10 7 2 11 4 6.28439

5 3 4 11 5 6 21 6.60672

5 3 4 11 5 6 21 6.60672

Numbers represent ranking for each service.

PRIORITIZATION OF CAPABILITY GAPS 183

Copyright r 2010 John Wiley & Sons, Ltd. J. Multi-Crit. Decis. Anal. 16: 179–185 (2009)

DOI: 10.1002/mcda

Page 6: Prioritization of capability gaps for joint small arms program using multi-criteria decision analysis

due to its relatively larger emphasis on the Midand Far Terms and relatively smaller emphasis onthe Near Term. Within each time term, results forall six services are quite similar.

The decision analysis results are also similarwhen comparing one time horizon to the next.

Time frames might have afforded respondents themeans to predict large changes in small armscapability needs in the next 151 years, but this didnot come out in the results. Gaps that are a highpriority in one time frame are, for the most part,also a high priority in the next, and likewise formid- and low-priority gaps. For this reason, it isalso not surprising that the gap category rankingsare similar to the rankings produced for each timeterm (a complete breakdown of which of the 72gaps fall into which category is given in theSupplementary Information).

Ad hoc prioritizationThe ad hoc results are quite different from thedecision analysis results. Although the top fewgaps are ranked highly by both methods, otherrankings differ significantly (e.g. Figure 6). Oneexplanation for this includes a relative over-valuing of gaps in the Transmit and Receive taskby the ad hoc rankings, as the decision analysishierarchy takes into account the fact that each ofthese gaps represents a much smaller portion of anoverall task than do gaps beneath more narrowtasks such as Suppression. Another likely factor isthe ad hoc under-valuing of non-lethal gaps. Whenasked for a gut feeling, survey respondents oftengave lower importance values to non-lethal weap-

Army Rankings vs. Overall Rankings

00

8

8

16

16

24

24

32

32

40

40

48

48

56

56

64

64

72

72

Overall Ranking

Arm

y R

anki

ng

Figure 4. Comparisons of Army and Marine Corps gaprankings vs. the overall consensus ranking.

USMC Rankings vs. Overall Rankings

0

8

16

24

32

40

48

56

64

72

Overall Ranking

USM

C R

anki

ng

0 8 16 24 32 40 48 56 64 72

Figure 5. Comparisons of Army and Marine Corps gaprankings vs. the overall consensus ranking.

Figure 6. Comparisons of decision analysis vs. ad hocrankings for the Marine Corps.

I. LINKOV ET AL.184

Copyright r 2010 John Wiley & Sons, Ltd. J. Multi-Crit. Decis. Anal. 16: 179–185 (2009)

DOI: 10.1002/mcda

Page 7: Prioritization of capability gaps for joint small arms program using multi-criteria decision analysis

ons than were calculated based on their decisionanalysis responses.

Uncertainties/deficienciesIt should be noted that the decision analysisprocedure implemented in this project did haveuncertainties and deficiencies. For example, thehierarchy was not sufficiently resolved to distin-guish between every single gap, allowing ties. Italso had one additional level of resolution underthe Transmit and Receive task, possibly splittingup importance weights by an additional fractionand causing these particular gaps to be seen asrelatively less important than other gaps. Thisparticular issue was noticed at the outset of theproject, and it was decided not to change thehierarchy because (a) the hierarchy was based oncurrent military doctrine and (b) perhaps thesegaps warranted smaller weights, given that theextra hierarchy level reflects that each gap is asmaller portion of the overall task than are gaps inother tasks (when compared with, say, Breach,where the single gap in each time frame basicallyamounts to an inability to perform the entire task).

We also did not track the consistency of indi-viduals’ pairwise comparison survey responses. Itis quite possible for AHP survey respondents tosay that A is twice as important as B, B is twice asimportant as C, and C is twice as important as A.This deviation from transitivity is often tracked inAHP using a measure called ‘inconsistency’. Wedid not calculate this value, and consequently wedid not identify individual survey respondents whomay have been inconsistent and hence would havebenefited from further clarification on surveymechanics. Additionally, AHP itself is often criti-cized for not being as rigorous in its weightingprocedures as other decision analysis methods, butas mentioned in the introduction, it was easy toimplement and allowed incorporation of manymilitary opinions, and we felt that this trade-offwas acceptable.

CONCLUSIONS

The MCDA implemented as AHP was used toprioritize the capability gaps for the JSSAP.Although the decision analysis results are differentfrom ad hoc rankings, this was to be expected. Toomany considerations need to be held in mind atonce for the respondent to give importance valuesoff the top of his or her head for each gap. Instead,

AHP was used to break down a complicateddecision problem into manageable chunks, andfrom there to build back up to a final prioritiza-tion. The results reflect the importance of differenttime horizons—the prioritizations within eachtime horizon are consistent across services, andthey are also thematically similar to prioritizationsin other time horizons.

AHP was selected for the prioritization after theassessment of available resources, time constraints,and the specifics of the gap prioritization task.AHP was chosen because (i) the limited length ofthe survey/interview required to elicit decision-makers’ values allowed many military decision-makers to input their opinions; (ii) AHP’srelatively low time and software resourcerequirements allowed gap prioritization to beconducted quickly; and (iii) AHP mathematicallyand transparently structured the decision process,including in this case a hierarchy of decisioncriteria which correspond to JSSAP guidance.

ACKNOWLEDGEMENTS

We would like to thank Dr. Edmund Crouch ofCambridge Environmental for review and stimu-lating discussions. This study was funded by theJoint Service Small Arms Program via AmericanSystems Corporation. The views and opinionsexpressed in this paper are those of the individualauthors and not those of the US Army, or othersponsor organizations.

REFERENCES

Belton V, Stewart T. 2002. Multiple Criteria DecisionAnalysis: An Integrated Approach. Kluwer AcademicPublishers: Boston, MA.

Office of the Under Secretary of Defense for Acquisi-tion, Technology, and Logistics. Creating a DODStrategic Acquisition Platform, 20301-3140. Washington,DC. April 2009. Available at: http://www.dtic.mil/cgi-bin/GetTRDoc?AD5ADA499566&Location5U2&doc5GetTRDoc.pdf.

Saaty TL. 1994. Fundamentals of Decision Making andPriority Theory With the Analytic Hierarch Process.The Analytic Hierarch Process Series: Vol. VI. RWS:Pittsburgh, PA.

Yoe C. 2002. Trade-Off Analysis Planning and Proce-dures Guidebook. Prepared for Institute for WaterResources, U.S. Army Corps of Engineers.

PRIORITIZATION OF CAPABILITY GAPS 185

Copyright r 2010 John Wiley & Sons, Ltd. J. Multi-Crit. Decis. Anal. 16: 179–185 (2009)

DOI: 10.1002/mcda