applied science associates inc unclassified … · this interim report was submitted by applied...

89
AA09 DA APPLIED SCIENCE ASSOCIATES INC VALENCIA PA F/6 5/9 MAINTENANCE TRA1INING SIMULATORS DESIGN4 AND ACQUISITION*- S-UMMARY--EIC(U) NOV 79 G R PURIFOY, E W BENSON F33615-78-C-0019 UNCLASSIFIED AFHRL*-TR-79-23 NL EIEEIIEEEIIE NONED IIIIIIIIII

Upload: doankhuong

Post on 12-Oct-2018

213 views

Category:

Documents


0 download

TRANSCRIPT

AA09 DA APPLIED SCIENCE ASSOCIATES INC VALENCIA PA F/6 5/9MAINTENANCE TRA1INING SIMULATORS DESIGN4 AND ACQUISITION*- S-UMMARY--EIC(U)

NOV 79 G R PURIFOY, E W BENSON F33615-78-C-0019UNCLASSIFIED AFHRL*-TR-79-23 NLEIEEIIEEEIIE

NONED

IIIIIIIIII

AFHRL-TR.-7923 "/ i

AIR FORCE 103,MAINTENANCE TRAINING SIMULATORS DESIGN

AND ACQUISITION:

Hn SUMMARY OF CURRENT PROCEDURES

U By

George R. Purifoy. Jr.mEugene W. Benson

Box IN8

TECHNICAL TRAINING DIVISIONNLowry Air Force Base, Colorado 80230

RNovember 1979E Interim Remon for Period July 1978 - Februarv 1979

0U Approved Giar 1u1itI , c1 11ilkca-,

R

ES LABORATORY

AIR FORCE SYSTEMS COMMANDBROOKS AIR FORCE BAbE,TEXAS 78235

." .~ 8. .

NOTI('F

When U.S. Government drawings. specifications. ot otlher data are usedfor any purpose other than a definitely related Guovernmentprocurement operat ion. I he 6owernment thereby incurs noresponsibility nor any obligation wh~atsoever, and the fiact that tileGovernment may have formulated, furnished. or in any way suppliedlte said drawings, speci flca tions, or other data is not ito be regarded byimplication or otherwise. as in any manner licensing the holder or anyother person or corporation. or conveying any rights or permission tomanufacture, use. or sell any patented invention that may in any waybe related thereto.

This interim report was submitted by Applied Science Associates. Inc.,Bo x 158. Valencia . Pennusylvania 1(-059. under contractF3361 5-78-C.OM I9. prolectl 2301, with Tech~nical Training Division, AirForce Human Resources Laboratory ( AFSC). Lowry Air Force Base.Colorado 80230. IDr. Fdgar A. Smith ('ITT) was the Contract Monitorfor tlte Laboratory.

This repoKrt has beeni reviewed by the hltOrmation Oftice (01) and isreleasable to the National Technical Information Sea vice (NTIS). AtNTIS. it will be available to (the general puiblic . including fillaeignnat ions.

This technical rclxort has been reviewed and is approved for publication.

MARTY R. ROCK WAY. Technical DirectorTech nical Training Division

RONALD W. TF RRY. Colonel. I ISAF- Commanader

UMclusifted

REPORT DOCUMEt4TATION PAGE READ INSTRUCTIONSUEFORE COMPLETING FORM

2, GOVT ACCESSIOk NO, IECIPI1NT.S CATALOG NUMBER

M ANERAINNG. SMULATORSQ CIGN AND JneioIITINAC IR JWOP78 IFeimtur 79AMMAY MSRREN F~or~uRs.A -wr~tyMINO ORO. REPORT NUMUEff

ALI HWAIS CONTRACT ON GRANT N'MOCRI.I

Georp R/ rifay. Jr. !$ F3I.KCG9"

I. P11111ORMING ORGANIZATION NAME AND ADDRWSI PROGRAM ELEM~ RJC.TSAppiedScinceAsocites lie.AREA & OR UNIT NUMBUERq

Vajencia, Pennsylvania I 605k) 7It. CONTROLLING OFFICE NAME AND ADDRESS

Mi Air Force Humian~ Resources Laboratory (AFS(') oes=47Brooks Air For"c Base, Texas 783 is NUMBER of, PAG

14. MONITORING AdGENCY N AMR & AODRESS(It dIflot"pI Ihv. C.'.tsIfIrin Otti..) IS SECURITY CLASS.)%6-ooe

Technical Training Division hcasfeLowry Air Force Base, C'olorardo 80230

SCHEDULE

Is. DISTRIBUTION ST AT1MEN T (ol We Report)

Approved for public relese; distribution unilitnited.

Ili. DISTRIBUTION STATELMKNT I0Vth IA*&~r~.tratofrodir Bnflock 40. I difte~m fr~., Rop...e

1,SUPPLEMENTARY NOTES

IS. KEY WORDS (Com.Ihw. a" ww....e @ood it nere*mr and Idonftlt- bw bI.. numbor)

acquisition simulatlioncompuiet-msted instruction technical trainingmaintimance Simulators trainingmaintenance training. I mining devices

10 AIIIRIAG? (CenlIImw on ronq** aid* I ntc"Oei and Idonthin A, blockb nmrb*0

*btechnical report is the first in a series that will explore the problemis of miaintenance traIning Simulation

deign oid acquisition. It is focused on the existing procedures followed by Air Force personirel in performning 4Instrutinal Systems Devlopmrent (ISO) analyses to define maintenance training equipment requirements. and byvSystem Prognm Office (SF0) Training E~quipmrent AcIStitimi Mnagets in accomoplishing training equipmentptocurement. LAte reports in this Seriea will structure appropriate functional specillcations for the acquisition oifmaintetnance, training imulators, will present handbooks to guide 1St) analysts in selecting approprtiate types oft

maintenance trainingt equipment and in designing and documentin required maintenance training %inmulatorAharacterbtls and features. and to guide SF0 Acquidtion Managr in preparing, INine Itein Specifications.

DD ~'I 1473 EDITION OF I NOV aS is onsoLar5anlssfe

A SECURI1TY CLASSIVICATION OF THIS PAGE S5Ion nt It Kto,.II

SUCURITY CLASSIFICAIION OF THIS PAOE(W"mo Dole Enterd)

Item 20 Continued:

nthis report both the ISD and SPO procedures are described as they are cu~rrently accomplished. Relevantdocumentation is dted and a comprehensive bibliography is Included. For each of the two sets of procedures, a

general decision model is presented as a reference. and general problem areas which appear to be degrading theultimate cost-elTectiveness of maintenance simulators are discussed.

I I'

UnclassifiedSECURITY CLASSIFICATION OF THIS PAOr(la. Plate ntatedl

S

I! I

PREFACE

This report was prepared by Applied Science Associates, Inc.

(ASA), Valencia, Pennsylvania, as an Interim Technical Report under Air

Force Contract No. F33615-78-C-0019. Mr. George R. Purifoy, Jr. is the

Principal Investigator and Project Director. The Air Force Human

Resources Laboraboty (AFHRL), Technical Training Division, Lowry Air

Force Base, Colorado, is the sponsor. Dr. Edgar A. Smith is the

Project Engineer.

This study is one of a series of related studies under the

Technical Training Division's Project 2361, Simulation for Maintenance

Training. Project 2361 is an advanced development program to develop,

demonstrate, test, and evaluate selected applications of computer-based

simulation for Air Force maintenance training. The objective of this

program is to build baseline knowledges about techniques, procedures,and principles necessary for broad applications of simulation in

maintenance training. Simulator training devices are being fabricated

and demonstrated in an operational training environment in order to

establish cost, reliability, and training effectiveness information.

These data will contribute to a determination of training value factors

for eventual Air Force use. Demonstration of the training/cost-effec-

tiveness of simulation techniques, coupled with analyses of effective

simulation management tools, will provide the necessary empirical data

to develop model specifications, design user handbooks, and to prepare

life cycle management guides for the effective utilization of simu-

lation in maintenance training.

Summarized in this report is the process currently in use by the

Air Force to achieve design and acquisition of maintenance training

simulators. Responsibilities for all portions of the process are

defined, specific procedures followed by the Instructional Systems

Development (ISD) team analysts in deriving training equipment design

requirements, and by the System Project Office (SPO) Training Equipment

Acquisition Manager in procurement are detailed, and existing problems

related to training equipment acquisition are discussed.

Ior

s I

The authors also wish to acknowledge the assistance and cooperationof the many individuals who contributed information and critiques ideas.From ASA: Dr. John Folley, William Pieper, and Thomas Elliott. Fromthe Air Force:

Dr. Edgar A. Smith AFHRL/TT Lowry AFB

Major Dennis DowningMr. Gary MillerDr. Joe YasutakeDr. Marty Rockway

Col. Richard SheltonDr. Gary A. Klein AFHRL/ASR WPAFBMr. Gerald C. Carroll ASD/ENETTDr. Richard J. Schiffler ASD/ENECHMr. Larry J. Ivey ASD/ENECHMr. Bob Swab ASD/SD24Capt. Richard Jeffries ASD/SD24Mr. Marty Bainbridge ASD/SD24 " Castle AFBLt. Col. John Rutledge ASD/F-15 SPOMajor Ed Danber ASD/F-16 SPOCapt. Steve Johnson ASD/ATC Resident Office

and EF-lllA SPOMr. Hilton D. Goldman ATC/XPT Randolph XFBCapt. Gary Patterson 3700 TCHTW Sheppard AFBLt. Col. Peter W. Stoughton 3306 TES/ATF Edwards AFBMSgt. Dennis KoxSSgt. Al BehmMSgt. Ed MacKayTSgt. Bob HarringtonMr. Forrest Wolferd SAMSO/MNTP Norton AFBMajor Ned Gow 4315 CCTS/CMC Vandenberg AFHCapt. Bernard GarfinkelCapt. Bill OakleyMaj. David G. Hamlin 3901 SMES/MBTS/MSgt. Dale Lord 4200 TES Castle AFBTSgt. Steve McKinneyMajor Daryle D. Cook 4017 CCTSCapt. Bill Cole FTD-533 Hill AFBTSgt. Doug FarmerTSgt. Sam Fulton

:4

iii

TABLE OF CONTENTS

Page No.

SECTION I. INTRODUCTION. ... ................... 1The General Problem ............... IDefinitions. ....... ................... 2

SECTION II. MAINTENANCE TRAINING EQUIPMENT PROCUREMENTPROCESS SUMMARY ............................. 4

Phase I. Identification of Requirements . . . 5Phase II. Development of Specifications . .. 7Phase 11I. Procurement ....... ............. q

SECTION 111. ISD PROCEDURES FOR TRAINING EQUIPMENT DESIGN. . . . 11The ISD Training Equipment Design Process.... 11A Training Equipment Design Process odel. . . . 29Major Problem Areas ..... ............... .34

SECTION IV. THE SPO TRAINING EQUIPMENT ACQUISITION PROCESS ... 44Processes and Procedures . .......... 44A Training Equipment Acquisition Process

Model ........ ..................... .58Major Problem Areas ...... .............. .61

BIBLIOGRAPHY ............. ........................... 64

iii

LIST OF FIGURES

Figure Page No.

I Classes of Training Devices ....... ............... 3

Major ISD Models and Their Relationships to theProcedures Used at the 3306th T&ES ........... 13

3 Sample of Contractor-Provided Task Data Base(Computer Printout) ....... ................... ... 18

Data Sheet D: Maintenance and Operator TaskAnalysis Page 1 ........ .................... . 19

5 Data Sheet D: Maintenance and Operator TaskAnalysis Page 2.................... 20

6 3306 T&ES FORM 1 (TEST) Jan 79 .... ............. ... 21

7 3306 T&ES FORM 2 (TEST) Jan 79 ..... ............. .2

8 3306 T&ES FORM 2b (TEST) Jan 79 Rationale Checklist. .. 23

9 Media Decision Table ....... .................. ... 24

10 330o T&ES FORM 3 (TEST) Jan 79 .... ............. ... 25

11 Teaching Methods Selection Grid .... ............. ... 27

12 Decision Sequence for Training EquipmentSelection and Design ....... .................. ... 31

13 General Summary - Headings from Task DescriptionWorksheet ............ ........................ 3b

14 Flow Chart Depicting Engineering Contributionsto System Acquisition................. . 47-54

15 Systems Requirement Analysis (SRA) Documentationand the Processes They Support ............. 57

16 Maintenance Training Equipment Acquisition j

Procedural/Decision Sequence . . . ........ . . . 60

iv

1

i m

LIST OP TALES

Tab le 1Xt~y No.

Organizational Responsibility forTraining Equipment Requirements ..........

Trainling Equipment ProcurementResponsibiltv Summary . l

I

r.

V

SECTION I

INTRODUCTION

This technical report is the first in a series that will explore

the problems of maintenance training simulation design and acquisition.It is focused on the existing procedures followed by Air Force person-

nel in performing Instructional Systems Development (ISD) analyses to

define maintenance training equipment requirements, and by SystemProgram Office (SPO) Training Equipment Acquisition Managers in accom-

plishing training equipment procurement. Later reports in this series

will structure appropriate functional specifications for the acquisi-tion of maintenance training simulators, will present handbooks to

guide ISD analysts in selecting appropriate types of maintenancetraining equipment and in designing and documenting required mainte-

nance training simulator characteristics and features, and to guide SPO

Acquisition Managers in preparing Prime Item Specifications.

In this report both the ISD and SPO procedures are described asthey are currently accomplished. Relevant documentation is cited and a

comprehensive bibliography is appended. For each of the two sets ofprocedures a general decision model is presented as a reference, and

general problem areas which appear to be degrading the ultimatecost-effectiveness of maintenance simulators are discussed.

The General Problem

Required maintenance capabilities for Air Force Weapon systems

appear to be increasing as the sophistication of weapon systems

increases. At the same time, training budgets are shrinking. These

factors, and the relatively short post-training careers of a high

percentage of Air Force maintenance personnel, make an increase in the

cost-effectiveness of maintenance training essential. The use of

simulators, as a major approach to maintenance training, is assuming

growing importance as one thrust toward improvement. Simulation, long

an established training technique for system operators, has a number of

potential benefits when applied to the teaching of system maintenance.

These benefits include reduced cost, increased training equipment

reliability, instructionally effective device characteristics, student

1

and instructor safety when practicing operationally hazardous activ-ities, and the capability for tailored hands-on practice opportunitiesthrough malfunction insertion and the creation of operationally criticaland seldom encountered conditions. However, the realization of theseadvantages has, to date, not been spectacular. There are no formalizedprocedures for maintenance simulator design. This has resulted in highvariability in the cost-effectiveness of current maintenance simu I 'tors.

Definitions

Classes of Training Devices: This report adopted the training deviceclassification developed by Kinkade and Wheato 1 inwhich all "arrangements of equipment components, apparatusor materials which provide conditions that help traineeslearn a task" are DEVICES. Devices are then sub-divided,as indicated in Figure 1, into TRAINING AIDS, which are usedby instructors to present subject matter, and TRAININGEQUIPMENT, on which trainees practice job/task-relatedactivities.

Simulator: A trainer which provides hands-on practice for aspects ofthe operational job which have been selected on the basisof their criticality and learning difficulty, with

events/indications reproduced to the necessary degree offidelity generally under computer control.

Simulation: The reproduction, in a training setting, of appropriatesubsets (part task/whole task/integrated task) ofperformance opportunities which require the samecombination of mental and physical skills, and theapplication of the same group of knowledges, as thoserequired on the job.

Note that most, if not all, items of training equipmentsimulate; but that only a tnique subset of them aredefined as simulators.

1 Kinkade, R. G., & Wheaton, G. R., Training device design. InH. P. Van Cott & R. G. Kinkade (eds.), Human engineering guide to

equipment design. Washington, D.C.: American Institutes for Research,1972.

4

I . , .. .'. ..

TRAININGJ DEVICES

TRAINING TRAIMNGAIDS 0 JIMNT

PART TASK VVHOLE INTEGRATED

JOB REPRODUCTION - .TRAINERS TASK TRAINERS

f SYNTTIC ACITUAI EUPTRAINER CONFIGURATION -. SIMULATORS TRAIN RS' 1 RAINERS

BEHAVIORAL FOCUS - . SKILL S] PROCEDURES KNOWLEC)GES

Figure 1. Classes of Training Devices

fil

SECTION It

MAINTENANCE TRAINING EQUIPMENT PROCUREMENT PROCESS SUMMARY

This section is on overview of the process currently followed inacquiring maintenance training equipment. It provides a framework fromwhich to discuss the specifics of the two major components of theacquisition process: the Instructional Systems Development (ISD) teamactivities, which are detailed in Section I1; those of the SystemProgram Office (SPO) Training Equipment Acquisition Manager which aredescribed in Section IV.

AFHRL-TR-78-28, Description of the Air Force Maintenance TrainingDevice Acquisition and Utilization Process, provides a relativelydetailed and complete description of the total life cycle for mainten-ance training equipment. The life cycle can be divided into fivephases:

Phase I - Identification of Requirements.

Phase 1I - Development of Specifications.

Phase III - Procurement.

Phase IV - Utilization and Support.

Phase V - Retirement.

The focus of the current study is on the first three of thesephases. Phase I outlines the responsibilities of the ISD team. PhasesIt and III describe the responsibilities of the SPO Training EquipmentAcquisition Manager.

Unfortunately, there is not a single and iniform procedure fordesigning and acquiring training equipment ir. the Air Force. By andlarge, each procurement, through various mixes of organizations, followsdifferent procedures and results in different intermediate products(requirements, specifications, etc.). The following description onlyillustrates the major processes.

41

11l1ise I. -Iden~t it icat I ill of =Iiu reme-nts1_

In Phlase I k Itq Ire1m011evft s tor maintenarice t ra n f Iig oqu f pment ai redevel1o p ed by d i ftetrenL o rgan IIa?;t ion IIS, dIV eenId f[Ig I)r imIIaIr i luon:

Thet aIcqut i s it I il m1ode tir s t It uis of t lie sNvst eml( st o b e supp1 1)o r t c -

al . A niew s vst em I (cu- kr r enIt I v bei ng prOcuk reOdthfroughi an A ir Force Ss t ems (Xrmmind-SVO(I

h . A Ssstem otit '. c qufit I in (a current .;*sst emo ir w I I c I A ir Fo rce Log 1st 1 c s C ommand ha t II k Iek

program managlement responisi hi t v).

C. Seve ra I s\--s t ems ( requ i r ing common or genetra I I zablIetra i ngII suppor t

2. The locus; or use I or [ti, t riiIofnn equi1pmen t-

a1. 'lobil Ic raiinv Sets (Ml'S) tor FieldlIrai i ni i ugeO t Wc hie nl t s ( FTP)

b . Re sidellt Ira in fog Fqu EIttpMen-lt ( RTE') t orTokla ImIa~ I Tra Iini1ng Cent ers (FTC).

C . TraIinin equ , cittpment t or S t ra t egl I c Ai r Co0mman,11d (.SAC)t raiingI tc t i,,IIIt I vs . ( SAC I s inotLed as- a soparate,l oculs ot fuse becauise f t mna inta.1in1s di f st i Ilk, f ivep) 1r 00ic2ln ri's f Ilk, I tid i lit, ft s oWn I lit (I ljrrtt at f oil

TablIe I summarizes thle organil;at fins ha v ng primary re spon sihi Iit vfor es tahli sfa f u maIntenanice t raifnfing eiupa pent requi1remenits for eachaicquis ition Mode and focus of nrce.

A

Table I

Organititonal Rvspons~ti !ljtv' forTra in ing Equ ipment Rel:pit rement s

LOUSEO MTS FOR MAJOR P'TE & OTHER SAC TRAININGACOUISITION-->- WEAPON SYSTEMS MTS EQUIPMENT

MODE______ _

Nckx Sv ' ii'I 33060, 1 &LS, 1,11111v I WC 190 Id SSNII S

0111~''.th of 3 W~tI &S. 3unl 1C90i'0 SNU S NISI

I~) I'~u I'mm It is Nmpme

1SAC AI Co m1 .

AFR SO- I I, Traiitn g, L Managvmit and lit iliza t ion of Tratin il~''csand :\1R )0-8, hinst riuet onal Systems fle~velopmel.t , spec i 'v thIt

all t raining eqtpifivivlt req(Irement s muslt be dt-'e Ioped ;iccordin g to 1I)Pprocedures , anid imp1 v that suich reqlu I lenlt s evolv \e direc tly f roa tilt,I 511 process, Fachi of the three matjor requ itrement set ti o il rgan 11 :;l t o 0115

I sted Ill Table I does prodntwe relliiii remntis tor mat Oit enanctl t ra t.'IrS,.1nd vet eachi accomp I ishes them Inl ai iIque way\. The I SI) provess doesnot p rov ide a stand ard I.-.ed p roce d ure.

For new systems, requirement sett ing is tnit iated veryV earl v ilhet Sse Actiul sit ton 1,1 te Cyc le (SALC). General requl remenits for

t ra I il Ing equ ipment , based oil thet nm nt enance, and t ra tnt n) concept sestablished for thet weapon system and onl past exper tence , aireInititat ed for the Stat emeni t o pe rat iona i Need (SON) long before thleSilo and thet IS I) I eans are formed. while these e oSt imates aIre e ssenit I a

or long-ranige plannfIiig puirposes, they are not based onl defi ittlyeIraitilig iatIvses . This report reogntI t-s thiat suich I rat nt ugequ ipment rvittrement esthimates; are, necessary from thet in it ialI conceptstage of wea l 5 S-tcem development I orwa rd IIi roughl allt Of the dev elop-ment al phases. However , unitilte IS[) pr-ocess Is initilated thle act il.1lt ratining rect rement s whic tshoul01 d he suipport ed by, training oqt peintfints remain spec iil at lye. Wilk manly belle ft Is -ouild he gal ned 1)\improvements Il n mt hods tor derilvinug earls' est immts oft trainilng anlldratr1ilrg viltI pment retiitrement , tis stt ud exami nes thet .icqutisit 11111

2nt~l 330t th T&..ES suippo r ts test ( APR 110-1-4) programs anld perflormls11SI) as d reeted by HoQ Air T*ralilng Commlanld, I tic ditng majormodtffteat toils to wvipol) sy~stems whichl are, ouit of aeqitst tion.

process from tile point where the ISD team is established and trainingrequirements based on formal analyses emerge.

Each of the three primary organizations responsible for conductingtraining and training equipment requirement analyses for new systemsmakes use of somewhat different data bases, information sources,technical resources, and analytical procedures. Each has differentprocedures and requirements for training analysis documentation andreview. However, each organization identifies and lists maintenancetasks, and derives from them maintenance training and maintenancetraining equipment requirements.

The procedures followed to determine training needs for systemsout of acquisition and for several systems are likewise differentdepending on which of the two organizations performs this function.There is no requirement that either of these procedures (unlike thosegoverning new systems) follow formalized IS) or other systematictraining equipment requirement analysis. Generally, new trainingequipment requirements for these systems come from tile primaryTechnical Training Center and are identified by instructor personnel.Typically they are based upon a need to replace or upgrade existingtraining equipment which is damaged or obsolete, or they are based uponinstructors' intuition about the types of tralning equipment whichwould enable them to be more effective. Many of these types oftraining equipment requirements are developed in coordination with asystem contractor and are implemented through a contractor-inittiatedengineering change proposal (ECP). Such an ECP for a maint ienancetrainer is forwarded to various organi zat tions for approval, includi ngthe 3901st SMES/MBT's maintenance training section, (the primarytraining evaluation organization for SAC, located at Vandenberg AirForce Base, Caifornia) where the recommendations are verified and theimplications for tile new maintenance trainer, in terms of its impact onorganizatton, manning, and logistics, are reviewed.

Phase I1. l)evelopment of Specifications

Tile output of Phase 1, in whatever form it is prepared, is astatement of requirements for maintenance training eqtuipment. PhaseII translates these requirements into specificat! ons appropriate forcontractor design and fabri ta t ton, Procedures which structure thedevelopment of such spec if ica tions for maintenance simulators arecurrently uncertain, and are in a state of change within tile AirForce. Only a few maintenance training simulators Iave been procured.Organizational responslittties for the procurement process arv onlynow emerging. Tile Simulator S'stem Program Office (SIMSPO), ASD/SI24,

7

'YU&&~L -J

'r i gh t-Pat t erson Ai r Force Wise, Ohio, will as sime res;pons lbilitv forany new maintenance sitmulators beyond those ctirrent lv in procutrement

At the 13Obth Test and Evaliiat ion Squtadron, Edwards Ai r ForceRise, California, the TSD team provides the fol lowing trainingequi pment requi rements at the end of Phase 1:

1. 1 306 T&ES FORM1 I (TEST) 1d11 79: A compilIatiton (ifbehavioral1 requl rement s for each appropri ate trai nerAnd/or t raining aid.

- A proposed Couirse, Chart.

3. A proposed Training Standiard.

4. A draft hrinct lonal spec if hat ion modeledi after thePrime Item Development Spec ificit ton.

Trainting oqidpment roq~pitrement s from the pr imarv TechnicalTraining Centers (TTWs) ;ire provided on Form 60Olh, aI reqidsit ion formlised to jut%,itf v the neved for each it em ot requ itred t ra ini n) vqoi pmentin terms of the Specialty Training Standiardi elemeints eaich willsu,"pport , the as socitate'd nuimber of stu~dents , and the nu1mbil r of hour11s Ofue SOAn1t ic C p1Ate0d .

Requl rements from the 391Ist SMES -'/M11T for SAC systems consist ofECPs prepared hy the, system coot ractor. Thies-e CPs mlay be backed liv at raitni og reqirements an~ lvs is, a Ithouigh to date formal l-SD proceduireshave not been ulsed. Thie E.CPs spec ifv des ired changes for the mi ssil1e, vs temn 1i anchi cont rol I omplex which impa-ct the s t Iatr and 'r othert r.flIne rs.

After rein Irement app roval by is ig command , AT a nd the 5 P1, -IPr ime It em Deve lo pment Spec if icat ion 1, s enerated. Son ree solect Ion-;when necessary, aire made risnal1v In n trioc 10t in with ISD persolntel-I ~And/!or the tsnsIrg command or ig I a r fterqiimnts*Often, the

pr ima rv wealpon svs tem cont rac tor will also hanve the respoosiblb 1it v ofcomplex t ra iers. In these instances, development and fnbricat Ion maYvbe Nuboont rite ted .

:ht rinlg eqnipment design, aI Prel imi nary Des Igo Review ( P1R) , aIt n Il Cri tical 1 Pes Ig Review ( CD)R), as- well as formal and informalInteraction between the cont ractor, the SPo, and the ultimte usersshiape' and approve the eqlpment conftgutrat ion. Foll Iow ingl CTR

noot racI t n I aIr ra ngeme n ts a1reI ma de f or equ I pjm en t fn br icant ion an d,

11sua v, v'A I nat tonl.

Phase Iii. Procurement

Procurement procedures for maintenance training equipment are

complex, intricate, and unstandardized. Table 2 summarizes various

procurement responsibilities for different kinds of maintenancetraining equipment.

9

ASI~i~v

ILl

44

z

44 C

LL - - m L

-J

4' LL4

E0 I

Zi ~ -

w L)

a. - (r L

-0 0

C-. 5 L

I- a

4w c EEV.- MU ' E , E

cc LL u. .±

E, E-E

CL

ILI 0

o 10

to

~I|

SECTION III

ISD PROCEDURES FOR TRAINING EQUIPMENT DESIGN

In this Section ISD as it relates to maintenance trainingequipment design requirements is discussed under three major headings:

1. The process and procedures currently used by the ISD Teamat the 3306th T&ES.

2. A general ISD decision model leading to the identificationand design of maintenance simulators.

3. Problem areas which appear to degrade the effectiveness ofcurrently produced training equipment requirements.

The ISD Training Equipment Design Process

AFR 50-2, Instructional Systems Development, prescribes that theISD process is to be applied to all training planning, including thatdone for new weapons systems. Techniques for general application aredescribed in AFP 50-58, Handbook for Designers of InstructionalSystems, Volumes I-V, and are taught by ATC principally in theInstructional System Designers course given by the 3700th TechnicalTraining Wing, Sheppard AFB, Texas. In addition, there have beendeveloped Interservice Procedures for ISD as well as numerous otherversions of ISD tailored by specific civilian and military

4 organizations to their individual needs and preferences.

In general, all of these ISD ada-tions have in common a flow ofanalytical and developmental procedures which start with an analysis ofthe activities for which training is required and proceed to:

I. Determine training requirements.

2. Establish training objectives and their sequence.

3. Develop performance measurement techniques.

4. Select appropriate methods and media.

5. Develop instructional materials.

6. Conduct and evaluate the instructional program.

While the specific nature of the tasks themselves determines theparticular characteristics of the developed curriculum and of thetraining equipment needed to provide necessary hands-on practice of therelevant tasks, the ISD approach used is basically the same for allapplications for both new or existing systems as well as for I-Level or

O-Level Tasks.

The 3306th T&ES is currently applying the Air Force organizationISD procedures to the development of maintenance training and trainingequipment for new systems. This organization has a core of highly

experienced ISD team personnel and has evolved an adaptation of thegeneral ISD model that has been singularly successsful in meeting AirTraining Command/Air Force Systems Command (ATC/AFSC) requirements fornew system maintenance training. Their ISD expertise is unique in theAir Force and their process is well documented. 3 For these reasonsthis Section reviews the procedures of the 3306th T&ES as they relateto the prescription of maintenance training and the design ofmaintenance training equipment.

Figure 2 illustrates the general relationships between the3306th's ISD process and tht procedural steps or phases outlined inboth AFP 50-58 and the Interservice Procedures for InstructionalSystems Development. The arrows in this figure connect each step orphase of the two cited ISD procedures to the most similar general stepor specific procedure of the 14-step 3306th's process. Arrows crossingother arrows indicate activities which are done in different sequencesin the two processes.

Interviews, in addition to those held with the 3306th T&ES on newsystems, were held with ISD training development groups at a number oflocations where production and upgrading of training for existingsystems are done. 4 At none of these other locations was any "by thebook" ISD analysis used in formulating requirements and/or designs fortraining equipment. Interviewees reported that formal ISD is generallydone only for new systems. Factors influencing the use of lessanalytical methods for existing systems include:

3 3306th Test and Evaluation Squadron, Mission Handbook, ATC,June 1979.

4 See listing in PREFACE, page I.

12

E

-U -EUv IL

0 r: c c t;EE-

4) WV .0 IV Wa~ X 00 in cc O > aFA

C *-'

CA .5 c. 2 0) 1=S

EA E- E SC . . ~ wa- 8 !! 0 VIVV) _5 060

02 Vr C41~cl CL~ mS~ o V0~V00 -

E0

0 ~ 0 0.

> C -V

'0

C: c

I) M

m >. C:V 0 2

W~~~ C .2~_-.C -(Vm E~~~ M W . ~V4

r c~IN E I - N 'cc___ I. Xv3w

321VN NDS3 7o3~ LN c3dfdI~~1

I . Instructors' lack of ISD experience and training, and biasesagainst formal analyses.

2. Insuf fic ient t ine avai lable for conduct ing forma I ana Iyses.

3. Lack of a comprehensive task data base.

4. Lack of formal 1SD procedural guidance fordeveloping training equipment.

Manning and Training of ISD Teams

The 330bth has a small cadre of experienced 1S) analysts. Theirassignments typically are new systems which are being developed in the"fly before buy" mode. Their involvement with these systems usuallybegins on or about the time a prototype is assigned to Edwards AFB fortest and evaluation. At times, the 330bth becomes the location ot ISDteams for systems not yet fully developed to the prototype stage and/ornot assigned to Edwards AFB for testing. In all instances, personnelare assigned to the ISI) team primarily on the basis of their selectionas future instructors for the initial operational units; a smallpercentage is retained as squadron cadre. The operating philosoph v isthat it is more effective to select and train individuals to be ISD)analysts who are already experienced instructors and Subject M.-atterSpecialists (SMS) than it is to select experienced analysts and attemptto make of them systems specialists and competent inst ruc tors. Theadvantages of this approach include:

I. Experienced instructors generally make good analysts, sincethey bring to the ISD process first-hand experience withtraining and the knowledge of the types of trainingtechniques which have worked for them most effectively inthe past.

2 As SMSs, they can quickly learn the specifics of a newsystem and can better assure that all system-unique trainingrequirements are included in the developmental process.

3. Experienced instructors are able to quickly learnappropriate IS) procedures. Without previous ISD trainingand experience, analysts do not come with entrenched ISDnotions which differ from those found effective by the33o0th.

4. The 3300th T&ES has been able to tailor ISD training forincoming instructors to specific procedures that have beenfound to work best. By imposing strict documentation

1.am

rekiuitremeit s wii t te ISO proeduire, tlhe stliatiron.Achieves Ntanidarkl~iatin int thle developmenital 1, OCt'5

and maxiizte,, tile traceahit tv o-f training developl-me ut dect1s tmis.

~.Inst ructors are able to take dietailedt kniowledge otthe system anld of their owii t raining mat vrials wth1themt-i wtuen the li, o t o theit ope rat t onal sI t te. COnIsO(elet IV.they hav'e tile best pissible preparat ton for t itt ingvthe couirse materials to the .pec iftic t ra hing if tua Io

he v e ncotevr (invd tor mak.ng k I ing .ous udtesthe System evolve.

There are several di sadvanitagvs to these, pocekhtrts-

'1 age port toll of thle tratied anid experienced 11S.,sl eave the squiad ron when1 t hex assume thiIr tiut I es as tilhemnIt al instructor personnvel for a niew operat tonial sytemn.Tits requi res the squadron to replace, and ret raini new%,people for each niew svxt em.

2.WIle t he squpadronl prov ides a tat bredi t ra ing reg linenlfor Inicomint, SMSs 1% lcia rctiei 'u dg te si I Is

necesaryfor efeettiye I SO occurs on! v onl "their svSt em.""llev' are,* thus, de~privedk of nmuchlvs eit' ftedback whiictuwoutld enable the traIninig eql pmen01t detvelopmntt deiifOnlsto he more eftfectivye dIngvll sulbsequentt pp)iNIc'at ionls of t hetI S proces . It shouild be nioted, *however, tha, t somesent or squiadiron personne-1lIhavV had 'Xtt en lye I S exper1iencek-withi a mnmer of pr~ey ios IV~ lIJV.0ed sVstt ems.

1.SmSs new to thet squJtadronj tendk to con tJ,,I ue thjeir indkividual,1tranin corss aind thet training eqidpment dsnswhichl

tlV he poduice ill ways thaIt haV 'tl ensuICceSsful for themn inl

tilt, ast Il s t atel-ot -the-art t rainin lg 'Ind t rai1ning1e'qkl p1ment techn11iqueitS Are Oft en no0t e1mp11 IOVed becauSemanyv ins1t Vit't oVS l A Ck e XperIince wt h o0 Ven01 kno10WI e'dg 0 t0

'them.

ISO Procedutres

Th I 10 Mt ls s i onl llandbook previu t l1SIV re t erenclked liescrVi be" 11nd et ail al 1 ot t he pr oceld i AIS te p! oft thiir ItS pro1 0Ces * Teproceduires dei t ine in ti.,rea t e r det t a 11 th lit, t 1,1.1 s o t the ilener, AlImodel shiown inl Fitire 2 nt e o tl o 0 r upot io of t hi rep tr -t t hetappi teat tonl of theseN prVocedulres 1kis descr ibed Iit t erms o t heit f ive

phaj,,ss of theI( g enetral modet I.

2L4

Iln p rac tice, t he genera I modelI phases are se oino imp Iene t inst r iot so( tienlt talI ord,-r. Oft tn 0. InAl1\t ic' act iev t ties ove'rl 11. I'lis i

a r t ic -i Iair Iv t rue ini t he ana lys i s Ot St em~ requJl i 1101t S4 Whr I''add i-t ions aind mod if icat ions to the general data bast, arc ott enl det 01-11i n10daf ter thte t otal I 151 process is well altong.

Stein 1. Analyze Sy"stem Requi rements. Ilte basic task dta~ ba.sefor mane neow systems is genera iv prov'idedi v th Ilk vt e11.s C011t ?"W~ t 0'rSystems designted p r ior to a bout mi d 1Q77 w Wrk, dtoe nine n ed bv .1 Vla; .11an1Skill Analys is Report, 11llustrated bY Figuire 1. Tvp1)ic. iIv , thei,,, da'tIare in the form of a computer pritout which es senit iail Iv~ p rov ide, ti askandi step names, and some verY geneoral informit ion 'oncerningc tas!kconditions,, crtiteria tor pertf.ormance, aknd crit icail itv ,I te 0 -SeVs;t e MSire documen ted be" the eon t ra t or fol1lowing Logi stic Csupport AiialIvs is(LSA) data formats. However, the compu)(ter pri ntouts ot ISA dlatai areyore difficult to read and Lo work withl, andk thee curreni'lt v cann11ot !ieselect ive lv format ted be subsystem. As a resutlt 'StSs prO f er t o wor kf r om thIte h aniid w ri Lteni c o pieos o f thei dta inpu1)t tsheeot s. 'Ihree-0 dt 1A

tormats are most of ten used:

1. a ta 'Sheet, C: Task Anni IYN* is Suivnia ry

2. Dat a Sheet D: Ma ite natnct, and Opeator Taisk Anal \'5is.

3. D ata Sheet iF: Support and Test Fqui pmont orTianinMaterial Description and Jutst iticit on).

Figuires 4 and 5 Illutstrate the first and suibsequent pageos f ron DataSheet 1).

seldom are any of these data bases, compl et e. The SMS kanal estmust work w ith eng itnering drawings , cont ract or desi I oper' 15001W TesForce personnel, and all exist ing technical data, as appropria1te, toidentify the specific tasks which matke up the on1-t e-JOb pc'forManlce

requirements from which training will be derived. Ofteni suibsystems arein a state of evoltion, making the ident if icat ion anid/or detaileddesc r ipt ion of both 0- and 1-Leve I tasks, impric t ica 1. Add it i ona I taisksthat are identified, and additional informat ion describinig taskper formance are recorded on the 1_10ot T&KS FORM' I ( TES17) as ill ist at edIn Figure t).

Step 11.* Define Training Requirements. The using commanld for thenew system should provide to the 1St) team tilformat Ion abouit the AirForce Special ity~ Codes ( AFSCs ) of those who will beomei, the neow svstoemtrainees.* Their prey lots weapon sys tems expert enee, Is al1so ai needediInpuit. Th is in format ion is of ten not available when neeoded inl the 1IS)process. As a r e sutt, thte a naI vqsts mulis t p)r oceved I It I, aIIv onl 'IA Ssump-lt toins made by the SMS unttI thee are ver if ted or mod it ted bY both

ATC and tihe using command. Once the specific experience of proposedtrainees is known (or assumed), training standards and SMS experience(or interviews with individuals experienced in the appropriate areas)provide the SmS with overall impressions of a profile of entrycap,,bilities to which training must be geared. With this profile inmind, the SMS examines each step of each task and identifies those inwh ic h:

I. There Is a knowledge or a skill new to the trainee.

2. Practice will be necessary to meet on-the-job per-formance requi rements•

For each task or step meeting either or both of these criteria, thereis assumed to be at least one training requirement. The 330t, T&ES FORM2 (rEST) is used to document each task and all training requirements interms of behaviors, teaching steps (groups of knowledges required forthe learning of the primary steps), the conditions and criteria whichdelimit appropriate behavior, and an initial estimate of training time.Figure 7 illustrates the way in which FORM 2 is prepared. FORM 2b,Rationale Checklist, (Figure 8) is used is a guide to define trainingrequirements and to structure additional information about eachrequirement.

Step Ill. Develop Objectives and Determine Media. In the processof filling out FORM 2, the SMF must make judgments concerning

situations which require stimuli in addition to the Instructor andTechnical Orders. The situations are, in general, selected on thehasis of instructor preference and insight, with the aid of a mediaanalogram (shown in Figure 9 in the form of a decision table whichrepresents the same set of deci ions as does the logic flowchart-typeana logram) and with the aid of the Rationale Checklist (Figure R).Specific media within any of the defined clas'ses are selected frominstructor preference, from a more detailed listing of media In AFP50-58, Volume IV, and'or from information about available trainingresources for the course under preparation.

Once all media have been identified in this way, the 1306 T&E,FORM 3 (TEST) is used to compile all behavioral requirements to besatisfied by each type of media which cannot be locally manufactured.The types of media for which a FORM 3 is prepared include:

1. rransparenc ies-

2. Slides.

1 Charts, diigrams, iiltstrttions draxrins

4 . lode 1 s/cItaw.1vs.

aA

.4I4b v

QO

L I-

.4.4C4

.. 0 0 a Uw m

L)O. U

.j 0..

lu j 'U

z. z z;J ,i. !IC, k

4 a x IA 04 C

4L C) ku i:

z w CK

A I-a 4L

p. ci -V

~ 0 1

K 4

0L sL UU .

A A44 'o

fi

4 4

IAI

fkI:

,qt,

U, *,

at t

C) 0

I *

it i7 i

* %, %1 4 i j

0e 40 I'4

ts l-k& O tO ell

~ z* J ~ 0 ~ *~, Li 0

-U 1 :t 17 %A A

%A -A. j

% 4 qc -", ~ CJ I .x. 4 a P -a o.4 (-,!A . . 4 .1-I w ' ~

*. ... * & 4 -C % *i

-9 a'4 0' C 4Au ',a, -J Oc.~ .) C ' W,-,1 k' 1- 0

u!11 1 14A41114I

C' -w * u to ck k- IV'

on at V)

- r -41 4.u 4l 4 *

%x t--j %U

4 u l k ,1 ,%

YN vM"dt Wd N 1

I li e IN

1lt A ' s:ok4141 AI O V I, ION oIQ It A .; A !- 4 1.., 1

'e ' 4

.,I aq tl' o, tt, ,tit l.% Ii'~ ." oIl v

l-,", W Oe It

II,% I l t a pil

s o I,,,, I ll lose m I e

kO tA tt itr o., 50 o ''to lI, l t

I~~~~~~ oiue k * .30 V&E OM1(''S)im7

e I -' ",. I i

LLJ L E E E E E E

I ,,i,

-xi I-- -o

2l n~ . Q0-- o- C

Ln.

M 4- t -

CC) ) C 4, o) C

LA. m 4,0L-~

UC) CO 0 u-,C 5L

3>~~ r_ ci 4'

4tC 115 C) 0.

0. 00C e )e N N I N P

LL

CC

cm c

_j- LL. C, L. 4)

L))

4 - >4--CILJ .~ J-

04) )

C)t __j - _ _ _ _ _ _ _ _-. -J ;

x CYoC "3 '. r-i r- C3L 0' -

-a-'. - 1 F '

u mI, C

LJ C1 ul II U "'

U.1.............................................. I I L+,4t

RATIONAL' (141 KLIS'T

- This checklist will be annotate~d with a check markTASK NUMBER: for each behavioral requirement on the 3306 TIS

Form 2 (TEST' that will be satisfied by academic10-28-22-00-X j instruction to provide rationale for the training___________Jrequi rement.

REQUIREMENT NUMBERS

TRAINING REQUIREMENT /~~~ ~~

1. New knowledge

2. New skill_ _ _ V3. Practice required

4. Cope activity

5. Condition/Criteriat

6Unique manipulative skills------------

7. New SE t8. New special tools - _ _

9. Is technical data available

a. Instr clear A easily understood .

b. Oper steps in logical sequence

c. Schemiatics adequate fordetailed troubleshooting

d. Sys units location identified

TECHNICAL TRAINING MATERIALS (TIM)

10. Is hands-on practice required

11. Isair vehicle practical _ _ - -

12. Is SE required --

13. Is actual equipment required 114. Will audio only suffice__

l5. Will static visual alone suffice L '16. Is static visual ___

and audio sufficient ____

fFt gtr 8. 3 loll TM:8' FORM 2 h (11",T) Ja 71

Rattolial, Checklis

5. Videotapes.

6. Trainers/simulators.

7. Actual equipment.

8. Audio.

1 2 3 4 5 6 7 8 9 10

HANDSON PRACTICE REQUIRED Y Y Y Y N N N N N N

O-LEVEL TASK Y N N YAIRCRAFT IS PRACIHCAL Y NACTUAL EQUIPMENT IS REQUIRED Y NAUDIO ONLY IS SUFFICIENT Y N N N N NSTATIC VISUAL ALONE IS SUFFICIENT Y Y N N N3D REQUIRED Y N Y N

STATIC VISUAL & AUDIO IS SUFF ICIENT N Y Y

AIRCRAFT XTRAINER X XACTUAL EQUIPMENT XAUDIO RECORDING XMODEL/CUTAWAY t AUDIO RECORDING XSOUND-SLIDE XVIDEO TAPES, FILM, ANIMATED PANEL XGRAPHICS-PRIN XMODEL/CUTAWAY X

Figure '). Media Decision Table

A medium item which can be manufictured locally and will not require amonitored procurement, is not documented on the FORM 3. Figure 1,illustrates a typical FORM 3.

Each FORM 3 (one for each medium) must also include a descriptionof the medium itself, its physical characteristics, its content, and

its function. These descript.ions are usually straightforward for allmedia, with the exception of major trainers. When the medium is asimulator or a major piece of complex training equipment, the

24

a) 0)

La -C

toD 2 0~ xU-'~ 0a) U' 4- Ei (L) '

Ct'L)C0n-4*'t w

- .U 4-Z04 34

LA A Q) L)JCV L)

Ln S.-X tA Cr 1 r-

c-.- a > r ) 'U

a) 0 L L

CL a Cn -'I-~ C 0,> 4- A-0

'U4-'Q. C) W E

LO CA A 0 0 C- 4-'1 -u

LW = - - Q)OC ,- 4-Q

! C) ) - J= 4- ()

IC ,

L.J A.I 4-L~C~-L 1 E0

-j -i ca- , = a~- -4-. <. =- (1 .- 10 z') W.C-0

C) 0J 0~ nf -- V

> v1 4 w 0. 0 CS0n M) CA0 4-V U4-

-i.2 0)U- 14')0 "

F- - 0 1L

C'l 2. E.. r- Ce 'mA >4',-')aOC .

C: W U C' a) . 4- a)C 'U a) LA ) -

L4J 4",-M 4- C0 .- C

-I0, V) LJ 4'. &.~ i - uoc Mco > M 0 w to0 .- o) l

Cl Ln 1C 0 L - C.-- 4 Lauj~ M -0CE CC '- 1 AS-: 04-ix - 7 W 41 M -- =3 CU CU

0- 0- U,) S0 LA) 0 LA4-~ -0'

V) C) co

L.)

LL L i

C ('

C>C

\1. I. -~

* I

es~i 1 25

desc r i p tion on thle FoRM I is futrt hier doe tined and amp l it ied by t hetpreparait ion of a proeI imlnary Funtc tt) ona I Spec if icat ion. I n add it ionident if ied medita are of ten reco)rdled on thle SME I og , (opt i ona I) , wit htnotat ions indicat ing those appropriate for local mnufiact ure.

Step IV.* Plan, D~evelop and V'al idate InstruACt Ion. Once t ra in ngrequiirments and recommended training eqt tpment have been es t ablished,the method of inst ruct ion is selected which is best stilted to teachingthe specific t asks.* Tasks which geonera te training rcqu i roment s areexamined to ident ify cat egor ies of t ask- level. training re qui remenlt sI nic 1ud lu1g:

I Facts and definit ions.

C oncept s

3.Pr inci plIes.

4.* Procedures.

S. Mental Ski Its.

t V.svchiomotor SkilIls.

7.At tI tudes .

Once task- lovelI requki rement s are ident if ied, a Toachli ng \tthodsSelect ion G~rid (Figuire 1l), is used to select. the mos't approp'rlatmethod for providing inst ruct ion on each taisk. Course, controldocuiments atre prepa red which include:

1.Couirse charts, which suimmarize the antic ipated outrse ineorms of major items of training equitipment reqlired mnd thesgments of training content to be inc ludted kwit h

associated training, time est imates and any otherInformation relevant to the course inst ruct ional design)

2.* oti-se Trratning Standards, which list tasks, kinowiedges,and proficiency codes, in the pre ferred teaching order.

3. Plan of instruct ion (P01), which provides a relat ivelvdetai led description of the complete couirse by' units ofinstruction, criterion objectives, requtired suipportmaterials and gidanIce, inst ructilonal unit dutrat Ion, anid Aappropriate Couirse Training Standard references. Th is 1'01becomes- the lesson p'lan structutre for the training course.and is typical ly personal ized by each Inistructor uising Aspec ific annotat ions to cuie appropriate In-class Insti ructorIand/or student act ivity.

2t)

m z CY r +x + + + +

C3 ~ -i 2 *~

O.cS1~ + I + 4-+

o -MZ E

t- - - - -

LU'

t (U 4jCW - Cn -

42 u. LU CA,

VI. 4-' i vi ;7!43 0n .2 zm- m' -

0 O3 V).-L ICn -~ -j 9 i .- X 9 -

04-. L> LU .

4-0 , L~ 0' x- LL' z > - !, 0. Ue (AL/IF

U- VI ill s(UZ L.- k-- 4:3 I-M Q: CI4 9.- 4

mC. 00 LUI CK .-I 0 Ia. LU -

C- .1 I.- =i IL C3

Lin +) C'. LU Lj u C-- F.) (~~~~t ae.J I* -- U C. . . C). w- Z! >- C. >F .x 2 0 V '- 9- C wU ;X L ' F.

.4L

S2

Once a FORM 3 is pre pa red for eachI medi ur, a.1 cag oftraIngequi pment diescript lye informat ion is assembled for the TrainingRe qu irement s Re commeindat Ion Review Meetinug (TRRRM ) Thei infoarma t ionpackage aissemb led includes:

o o r. E*S F0R M .3 trV i: )

2. Proposed Couirse Chart.

1.Proposed Con rse TVrai n ing Standards.

4. Addit ionail informatiton as appropriate to substalnt aterecommended design (FORM -s, etc.*).

When the package involvos a ma jor trainer, suich as a ma intenancesimulator, it will also include a prel iminary Functional Specificat ion,AS previous ly described. * i TheRRRM review,- all training, anld t rat - ugequipment recoummendat ionis from all ot thle S>ISs on the I SO team andconsolidates recommendat ions, and equi pment reqtirements to derive .anOpt imum media package. When1 thle package is approved by ATC , it isfo rwa rded to the SPlo for procuirement.

luist ruet iona I mater ialis are pre pa red fol1lowing thle prescript ions,ot the couirso cotrol docutmentts anid t ii 1ing riterionl 0objtect iVesdierived from the ana lyses docuimented on 3 Ioo T&.ES FORM 2 -) SI)Performance and/or writ ten test-, are prepared for cri terion objec( i yes,ais ouit 1lied in1 thle P1 I Ins t rute t i ona I ma t er ia I s ,consi-,is t i iig o fTrechinicai Iorders, programmetd text, stuidY guides, workbooks, handtouts',etc. , are deve loped and i nteg rat ed f ol lowingl thle 1101 s;truicture t o formaI cohies iye and inst rue t I ona I lyve f f eet i ye presentat ioil. Mterialls aIrevalidated as they are prepared and refined prior to final coursec onduict.

Step V. Conduct and Evaluate Instruction. Althiouigh this stepl isSi St ed as a pa rt o f theit t ot a I S1) p1rocOSS , i t tvpY l10 I I is A C 0 o1mpi Ihedin a formai ItraiIn tug env'i ronment at an ope rat ional s5it e. Courseconduict is provided by thle SASs whio served as thle I SI t earn and Is donevuinder the restponsibt Iits' of thle Field Training Group, TechinicalTraining Center, or othier managemnent agenicv. Couirseo materials arerevised and upldated as appropriate to improve thleir e I feetivelless and'to reflect relevanit system changes.

Pit

A Training Equipment Design Process Model

Tile preceding material has described the current Air Force ISOprocess for determining the need for and thle charac teri1stic's of inatn-tenance training equiipment. In this sect ion thle underlying decisionlogic which should structure the process of proceeding from taskinformation to simulator design characteri1st ic s, and of identi 1 fyi ng thegeneral classes of information whi ch are requi red to support eachdecision set is explored. This approach of putting' In to sequence majo rsets of dec is ions provides a general training equipment design processmodel1. The model is a general oine in that dec is ion sets are describedat a level which general izes across most of thle training developmentsi tuat ions in which the lSD process couild result in maintenance t ra tnt ngs imulators.

Thle purpose of this design p ssmodel is to lay the g rotindwo rkfor forthcoming hierarchical and associative relationships bet weeninformnat ion about tasks and appropriate training equi pment charalcter-ist ics and training appl icat ions by comparing it to current ly usedprocedures.* The model is an extens ion of exi1stin g 1 51 procedulres andis intended to lead toward:

1. A determinat ion of appropriate ISD procedure mod if ica-t ions which will cost-ef fec tivelyv suipport stlmtilat ordeve lopment

2. An ident i f icat ion of the proceduiral stecps wh ich can heeffect ively aided by ha v 1 n a refoerence mania I (handbook')for proceduiral Ituidance and dat a.

3. The spec if icat ion of thle charact erist hs and spec ifticcont ent of appropriate ISD documlentat ion of 1M i lit enanicesimultator training requirements.

A Decision Sequience

G iven thle procedural content and the intent of exist 1 n ISDguidance material, the following general decitsion areas begin at thepoint of determining what ilust he accomplished onl the job and end bhprescribing thle best method of document ing the design as fol1lows:

1. D~etermine the requi red job-relevant skills andknowledges.

2. Spec ifyv those skills and knowi edges which muiist belearned by trainees (as cont rast ed to those which thetrainee has already mastLered t rom previouis tr'a ini ngand/or experience').

) ii,

I denlt it V t Ilo se ski IlIs antd knlowi edges WitichI canl he r

Most e t tfee-t ivelIv Ilearnvd , at Ileast in part ,t hroughIrac L ic-e onl an it ema to tra int I ng ekt i linett

-.. G;roup skill s and knowledges by c lass or t vpe of

t ra in ing equipment.

S.Spec ift for eachi t raining equipment tYpe how wellthe assoc ia ted skills and knowi edges mutst be

Dc lc rmine the ordeor inl which spec ific cl isses oftraxiining equiit ent s hounk id e emp lovyed to fac ilitat elearni ng.

inr eseript ions mf training equiipment chiaracter-ist ic s that sem to most effect ivelyv sippor t spec ifti cleairninig roquki rement 5.

S8. Dove lop a pre I iminary P ilin of Iist ruct ion t VO I )wich71inito g rattes trxxili n ne. eqiMent aikl '11 00t0er appi)-Opr jote

dintii into anl efttoct ive traxiining sconaxrio.

9. Rev' ise andk f inll i -.t thle equi pilxent des' ign andk det ili IallI re levant futntt ionalI cIwa~cteri s t ics to he ult iI i.'edinl the training scenario.

1.Docuiment aill equtipimnt-related trai ining requiremenitsand1% assoc iat ed training equtipment fuine t lna I chariiacter-

itisas thle principal inpukt to thle SPO aeqitist ionl

Figure 12 depict s t lxis dec is ion p rocess and nmxr l ~ thleprincipal informational itnpuits necessary to it.

Criticaxl Features

The model represented in Figure 12 highlights a number ot c ri ticailact ivitieos (dectin sets) which influtence the eftectivenless of train-Ing equipment as it is uiltimately employed inl a training regimn.The model also suiggests that cr It icalit v of these activ it ies nc rea seowith training cqul pment complex ity, since the, potent ial is inc reasedfor inappropriate design to s igni ficant lv affect training quialitv, andto have greater c:ost implications. Crit ical reutitrements for thisdee in model have been classified into four categories:,

10)

Sys T,'1 11 1 [i-t IS Job Relevant _____ 2. Specify Skills & Knovi iqes 3. Idrtntid.SLI SIISr5 I & Kiiotdelres which mnust be Learnsed Bt

A, ,rrr 1. Compare vviti iriput capablities iorif rdenf,t F i eiure

Doat is,- new vkilliknow Indrles.1 icm

Tat t. Steps Live 1(0 oI I

F, irlur-cL lIntel val 2. Specify skills,/sets of knowfiee fe~jiriri

E lopsi . T-m rt' 1 costI pra ctice to meet standards.

No, Pr surriei C, terra 2. Pr irc'dur

AFSC C ticality 3. Classify Name,/Locate/Associate ai. timeSoN.Drscrrnrinate/Detect 'Ident y ly offfiT0N.Monitor c. task

Enfqi reei rg D mj Follow Procedure 0i. highErprtOpirnCommun Icate e. manyE ptOiinUse Rules/Principles f. initial

Solve Problems/Make Decision, seceI~ De ie Skiit 'Knoivledges !TastkStep Perform Continuous Perceptual-

2 Specify relevant performance standards Motor ActivitylUte sta ndar dized verbIs

ITERA IVE--

6. Sequence All 7. Establish Trainirtg to8.

0A Training Requirements Eqimn Ieig C IcpS

1. Classifyi objectives as 1. Group S&K clusters (Step 4) bsy or deierl 1 . Select ina. Common objectivesb. Task Unique 2. Flowch

2. Select Equipment Type oin basis of a. Sele

2. Classify Task Unique objectives as: a. Ptactice implications of S&K t,. DJeter

a. Having a joh-related order (contingent) b. StUdettt flowV c. Inteb. Prerequisite for learning others c. Available resources

0. Use simulation if 3. Outline

3. Prepare prerequisite sub-obiectives personnel or equipment hazardsatual equipment is unavailable, 4. Assr

4. Order all objectives: refcat ihcs ahtaa. Contingent objectives in job order 3. Determine Equipment Cftaracteristics 5. Reviewyb. Relevant common objectivesa.Oetinlfdiymoefc. Prerequisite objectives & sub-objectives a . Opetitonal fidelit e - moreac

relate to common or contingent oblective U.Isrcintcpblte uar/-by increasing complextity pr/

Figure 12. Decision Sequence forT

31

I141 1.0111.d9I

I~ I.o. I I I K .. .II I.. .... I..

.. .. . .. . I.119 I.) .. V.' .,1 .... .i l

X I? .... I 1.. t III,, .ll. oh- ' 01. ... ... -', . ..

W ggvi'w Jto Wo,, ,,ll~I ....

o" ' ..... .. li O t'? V ')'5

~~e*..g.',g........ ..i.'9 )I.~ gu o

. 1 ilthll I 41 '14i9.I ~ ~ 4.y \ IugII

I~~~~ I-vw psp el

1. Task Analysis

a. The complete decision process represented by themodel has as its foundation skills and knowledgesderived from on-the-job performance requirements.These skills and knowledges, ideally, should bederived by those analysts most knowledgeable about

the job periormance requirement implications ofvarious hardware designs; personnel subsystemspecialists from the equipment contractor'sworkforce who prepare LSA data prior to the ISDeffort. Even with this type of data input,opportunity must be provided in the ISD develop-ment schedule for Air Force SMSs to performadditional skill and knowledge derivation.Analogous skills and knowledges must also beavailable or derivable, for the anticipatedtrainee population.

b. Definitive task descriptive data required tosupport downstream decision-making. Categories4hich need to be added to the standard LSA database (or to be more descriptively documentedwithin existing data categories) include:

(1) Performance standards.

(2) The identification of tasks requiring

major psychomotor skills.

(3) Tasks which must be done in conjunctionwith other tasks.

(4) Detail in the procedures for the selection

of possible task alternatives (e.g.,procedures to be followed in contingencysituations or troubleshooting strategiesappropriate to various symptom patterns).

c. Standardized task descriptive verbs are needed toincrease the communication reliability of task andstep descriptions, and to serve as the basis fromwhich associative selections are made relating,

for example, training equipment characteristics totypes of maintenance tasks.

d. Criteria are needed to provide guidance for therealistic assessment of practice requirement

implications related to various types of tasks and

steps.

32

Mika&&_~* .~

e. Criteria and guidance are necessary to promoteaccuracy and consistency in the classifications ofskills and knowledges which form the basis forsubsequent decisions relating to the selection anddesign of training equipment.

2. Training Development

a. ISD-compatible procedures are needed to encourageand structure the simultaneous design of trainingand trainers.

b. Useful and unambiguous criteria are needed tostructure the selection of skills and knowledgeswhich are appropriate for learning on trainingequipment, especially criteria which pinpointthose for which simulation is not only appropriatebut essential.

c. A procedure is needed which guides the identifi-cation of the scope of training requirements whichshould be incorporated within any particular item oftraining equipment based upon an identification ofthe optimum order for meeting training objectives.

3. Training Equipment Characteristics

a. Bases and principles are needed to guide themaking of training equipment tradeoffs (that is,the process of recognizing that various subsets of

training objectives can be effectively realized byfollowing more than one medium approach). Majorclasses of alternatives include:

(1) Selecting equipment which permit practice offewer tasks or on part-tasks rather thanincorporating whole tasks or integrated taskpractice.

(2) Changes in level of fidelity.

(3) Selection of a smaller subset of practicesituations which are representative(generalizable) of those needed across anyset of equipment-related trainingrequirements.

I3, 33

4 + .....

b. Guidance is needed which describes types andtechniques of simulation as they relate to theneed for various types of practice situations.

c. Procedures and guidance are needed to effectively

relate specific simulator training objectives and

their associated skill and knowledge requirementsto designs of both the operational characteristics

to be simulated and the instructional features

appropriate to optimum learning.

d. Guidance is needed In determining the appro-priate computer generated/controlled aspects of

maintenance simulation and the programming

requirements which will yield appropriate degrees

of flexibility in simulator employment and indetermining in-house maintenance and updating oftraining exercises.

4. Functional Documentation

a. 1SD team documentation specifications for

describing training equipment (especiallymaintenance simulators) to initiate the SPOprocurement process are needed.

Major Problem Areas

The preceding material of this section has summarized the IS)

process currently in use in the Air Force. Overlayed on this process

was a general decision model structuring the design and documentat ion

of maintenance training equipment. Contrasting this general model with

information about current Air Force ISI practices has highlighted six

problems areas:

1. Lack of procedural documentation.

2. ISO not fully applied In this area.

3. A Priori simulator selection.

4. Insufficient ISI) team training.

34'I

I~

5. Requt red intormat til not avat I able.

b. Incomplete analYses.

These problem areas over lap and tnt eac t. For examp leI 81analysts are not t ratined in ce rtai n a reas simply because no formnalprocedu res cuorrent ly exit Ionl whitchi to base, t raining. Each area isdiscussed here, however, to assist tin conceptualizing solutitons whichhave potential f or improving the ett i ionc'v anld the cost-offect ivenessof the 1St) process ats It. produces t rai ning equl pment recommendat tonls,part IiularlIy for maintenantc e stmulIat toil.

This c lassifIicat ion ot problem areas associfated with the appli -cation of 181) prOCkeAtures appears, at first bluish, to beit heavyI ld I c tmenitt . 11owever , Ini rev iewing the spec i f Ic problemIs WtI th each1 Ofthese areas,* it is Import ant to ma inta in it realist ic perspec tivye. Thet.[SD) concept Is relative ly new, uintitue l v demanding, and not widelYapplIied . Even so, its users, part ictilainlv thet 1300~ithi TN ES haveamassed an impressiv e record of efIfectiv e t raining develtopment andImplementat tin. This classi t iv '-t ion of exist ing problems needs to hi'taken for what it I-., ,an at tempt to i dent ify ways Ii which an a 1read " Nsuccessful process can he fuirther improved tin the cost-effect iveness otits products.

Lack of Proceduiral D)ocumentat ion

Ani extensivye revitew has been made of the information reqo itrement sspecified to (I 8 anal yses Ii AFP 51) 58, Voluime 11 , Task Analyvsis,dated 19 .lutv 1478. The descriptions tin thai volume const itute theprimaryv p roced hiralI resou rce for 81) fin the Air Force.* The t v pes,categories, and overall nature of thet. information requiirementsspec ifited in Al-P 50-58 for the 181) tatsk analysts are both compreheonsivein terms of the descrtipt ion It prov ides of tasks/act ivit tes andsufficient to provide the struicture for all appropriate training andttra ining equi pment development dec isitons,. Details of the informationrt'qu irement s and the task anal 'v tic procedutres will not be repeat ed inthis report. The headings from the task description worksheet un;ed tocol late and suimmarizie data for the task analvs is are presented InFiguire 13 ats at general suimmary.

Three major shortcomings characteir Iize 181 docuimentat tonl cuirrent Ifin use , i nchiud i g Al-P t0- 58 and all ofther in 81 references previouisly

1. They describe what is often anl ideal1st ic dataavai labi lity si tuat ion. Much of tilt time thetdepth, accuracy, and reliability of task dataavail able to thet 181) analyst does not permit

ISA61 A)313IO&d

III ~IA& ____ _ _ ___ ___ ___

r A M I A~uI d U83.13f k

ILI

clean abstraction into the specified categoriesof the analysis procedure. Few suggestions arepeovided to assist the ISD team member in thesesituations.

2. The ISD guides and handbooks available are "principle"oriented and provide minimum guidance on the proceduralor mechanistic application of these principles in makingtraining and training device design decisions.

3. The nature of the decision-making required for ISDnecessitates the learning, retention, and integrationof a large number of complex concepts/constructs andassociated knowledges. The procedures, in total,constitute a set of skill requirements which necessitatesextensive practice for mastery. ISD, in its intendedsense, cannot be conducted by individuals, no matterhow well motivated and operationally knowledgeable,who have not had an opportunity for extensive practice

and insightful feedback. The ISD process is neithermystical nor extremely difficult. However, it requiresan ability to conceptually manipulate and dissectbehavioral information. This is foreign to many teaching/training situations. It demands a degree of meticu-lousness and exhaustiveness with the minutiae of tasksand activities in order to make the same kinds of trainingdecisions that are typically made with far less rigor.

ISD Not Fully Applied

The formal ISD process is not generally used when developingtraining equipment for systems out of acquisition or for commontraining requirements across several systems. Interviews with ISDgroups working in these areas revealed several instances where the ATCmandate for the application of ISD to all training development wascausing training development groups to prepare formalized trainingobjectives for the training courses already being taught. While thisexercise was useful in helping instructors to tighten their instruc-tional regimens, it could have little effect on making the revisedtraining more job-relevant. Since little task data in a formal senseexists for most of the systems out of acquisition, the preparation oftraining objectives can be based only on the training course as itexists, rather than on any formal set of job performance requirements.Similarly, the associated training equipment for existing courses wasconfigured on tihe basis of instructor preference and tradition, ratherthan on the basis of any formal analytical derivation of simulation andinstructional capabilities.

37

L U

Job performance requirements and task analytical derivation of

training requirements is tile major key to tile effectiveness of the ISOprocess. With older systems which have no extensive and systematictask data compilations, it remains impractical, in most instances, todevote the time and manpower required to amass such task data.

A Priori Simulator Selection

There is a growing emphasis within the maintenance trainingdevelopment community of the Air Force to consider the use ofsimuilators as the primary training medium. While this emphasis forms avery strong vote of confidence for the maintenance simulation movement,It appears to reduce the already small inclination on tile part of someISID team members to examine alternative means for achieving maintenancerequirements and for making cost-effective training and trainingequipment design decisions.

Unfortunately the ISD process as it currently exists provideslittle systematic guidance for the selection of specific types oftraining equipment. This gap in the ISD procedure increases tileprobability that training equipment selection will be based onpreference rather than on formalized analyses aimed at maximizingcost-effect iveness. Under these conditions the uniqueness of a"simulator" may be easily justified for meeting trainting requirementswhich could as effectively be met by less costly approaches. To thecredit of the 33Obth T&ES, there have been a number of instances onrecent new system ISI) programs where simu lIators were not recommended.However, the emphasis remains.

Insufficient ISD Team Training

Selection of Training Equipment. AFI' 50-58 provides littleguidance in the selection of specific types of training equipment tosupport tile achievement of specific training objectives. Numerous"considerations" are suggested, but little formal structure isavailable to make tradeotfs among all of the considerations, leavingtile selection of trainers (from cardboard mockups to full-blownsimulators) to tile SME's preference--all of Lits withifn tile broadlimits imposed by tihe procedtires for using tile Mledia Decision FlowC~hart or D)ecision Table.

A great deal of research over the years has shown that for ail 'y setot tasks to be trained, there are a number of alternative combinationsof traltinig media which can be employed to successfull" achieveretquired learning. Iowever, 181) team members are not trai ntd to makeimedii tradeoffs to achievo speciftic combilnations of learni ing cAlla-

hilit|,,, ind to maximi e, tile efficiency of a training regimen.

S

Training equipment permits student practice of selected aspects ofoperational jobs, but seldom is the provision for practice sufficientby itself. It becomes effective only when integrated into a carefullyorchestrated sequence of learning opportunities. It is thisorchestration that is the key to effective training. Selection ofspecific training equipment must evolve from the design of the trainingregimen. Current training for ISI) team members does not promote thisprocess.

Design of Training Equipment Characteristics. Once the decisionhas been made to utilize a specific type of training equipment, a wholenew set of training/learning implications becomes critical. Theseinvolve the design of the trainer itself. There are two generalclasses of decisions:

1. What aspects of the operational situation should besimulated, and in what ways?

2. What instructional features or capabilities, in additionto its simulation capabilities, should be built into thetrainer?

Few SMSs prior to assignment to the 3306th T&ES have had theopportunity, especially in the maintenance training area, toparticipate in the design of a major trainer, and/or have the formaltraining in selectively employing tile numerous state-of-the-artapproaches to accomplish particular training strategies. For example,in the first category (concerning what should be simulated, and how)there are a large number of considerations dealing with level offidelity, such as:

I. Environmental conditions.

2. Stimuli.

3. Response situations.

4. Control-display relationships.

In the second category (concerning what instructional features)there are decisions concerninig when and how to employ:

I. Enhanced cueing/feedback.

2. Time distortion (freeze, accelerate, repeat).

I 3. Performance monitoring.

qQ

4. Student station.

S. Instrukctor station.

Current lv, tile 1St) team members are not t ra ined to deal1 wi th theseissues in a formal way. Decisions initially reflect SMS experience andpreference; later, when the equipment is in the procurement process,the contractor's human factors personnel may persuiade tile SMS to agreeto some enhanced set of equipment characteristics. Despite thle humanlfactors input, the recommended characteristics are seldom derived froman integrated approach to meet the total set of training requi rement s.

Required Information Not Available

Traditionally, systematic training and training equtipmentdevelopment efforts, both pre-ISI) and 181), suiffer from di ttIerence inltime phasing between system development and training s 'ystemdevelopment. Tit- cont inues to be true even with thet cuirrent ISI)offorts onl new systems being conducted inl a f ly before buy"-environment, where the forma I ISO) effort is not i nit i ated tiut i I t lieprototype hardware is undergoing test And evaluat ion. Thiree majorclasses of information current ly appear to impede ISO progress ear lv illthle ana lyticeal phase:

1.* Maintenance and training conce'ptS. F\,vei though C ie'

ONshoul Id i dent1 i f y ho t h ma i n t vna n ce a nd t ra i n i ngconcepts which will be implemented on ai new s\vst em.conf irmatiton and oommitment to those concepts ire, attimes, di fficult for the SMS,, to ob-tain. As iresuilt thet ISD team must tnt It jaillv makeassuimptionls about the uiltimate ma iintenlanc'eorganizat ion and hierarchy of traininig experien1ceswhich will prepare people tor tfiel d ma In itevnanceass Ignments

2.Trainee (target) popuiLation, injCjhid ing leVeIZ 111adAFSCs. There are otften tiltff ictil ties inl obtaining,using command commitmenlts for t a rgt popo 1 at inrelevant to various system maintenancee uie ntone Consequence of this inahi lity to idenltify thlesp-seI fic target popuilation early Inl thle 1I) processis that assumpt ions art, made abouit tile specitfic-AFSk's and levels to be assigne'd. More imporlt anit I vassUmptions a~re made abouit the recent weapon svst emlexperience which thet t ir-gt popuilattion will have aistheY' enter traning. Tilt training requtirementsthemselves are bKised uipon thet SMS s 11udgmlent ot new

.-401

skill and knowledge requirements. Thus an erroneousassumption about trainee previous experience canproduce a mismatch between the job performancerequirements for the new system and the total set ofskills and knowledges which the training program isdesigned to generate.

3. Comprehensive task data. The task data base for new

systems is generally produced by the systemcontractor. However, the system in its prototypephase typically does not have a complete andvalidated data base. Often subsystems are in a stateof evolution, making the identification anddescription of both 0- and [-Level tasks difficult.

There are no ideal solutions to any of these three major infor-mation area problems. Critical implications of them are: (1) the ISDteam as a whole needs to very carefully coordinate their needs in orderto encourage and promote timely decision-making at the SPO, the usingcommand, and ATC, and (2) assumptions need to be made early in the ISDprocess, documented, and approved/modified as information becomesavailable. For existing systems, ISD teams need to "bite the bullet"and quickly generate relevant task data prior to making major trainingand training equipment decisions.

Incomplete Analyses

The key to job-relevant and training-effect ive ISD is comprehen-sive task analysis information. It consists of the specific skills andknowledges which must be a part of the job incumbent's repertoire forsuccessful on-the-job performance. To obtain skills and knowledges, thetask descriptions (the names of specific tasks and steps, a descriptionof the conditions affecting performance, and a specification of relevantperformance standards) are analyzed to identify the specific behaviorsneeded (skills and the application of sets of knowledges). Thebehaviors judged new to the trainee are recorded as training require-ments and are further analyzed to generate training course and traininogequipment implications. AFP 50-58 describes this process in varyingdetail. I n actual use, howeve , these procedures suffer at three majorpoints:

1. Task and Skill Analysis. Figures 3 and 6 in thepreceding subsection of this report illustrate thetypical contractor computerized dat a base (LSA) and acompleted FORM 1 which record relevant tasks notfound in the LSA data base; (on some systems thereare no contractor produced data and all task

41

" .. ',. ... .. .. .. . . . . .. . .. . .. ..... . . .... ,.... . . .. . _',"."' ' .' 1 " ,' .a'

J ' N .. ' ', •* , '..,.. . .'. ' , a .'

al ,a~ '

in it orma t ion is reco rded on tI e FORM 1). The twoforms, together, constitute the typical working database for the 1SI) team on a new system. The problemis that this data base consists ot task descriptions,but not detailed task analysis. Required skills ands peci t ic know ledges are not identified. Theprocedure, however, is to use the SMS's knowledge ofsimilar svstemns ald to do the Ma lvs is "ill head."For any step or any task i n which ,i nw skill orknowledge is identified, the SMS prepares an entry onthe FORM 2. [he Rationale Checklist L, ,.sed todocument each of these decisions, however, it doesnot ident i fv the specific ski Ils and/or knowledgesinvo Ived.

The 13 0ith TES procedures spec i tv that behavioralrequirements in the form of skills and knowledges fortraining requirements be identified ol the FORM 2.Hlowever, the FORM 2 often uses the same taskdescriptive wording as the FORM I for each step wheretraining requirements are judged to exist, (the moreexperienced the SMS the more likely that theciritical behaviors will be comprehensivelyidentified). If the step is to remove Part A, thenfhe "behavioral requi rement" on the FORM 2 will manytimes be "remove Part A." If the removal isassociated with risk of equi pment damage, the"condition" column of the FORM I might c.mut ion "avoiddamage. " On the FORM 2, the identical "avoid damage"wording will also be recorded, without specifying theparticular activities which should be followed toavoid damag,.1e. For SMSs who are completely familiarwith a part icular step ot a specific task (acon1dition which might be unlikelv in a new system),tit, "avoid damage" caut ion might be sufficient toiildluCe .appropri ate recall, so that the specificski I Is and knowledges implied are given appropriatecollsideration ini the remaining portions of theanal'sis. It is possible, however, that in complexsubsvstems solet, important skills and knowledges maybe overlooked.

. lraining Equipment Selection. There are severalproblem areas associated with the selection ottraining equl ipment to be utilized in any of theSlbsvstem matntiena ne training programs for newsystems. First , medIa selection is generally derived

' .I f '

from the Media Checklist and on personal preference,

and is prescribed to meet specific training

objectives. For training equipment this procedure

often results in all identified training requirementsjudged appropriate for practice being assigned to a

major simulator, with various audiovisual media insupport. The procedure does not guide decisions

concerning the use of other types of practicedevices, including part-task trainers, procedures

trainers, etc. Similarly, the procedure does not

facilitate decisions about what set of total practice

requirements should be incorporated in the simulator;

two or more simulators having limited application

might prove to be more cost-effective in a giventraining situation than one major all-encompassing

simulator.

Second, media selections are made and documentedprior to the generation of a POI which defines thesequence and appropriate strategies for achievingspecific training requirements. Thus, often, the POTis built around the selected media rather than themedia selected to most cost-effectively support an

optimum instructional regimen.

3. Maintenance Simulator Design. The operatingcharacteristics of a maintenance simulator arecurrently selected, primarily, on the basis of the

operating charateristics of the equipment itself. To

paraphrase one SMS, "I'm not interested in exotic

features like providing knowledge of results. I only

need the simulator to behave like the airplane, and I

can teach with it." Consclueatly, most simulator

design recommendations which come from an ISD team to

the SPO (through the TRRRM and ATC review process)essentially duplicate actual equipment operation.This is not necessarily bad, but it significantlyreduces the probability that maximum training

usefulness can be d rived from the device.

While it is easy to criticize the curre , simulator

design process, there are not as yet I.,i) procedures

which can systematically produce training devices

with improved cost-effectiveness.

4 3

::. j;

SECI'ION IV

[liE SPO TRAINING, EQUIPMEN'T ACQUISITION PROCESS

'he System Program Office (SPO) involvement with the trainingequipment acquisition process is described 'n this Section under threemain topics:

1. Processes and procedures currently employed.

2. A general model that sequences acquisition decision sets.

3. MIajor problem areas.

Processes and Procedures

he current processes for procuring maintenance training equipmentdo not fit into a single pattern. Each weapon system SPO is differ-ently organized to suit its functional needs. This results in trainingequipment management assigned to the Special Projects Office of oneSPO, to Log istics in another SPO, Development and Gperations in athird, etc.

Broad categories of activities of Training Equipment Acquisition,\tanagers, however, are common to all SPOs. The activities for whichthe manager is responsible include:

1. Validation of training device requirements aspresented by ATC, resulting from the ISI)process.

2. Validation of the weapon system contractor'sengineering data.

3. Preparation of procurement documentation whichtranslate ISD-derived training equipment designrequirements into equipment functionalspecifications. The training device acquisitiongoal is to provide no more or no less than theprojected system requirements.

Ak.4.4 :

Ma Ila I ellit,' 1 'I 0l 1 , O 'i l I lig p l'Ok, (II I ,, |V I r'III

cl' 1 t 'a1k, to r ,'i.. ', k , , , 't c oli t htrm 'll IlllIt ori pl}t it tl('i'' I i 0 oi i( i pc ,. s o io Iro t a1 .1

t t.Il 1 Itlil -(t I (w t I v' Illtod it I i' I I v' t'd ti v t i ,.

V4_l 1da ill , ot I Ir ln ti Ii' v 1 (I I re ll .'t It

T'h S Ito ACt ( i ':il l M.1 iLVIC ' :I a ll i t L .- ;1 (ol w IIt lit' .m .Iu vt

.1It ivi ! ;lI,; 1t' ligdl u t . o tht' I h I t ,llll qp'It Ic;%t kill 1 r bn1nl ig ' q t' i'mt 'n

d i ,i'N ,Iii i' e( tlit'inll lits p IoV I ti d I -I hil a .i,,1 I oli i'( I O'w I ild v-I I tl Ia I o l o

tit lt-I' it Ik l Ii to',li i i. 1 I it'It w .' poll "Iv t 1ii tl't iit'lli'll ro:ll -

t I'A I t.s . I 'add I lt N 'l t I lliirvillli , . , ro 1ih1 l h d '1I i tld I':

I,: 'l ] ll I II, I I I;I 1111 1 V1,l k' I, ! II I .tt.( . IVV hv.t 'll 'll l ll7I hit ;Nilluilii i ,'c' i'uj' i ll'dl Ilt',i.;iniu ' I I'.i lii liu'. ,il*

ilih Il't i'd Wihl'tii'', '' .ppi I t'lh0I

AIV I Ilti" t ; I o I li' I r.1 i 1 lig 4 'iltt pI li oll 'ilVi I l' t 0 1' N

vil ivl t'uit ol wo, '.

I I\''t Ii -' t c I pit I i Iii I lii' .tllii'' ' l'|l t i* 4 li ,ihitc

h I I tni t b v'(. tC l Im' tIlk' l' A kii I 'll .i vi. r l vt l it' I 11 ( il VC I t ti C l iti t I.I I i i I ''lu I I I'm t I I t

ei'.i l t I l i I I I1 t FA i i'tt, I I111101 t Ni lk li o imo 'n t I" II tlitii I ' I

1i'il' it wit I i t i i l.t) 1 pi" t' N' : I 11t tll t 1to01t i'd wit It m irpni ' NiN'!z. lh' iili

I ' i'nll I t i ia I I ,v Nl ' t Il' :o;llit, I'll o0 li .' I oI t O ll I, I\ t'u'd' I ov d'v I ct

r-4.41i I rcuuui'iit I s i 1i I I Ih1t 11t 1ii 1),'IN r it ; 111 111.1i 1 It 1. ouN li't, C 1;.1iV V

III.I I IIt .II i t h 1 ' WCd 114Ill NV NI ' O'ii.

V '.11ll t II k I F18 I' lllt 'l l lo.lv r I , IIIi an I-v''{il 1'.l1 11 l I'' vll litl

S lIt'iv I I i lot ) . .. .

lI w o v i ''l lig li ,' l l (Ia. )111t V tit I; i it I 11 ', 1,4111l I'nlit'll I lill~t hI ' 1 ll V.iV ll 'll

I ill I CO ii t .1 ilt ii t

k liti, I. W11 11' ,' t u', Iitr ti t , 1t i lit, t I tlit, \'.i I d.1 i t f ll o t I d \. ' ct I IIII I.' I ; ,' ' l | l ,I .111d I I it I i' kIil l I l th ' Ii l II t II -n I II t Ik IIt [lit Ill

Iii'i'It I''it Ifli I i It \' s Ii' t b I il ilit.

'i v

1+. |" 1 " II tt lt t I N i I I .t t li t, i t' v i,.(" I

I I lg o iq% I d 11 t0 * ; ii 0 , C P 1 it I V

'; " t' I,' l 1 ,'' 1 '11 , i, ,"

III I I I I :II' I'v i1 I I, I t

I. v ' , tit I I 1 t l ' I . , I I'1 .\ .4 I " 1 ' 1 114 1 11

v v ', |. I '.t , 011t ' ]. t ' t '+~ '1 1 " | 1 t e l 1 | ~ ;

:4,1 .1 , 4 1 1 t,, III 'I I 1 , I' II V( I,' I ,V1114 ,', 1. 114 CV'I ,

' I it1 . 'II '.. II lt. 114 t t N M.4 1.41 V,1 t he I' ' 11%.' ;1. v

01 ;14't 1,, 11.4 ,'I, 1 0 ,|t ' I I d i4 " ! , It, ',' '1.1,'t 11 " :,,

III 11. '\1 , I 11 I l.111 I l . ,il c. I, ll, i l, l. -% o I A d V I. 1, 1 V'. o O il,,\ , 'C m

c~ ~ ~ ~ ~ ~ ~: v.%', 1\ .I,;Sll',+ to Ih, il om t ,tttlll,+

- I + ''l ,l I ,I i t I + t 1 1m 1 k,' 1 \0 1 ",,1 , , d11 1 < "+O '

M.1 |,Il. t , I," . "II 01111 t ,d I, 11 ' v~ I'' ,l~~, .IIIIo

, po c v .1" l, I+ , 111 1 1 'tt. t lit, tt. i llt, , 11 11 111g ltlt l t

,Id|, It w t , I o,,..+ I " F, l~'' v S 0 i I I I I od I t I I

c4 I It

.1 1- , ,. o 11 v I 4 dv I% lr.. .. ....... ... .. ... ....... ...... I I .,+, + ..... .I , -+ I I. ... ..... .. .I I I._oll , I I t

L..J

3L

VL..~ j.

~~iv k~c:

4A

h-J

(-3

La

-AJ

kA Of

uj C-3

Liu I- a 9

I.-- <

.J L C

-a Mi0. 0.I

C)- La7

CL La

ul~IS La -o w-)yn

tn~~9 I0LiL . L

48a - a>

LU

C)LA

Lo Lo

Ct Ll -1 ;

Luo

L

.I 4'--

C, C3

49-

LU 1

LM

- Cy

Q:.

LU ad Li

o3 >- 0. k -

-LUJ

LUJ

91.L L, kl El ' L

olL L L. i t x C

53

t-3 9)

cy Q:

.8<

0W

-D C

ax Z- a I

L.J-

L

51

Ly. C4LL

. . . . . . .

La

L" -

La :. L L

4:r :--

&A C '

viiu'C

C' C

-x4w

W a. -XC-L. L.)V La4: a: a

kn La

4: 'CL) 0

0~c La"i4-~A W- :DL

In: (A

C) 0 LI52

FtAA ;- AJ :-X .'i

L.A L J w~~

I- -J=' L.A ;- V

VL* -3 Lit--

L.A C3 F-L. FL3 I LIJ C) C

L wFLI C CL-.

C - fi- L i CF-F L, C- 1 4 WAF CL L b-.

LI C I 71

L.1 L.A.'~j X~ CL...o

F-l

LLAI

F--

L.1c

-53

IA~i 4.

LI)

-JL

Li.

Z:L4 - I V-

LI)U

44:

4x L:' 13,

LL)

106.) V)

Z5Z

61.

54.-

specializing in simulation research with AFHRL arealso available. Engineering psychologists of the HumanEngineering Laboratory of Aerospace Medical Division arelikewise available to provide guidance and research supportdirectly to the SPO. Personnel located at the contractor'sfabrication facility are assigned to the Air Force PlantRepresentative Office (AFPRO) or to the Defense ContractAdministrative Services (DCAS). They may be called upon bythe Acquisition Manger to monitor progress directly,especially in suspected problem areas so as to assureequipment quality and to meet required delivery timeschedules.

3. Air Force Documentation. The maintenance trainingequipment acquisition team headed by the SPOAcquisition Manager has documentary resources toassist in the translation of training equipmentrequirements into hardware specifications. MilitaryStandard 490, "Specification Practices" contains 15appendixes outlining appropriate types of speci-fications. AFHRL-TR-78-28, see Bibliography, pages30 and 31 (Hannaman, Freeble, & Miller, 1978), givesthe details of a two-part specification processnormally employed in the acquisition of trainingequipment. The first part is the "Prime ItemDevelopment Specification" and the second part isthe "Prime Item Product Fabrication Specification."

4. Contra" ting Procedures. AFHRL-TR-78-28 accuratelystates that the SPO responsibility for developmentof the specifications is usualtv accomplishedthrough a contractor. The Prime Item DevelopmentSpecification is usually prepared by the primeweapon system contractor from the documentedtraining equipment requirements in conjunction withISD personnel, and with direction, advice, andApproval of thle SPO Training Equt' .- nt AcquisitionManager. Thus, at least for acqui.-ng trainingequipment for new systems, the requirement in AFR50-1i, Training, Management and Utilization ofTraining Devices, that ISD personnel be included insource selections is superseded at an earlier point

In time by the weapon system source selection.

Acquisition Management

The weapon system contractor's responsibilities In preparation ofthe Prime Item Dekvelopment Specification for maintenance trainingequipment are usually established at svstem souroe selection. This

S ;ii

4

Oda

decision has customaritv been made because the contractor has readyaccess to tile necessary engineering data and expertise.

In most SPO maintenance training equipment acquisiton prog'rams, adeparture from the process prescribed by directives often occurs at thepoint when the Prime Item Development Specification for the weaponsystem is approved. Most often, as previously described, the primeweapon system contractor is awarded training equipment deliveryresponsibility.

The problem causing the change in the acquisi-tion process, asstated by regulations, occurs because the prime contractor usuallvsubcontracts the training equipment development and production effort.Armed Services procurement policy discourages technical inputs to thesubcontracted process by Air Force monitors. Direction of this typemay give the prime contractor reasons to demand additional funding or amodification of critical delivery schedules. Additional conflictsoccur since regulations require the acquisition manager and team tomaintain visibility and control throughout the development aadproduction programs.

Usually, the Intercontinental Ballistic Missile (ICBM) SPO ofSpace and Missile Systems Organization(SAlSO)contracts with the weaponsystem contractor for training equipment and simulators. Thiscontractor does not subcontract. Therefore, the ICBM SPO does notencounter the problem of maintaining program control and visibility.

In other instances when the ICBM SPO contracts directlV fortraining equipment from other than the weapon system contractor, themanagement problems noted above are lessened by contracting for anengineering source data package called System Requirements Analsis,(SRA) from the prime contractor.

rhe SRA package sup lied by the prime weapon system contractor isdefined in SAMSO-STD-77-6 as "A sequential and iterat ive engineeringprocess designed to establish the functional requirements for eachelement of a weapon system. Tile process provides a logical sequenceand a clear record of the development of system requirements to managethe system engineering effort throughout all phases of systemacquisition." This analysis systematically establishes requirementsfor equipment, personnel, procedures, and facilities. Figure V)indicates SRA Inputs to personnel and training development processes.

Engineering data and support from weapon svstem contractors foraircraft and electronic system SPOs are often variable in qualtt'y.Frequently they are not available in time to meet training definitionrequirements.

L., ' - , ' . '' ,.- .,

8 t

0

C- C

C C

m

CA ~ .. 3 C <

Cr

ww LO

g-~~l Z(d~-4 I <

>~Z l57

4 *6J-

The apparent critical differences between the SRA approach andthat followed for aircraft weapon system contractors are:

I. The SRA data package has its own detailed andexplicit specification, rather than being anassociated output from the weapon system primeitem specification.

2. Data production milestones and quality are closelymonitored during production.

3. Data outputs are phased with weapon systemdevelopment so that updated and expanded packagesare produced as the system(s) are designed. Inthis way, preliminary data are released earl.%,with more detail evolving as developmentprogresses.

Other maintenance training equipment management differences amongSPOs are caused by varying degrees of participation by engineeringspecialists in the decision-making process that supports the SPOAcquisition Manager. Engineer availability, personnel expertise level,and the SPO's willingness to consult the engineers are additionalfactors which affect decisions.

A Training Equipment Acquisition Process Model

The managerial decisions involved in the SPO training equipmentacquisition process are identified and sequenced in the model thatfollows. In an attempt to accommodate the wide range of existingcontractual, organizational, and managerial differences noted above,the model is necessarily very general. It is, however, applicable tomost acquisition procedures. The decisions required in this processare indicated irrespective of the agency who may make them, whether itbe the SPO Manager, a contractor, Air Force engineers, or an acqui-

sition team.

The training equipment acquisition process model begins withtraining equipment design documents produced by the ISD team. Appli-cation of the recommended ISD procedures and the employment of thetraining equipment design process model proposed earlier structure theSPO inputs. Even efficiently managed acquisition effort may be wastedif the procured equipment has been based on an incomplete design

process.

J1;

'8

Th'le major pa rt s ti I ticw SPO it, (I s it iti tie t, is t oil p rovss int I It itthti dev e Iljut'ift of an effete. Liye p rocurvitent spvc it I cat iIn, t lit,

mngemient o 1 t lit, (Ieve to pmv t lid ta hr ict t it p rocess , andIli lett'valIwit Ion ollt tie dtit-tIvered t ra I ner.

A PlrocedturaI/IDec isioti ietiut'

F i gitrt 16 ie p1 ct s t lit, proc t'dti a I / dec, is I i process anld stilmmla i I ze sth liSupport laig iptits ntessary I orl accomp Iti shmntM . Tw major steps ottil t' eItIvIlC t, I Iwo I Vt

1.Vali I te tilit' t raitning eqIttipmeM tuinct tin and dies igncharic ter 1st ics docitmentetd as aI resuilt of t lit I Si)

D. tterm Ine thle ttvaslh I bIIIt v ot tihe vat IIdattell 'qu I pnitreqt Ii reuw n t s Iin t e rms o t ava I I able mont't aI ry eso

t's It its, dle I I vt'ry* t I tilt reqpi rt'men C s , and

Pnr it .s In ts It Otcat ot laie-fa t (o liSPPrlraf

1leck to r I or I' roa 0jIl IO j1 0 neeIVd Ali lld Ot It011S

P're'pa rvt'I at t'men t 0 t Wo rk ( SOW) atiLI Re tlit-'s t o IPlro posal ( llP) (to t'n t a t t onl dett a I I I nig I lie maa.11lgt I

aplprtiaci 11p)1 Ivablt' to coolt ract or Act ivit CIes.

SO. SeL' t conlI Irae1 t 11.3 Solrc hy1- bYV011M pAt I VV' I V S-,;SS I nI~viipropos 1 i s on ) I s I as 1o 1 tlocitilt e'd Lti h I c-IIappiiot-ic ilkt ti An~ ant I iig o 1 it'ql I rei'ment S , I onlova tilnSt owardi sat i s Iy ing goalIs , t I melIv protltit't de' I tvver v

v'xpltr Iv iit, I at, I II ves. ptirolivIt' ri'storev. , .1nd%

6. ReV0Vji 0 .ind I I na II lie tltt IS isOt thew (IiW'ii'iiit'iit

spe it I it At[iltn I Issit' cnritLt'Wit It cvi'EV

spi'ct ic IL'vt'iIf ri'mt'iit t',lllill.15 i /.1 og to the tkllt I'ACt101

tha-t t1 he1 rgorotis Itest * icp oc,,ild Cliickolitro tdor'S C t'i iit. i'le IIiI tlt,' sp t I Icte filn wi I I hi'

St ritI IV t-i'ilt Oil.

1. Mtii t i, wit IiIn coot ract t al lV I 'gai I htinid t lit,kt~lo'Vi' t al a I t 1 rit rodo t it t il 1tro's s toL .is!%irt'

v'iji I pini I *Iiid t fint'I INess Lit vt'li I Iiuot I iI vt'lv.

CL

CL"

It a3 LP

V

a~~~ >4 .

onn

;".o*1. C 0

4.1 uC

a .

IC "4 'IaI I

Ct M4 1n

8. Supervise and part itipato inl the Speck it tod LestacccptLance , and che lckout ac t l v itivs * Coo rd inat ous ing commantd anid expert engineer ing glupport t oassuire that thle ma i it~oae- t ra in ing~ eqoiiipmont mootcont raetedi reqi remenits.

Major Problems Areas

As a re sult o t the rev few k itdcmnttin thet obse rVa t ionls 0Silo procedures and a study ot the training equ ipvment acqu i sit ionldecis ions requi red, three problem areas have been i dent it ied:

1 . Variable Kanagement. 1rac t- ices.

2. Lack of ProceduralI Guidance.

3. Late Acquis ft ion.

Variable Management P rac tices

The varijabi lity oft cont rac tualI organzi t i oia 1 and manager ialIClaracteristics of tle acqui. sit ton process (it ten rot IteCt a rel at ivel vlOW priority that. mitenan11ce trainling equtipmelnt, has illoprisntoilier areas of thle system development process. Lack o1 cons istentorganilition anioni Silos has resulIted inl va r i ng degrees of programstipport- to ma I nielance training (qi ipmeoit 1manam)en10lt and, uhepetIV.to v'ariable training equ iipment quialIit v

The single point procurement managemenlt approachlkl empl1O01 hVe "V!'")8

has recent lv beeni implemnixted tor aer'Olaut i ctal mvst ems * S hll' Iha",been des igna ted to assnie this re sponls bi Ilitv Thmis ora atinhas

eti~s fvc experienice inl matiaiiigg tilte acqusisitin ot flight c rows fmMatorso This experienice shoul1d VI tect IV'eIV geea -.' to m1i, inl-tenance ira Ining sfmulators.

Thet mna 'jor advant ages of hav'i ng a11llm i mit oimancet t ral lit ngs imulators acqutl red throtigh SIMSPh Hnc hide:

I.St andard i zat ion of cont rac t il k, proocedu res thatestahi ism a set of workingp arrangemeint s' wi thIcon trac t ing ott leers anld bttWeen t eam membervs o t

2. It I i 7.at ionl o t I ts own cadreI* 0 t1 11a Innanct's lIuI a t or e xpe r ts t o dealI witd tdc i s ll-, reqIIng1%)1huma n tic t r s a nd vii) I netr log, expe rt Ist'

3. Coordinated channels with Air Force laboratory andengineering experts who, at times, may be needed to

establish specifications and/or evaluation designs.

Lack of Procedural Guidance

A primary directive in the acquisition of maintenance trainingequipment is MIL-T-81821, Military Specification: Trainers, Main-tenance, Equipment and Services, General Specifications For. Thisspecification, in the opinion of the authors, appears to place require-ments for realism and functional fidelity which can adversely affectthe ultimate cost-effectiveness of maintenance trainers, especiallysimulators. The directive also necessitates extensive justificationeffort if the requirements are deviated from to achieve enhancedinstructional value.

Under conditions where maintenance simulators are being procuredthrough the weapon system SPO, this directive is particularly cumber-some. Many Training Equipment Acquisition Managers are new to theirjobs and are relatively unfamiliar with maintenance training and withequipment procurement. For them, MIL-T-81821 can he especiallymisleading in guiding the preparation of procurement specifications.

With the shift of acquisition responsibility for maintenancesimulators to SIMSPO the detrimental effects of MIL-T-81821 will bereduced. Highly experienced specification writers will be able toselectively apply the requirements to achieve the training effective-ness identified and recommended by the ISD analysts. On-going studiesof maintenance training equipment design and acquisition should producethe principals and procedures to permit future updating of MIL-T-81821.

Late Acquisition

The most important problem area in the acquisition process Is thatthe definition of training equipment requirements and the subsequentissuance of a procurement specificat ion are not accomplished in time topermit trainer deiivery in support of initial operational training.There are several related factors Involved. l.ate receipt and lack ofcompleteness of engineering and task data provided to the IS) teamsprolongs the equipment design period.

i. A staff study conducted and reported by Aero-naut ica l Sys tens ) ivi s ion recommends cent ra i izedprocurement of matntenance simulators for aircraftsystems in ASD's Simulator SPo (SIMSPO).

A paragraph of the study states: "...the majorfunding problem was the need for using trainingequipment funds to cover overruns in the A.rvehicle system." (Maintenance Training SimulatorProcurement, page 8). Earlier training equipmentrequirement definition could have providedcontractual utilization of budgeted funds beforethey were used for other purposes.

2. The late definition of requirements sometimesresults in a contract which combines research/-development and fabrication phases. Theseprocurement practices occur in an effort to meetrequired equipment delivery dates by attempting tomake up time lost In defining requirements. Thistype of contract often results in fundingdifficulty due to an escalation of devicecomplexity and price.

In addition, there are times when insufficientmanpower availability amotg engineering advisorswhose time is shared among acquisition programs ofseveral systems delays completion of the spec-ification. Finally, high turnover rates among SI'OAcquisition Managers during the procurement processdue to military transfers further slows theprocurement process and detracts from the avail-ability and cost-effectiveness of the finaltraining system.

t

63

63,

BIBLIOGRAPHY

Ammerman, H. L., & Melching, W. H. The derivation, analysis, and classifi-cation of instructional objectives. Alexandria, VA: Human ResourcesResearch Organization, George Washington University, May 1966. HumRROTechnical Report TR-66-4, AD-633 474.

Applied Science Associates, Inc. Handbook for development of advanced jobperformance aids (JPA) in accordance with MIL-J-83302 (USAF. Final Draftunder Contract No. F33657-71-C-0279-PZOO01. Valencia, PA: Author,Aeronautical Systems Division, Wright Patterson AFB, OH, 15 January 1971.AD716820.

Ayoub, M. A., Smillie, R. J., Edsall, J. C. Assessment and job performanceaids: a simulation approach. Final Report, Part I, Executive Summary,North Carolina State University, Raleigh, NC: Naval Air Systems Command,April 1974. Contract No. N68335-75-1129.

Ayoub, M. A., Cole, J. L., Sakala, M. K., Smillie, R. J. Job performance-aids: assessment of needs. Final Report, North Carolina State University,Raleigh, NC: Naval Air Systems Command, October 1974. Contract No.N00600-74-C-0276.

Benson, Gene. Working definitions of device characteristics/requirements.

(Internaj working paper).

Booher, H. R. JPA systems technology selection algorithm: development andapplication. Preliminary Draft. San Diego, CA: NPRDC TN 78-.

Branson, R. K., et al. Interservice procedures fGr instructional systemsdevelopment. Executive summary and model. Orlando, FL: Naval TrainingDevice Center, Army Combat Arms Training Board, I August 1975.AD-AO19 486.

Chambers, A. N. Affective and acceptance factors in selection and utiliza-tion of training aids and devices. Port Washington, NY: U.S. NavalTraining Device Center, November 1958. Technical Report NTDC 9-11-1,AD-214 729.

Chapanis, A. On the allocation of functions between men and machines.Reprint from Occupational Psychology, January 1965, Vol. 39, No. 1, 1-11.

- Report No 8 under Contract Nonr-4010(03) between the Office of NavalResearch and the Johns Hopkins University.

64

I

Chenzoff, A. P. & Folley, J. D., Jr. Guidelines for training situationanalysis (TSA). Port Washington, NY: U.S. Naval Training Device Center,July 1965. Technical Report NAVTRADEVCEN 1218-4.

Coff, J., Schlesinger, R., Parlog, J. Project PIMO final report PIMOtest summary. Serendipity, Inc.

Colson, K. R., Forbes, S. F., Mathews, L. P. & Stettler, J. A. Developmentof an informational taxonomy of visual displays for Army tactical data

systems. Research Memorandum 74-4. U.S. Army Research Institute for theBehavioral Sciences, February 1974.

Condon, C. F. M., Ames, L. L., Hennessy, J. R., Shriver, E. L. Flightsimulator maintenance training: potential use of state-of-the-artsimulation techniques. Lowry AFB, CO: Technical Training Division,June 1979. AFHRL-TR-79-19.

Cotterman, T. E. Task classification: An approach to partially orderinginformation on human learning. Wright-Patterson Air Force Base, OH:Wright Air Development Center, January 1959. WADC Technical Note 58-374.ASTIA Document No. AD 210716.

Cox, J. A., Wood, R. 0., Boren, L. M. & Thorne, H. W. Functional andappearance fidelity of training devices for fixed procedures. Alexandria,VA: Human Resources Research Organization, June 1965. HumRRO TechnicalReport 65-4, AD 617 767.

Cream, B. W., Eggemeier, F. T., & Klein, G. A. A strategy for the developmentof training devices. AFHRL-TR-78-37, AD-A061584. Wright-Patterson Air ForceBase, OH: Advanced Systems Division, Air Force Human Resources Laboratory,August 1978.

Cream, B. W. & Lambertson, D. C. Functional integrated systems strainer:Technical design and operation. AFHRL-TR-75-6(II), AD-AO15-835.Wright-Patterson Air Force Base, OH: Advanced Systems Division, Air ForceHuman Resources Laboratory, June 1975.

Crowder, N. A. A part-task trainer for troubleshooting. Lackland AirForce Base, TX: Air Personnel and Training Research Center, June 1957.ASTIA Document No. 131423. AFPTRC-TN-57-71.

Defense Documentation Center. Simulation and trainers (U). A ReportBibliography. Search Control No. 011833. Alexandria, VA: Author,Defense Supply Agency, Cameron Station, January 1974.

I65

Al

Demaree, R. D. Development of training equipment planning information.Wright-Patterson Air Force Base, OH: Behavioral Sciences Laboratory,October 1961. ASD Technical Report 61-533.

Department of the Air Force. Acquisition management A guide for programmanagement. Andrews AFB, DC: Author, Headquarters Air Force SystemsCommand, 9 April 1976. AFSC Pamphlet 800-3.

Department of the Air Force. Acquisition management Handbook for managersof small programs. Wright-Patterson AFB, OH: Aeronautical Systems Divi-sion, 1 October 1975. ASDP 800-14.

Department of the Air Force. Course training standard. Minuteman modern-ized command data buffer initial qualification training. Vandenberg AirForce Base, CA: 4315th Combat Crew Training Squadron (SAC), 13 March 1978.CTS 182100K.

Department of the Air Force. Development of human factors engineering forsystem/equipment programs. Wright-Patterson Air Force Base, OH:Aeronautical Systems Division, 10 August 1977. ASD Pamphlet 800-2.

Depactment of the Air Force. Human factors engineering. AFSC DesignHandbook, DHI-3. Wright-Patterson Air Force Base, OH: Author,Headquarters Aeronautical Systems Division (AFSC), I January 1977.

Department of the Air Force. Human factors engineering for theintercontinental ballistic missile systems. Proposed SANSO Standard 77-1.ICBM Program Office, MNTP. Project Number EO-29. 28 April 1978 draft.

Department of the Air Force. Instructional system development.Washington, DC: Author, 7 January 1977. Air Force Regulation 50-8.

Department of the Air Force. Instructional systems development. RandolphAir Force Base, TX: Air Training Command, 4 October 1977. ATC Supplement1 to AFR 50-8.

Department of the Air Force. Instructional system development.Washington, DC: Author, 31 July 1975. AF Manual 50-2.

Department of the Air Force. Intercontinental ballistic missile systemstraining equipment and management. Proposed SAMSO Standard 77-12. ICBMProgram Office, MNTP. Project Number EO-49. 10 July 1978 draft.

Department of the Air Force. Internal Air Force Working Paper. Newsletterto Chief Engineers. MIL-PRIME-P-1670 - Parachute Systems. MIL-PRIME-8421- I

Air Transportability Requirements. MIL-PRIME-P-1670-1 - Parachute Systems.

66

Department of the Air Force. Management of training equipment. RandolphAir Force Base, TX: Air Training Command, 6 December 1976. ATCRegulation 50-30.

Department of the Air Force. Plan of instruction. Minuteman modernized/

command data buffer initial qualification training. Vandenberf Air ForceBase, CA: 4315th Combat Crew Training Squadron (SAC), 1 July 1976. POI182100K.

Department of the Air Force. Research and Development. Air Force

reliability and maintainability program. Washington, DC: HeadquartersU.S. Air Force, 9 August 1978. AF Regulation 80-5.

Department of the Air Force. System requirements analysis program for theMX weapon system. SAMSO STANDARD 77-6, 10 Nov 1977. Norton Air ForceBase, CA: Space and Missile Systems Organization. AMSDL 33002.

Department of the Air Force. Technical Order Data Requirements for TrainingEquipment, Mobile Training Sets (MTSs), and Maintenance Trainers. AFAD71-531-(11), June 1976.

Department of the Air Force. Test and evaluation. Washington, DC:Author, 19 July 1976. AFR 80-14.

Department of the Air Force. Test and evaluation. Randolph Air Force Base,TX: Air Training Command, 29 July 1976. ATC Regulation 80-14.

Department of the Air Force. Training. Handbook for designers ofinstructional systems. Volumes I-V. Washington, DC: Author,Headquarters U.S. Air Force, 15 July 1978. AF Pamphlet 50-58.

Department of the Army. Technical Manual. Operation, installation andreference data for turret elevating and traversing systems, cupola,gun/launcher, and mount used on tank, combat) full-tracked: 152-MMgun/launcher M60A2 (2350-930-3590). Department of the Army Headquarters,January 1976. TM9-2350-232-24-I.

Department of the Army. Technical Manual. Scheduled maintenance for turretelevating and traversing systems, cupola, gun/launcher, and mount used ontank, combat, full-tracked: 152-MM gun/launcher M60A2 (2350-930-3590).Department of the Army Headquarters, January 1976. TM 9-2350-232-24-2.

Department of the Army. Technical Manual. Troubleshooting for turretelevating and traversing systems, cupola, gun/launcher, and mount used ontank,_combat, full-tracked: 152-MM gun/launcher M60A2 (2350-930-3590).Department of the Army Headquarters, January 1976. TM 9-2359-232-24-3.

67

... . . .. , .. . . .. /

Department of the Army. Technical Manual. Maintenance for turret elevat-ing and traversing systems, cupola gun/launcher, and mount used on tank,combat, full tracked: 152-MM gun/launcher M60A2 (2350-930-3590). Depart-ment of the Army Headquarters, January 1976. TM 9-2350-232-24-4.

Department of Defense. Draft Military Specification. Improved TechnicalDocumentation and Training, Part II, Training Materials. DRAFTMIL-M-632XX(TM) PART II, 31 December 1975.

Department of Defense. Executive Correspondence. General System FunctionalSpecification for EF-IA Tactical Jamming System (TJS) Automatic TestEquipment.

Department of Defense. Military Specification. Air TransportabilityRequirements, General Specification for. MIL-A-8421F, 25 October 1974.

Department of Defense. Military Specification. Connector, Plug,Electrical (Power, Three-Wire, Polarized, Spring-Loaded, Pivoted GroundingBlade) Type UPI31M. MIL-C-3767/12D(EL), 27 February 1976.

Department of Defense. Military Specification. Interchangeability andReplaceability of Component Parts for Aerospace Vehicles. MIL-T-8500C,5 November 1971; Amendment 1, 3 May 1972.

Department of Defense. Military Specification. Maintenance ProceduresSimulator Specification. MIL-PRIME.

Department of Defense. Military Specification. Manuals, Technical:Illustrated Parts Breakdown, Preparation of. MIL-M-38807(USAF),28 November 1972.

Department of Defense. Military Specification. Manuals, Technical:Illustrated Parts Breakdown, Preparation of. MIL-M-38907(USAF) Amend-ment 4, 1 September 1977.

Department of Defense. Military Specification. Manuals, Technical:Commercial Equipment. MIL-M-7298C, 1 November 1970.

Department of Defense. Military Specification. Prime Items Specificationfor the Displays Test Station Simulator DTS. Specification No.CP76301A328A332, Rough Draft, Preliminary. WBS No. 2120.02.

Department of Defense. Military Specification. Quality Program Require-ments. MIL-Q-9858A, 16 December 19b3.

Department of Defense. Military Specification. Test Outline, Engineering,for the Inspection of Training Equipment, Requirements for the Prepara-tion of. MIL-T-27615(USAF), 17 July 19b.i~

A. t

Department of Defense. Military Specification. Training Devices, Military;General Specification for. MIL-T-23991E, 20 February 1974. Changedthrough 1 December 1977.

Department of Defense. Military Specification. Trainers, Maintenance,Equipment and Services. MIL-T-81821, 8 February 1974.

Department of Defense. Military Specification. Welding, Metal Arc andGas, Steels, and Corrosion and Heat Resistant Alloys; Process for.MIL-W-8611A, 24 July 1957.

Department of Defense. Military Specification. Welding of AluminumAlloys: Process for. MIL-W-8604(A er), 5 June 1953.

Department of Defense. Military Standard. Configuration ManagementPractices for Systems, Equipment Munitions, and Computer Programs.MIL-STD-483 (USAF), 31 December 1970. Change Notice, 1 June 1971.

Department of Defense. Military Standard. Corrective Action andDisposition System for Nonconforming Material. MIL-STD-1520A(USAF), 21March 1975.

Department of Defense. Military Standard. Corrosion Prevention andDeterioration Control in Electronic Components and Assemblies.MIL-STD-1250(MI), 31 March 1967.

Department of Defense. Military Standard. Digital Computation S stems forReal-Time Training Simulators. MIL-STD-876A(USAF), 8 July 1971.

Department of Defense. Military Standard. Electromagnetic InterferenceCharacteristics Requirements for Equipment. MIL-STD-461A, 1 August 1968,Change Notices through 3 July 1963.

Department of Defense. Military Standard. Engineering Drawing Practices.MIL-STD-IOOB, 15 October 1975. Change Notices through 15 April 1976.

Department of Defense. Military Standard. Finishes, Protective, andCodes, for Finishing Schemes for Ground and Group Support Equipment.MIL-STD-808 (USAF), 22 December 1960.

Department of Defense. Military Standard. Human Engineering DesignCriteria for Military Systems, Equipment and Facilities. MIL-STD-1472B,31 December 1974. Change Notices through 10 May 1978.

Department of Defense. Military Standard. Identification Marking of U.S.Military Property. MIL-STD-130L, 5 August 1977.

69I )

Department of Defense. Military Standard. Maintainability ProgramRequirements (for Systems and Equipments). MIL-STD-470, 21 March 1966.

Department of Defense. Military Standard. Maintainability, Verification/Demonstration/Evaluation. MIL-STD-471A, 27 March 1973, Change Noticesthrough 10 January 1975.

Department of Defense. Military Standard. Marking for Shipment andStorage. MIL-STD-129H, 3 January 1978.

Department of Defense. Military Standard. Materials and Processes for

Corrosion Prevention and Control in Aerospace Weapons Systems.MIL-STD-1568 (USAF), 18 November 1975.

Department of Defense. Military Standard. Provisioning Procedures,Uniform DOD. MIL-STD-1561, 11 November 1974.

Department of Defense. Military Standard. Reliability DesignQualification and Production Acceptance Tests: Exponential Distribution.MIDL-STD-782C, 21 October 1977.

Department of Defense. Military Standard. Test Provisions for ElectronicSystems and Associated Equipment, Design Criteria for. MIL-STD-415D,1 October 1969. Change Notice, 8 October 1971.

Department of Defense. Military Standard. Technical Reviews and Audits forSystems, Equipments, and Computer Programs. MIL-STD-1521A(USAF), 1 June1976, as amended by Change Notice 1 dated 29 Sept. 1978.

Department of Defense. Military Standard. Test Reports, Preparation of.MIL-STD-831, 23 August 1963.

Department of Defense. Military Standard. Parts Control Program.MIL-STD- 965, 15 April 1977.

Department of Defense. Military Standard. Reliability Program for Systemsand Equipment Development and Production. MIL-STD-785A, 28 March 1969.

Department of Defense. Military Standard. Specification Practices.MIL-STD-490, 30 October 1968. Change Notices through 18 May 1972.

Department of Defense. Military Standard. Supplier Quality AssuranceProgram Requirements. MIL-STD-1535A(USAF), 1 February 1974.

Department of Defense. Military Standard. System Safety ProgramRequirements. MIL-STD-882A, 28 June 1977.

70 #z7

Folley, J. D., Jr. A preliminary procedure for systematically designingperformance aids. Wright-Patterson Air Force Base, OH: Behavioral SciencesLaboratory, Aerospace Medical Laboratory, Aeronautical Systems Division, AirForce Systems Command, U.S. Air Force, October 1961. ASD Technical Report61-550. AD No. 270868.

Folley, J. D., Jr. Development of an improved method of task analysis, andbeginnings of a theory of training. USNTDC. TR NAVTRADEVCEN 1218-1.June 1964.

Folley, J. D., Jr. Guidelines for task analysis. USNTDC. TR NAVTRADEVCEN1218-2. June 1964.

Folley, J. D., Jr. Job performance aids research summary and recommenda-tions. Wright-Patterson AFB, OH: Air Force Human Resources Laboratory,Air Force Systems Command, April 1969.

Folley, J. D., Jr. Research problems in the design of performance aids.Wright-Patterson AFB, OH: Behaviorial Science Laboratory, AerospaceMedical Laboratory, Aeronautical Systems Division, Air Force SystemsCommand, October 1961. ASD Technical Report 61-548.

French, R. S. The K-system MAC-I troubleshooting trainer: I. Development,design and use. Development Report. San Antonio, TX: Lackland Air ForceBase, Air Force Personnel & Training Research Center, October 1956.AFPTRCTN-56-119. ASTIA Document No. 098893.

Fryer, D. H., Fernburg, M. R. & Tomlinson, R. M. A guide for determiningtraining aid and device requirements. New York, NY: Richardson, Bellows,Henry and Company, Inc., May 1952. SPECDEVCEN 383-04-1, AD641 912.

Gagne, R. Simulators. In R. Glaser (ED.) Training research and education.Office of Naval Research, Psychological Services Division, Personnel andTraining Branch, 1961.

Goclowski, J. C., King, G. F., Ronco, P. G., & Askren, W. B. Integrationand application of human resource technologies in weapon system design:Coordination of five human resource technologies. AFHRL-TR-78-6(II),AD A053 680. Wright-Patterson Air Force Base, OH: Advanced SystemsDivision, Air Human Resources L:.ooratory, March 1978.

Goclowski, J. C., King, G. F., Ronco, P. G. & Askren, W. B. Integratioaand application of human resource technologies in weapon system design:Processes for the coordinated application of five human resourcetechnologies. AFHRL-TR-78-28, AD A053 781. Wright-Patterson Air Force Base,OH: Advanced Systems Division, Air Force Human Resources Laboratory, March1978.

71

Hannaman, D. L., Freebie, L. A. & Miller, G. G. Description of the AirForce maintenance training device acquisition and utilization processes -

Review of current processes. AFHRL-TR-78-28, AD A059 743. Lowry Air ForceBase, CO: Technical Training Division, Air Force Human Resources Laboratory,1 June 1978.

Hannaman, D. L. & Freebie, L. A. Description of the Air Force maintenance

training device acquisition and utilization processes - Technology gapsand recommendations. Lowry Air Force Base, CO: Air Force Human ResourcesLaboratory. Draft Technical Report. Contract F33615-77-C-0052.

Hawley, J. K., Mullens, C. J., Weeks, J. Jet engine mechanic--AFSC 426X2:

experimental job performance test. Brooks AFB, TX: AFHRL, December 1977.

AFHRL-TR-77-73.

Herrick, R. M., Wright, J. B., & Bromberger, R. A. Simulators in aviationmaintenance training: A delphi study. Warminster, PA: Naval Air Develop-ment Center, 10 December 1977. NADC-78015-60.

Jorgensen, C. C. & Hoffer, P. Prediction of training programs for use in

cost/training effectiveness analysis. El Paso, TX: U.S. Army ResearchInstitute for the Behavioral and Social Sciences, January 1978.

Joyce, R. P., Elliott, T. K. Informational job performance aids: a

bibliography. Valencia, PA: Applied Science Associates, Inc., May 19b7,Supplemented l 1971.

King, W. J., & Duva, J. S. (Eds.) New concepts in maintenance trainers andperformance aids. Orlando, FL: Human Factors Laboratory, Naval TrainingEquipment Center,October 1975. Technical Report: NAVTRAEQUIPCEN IH-255.

King, W. J. & Tennyson, M. E. Maintenance training and aiding in the1980's: A prediction. From Proceedings - Volume IV. Future of

simulators in skills training. First International Learning TechnologyCongress & Exposition on Applied Learning Technology, Society for AppliedLearning Technology, July 21-21, 1976. P. 40.

Kinkade, R. C., Kidd, J. S., Ranc. A study of teactical decsion makin&behavior. Aircraft Armaments, Inc., Cockeysville, MD: Decision Sciences

Lnboratory, Electronic Systems Division, Air Force Systems Command United

States Air Force, L. C. Hanscom Field, Bedford, MA, November 1965. Contract

No. AF lq(628)-4792.

Maintenance Training Simulator Procurement, Staff Study prepared by Deputyfor Development Planning, Aeronautical Systems Division. Wright-PattersonAir Force Base, OH: 3 July 1978.

72.-.. l

MatI tory, W. .J. lDeve to moll guidLel1inies for sped i n1g ilc t I on;I Icharacteristics ot maLeac raining simt'Ltors. Valencia, PA:Appli ed Sc ienc e As socitate s, Inc ., Fcbruairy 1 978. Task Docuetiln tat ionRe port , Data I~ t emn AO) 3. NAI)C Cottrac t Nb2 2 9- 17 -C-O 304.

MalIlory , W. J . , KI t ot L , TV. K. Meastiring troublesboot i g skills tisi,6'hardware-tfree sinmulat ton. Final Report tor Period I March 1477 )tit\ul1978. Lowry AFBI, Co- T'elin (c-al TrainIng D~ivis ion, Wrighdt-Pat terson AFR,

011l, lh'cember I197t,* AFIiRL-TR-7i-42.

M'c~tilrk , F. 1). , Pie ope r, W. 31. , ~.Nl 1cIvr, G. C. OperationalI t ryout of a

gener Iprpose si mu at or. Al-IiRL-TH- 7 i- 1, AI)-At) 14 794 . Lowr, v Ali r Force

Kiase, oHl: Techn ical1 Tra ininug Service, Air Force Hlumain RoOurces Laboratork.May 1975.

Meoist er, 1). Assessment of a prot ot v pehuman resources doti haod book forsyStems elg I neeij. F-inal Report 1or erio April 1 9)t- PcmbrIWest lake Villtage, CA: Advancedi S 'ystemls Div ision, Wr ight-Pat terson AFK, 011,Dec ember 197tb * AFlIRL-111-76--1 .

MeI s t er, 1). , Mi I Is, R. H. D~eve Io pnen t of a hu iman1 erf ItormIlanlce re I la bi I I t vdata system. A paper based.o thelk f ina report -or Cont. ract F) 1ki 5-7o-C-

llI8 , Human Fnigineer log Ditvis ion Ae ro.,pact, Medlica I Research l,,iborator h's.

Meist er, P). limnaii factors tin operattkwali svtte,,s test inn: A manxia otprocedu1tres. San D~iego, CA: NPIR)C S%~ 78-8, April1 1978.

Miller, K. E. Inst ruct. lonal st rate.gies uising low-cost stintlat ion I oreltect ronic minlteniance. Ar 1 i ng t on , VIA: II. S. Arm v Resea rch Inlst I it t, t orth Heh-~fa vioI ra I m -d Sock (aI I Sci encs, lv YN7 5 . AR!I -TR- 72-A" . ADA\02'i94-1

MilleI r, G. G. The tit 11 za t i on of- s imulat or t(or Atir Fo rce tech iIcalIt ra I n Ing. Fromt Procv,'d ings - Vo lumet I V. Fuitutre of s imulIat ors i n sk i (srtt 1' 11 g * F I rs t n tt, rmat i onla I Lear 1-1 ng ITe hum I op. Cong ,res"s &~ Fxpo si tin

on Appliled Leartttg Tectnology, Soc: tt t or Apil it'd Lernn! lug Tchnol ogv,

Mt II er, . &. G~ardner, E. Mi. Advancved s (m Int a t or-r vL t ormancvespec it (cat Ion ltr an F-Ill test sa oa ~1lT-~7) PA2 ~3

-.w lrFrc is t: *Itechnt-ica- l Fr-I.minl~i Dig vi sion, Al r- ForceHua

'i.s,'uurcues Laborat orv', November 197su.

* u. .Stime cons iderat tins in the design -nd tit I I izat toll totn .it 'rs tor (techinical t ra ml ug. AFhtRI-TR-74-(0, Al)-AOOI (ilk). lowryv Ai r

K. CO: rchin (cal I 'ra I n Ing 1)i1 v is in, Ali r Fo rce human Re soure

* 'iv-w. K. .1. , Frit'ksoti, .I. M. , Klein, C. A. & Hot I, K. R.jli. j y~n itide: ('he list, oif I raili) 4 A.olre e t III

* t re cop)Wigh -P fte so A rForce Ha se.*Ol

7)

Miller, R. B. Psychological considerations in the design of trainingequipment. Wright-Patterson Air Force Base, OH: Wright Air DevelopmentCenter, December 1954. WADC Technical Report 54-563.

Miller, R. B. Task and part-task trainers and training. Wright-PattersonAir Force Base, OH: Wright Air Development Division, June 1960. ContractNo. AF33(616)-2080. WADD Technical Report 60-469. AD No. 245 652.

Naval Air Development Center. Guidelines for designing a simulator fororganizational level electronic maintenance training. Wayne, PA: AppliedPsychological Services, Inc., 8 August 1977.

Naval Air Development Center. Notes and elaborations on "Guidelines fordesigning a simulator for organizational level electronics maintenancetraining." Wayne, PA: Applied Psychological Services, Inc. 31 August1977.

Naval Training Equipment Center. Specification for trainer, A6E tram air-craft, detecting and ranging set maintenance. Orlando, FL: Author, 18January 1978. N83-729; Task 8034.

Parker, E. L. Generalized training devices for avionic systemsmaintenance. Orlando, FL: Naval Training Equipment Center, April 1975.Technical Report: NAVTRAEQUIPCEN 73-C-0091-1.

Pepinsky, P. N., Pepinsky, H. B., Pavlik, W. B. Motivational factors inindividual and group productivity, II. The effects of task complexity andtime pressure upon team productivity. The Ohio State University ResearchFoundation, Columbus, OH: Office of Naval Research, N6ori-17, T.O. Ill (NR171-123).

Pieper, W. J. & Benson, P. G. Simulation design manual for the EC-Il simul-ator. Lowry Air Force Base, CO: Air Force Human Resources Laboratory, May1975. AFHRL-TLR-75-14.

Pieper, W. J., Guard, N. R., Michael, W., & Kordek, R. Training developersdecision dialogue for optimizing performance based training in machineascendant MOS. Valencia, PA: Applied Science Associates, Inc., I March1978.

Powers, T. E. Generic cognitive behaviors in technical job tasks. Apresentation for Management Review on Maintenance training and PerformanceAids, David W. Taylor Ship R&D Center, Bethesda, MD: February 1977.

Prime Item Development Specification for a Trainer, Flight Simulator,Type No. A/F-37A-T55. Representative of the A-tO Aircraft, Specifica-tion No. SSPO-07878-4000A, Simulator System Program Office, AeronauticalSystems Division, Wright-Patterson Air Force Base, OH, 3 January 1977.

74

Prime Item Development Specification for Simulated Aircraft MaintenanceTrainer (SAMT) System 2185. Specification No. 16PSO28, CodeIdentification No. 81755, General Dynamics, June 1977.

Randolf AFB, Functional/performance requirements for maintenance proceduresimulator for the EF-111-A tactical Jamming system (TJS). Functional/-Performance Requirements TT-PR-78929, 29 September 1978.

RCA Service Company. Maintainability engineering. Cherry Htill, NJ:Rome Air Development Center, Research and Technology Division, Air ForceSystems Command Griffiss Air Force Base, NY, 9 February 1963. ContractAF30 (602)-2057.

Reilly, R. E. Analysis of part-task trainers for U.S. Army helicopter

maintenance. NTEC Contract No. Nb1339-C-76-0098. Alexandria, VA: Al lenCorporation of America, December 1976 (draft).

Retterer, B. L., Griswold, G. H., McLaughlin, R. L. The validation of amaintainability prediction technique for an airborne electronic system.Wright-Patterson AFB, OH: Behavioral Sciences laboratory, AerospaceMedical Research Laboratories, Aerospace Medical Division, Air ForceSystems Command, May 1965. AMRL-TR-65-42.

Rifkin, K. I., Pieper, W. J., Folley, J. D., Jr., & Valverde, H. H.Learner-centered instruction (LCI). Volume IV: The simulated mainten-ance task environment (SMTE): A job specific simulator. AFHRL-TR-68-14,AD-855 142. Wright-Patterson Air Force Base, OHt: Air Force Human ResourcesLaboratory, Air Force Systems Command, Kay 1969.

Sakala, M. K. Effects of format structure, sex, and task familiarity on thecomprehension of procedural instructions. Department of Industrial EngineNorth Carolina State University, Raleigh, NC: Naval Air Systems Command,June 1976. N68-335-75-1129.

Schumacher, S. P. & Glasgow, Z. Handbook for designers of instructionalsystems. Valencia, PA: Applied Science Associates, Inc., I March 1974.

Schumacher, S. P., Swezey, R. W., Pearlsteln, R. B. & Valverde, H. H1.Guidelines for abstcacting technical literature on instructional systemdevelopment. AFHR-TR-74-13, AD-777 757. Wright-P'atterson Air Force Base,OH: Advanced Systems Division, Air Force Human Resources laboratory, February1974.

Serendipity Associates. PIMO status report 1. Section IV & V & Appendix A,May 1956.

Shriver, F. L., Fink, C. D., Trexler, R. C. Forecast systems analysis andtraining methods for electronics maintenance training. Alexandria, VA:George Washington University Human Resources Research Oft ice Training Met hodsDivision, May 1964. HumPRO Research Report 13.

75

Shriver, E. L., Hart, F. L. Study and proposal for the improvement ofmilitary technical information transfer methods. Final Report on ContractNo. DAAD05- 74-C-0783. Alexandria, VA: U.S. Army Human EngineeringLaboratory, Aberdeen Proving Ground, MD, December 1975. AD-A023 409.

Siegel, A. I. & Federman, P. J. Considerations in the systematic develop-ment of simulators for organizational level maintenance training.Wayne, PA: Applied Psychological Services, Inc., 16 June 1977.

Siegel, A. I. & Federman, P. J. Notes and elaborations on "Considerationsin the systematic development of simulators for organizational level

maintenance training." Wayne, PA: Applied Psychological Services, Inc.,11 July 1977.

Smillie, R. J., Edsall, J. C., Ayoub, M. A., Muller, W. G. Assessment and

evaluation of job performance aids: a simulation approach. ContractN68335-75-C129, Naval Air Systems Command, Washington, D.C.: Main ReportPart II, April 1976.

Smillie, R. J. The assessment and evaluation of job performance aid formatsusing computer simulation. A thesis, North Carolina State Unversity,Raleigh, NC: 1979.

Smith, B. J. Task analysis methods compared for application to training

equipment development. Valencia, PA: Applied Science Associates, Inc.,September 1965. Contract N61339-1218-$2, Project 7568. AD 475 879.

Smith, E. A. Four systems for controlling multi-screen or team trainingpresentations. AFHRL-TR-77-83, AD-A055 093. Lowry Air Force Base, CO:Technical Training Division, Air Force Human Resources Laboratory, December1977.

t Spangenberg, R. W. Tryout of a general purpose simulator in an AirNational Guard training environment. AFHRL-TR-74-92, AD-AO09 993. LowryAir Force Bose, CO: Technical Training Division, Air Force Human ResourcesLaboratory, December 1974.

Spangenberg, R. W. Selection of simulation as an instructional medium.From Proceedings - Volume IV. Future of simulators in skills training.First International Learning Technology Congress & Exposition on AppliedLearning Technology, Society for Applied Learning Technology, July 21-23,1976. P. 65.

76

. - _ .,. .. .

Sugarman, R. C. , Johnson, S. L., & Ring, W. F. It. B-I systems approach totraining. Final Report. Contract No. F33657-75-C-0021. Buffalo, NY:Caispan Corporation, fuly 1975. Calspan Report No. FE-5558-N-I.

AD B007208.

Swanson, R. A. The relative effectiveness of training aids designed foruse in mobile training detachments. The Lackland Air Force Base, TX: AirForce Personnel & Training Research Center, AD No. 30963.

The American Institutes for Research. Functional requirements for drivertraining devices, Volume 1. Final Report for Contract No. FH-11-7322.Pittsburh, PA: prepared for U.S. D)epartment of Transporation, NationalHighway Traffic Safety Administration, November 1974.

The American Institutes for Research. Functional requirements for drivertraining devices, Volume il--Appendix: Training events and functionalrequirements. Final Report for Contract No. FI1-ll-7322, Pittsburgh, A:U.S. Department of Transportation, National Highway Traffic SafetyAdministration, November 1974.

Valverde, 11. 1t. Maintenance training med ia--An annotated bihliography.Wright-Patterson Air Force Base, Oil: Aerospace Medical Research Laborato-ries, Aerospace Medical Division, Air Force Systems Command, MaN, 1968.AkRL-TR-(17-l 51.

Van Cott, H. P. & Kinkade, R. G. (Eds.) Human engineering guide toequipment design. (Revised Edition). Sponsored by Joint Army-Navy-AirForce Steering Committee. Washington, DC: American Institutes forResearch, 1972.

Weingarten, .1. L. The primary specification. Deputy for Engineering,Aeronautical Systems Division.

Wheaton, (. R., Fingerman, P. W., Rose, A. M. & Leonard, R. L., J1r.Evaluation of the effectiveness of training devices: Elaboration andapplication of the predictive model. Research Memorandum 76-lb It.S. ArmvResearch Institute for thhe Behavioral and Social Sciences, Julv 147.Contract No. DAHC 19-73-C-0049.

Wheaton, (. R., Rose, A. M., Fingerman, P. W., Korotkin, A. L., & Holding,I). H. Evaluation of the effectiveness of training devices: Literaturereview and preliminary model. Research Memorandum 76-6. U. S. ArmyResearch Institute for the Behavioral and Social Sciences, April 1976.

77

- A

Willis, M. P. & Peterson, R. 0. Deriving training device implications fromlearning theory principles. Volume I: Guidelines for training devicedesign, development and use. Dort Washington, Long Island, NY: U.S.Naval Training Device Center, July 1961. Technical report: NAVTRADEVCEN784-1. AD 264 364.

Wilmot, H. L., Chubb, G. P., Tabachnik, B. J. Project PIMO final reportPIMO technical data preparation guidelines. Serendipity, Inc., Space andMissile Systems Organization, Air Force Systems Command, Norton Air ForceBase, CA.

Worthey, R. E. (Study Director) Air Force aircrew training devices. MasterPlan. Final Report. Wright-Patterson Air Force Base, OH: Headquarters U.S.Air Force by Deputy for Development Planning, Aeronautical Systems Division,

March 1978. AD A056940.

Wright Air Development Division. Uses of task analysis in deriving

training and training equipment requirements. Wright-Patterson Air ForceBase, OH: Author, December 1960. WADD Technical Report 60-593.

78 US GOVINMEN PRINII 01FIC:I I 'I071 °

14i "1 lo

-- -'I .. t'