document resume ed 094 752 ir 000 951 seidel, robert j.; … · 2014-01-14 · document resume ed...

76
DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application of Theoretical Factors in Teaching Problem Solving by Programed Instruction. HumRRO- TR -68 -4. INSTITUTION Human Resources Research Organization, Alexandria, Va. SPONS AGENCY Office of the Chief of Research and Developtent (Army) , Washington, D.C. REPORT NO HumBROQTR -68-4 PUB DATE Apr 68 NOTE 75p. EARS PRICE MF-$0.75 HC-$4.20 PLUS POSTAGE DESCRIPTORS Educational Research; *Instructional Technology; Military Training; Mnemonics; *Problem Solving; *Programed Instruction; Training IDENTIFIERS Army ABSTRACT In continuing research into the technology of training, a study was undertaken to devise guidelines for applying programed instruction to training courses that involve the learning of principles and rules for use in problem solving. As a research vehicle, a portion of the material in the Army's Programing Specialist Course was programed to explore several different factors in using automated instruction to teach computer programing. Experimental versions of the course were administered to over 900 subjects in various experimental groups. Criterion and retention tests based on actual job problems were used to measure subject's performance, along with in-training measures. Results in a series of prompting/confirmation variations indicated that giving subjects extensive stimulus support during training helps motivate them and improves scores during training, but hampers them in using what they have learned. Requiring subjects to fully write out rules during training hindered them in developing problem-solving skills applying these rules; however, using mnemonics during training aided subjects in retaining what they had learned. (Author)

Upload: others

Post on 17-Jul-2020

1 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

DOCUMENT RESUME

ED 094 752 IR 000 951

AUTHOR Seidel, Robert J.; Hunter, Harold G.TITLE The Application of Theoretical Factors in Teaching

Problem Solving by Programed Instruction.HumRRO- TR -68 -4.

INSTITUTION Human Resources Research Organization, Alexandria,Va.

SPONS AGENCY Office of the Chief of Research and Developtent(Army) , Washington, D.C.

REPORT NO HumBROQTR -68-4PUB DATE Apr 68NOTE 75p.

EARS PRICE MF-$0.75 HC-$4.20 PLUS POSTAGEDESCRIPTORS Educational Research; *Instructional Technology;

Military Training; Mnemonics; *Problem Solving;*Programed Instruction; Training

IDENTIFIERS Army

ABSTRACTIn continuing research into the technology of

training, a study was undertaken to devise guidelines for applyingprogramed instruction to training courses that involve the learningof principles and rules for use in problem solving. As a researchvehicle, a portion of the material in the Army's ProgramingSpecialist Course was programed to explore several different factorsin using automated instruction to teach computer programing.Experimental versions of the course were administered to over 900subjects in various experimental groups. Criterion and retentiontests based on actual job problems were used to measure subject'sperformance, along with in-training measures. Results in a series ofprompting/confirmation variations indicated that giving subjectsextensive stimulus support during training helps motivate them andimproves scores during training, but hampers them in using what theyhave learned. Requiring subjects to fully write out rules duringtraining hindered them in developing problem-solving skills applyingthese rules; however, using mnemonics during training aided subjectsin retaining what they had learned. (Author)

Page 2: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

el

BET COPY 'AVAILABLE

Technical Report 68-4

The Application of Theoretical Factorsin Teaching Problem Solvingby Programed Instruction

by

Robert J. Seidel and Harold G. Hunter

HumRRO Division No. 1 (System Operations)

IlluntRIROThe George Washington University

HUMAN RESOURCES RESEARCH OFFICE

AD

April 1968

Prepared for:

Office, Chief ofResearch and Development

Department of the Army

Corthad DA 44-19B.AR0-2

This document has beenapproved for public release

and sale; its distributionis unlimited.

Page 3: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

,

r\t)trio,"The Application of Theoretical Factors

in Teaching Problem Solvingby Programed Instruction

by

Robert J. Seidel and Harold G. Hunter

.4r:iikEaMMEMACIVOMMIINNAYASklaWaMMEUSKil

This document hos been approved for public releaseand sale; its distribution is unlimited,

S DEPARTMENT OF HEALTH.EDUCATION &WELFARENATIONAL INSTITUTE OF

EOUCATIONTiS DOCUMENT HAS BEEN REPRODUCE 0 L&ACTLT RECErsFo t ROMTr.1 PERSON OR ONDANotATIONORICONAT oriCr f POINTS O. 'PEW OP OPINIONSSTATED 00 NOT fiE SSARTLT REPPTSENT OFFICIAL NAtroNnt. INS:IIIItF orEOUC&T.ON POSITION ON Pr.%

Prepared far;

Office. Chief of Research and DevelopmentDepartment of the Army

Contract DA 44.18BAR0-2 IOA Proj 2J024701A712 01)

HumRRO Division No, 1 (System Operations)Alexandria, Virginia

The George Washington UniversityHUMAN RESOURCES RESEARCH OFFICE

teg

.0"fr-

April 1968

Technical Report 68-4Work Unit METHOD

Sub-Unit II

Page 4: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

The Human Resources Research Office is a nongovernmentalagency of The George iishington l'otiversity. The research reportedin this Teeliniral Repori was conducted under contract with theDePorithent of the Arm% (1)A l1-11-18-.1110-2). Ilum11110's missionfor the Department of the Army is to conduct research in the fieldsof training. motivation. and leadership.

The findings in this report are not to be construedas an official Department of the Army position,unless so designated by other authorized documents.

PublishedApril 1968

byThe George Washington University

HUMAN RESOURCES RESEARCH OFFICE300 North Washington StreetAlexandria, Virginia 22314

Distributed under the authority of theChief of Research and Development

Deportment of the ArmyWashington, D C 20310

Page 5: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

FOREWORD

The research described in this report was performed by the HumanResources Research Office under Work Unit METHOD. Research for ProgramedInstruction in Military Training. The objectives or Sub-Unit II of that Work Unitwere to develop guidelines roe applying programed instruction to military tasksrequiring the learning of principles and rules, especially those applying tocomputer programming.

The vehicle for the research was the development of an automated instruc-tional package for teaching basic computer programming instructions in theriki..DATA symbolic coding language. The final version of the course isentitled, Basic Computer Programming: A Self-Instructional Course (withAnswer F3ooklet), June 19G7.

The research was conducted at !fun:113RO Division No. 1 (System Operations).The present Director of Research of the Division is Dr. J. Daniel Lyons;Dr. Arthur J. Hoehn was Director of Research when METHOD II was initiated:Dr. Robert J. Seidel was Work Unit Leader.

HumR1i0 staff members who contributed to the research include Dr. IrisC. Rotherg, in the early stages; Dr. Eugene P. MacCaslin, task analysis;Dr. Donald Reynolds, Dr. Harold Wagner, and Dr. Richard D. Behringer, datacollection; Dr. Harold G. Hunter, course preparation and report preparation; andSP 4 Wayne S. Carpenter, course preparation.

The superintendents of the school systems in the metropolitan Washington,D.C., area who provided facilities and students for the administration of theexperimental programed instructional course were Dr. J.C. Albohrn, Alexandria,Virginia; Dr. H. Wilson, Associate Superintendent for Instruction, ArlingtonCounty, Virginia; Dr. C.P. Hansen, Washington, D.C.; Mr. E.C. F'underburk,Fairfax County, Virginia; Dr. H. Elseroad, Montgomery County, Maryland;and Mr. W.S. Schmidt, Prince Georges County, Maryland.

IfumR1i0 research for the Department of the Army is conducted underContract DA 44- 188 -ARO -2 and Army Project 2J024701A712 01, Training,IVIotivation, Leadership Research.

Meredith P. CrawfordDirector

Human Resources Research Office

Page 6: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

41.10=4100

SUMMARY AND CONCLUSIONS

Military ProblemThe military problem to which this study was addressed is primarily that of the need for

research in the technology of training, with development of a potentially useful course servingboth as the research vehicle and as a secondary objective.

In recognition of the need for more efficient training of military personnel to meet theincreasing demands for more complex knowledge and skills, research and development on auto-mated instruction is continuing on a widespread scale. While solution of training problems for aparticular course represents an important activity, contributions that have broader applicationcan be made by establishing general guidelines for developing automated instruction for a classof instructional content. Toward this end, the principal objective of METHOD II research hasbeen to develop general guidelines for the application of programed instruction to tasks requiringthe use of principles and rules in problem solving.

Increasing complexities in military systems have created a need for faster and more effi-cient methods of information processing and a consequent increase in the importance of trainingfor computer programmers, whose work involves the application of principles and rules. Theactivity chosen for the research, therefore, was the development of an automated instructionalpackage for a portion of the Automatic Data Processing Specialist (ADPS) programming course(MOS 745.1). The instructional content was concerned with the basic computer programminginstructions and FIELDATA symbolic coding language, pertinent to the MOBIDIC computer.Emergence of a usable military course of instruction would be a desirable by-product ofthe research.

Research ProblemAlthough programed instruction is beginning to find application in full-length operational

courses, much of the research on the technique has been restricted to short-course laboratorystudies, using material requiring simple memorization. Most operational training, on the otherhand, requires learning complex material-"connecting" a set of stimuli (concepts) and symbolic"mediating" responses, in contrast to "connecting" a single stimulus and a single overtresponse.Further, for training, a distinction must be made between learning per se (i.e., developing a baseknowledge of and ability to use what was learned to solve problems.

To help bridge the gap between laboratory research and training, the METHOD 11 researchwas planned as a large-scale experimental study to explore a variety of factors in using automatedinstruction to teach fundamentals of computer programming. The study provided "real" trainingcontent, an environment more like a school than a laboratory, and complex material to be learned.

Variables studied experimentally in presenting programed instruction were (a) degrees ofprompting-providing correct answers to the student before he responds, (b) degrees of confir-mation-providing correct answers for the student after he responds, (c) effects of verbalization-.requiring students to write or name rules, and (d) effects of variety of practice.

Since prompting and confirmation affect the smallest unit of a program, the individualinstruction frame, study of these variables deals with the fundamental aspects of preparingprogramed instruction. Verbalization which has been used in other research to indicate whetherthe student understands the material, was used in this study to explore the relationship betweenthe student's ability to verbalize rules and principles and his ability to actually use them inproblem-solving performance on realistic materials.

...

Page 7: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

ApproachThe research involved four major typos of ectivities:

(I) Literature review. The literature on programed instruction and learning principleswas reviewed to select concepts and techniques for experimental study.

(2) Program development. After attending military and industrial computer program-mer classes, researchers analyzed the programmer's work, with the aid of operational militarypersonnel. Portions of the ADDS programming course were selected for experimentation, andtraining content for an experimental course was developed jointly by military instructors andresearch personnel.

(3) Experimentation. Materials for the experimental course were divided into fiveparts of increasing difficulty. Training content was translated into programed instruction formatand then prepared to reflect varying degrees of the experimental variables. Training experiments,factorial in design, were then conducted: a relatively small experiment using the first two partsof the course, and a larger-scale effort using all five parts of the course'.

(4) Course revision. The principal research findings were applied in developing arevised version of the course in programed instruction format.

The experimentation phase of the research may be summarized as follows:Experiment 1A The prompting/confirmation and the verbalization variables were

studied during a three-day period of training and testing, using 60 high school students. Thefirst two portions of the five-part course were the subject matter. As the prompting/confirmationtraining condition, one-half the subjects wrote answers after being given the needed information,and the other half wrote answers before receiving the information. Alternative versions of ver-balization were used; during training, one group of subjects was required to write the entire ruleapplicable to the computer programming problems they were solving, a second group wrote theappropriate mnemonics (name) for the rule, and a third group neither wrote nor named the ruleduring their instructional work.

Experiment 1B This experiment was the first portion of the large-scale study. Itinitially involved 805 volunteer subjects from local high schools, about half of whom remainedin the program at the close of the 1B portion of the training. It covered Parts I and II of thecourse and the materials used were identical to those in Experiment IA. The same prompting/confirmation conditions also applied; study of the use of mnemonics was continued with groupsthat named the rule or did not name the rule in training, but the treatment requiring writing of theentire rule was omitted. This experiment and Experiment 2, together, covered 10 weeks, withinstruction given once a week.

Experiment 2This experiment was the second portion of the large-scale study, fol-lowing Experiment 18 in time and covering the more complex content of the course (Parts III, IV,and V). The students who had completed Experiment IB continued into Experiment 2; a total of345 completed the whole five-part program. The treatments included expanded variations of theprompting/confirmation conditions. They also included a variety-of-practice variable; duringlearning one group of subjects practiced solving problems in a single context while a second

'group was given three different variations of problems.For all of the experimentation, the performance tests used as final criterion measures

(and also as retention tests given at periods of one day to four weeks after training) representedthe kinds of computer programming problems typically encountered on the job by programmers

vi

Page 8: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

with a comparable amount and kind of training.course provided error rate data on learning during

Following the experimental administrations,from the research and apply experience gained.and improving the format. The course was thenhigh school seniors, all of whom completed it.

Work on similar practice problems within thetraining.the course was revised to incorporate findings

This revision included simplifying the contentadministered to seven college freshmen and six

ResultsExperimental Effects

(1) The results of Experiment IA showed that:(a) Writing out rules of computer programming during training hindered students on

the performance tests, despite the fact that they learned to write the rules better than theother students.

(b) Prompting was associated with significantly better learning than was confirmation,as measured by error rates during training. However, a reverse tendency appeared on theapplication of learning to problem solving in the criterion performance test.

(c) Practice in naming the rules of computer programming (rather than writing them out)appeared to aid criterion performance.

(2) Experiment 1R added to the reliability and generalizability of Experiment IA results byduplicating them under different administrative procedures.

(3) The results of Experiment 2 showed that:(a) The more information (either prompting or confirmation) the students received during

training, the more likely they were to complete the course.(b) For those who completed the course, more prompting or confirmation reduced error

during learning, but increased error on the criterion tests.(c) Students who dropped out had performed much more poorly than they had thtei*ght

they would. Students who finished the course had expected to perform only slightly better thanthey did.

(d) Variety in practice problems led to better criterion scores than did no-variety.(e) Results on the final retention test four weeks later yielded the same pattern as the

criterion tests.

Course Effectiveness

(I) Taking into consideration the differences in administration conditions between the Armyschool situation and an experimental setting aimed primarily toward comparing techniques ratherthan maximizing scores, the course seemed to be effective in certain of the experimental condi-tions. Specifically, the average scores on criterion tests in the first two parts of the coursewere approximately 80-85% correct. Under the optimal experimental conditions in the most com-plex portion of the course, the scores dropped to approximately 50% correct. Average time forcompletion of the MI five-part course (Experiments 1A and 2) was about 27 hours.

(2) In the pilot administration of the final revision of the course, four of the seven collegefreshmen given the course had scores of 90% or better on the final criterion test and the lowestscore was in the high 70s; median completion time was 26 hours. The high school students'scores averaged slightly lower, in the 80s, with a median completion time of 31 hours. The two

vii

Page 9: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

high school students who obtained the highest scores were in the bottom third of theirsenior class.

Conclusions(I) Results suggest the following guidelines for preparing programed instruction iv: learning

rules and principles:(a) A large amount of prompting and confirmation is important as a motivational device

to keep students interested in continuing training. However, a smaller amount of prompting andconfirmation, while producing more errors during learning and lowering motivation, appears tofoster better ability to use the results of training. The advantage of less prompting/confirmationfor ability to use learning seems consistent and long-lasting. -

(b) Requiring students to write out rules during their training hinders the developmentof problem-solving skills using these rules.

(c) Variety in practice problems facilitates the learning of problem-solving skills.(d) Early training using mnemonics (naming the rules) aids later learning and retention,

particularly as material becomes more complex.(2) Because of the nature of the course content, it would appear to be useful for either

officer or enlisted personnel who need some introduction to computer programming, whether foradministrative or supervisory activities, or for some allied operational activity.

viii

Page 10: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

Chapter Page

1 Rationale 3

Military Problem 3

Scientific Background for the Research 3Programed Instruction: Value as Theory

and Technique 4Effects of Errors on Learning 5Range of Instructional Content 6Knowing and Using: Levels of Learning 7

Research Objectives 8Background for the Experimental Hypotheses 8Experimental Treatments 9

2 Research Procedure 10Approach 10Overall Description of the Programed Course 11Experiment 1A 12

Design 12Subjects 13Training Materials 13Procedures 14

Experiments 1B and 2 14Design 14Subjects 16Training Materials 16Procedures 16

3 Results 18Experiment 1A 18Experiment 1B 21Experiment 2 23Relationships Between Experiments 1B and 2 25Factors Related to Dropout 30Correlations Between Experiments 1I1 and 2 31

Discussion 33Comparison of Experiments lA and 1B 33

Evidence !*o Learning 33Treatment Effects 33Summary 34

Experiment 2 34Stimulus Support 34Variety 35Mnemonics Training 36Transfer Effects of General vs. Task-

Specific Factors 38

ix

Page 11: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

Chapter Page,

Practical Potential of the Programed Course 38Relationship of Content to the Parent Army Course 38Administration, Motivation, and

Population Factors 38Instructional Effectiveness 39

Final Revision of the Course 40Description 40Evaluation 41Suggested Utilization 41

Literature Cited 45

Appendices

A Samples of Course Content and Problems From Experiment 2 . . 4713 Level-of-Aspiration Scale 56C Samples of Revised Course Format 57

Figures

1 Approach Used in the Method II Experiments . 119 Design for Experiment 1A 133 Design for Experiment 2 154 Relations Between Learning Error and Criterion Test

Error Under Prompting and ConfirmationConditions: Part III 23

5 Effects of Prompting and Confirmation on CriterionTest Performance: Part UI 24

Relation Between Learning Error and Criterion Test ErrorUnder Variety-of-Practice Conditions: Part III 25

7 Effects of Degrees of Confirmation on Criterion TestPerformance: Experiment . .. ?6

8 Effects of Degrees of Prompting on Criterion TestPerformance: Experiment 2 26

Effects of Similarity of Stimulus Support From Experiment113 (0 or 100) to Experiment 2 (0 or 100) on CriterionTest Performance: Experiment 2 29

10 Relation Between Stimulus Support and Proportion ofDropouts: Part III 31

Tables

1 Mean Criterion and Retention Test Scores by Treatments:Experiment lA 19

i'vlean Learning Times for Verbalization Variable:Experiment 1A 10

3 Mean Time-Weighted Criterion and Retention Test Scoresfor Prompting/Confirmation Variable: Experiment 1 A 20

Page 12: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

Tables Page

4 Mean Time-Weighted Criterion and Retention Test Scoresfor Verbalization Variable; Experiinent 1A 20

5 Mean Criterion and Retention Test Scores ofNaming/No-Naming Groups: Experiment 1B 21

6 Mean Criterion and Retention Test Scores ofPrompting/Confirmation Groups: Experiment 113 22

7 Adjusted Mean Scores of Naming/No-Naming andPrompting /Confirmation Groups on Criterion Test:Experiment 113 22

8 Group Means and Medians on final Retention Test:Experiment 2 24

9 Analysis of Variance of the Effects of Confirmation(Experiment 2) by Naming/No-Naming (Experiment 113)on Criterion Tests for Parts III, IV, and V 26

10 Duncan Range Groupings of the Effects of Confirmation(Experiment 2) - Naming/No-Naming (Experiment 113)Contbinations on Criterion Tests for Parts III, IV, and V 27

11 Analysis of Variance of the Effects of Prompting(Experiment 2) by Naming/No-Naming (Experiment 113)on Criterion Tests for Parts III, IV, and V 28

12 Duncan Range Groupings of the Effects of Prompting(Experiment 2) - Naming/No-Naming (Experiment 1B)Combinations on Criterion Tests for Parts III, !V, and V 28

13 Analysis of Variance of the Effects of Prompting(Experiment 2) by Prompting/Confirmation (Experiment 1B)on Criterion Tests for Parts III, IV, and V 29

14 Analysis of Variance of the Effects of Variety/No-Variety(Experiment 2) by Naming/No-Naming (Experiment 1B)on Criterion Tests for Parts III, IV, and V 30

15 Duncan Range Groupings of the Effects of Variety/No-Variety(Experiment 2) - Naming/No-Naming (Experiment 1B)Combinations on Criterion Tests for Parts III, IV, and V . . . 30

16 Correlations by Intelligence Level Between Learning andCriterion Scores of Finishers and Nonfinishers:Experiment 1B . 32

17 Correlations Between Expected Success, Intelligence, andCriterion Performance of Finishers, as a Functionof Course Complexity: Experiment 2 32

18 Retention Ratios for Prompting Treatments! Experiment 2. . 36

xi

Page 13: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

The Application of Theoretical Factorsin Teaching Problem Solvingby Programed Instruction

Page 14: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

Chapterl

RATIONALE

MILITARY PRO13LEM

As present and future defense systems continue to increase in size andcomplexity, greater demands for information and skills are being placed onmilitary personnel. Besides the requirements of learning new weapons systems,changes in current equipment and tactics impose broad, continuing requirementsfor adaptation of skills already learned. Taken together, these conditions demandthe continuing development of methods for constructing shorter, more effectivetraining programs.

Much necessary training research has been di rected toward solving problemsin specific courses of instruction. Less research has been aimedat establishinga body of knowledge, sets of principles, and effective guidelines for trainingdevelopment in general, or for specific categories of training content. Besidesthe need for more effective individual training programs, then, there is a needto develop a body of training technology. Specific findings need to be broughttogether in a systematic way, so that they may be more readily applied to futureArmy training programs.

A category of training content that has both great importance and broadapplication is the learning of principles and rules for use in problem solving.It was therefore decided that the METHOD II research would deal with guide-lines for applying programed instruction (PI) to Army training courses thatinvolve the learning of principles and rules.

The use of principles and rules is typified in the task of programming forautomatic data processing systems. The programmer's job is problem solution,that is, he analyzes information given and prepares a solution to the problem ina form acceptableto a computer. For the Army, training computer programmersis a matter of considerable urgency because the increasing complexity of weaponsand operations systems requires highly effective information processing possibleonly with computers. Therefore, the vehicle chosen for METHOD II research wasan automated instructional package for selected portions of the Automatic DataProcessing Specialist (ADPS) (FIELDATA) programming course.

From work with such materials, it was hoped that the experimental findingsof METHOD II would provide a beginning toward the establishment of guidelinesfor programing training content that consists of rules and principles. A secondaryresearch goal would be development, as a research by-product, of course materialthat could be used for military instruction.SCIENTIFIC BACKGROUND FOR THE RESEARCH

Several facets of theory and exploration which led to the current set ofexperiments are summarized in this section:

(1) The area of programed instruction is examined from theoreticalaspects, as well as for its practical value as a controlled environmental settingin which to do training research.

(2) The role of errors in training and training research is discussed.

3

Page 15: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

(3) The range, difficulty, and variety of instructional content arcconsidered in terms of concept formation processes, as contrasted with simplelearning such as rote memorization.

(4) The distinction between learning concepts and using them in solvingproblems is given emphasis.

These factors are considered against a backdrop of the various kinds ofcontent that have been used in the laboratory setting. Traditional psychologicalresearch (generally using isolated bits of verbal material) is contrasted withthe kinds of material that must be dealt with in the everyday teaching and train-ing situation, that is, connected discourse, hierarchically ordered and relatedconceptual material. Research pertinent to the specific studies undertaken inWork Unit METHOD is discussed in order to bring into focus the particularexperimental hypotheses to be investigated.Programed Instruction; Value as Theory and Technique

Because of its potential for applying Skinnerian experimental techniques tothe area of education, programed instruction has been viewed as a panacea forproblems of teaching; it is not that. On the other hand, programed instructionhas also been termed merely another teaching aid; it is not that either. It is amedium of instruction that can readily provide for the application of psycholog-ical principles to teaching. It can also provide a testing ground for uncoveringlaws of learning, using as a laboratory the highly controlled educational environ-ment that PI makes possible.

Ironically, the advances in programed instruction to date have come notfrom applications of theoretical developments but rather from a techniqueincidental to programed instruction per sethat of task analysis, which hasexisted in the field of military and industrial psychology for many years. Withthis tool a curriculum developer who is planning instruction is directed to afocus on the behavioral end-products of training (or education). There is nodoubt that this analytic tool has been helpful in improving curricula through thedevelopment of programed instruction; however, it provides marked improve-ment in conventional teaching as well. Curriculum improvement deriving fromthis technique is best considered an engineering improvement, to distinguish itfrom advances that might be due to the development of learning theory.

The advent of programed instruction has also resulted in a focus upon theindividual student. This represents an administrative breakthrough rather thana development from the psychology of individual differences. For years teachershave been aware of the importance of a student's individual characteristics ininstruction, but crowded classrooms and the need to serve the "average" studenthave interfered with teachers' attempts to take advantage of individual capacities.The teaching machine and the programed booklet have provided a distinct stepforward in this regard.

Educational research literature is rife with studies showing no differencesbetween the so-called 'conventional" and the programed instructional technique(cf., Schramm, I). Usually, the data indicate that one can teach the same amountin less time using programed instruction than using conventional instruction, orelse the development of proper behavioral objectives for a course results inimprovement for both programed instruction and conventional teaching.

Existing courses in programed instructional format represent constellationsof learning factors such as frequency of reinforcement, amount of informationper presentation unit, and activity of the student. No PI product truly representsa particular learning theory. The intent in the present research has been to try

4

Page 16: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

to examine sonic of the training factors (e.g., amount of information given priorto requiring a response) without attempting to label the program according towhether it is related to one or another or some combination of PI viewpoints.

To date, simple devices (the programed booklet and small teaching machines)have been the primary vehicles for PI. Such devices, in turn, have been quiteadaptable to the techniques of operant conditioninglearning a response by beingrewarded as a "consequence" of the response. These techniques (developedmostly with rats and pigeons) for increasing the likelihood of a response havebecome more or less the modus operand! of the curriculum developer in thefield of programed instruction. They include the training procedure of successiveapproximation (with minimal, if any, errors), particular reinforcement schedules,immediate feedback, and active participation by the student.

Cook and associates (e.g., 2) and Angell and Lumsdaine (3), and others aswell, have found that prompting (giving the student specific information beforeasking him to respond) is significantly more effective than confirmation (givinghim correct information after he responds) for serial learning. Gagne' andBrown (4), in a study dealing with conceptualmaterial, compared several com-binations of prompting and confirmation (although they did not use those terms)in the learning of principles for generating number series. Their guided dis-covery (confirmation) group, medium difficulty condition, appeared to be mosteffective, while the rule and example (prompting) condition, low difficulty level,was least effective.

Thus, the results obtained by Angell and Lumsdaine suggest that the subjectsshould be given substantial information before being required to respond, whilethe findings of Gagne and Brown suggest that subjects should be required to dis-cover the answers themselves. The precise factorthat causedthese experimentaldifferences cannot be determined because of the differences in the researchparadigms used and because of many other factors that were not comparableacross the experiments or between conditions within an experiment.Effects of Errors on Learning

The use of operant conditioning techniques, together with an insistence ona "low error rate-during learning have combined to produce PI materials char-acterized by low difficulty. Low error rate requires easy-to-understandmaterial, and the operant maxim of high student participation encourages thebreaking of the material into small (hence, low error rate) segments, with astudent response after each segment.

Whether a low rate of error is desirable during learning may be questionedon the basis of data developed in studies dealing with level of aspiration. It hasbeen found, for instance, that an individual performs to his maximum when thelearning task is neither too easy (low error rate) nor too hard (high error rate).If the task is too easy the student refuses to set a level of aspiration, or personalgoal, and will not perform the task, through lack of interest (inattentiveness) orlack of motivation. On the other hand, if the task is too difficult the student willrefuse to try, because of frustration (Lewin, 5).

The emphasis on low error rates is based, for the most part, on studies oflower organisms (rats and pigeons) and, thereby, of simple t.7pes of learning.Human learning, particularly in education, tends to be much more complexprimarily in terms of concept formation and problem solving.

The focus of the present study with respect to error rate during learninghas been to manipulate certain stimulus conditions that might produce greateror smaller error rate in a conceptual, hierarchically ordered setting. It was

5

Page 17: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

e,:pecteci that, unlike the effects obtained in rather simple discrimination learn-ing, those conditions leading to greater difficulty during training (e.g., few hints)might well result in better overall criterion performance and retention.Range of Instructional Content

In simple operant learning, the organism is taught to discriminate, orrespond differently to different stimulifor example, to perform in one way inthe presence of A and a different way in the presence of B. In most types ofhuman learning, however, the student not only must discriminate A from 13, butmust also learn that A and 13 each arc to be treated as belonging to classes ofAs and 13s. For example, a child learns that the fork in his hand is but anexample of forks in general, all of which have common properties; humanbeings tend to make such Inductive leaps even if not told to.

Even in supposed rote learning, subjects Structure the stimuli into groups,as shown by results of a number of studies in paired-associate learning(Battig, 6). Battig has proposed an interference-facilitation theory of learningthat high intra-task interference promotes inter-task facilitation. This view isconsistent with the motivational ideas expressed above. That is, the dualassumption that error rate should be minimal during the teaching-learningprocess and that human learning is analogous to a simple discriinina.tion problemis open toquestion, on the grounds of lowered motivation and of oversimplification.'

In general terms, most hu man learni ng requires that the student develop trans-formational behavior with respect to given stimuli. These transformations can becalled mediating responses or hypotheses, which exist, as it were, in the student's"head" and allow him to apply his learning from one situation to another.

The wholesale adoption of operant conditioning principles can be viewed inanother way. Skinner (9) has criticized the use in textbooks of materials hetermed irrelevantmaterials such as analogies and examples, which mightconfuse the student. Such materials are often included to challenge the student'sthinking or to make the topic more interesting; similarly, an instructor, intranslating a book and other course materials into analogies and examples, triesto make the student "understand" rather than memorize concepts. What is beingaccomplished inthe teaching-learning process is the establishment of connectionsbetween a delimited set of stimulus elements and a delimited set of responseelements. Instead of covering only narrowly defined "relevant" material, theinstructor or textbook writer who makes use of materials suchas analogies mayteach a much more complex set of relations.

Three classes of stimulus elements can be considered: irrelevant, correct,and omitted. The same three classes exist for the response, but there is alsoan additional class, the incorrect response. The outcome of training experiencemay be viewed as a set of incorrect stimulus-response (S-R) tendencies com-peting with correct ones.

One prevalent approach to programed instruction limits content to only thosestimuli that are relevant, and tries to limit responses, as much as is feasibleonly to those that are correct. Although this means that no competing responsetendencies are established, the range of S-R coverage is considerably narrowed.In other words, a limited sample of appropriate elements becomes connectedbzit the coverage is incomplete in terms of the total stimulus and responsepopulations. The problem of defining what is "relevant" in instruction, to developthe appropriate concepts and to allow the person to discriminate "correct" from

In fact. research %tido connected discourse has already produced data which raise a question as to theuniversal applicability of minimal error rate (haste Systems Inc., 7; Seidel 01,41 lintberg, 8)

6

Page 18: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

"incorrect," is a crucial theoretical and practical problem forprogramed tisl rue-tion, especially for that dealing with problem-solving behavior,

Once is considers a broad range of instructional content, and departs fromoperant conditioning concepts, new research questions unfold. One such questionderives from Harlow's error factor theory (10). In learning set research,lialow (11) inferred that the increased conceptual proficiency shown by subjectsstemmed from eliminating erroneous hypotheses during training. llis conclusion,and the essence of his theory, suggests that encouraging certain kinds of errors(i.e., incorrect S-Rs) during learning could result in more effective eliminationof erroneous hypotheses,

Thus while concepts are being formed (as is the ease in most education),variety of content probably facilitates the elimination of erroneous hypothesesduring training. It also allows the individual student to abstract the appropriatemediating skills and responses more effectively than he could in a homogeneouscontext. The individual can then use these mediating skills and responses togeneralize to other stimulus contexts after training,Knowing and Using; Levels of Learning

The distinction between rote and problem-solving learning is made asfollows. Rote learning involVes connecting specific stimuli to specific responsesduring training, and the test of success in training is simply whether the connec-tions have been made. Problem-solving learning, on the other hand, involvestraining the student to abstract a common (identity) response to a class ofstimuli; while a test of' success in such training consists of knowing (being ableto name, state, or describe a concept), the main test rests in being able to usethe concept effectively in solving problems (cf Kcndler, 12, and Gagne, 13).

A more detailed analysis of problem-solvinglearning heightens the differencebetween it and rote learning, For problem-solving learning, distinction is usefullymade between (a) responding during training, (b) abstracting concepts fromresponses, (c) knowing a concept, (d) being able to use the concept.

Making this distinction is not new. For years, Gestalt psychologists havedelivered polemics (and included some demonstrations of principle learning) onthe distinction between rote and conceptual learning, The thrust of Katona'swork on learning number series is another example. However, only recentlyhas a study by Smith, Jones, and Thomas (14) clearly separated the two typesof responding. Their experiment directly compared rote and conceptual learn-ing where number of stimuli per response was varied from one in l'oh r. (Rotelearning involved random grouping of stimuli, whereas conceptual mu:Ilionincluded a meaningful link among stimuli within each grouping.)number of stimuli per response improved concept learning iiti1 rrlearning, thus showing two distinct learning processes,

Added to the identity response in most human )(sat-Milt! Ai,' Oltsolving responses; that is, the identity response is used in dcvelopircto put concepts together in order to perform some task. Will lb. 411)1 t' 11.1.1-niques for teaching concept formation per se (identity responding) ho. the :-.amcsas those for teaching problem solving? Gagne- (13, p. 312) suggest that they mayindeed be different forms (or at least levels) of learning. C9neepi learningconsists of "establishment of mediating response to stimuli which ! I:o fromeach other physically ('classifying')." Problem solving, he feels, 1,1 the e.,;talish-ment of a process which 'combines'two or more previously learned rides in a'higher-order rule'." A study by Gagner and Smith (15) suggests that the verbal-ization of principles takes on importance as the problem becon1(.*1-; moro c:emplex.

7

Page 19: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

The distinction between concept.; and problem solving can be illustrated inelectronics troubleshooting. The student must understand the concept of aschematic diagram which represents the wires, connectors, and other parts.He must know the color coding system for resistors and the system of namingpins on electronic vacuum tubes, as well as how one reads a multimeter. Super-imposed on all of these is another requirement: the student must determine thelikelihood that following a given path of potential defects will be successful introubleshooting the overall system. The effects of each factor may be taught.So might the mechanical responses or a total task in isolation. Given the learn-ing and isolated practice, the student may be asked to actually troubleshoot amalfunctioning piece of equipment; this activity comprises applying the complexof concepts to solve a problem.

Viewing the troubleshooting illustration from a theoretical point of viewprovides a more technical analysis. The student might learn (from makingcertain kinds of controlled errors during training) to recognize the effects ofvarious kinds of errors in the system, and to abstract from his trainingexperience an understanding (conceptual set of responses) for the task. Havinglearned the proper identity responses (concepts), the student must put thesetogether into a strategy to perform the task of troubleshooting successfully(the problem-solving aspect of the process). Here, it seems, is a meaningfuldistinction between understanding (concepts) of a class of stimuli and theiruse (problem solving) in a series of acts required to turn out a useful solution.'RESEARCH OBJECTIVES

Background for the Experimental HypothesesUnfortunately, much of the research that has been done concerningthe effects

of learning factors in programed instruction has been conducted with pairedassociates or serial learning. Little has been done where experimental treat-ments have been manipulated for hierarchically organized conceptual discourse,zYet this type of discoursemeaningful course contentis precisely the mediumfor most if not all, programed instructional applications. Paired-associateexperiments generally require the subject to make a series of rote responses,and his learning is measured by his ability to give back these rote responses.Conceptual learning, however, requires the subject to abstract mediatingresponses, and learning is measured by his ability to apply these mediatingskills to a variety of different situations.

As indicated earlier, the present study was undertaken to examine someof the psychological factors in programed instruction. The review of previousresearch led to selection of the following factors for exploration:

First, the value of a rule-giving or naming type of verbalizationresponse was investigated. This factor deals with the distinction between rotelearning and concept formation, and was intended to explore the efficacy ofrepeating or naming concepts in training, for the purpose of applying learningto problem-solving situations. Many current programed instruction programsare parallel to paired-associate learning situations in that they seem to empha-size reproducing material presented. Yet the program developers seem to beinterested in teaching concepts (mediating skills). Does verbalizing a rule aidor hinder learning to use the rule in problem solving?

'This view of such a teaching problem has recently been implicitly expressed by Cronbach (16) in anexcellent review or discovery experimcnts.

'A recent study by Hickey and Netvtnn (17) is a welcomed attempt. Also. for an excellent discussionof prnblents in doing PI research un conceptual material, see Traub (18).

8

Page 20: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

Second, the principle of minimal error rate as applied to learningcomplex, meaningful materials was questioned. Level of aspiration studies haveshown that a certain amount of difficulty (e.g., error) during learning will leadto maximum motivation (and implicitly, high scores on a criterion test).

Third, the importance of varieties of examples in such human learningwas examined to see whether this principle generalized from simpler concept(learning set) studies, in which variety of examples has been found to fosterconcept formation. It would be expected to aid subjects to generalize fromtraining content to operational application.

Fourth, attention was given to another pair of factors of interest in pro-gramed instruction research"prompting" (providing the student specific infor-mation prior to asking him to respond) and "confirmation" (giving him correctinformation after he responds). For a meaningful comparison of prompting andconfirmation, the sole element manipulated should be the amount of informationgiven to the subject prior to requiring him to respond; whatever is not given tohim then should be given after he responds to make the total information eachsubject receives comparable for the confirmation and prompting conditions.Experimental Treatments

In summary, the background for the METHOD 11 research rested on theneed to study, using meaningful course content, those factors that seem mostrelevant to learning how to use rules and principles. The first experiment(1A) examinedthe influence of prompting vs. confirmation, andof alternate waysof verbalizing rules. The results of this experiment were taken into considera-tion in further exploration of the same variables in the second experiment (1B).In the third experiment (2) other aspects of these variables were explored, and,through work with a variety of practice problems, the development of connec-tions between conceptual stimuli and mediating responses was considered.

With the training content (writing a computer program) put in the form ofprogramed instruction, experiments were conducted with the following factors:

(1) Amount of information to be given to trainees prior to requiringresponses (degrees of prompting vs. degrees of confirmation) (Experiments1A, 113, and 2). This factor was judged especially valuable to study since itsinfluence is felt in the smallest unit of a program; namely, the frame itself.

(2) Value of written verbalization of rules during training (Experiments1A and 113). Verbalization of rules has been generally used in construction of pro-grams to indicate understanding by the student of the material in the program. Inthis study, interest lay in the relationship between the student's ability to verbal-ize rules and principles and his use of them in problem-solving performance.

(3) Value of variety of context for practice problems during the train-ingon problem solving (Experiment 2). The complexity in mostlearningdemandedconsideration of broad connections between conceptual stimuli and mediatingresponses. Variety of verbal contexts for the practical problems given duringtraining provides the setting for establishing broad connections. Minimal varia-tion of problem context should encourage only simple connections.

(4) The effects on a problem-solving criterion of different amounts oferror during training, examined throughout the series of experiments.

As the final step in the research effort, an attempt was made to incorporatemajor findings from the experimental work into a revised self-instructionalcourse in basic computer programming. The desired by-product was a usablemilitary course of instruction.

9

Page 21: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

Chapter 2

RESEARCH PROCEDURE

APPROACH

The major research activitiesa literature review, program development,experimentation, and course revisionmay be summarized as follows:

(1) The scientific literature on learning principles and programedinstruction was reviewed as described in the preceding chapter, to provide abasis for selecting concepts and techniques for experimental*study. Selectionwas based on importance for learning in a military training context, feasibilityof manipulation for study, and potentialfor the advancement of training technology.

(2) As the first step in developing the instructional program, theresearchers attended computer programmer classes at the U.S. Army SignalSchool, Fort Monmouth, New Jersey and at IBM. They then interviewed super-visory and computer instructor personnel, systems analysts, and computerprogrammers at Fort Huachuca, New Mexico. The nature of the tasks performedby the programmer was analyzed with the assistance of operationalprogrammers.Portions of the Automati'c Data Processing Specialist (MOS 745.1)' programmingcourse were selected for experimentation. Final content was developed jointlyby instructors' from the Signal School and research personnel.

(3) The materials selected were divided into five parts of increasingcomplexity. Training content was translated into programed instruction formatand then prepared to reflect varying degrees of the experimental variables.Three experiments were conducted; their major characteristics are summarizedbelow and in Figure 1. In all three experiments, the final criterion and retentionmeasures were performance tests representing the types of problems typicallyencountered on the job by computer programmers with a comparable amountand kind of training. The experiments were

Experiment 1AThe prompting/confirmation and verbalizationvariables were first studied during a three-day training and test period in 1962,using 60 students from a local high school as subjects. Parts I and II of the five-

,..3---' part course were used as subject matter.Experiment 1BThis experiment was the first part of a larger study

conducted between September 1964 and January 1965, and initially involved 805 stu-dent volunteers from local high schools. The variables were the same prompting/confirmation conditions and part of the verbalization condition used in Experiment1A, and the materials were also the same. Instruction was given once a weekduring the first part of a 10-week time frame which included Experiment 2.

Experiment 2This portion of the larger study followed Experiment1B in time and was applied in weekly instruction to the more complex coursecontent of Parts III, IV, and V. Students who had completed the instruction inExperiment 113 were the subjects. The prompting/confirmation variable wasexpanded and a variety-of-practice variable was introduced,

`Under the current MOS structure. the ADPS Programing Specialist is MOS 741.'.

10

Page 22: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

Approach Used in the Method II Experiments

Experiment N Instruct ienelMotor iis Factors Studied Criteria Time

Sequence

1A 60 Part I Verbalization of rules and Part ICriterion DailyPart 11 principles

a. Writing the Ruleperformance test at endof instruction

Sessions

(Rules Group)b. Naming the Rule

(Naming Group)c. Neither (CPOnly Group)

Criterion tests on writingand naming rules at endof instruction

l 3 days

PromptingiConfirmatipna. Answer before response

Retention performancetest the next day

(Prompting Group)b. Answer after response

Error rates duringtraining

](Confirmation Group) Part IISome as Port I

18 805 Part I Verbalization Same as in experiment WeeklyPart 11 a. Naming the Rule

(Naming Group)b. Not Naming the Rule

(No-Naming Group)

lA except that the retention test for each Partwas given one weekafter instruction

Sessions

Prompting/Confirmationa. Answer before responseb. Answer after response

2 Part IllPart IVPart V

Prompting/ConfirmationExploration of variousdegrees of promptingand confirmation

In addition, criterionperformance test at endof instruction for eachport.

> 10 weeks

345(at

ofcourse)

end

Variety of Practicea. Similar practice

(NoVariety Group)b. Varied practice

(Variety Group)

A final retention testwas given four weeksafter the close ofinstruction

Figure 1

(4) A final version of the course in programed instruction format wasdeveloped by applying the principal research findings, and a pilot test was con-ducted, using 13 students as subjects.

OVERALL DESCRIPTION OF THE PROGRAMED COURSE

The experimental programed course was directed toward the level of highschool seniors and first-year college students. This level was chosen becausethe content material was originally drawn from portions of the Army trainingcourse for the ADPS Programing Specialist, at the U.S. Army Signal School, inwhich the background of the Army trainees corresponds to a high school education.

The aim of the experimental course was to provide understanding of funda-mental computer programming concepts and, more important, to produce pro-ficiency in writing elementary computer programs. From the very beginningof the course, its entire context was oriented toward the writing of computerprograms. That is, a concept was introduced, then a problem incorporatingthe concept was presented, and the student programmed the problem. Thepractice problems and the problems used in the criterion and retention testswere chosen from actual ADP job situations.

11

Page 23: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

The repertoire of primary programming commands and the symboliclanguage in which the computer programs were written were chosen frommaterial which had been used by the Army to teach trainees to program theMOI3IDIC computer. The language is the FIELDATA Symbolic Language which,although representing a sot of symbols interpretable by the 11/10l3IDIC, issufficiently general in form that it largely applies to almost any other compacrand to the English language (the addresses).

As nearly as possible, the goals of all subjects were made comparable byorienting them with an overview of the concepts of computer programming in anIntroduction to the course. Specific attention was paid to the concepts of com-mand, address, the accumulator, memory locations, input, and output. Alsoincluded were a brief description of how a computer functions, and some samplemnemonics as they are used in computer commands. The coverage of thesetopics was interspersed with periodic review and with questions which studentsanswered to themselves. The student was instructed to continue reviewing thematerial preceding each of the sets of questions until he could answer thematerial to his own satisfaction. As the last item in the introductory material,the student worked through a short sample program.

The five parts of the experimental course proceeded from the simple to thecomplex, Parts I and II dealt with primary programming; the student was intro-duced to storage locations, the concept of a central work area, the movement ofnumbers from place to place, and simple arithmetic operations, along with anintroduction to elements of a symbolic data language. Part 1II covered basiclooping concepts, including general transfer commands, counting the loops,leaving the loops, and program preparation. Parts IV and V dealt with dataprocessing, covering address modification and address arithmetic (effectiveaddressing), sorting and counting data into various categories, and multipleaddress modification.

The programed instruction materials were allprepared in the form of linearprograms so that a student moved step-by-step straight through the course. Itwould not be correct to characterize the steps as frames, however, because thestepsrather than single sentences or two sentences at a timeconstituted con-ceptual or functional units. Thus, there could have been as many as two or threeparagraphs given to the student prior to questioning him on the content.

Details on the course materials and the variations for the experimentaltreatments are presented in the descriptions of the individual experiments.Content and presentation of the course are illustrated in samples inAppendix A.

The criterion tests at the end of each part of the course consisted of actualcomputer programming problems submitted by a programmer group from FortHuachuca and by programming instructors from the SignalSchool. Where neces-sary, problems were simplified by removing aspects that would require tech-niques beyond the scope of the course.

EXPERIMENT lA

Design

A 2x3 factorial design was used in Experiment 1A (illustrated schematicallyin Figure 2). Six independent groups of subjects in a p! environment learned towrite increasingly complex computer programs (CPs). One-third of the subjectsperiodically wrote out the content of the rules used to guide the writing of com-puter programs (Rules Group), one-third periodically wrote down the names of

12

Page 24: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

Design for Experiment lA

TrainingVar iotion

Writes Computer Programand Rules

Writes Computer Programand Names of Rules

Writes OnlyComputer Program

Prompting

Computer Program

Rules

Computer Program

Names at RulesComputer Program

Answer givenbefore

student response

Answer givenbefore

student response

Answer givenbefore

student response

Confirmotion

r

Computer Program Computer ProgramComputer Program

Rules Names of Rules

Answer givenafter

student response

Answer givenafter

student response

Answer givenafter

student response

Figure 2

the rules (Naming Group), and one-third wrote CPs without any verbalizationof the rules (CP-Only Group). Also during learning, ona.-half of the subjectswere required to write their answers after being given explicit informationneeded in the response (prompting), and one-half were required to write theiranswers prior to receiving the information (confirmatiol). During learning,error rates were obtained on CP-writing for all groups and on rules and namingfor the appropriate groups.

Treatments composed of various combinations of the training conditionswere administered during the PI training, and their effects were evaluated oncriterion and retention tests given at the end of each of the two parts ofthe instruction.

Subjects

The subjects were 60 volunteer high school students (juniors and seniors), whohad been informed that they were to be paid between $1.50 and- $2.00 per hour,depending upon the proficiency they exhibited in learning. They were assignedat random to one of the six training conditions. A combination of scores fromthe verbal and clerical tests of the Army Classification Battery (ACE) was usedto measure intelligence level for the detailed evaluation of results.'

Training MaterialsThe training materials used in this experiment were Parts I and II, covering

an introduction to symbolic language for computer programming and primaryprogramming commands of a business-type digital computer. The learning taskwas writing computer programs of increasing complexity, with the additionalrequirements imposed by the experimental treatments. The treatments wereapplied as each new concept was introduced along with the attendant requirementof writing a computer program.

A different programed instructional booklet was used by each of the sixgroups of subjects (Figure 2), in accordance with the training conditions assignedto that group. The booklets usedbythe Rules and Naming (or mnemonics) Groups

'The intelligence measure was the same as that applied by the Army for entrance into the ADDS pro-gramming course.

13

Page 25: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

consisted of 9 0 pages, inducting the criterion tests; booklets used by the CP-Only Groups had halt that number of pages. The difference in treatment betweenthe prompting and confirmation conditions did not affect the size of the booklets.

For all groups, the first step was the presentation of a concept. The nextstep varied according to the experimental treatment. The following samplerule can be used to illustrate the various treatments:

The CIA SOCSEC. instruction clears any materialalready in the accumulator and copies the numberfrom location SOCSEC into it.

After the concept had been presented, the portion underlined above was given tothe subjects in the Rules Groups, who were to fill in the rest; the portion notunderlined was given to the subjects in the Naming Groups and they filled in thepart underlined. Both groups were then given a problem illustrating the conceptand they wrote a computer program on the problem. The CP-Only Group wentdirectly from the presentation of the concept to the statement of the problem.

The prompting/confirmation variable was added to the above treatments byhaving subjects work on only one page at a time. The Prompting Groups saw theentire sentence first; the subjects in those groups turned the page and completedthe blank before working on the computer programming problem. 'Fbe subjectsin the Confirmation Groups completed the blank, then turned the page and viewedtne entire sentence after working on the problem.

The booklets included criterion tests at the end of each part of the course.As a secondary part of the criterion tests, the ability of all subjects to writet. e rules and the names of the rules used in writing the CPs was measured.

ProceduresOn the first day students were given Part I, the section concerned with pri-

mary programming commands, and a criterion test for that material. On thesecond day they were first given a retention test (for Part I) consisting of fourproblems to be programmed, which all subjects finished within 30 minutes. Imme-diately following the retention test, they were given Part II of the course, onmore complex computer programs, and the criterion test for that part. On thethird day, they were given Retention Test II, consistingof four prograrnmingprob-lems covering all the course material, followed by the ACB intelligence measure.

The students were allowed a ma:cimum of four hours on each of the two daysof training, but only one or two students required that much time on either day.Each student's booklets were checked at the end of his working day to ensurethat he was actually working with the material. All students appeared highlymotivated to perform; many stayed after hours to ask further questions about thecourse work. (No substantive questions were answered in order not to contam-inate the experimental treatments.)

EXPERIMENTS 113 AND 2

Design

Taken together, Experiments 113 and 2 covered all five parts into which thecourse was divided. The first two parts (used in Experiment 1B) were identicalin content to that of Experiment 1A; they provided subjects with the backgroundnecessary for the final three parts (Expeiment 2), which dealt with looping

Ea.ch of the five parts had its own criterion test.

14

Page 26: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

All variables in the two experiments were administered factorially duringtraining, Experiment 113 involved the same conditions as Experiment 1A, exceptthat the Rules Group was omitted. That is, in Experiment 113 comparisons weremade between (a) prompting and confirmation; and (b) writing the names of rulesand writing computer programs, and simply writing the computer programs,'

After completing Part II of Experiment 113, students were reassignedrandomly to treatment groups for Experiment 2, which included three variables:

(1) 0, 50, or 100% confirmation.(2) 0, 33, 67, 100%, or progressive (67, 33, 0) prompting.(3) No-variety vs. variety conditions on practice problems.

The No-Variety Group worked on practice problems having a verbal context ofwarehouse inventory for electronic vacuum tubes. The Variety Group, however,had three types of verbal contexts, the one on vacuum tubes just noted, and onmilitary pay, and uniform selection.

The design of Experiment 2 is anincomplete three - factor study, com-posed of 29 cells (see Figure 3).2The practice problems were given insets of three instead of singly. Theconfirmation variable was admin-istered on a more molar levelthanthat of the prompting variable, beingapplied across sets of problems; thatis, either the entire set of threeproblems received confirmation or itdid not, depending upon the confir-mation applied. Both the promptingconditions and the variety vs. no-variety treatments were applied withinthe sets of three problems. Further-more, the progressive promptingcondition was administered as adiminishing prompt within each setof three problems; thus, the first problem received two-thirds prompting, thesecond one-third, and the third no prompting. A sample application of treat-ments is shown in Appendix A,

In addition, an attempt was made to obtain data on the relationship betweenthe stage of training at which mnemonics are introduced and the improvementin criterion performance. This was done by continuing the ur:.e of mnemonicsthrough the course for three additional groups of students.3

Design for Experiment 2

0 33 Progressive 67 100

Percent of PromptingAdded Vorbolizotion treatments in the SomaVariety condition:

Naming 1100% Prompting and 0 Confirmation0 Prompting and 100'; Confirmation

Nooming tO0e Prompting and 0 Confirmation

Figure 3

`The schedule of experiments did not provide true replicates beteen any of the parts or experiments.What is lost by nut having replicates. however. is more than compensated for by the gain in generalizabilityof whatever findings are stable across materials differing in complexity.

`The four Yells which would have combined 100% or 50% Confirmation with 100 Prompt lug (eitherVariety or No-Variety) were considered somewhat redundant and were to be omitted from the design.

'Three cells were added to the design to extend the comparisons of Naming No-Naming, and Confir-mation ()fossilising to more complex materials. The tells were: lOOfi Custfirnutiou plus Naming. 100'7Prompting plus Naming. and 100'7 Prompting with the Naming consent provided without any response require-ment. A clerical error during preparation of the booklets resulted in 0% Confirmation - 100% Prompting(Variet and No-Variety) cells omitied instead of 100r; Confirmation - 100 "i Prompting cells. Thus. Namingcomparisons in Experiment 2 were quite limited.

15

Page 27: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

Groups of 29 subjects or multiples thereof worked in a single experimentalroom with each student of a set of 29 in a different experimental treatment fromthose of his classmates. This design effectively eliminated experimenter dif-ferences and socioeconomic variability between school systems, and providedgenerally equivalent levels across cells. As a check on ability levels, the CLportion of the Army Classification Battery was administered to the students.'Subjects

The original plan was to use Army subjects at the U.S. Army Signal Schoolfor Experiments 1B and 2, Because of difficulties in obtaining an adequatesample of Army subjects there, however, it was necessary to use a civilianpopulation for the experimentation. Subjects were male volunteers recruited,without compensation, from the junior and senior grades of high schools inthe metropolitan Washington, D.C., area. The schools participating representedall local area school systems, thus insuring a sizable range in socioeconomicstatus of the pupils attending. No E. ystematic attempt was made, however, totake stratified samples either within or across schools on the basis of socio-economic status. The school systems, and the number of participating schoolsper system, were: Alexandria, 1; Arlington, 1; District of Columbia, 3; FairfaxCounty, 2; Falls Church, 1; Montgomery County, 4; and Prince Georges County, 2.

The number of volunteer subjects beginning the course was 805; the drop-out rate was high (57%), with the number of subjects who completed the entirecourse being 345. Approximately 75% of the attrition occurred pz.ior to criteriontesting for Part III, 21% prior to testing for Part IV, and 4% during Part V. Theanalyses of the experimental effects, reported in the next chapter, were made onthe basis of the subjects who completed the entire course.Training Materials

Parts I and II were contained in one booklet, the content of which wasidentical to that used in Experiment 1A. Parts III, IV, and V were also containedin a single booklet (186 pages). Part III introduced the technique of looping,Part IV dealt with address modification, and Part V had to do with addressarithmetic (a more advanced technique of addressing).

The performance criterion tests were administered after the students com-pleted each part of the program. A similar test was administered four weeksafter the course was completed, as a measure of retention. Upon completion ofeach criterion test, the students were given a Level-of-Aspiration (LOA) Scale,divided into intervals of 5% (see Appendix B). They were asked to make twojudgments: (a) how well they thought they had performed on the criterion testthey had just taken, and (b) how well they thought they would perform on thenext criterion test. These scales were used in computing discrepancy scores,obtained by subtracting the student's actual performance on the next test fromhis LOA or expected score. Such scores are traditionally used to obtain anindication of the degree of realism with which a student progresses through acourse; for example, a small positive discrepancy score (expectation slightlyabove performance) exemplifies the realistic, positively adjusted student.Procedures

Students were trained in a classroom at their own or a nearby high schoolafter school hours (afternoon or evening sessions) or on Saturday. They attended

1) lo ACIl test data were available for roughly one-fifth of the aulijects.

16

Page 28: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

one three-hour session a $.veck. It should be noted that so long a time betweensessions was not intended when the experimentalcourse was constructed becausethe delay allows more time for forgetting, Daily sessions would be the mostdesirable schedule for the self-instructional course.

Halfway through the three-hour session the students were given a 10 to 15minute break. Seats were spaced as far apart as room size would allow. Theratio of proctors to the students was approximately 1:15. The proctors insuredthat the students worked independently and tried to prevent review of materials.

Proctors were instructed to assist the students only with procedural or"structural" questions. (A "structural" question would be one about a suspectedmisprint or, in rare cases, a request to define an unfamiliar Army term su..has RP.) Proctors did not answer substanqvc questions, those pertaining tounderstanding the content being taught. In response to those questions, the proc-tors eintply told the students to do the best they could.

Before the course began, the students were given a brief outline along witha cautionary statement regarding the need for independent work and regularattendance. Students were informed that they would be given a final test, cover-ing the complete experimental course, approximately four weeks after theycompleted the course, and that they would receive certifications of completionafter the final test.

17

Page 29: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

Chapter 3

RESULTS

Taken together, the three experiments generated a staggering amount ofdata. To provide a framework for viewing the results, without presenting anoverwhelming amount of detail, it may be well to review the basic questions thatwere being asked. These questions took the form, "What is the effector X on Y?"For example, "What is the effect of naming rules during training on ability tosolve criterion problems?" The "Xs" and "Ys" are listed below, with an indica-tion in parentheses of the experiment in which they were used:

Effects of . . .

Level of prompting (all)Level of confirmation (all)Verbalization in training:

Writing rules (1A)Naming rules (1A and 1B)

Variety of practice (2)Intelligence (all)Combinations of the above

As Measured by . . .

Errors during training (all)Errors on criterion tests:

Problem solving (all)Writing rules (1A)Naming rules (1A and 1B)

Errors on retention tests (all)Time to complete (1A)Level of aspiration (113 and 2)Combinations of the above

Results are presented separately for each experiment. After the evidenceof learning has been presented, findings relative to the various treatments (left-hand column, above) and the effects of each treatment on the various measures(right-hand column, above) are discussed.'

EXPERIMENT 1A

Treatments studied were prompting/confirmation and verbalization. Instruc-tion was divided into two parts (I and II). After instruction, proficiency measureswere taken in the form of criterion tests (I and II) and retention tests (I and II).During instruction, measures were taken in each phase in the form of errorsduring training and time to complete.

Evidence for Learning. On the criterion test after Part I, nearly two-thirds of the students scored 80% or more correct, and more than a thirdscored 90% or higher. Results on Criterion Test II were comparable. Meancriterion and retention test scores are given in Table 1.

Time to Complete. For both Parts I and II, there were no significant dif-ferences between the prompting and confirmation treatments as to the length oftime subjects took to complete the course. The three Verbalization Groups diddiffer significantly, however, on both parts. In Part I, the Rules Group wasslower than the other two verbalization groups. In Part II, the Naming Group

'Unless otherwise indicated. the statistical test used was the Analysis of Variance.

Is

Page 30: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

Table 1

Mean Criterion and Retention Test Scores by Treatments: Experiment 1A(N 60)

1lettp.iire,--

Ito lesCP

Croup

1h ou

Namingi (:P

Croup

oP.Only

Group

Summary11can

PercentCorrect

S 1 a n- - - - - -

Hidescp{TroupC

o a r i OuY ia

Naming1 CP

Croup

ti i n 1_

CP-Only

Croup

Criterion Test 1 bPrompting Group :19 1:1 17 13 83 IS 9Confirmation Omni) 13 19 It 17, 87 10 8

Summitry \lean 41 1.6 16

i k.ortect 79 88 88

Criterion Test litPrompting Croup 66 66 75 69 78 29 16 11

Confirmation Croup 69 79 --., 75 81 19 8 12

Summary NIenti 68 7:1 76(i Correet 76 82 85

'totem ion Test I tea 76 87 84 82

11i:tendon TeM II P7-1 70 80 68 73

For Criterion Test 1. the grottpx tl Mem' signifit.antly. p thartIrtes test. (5) 15.52). VorCriterion Tem II. the groups liffercol significantly. p ,01 (llartIetts test. (:1) Ift.16). For Iletentims rst 1and Retention Test fl. raw !Wert.% or the group,: did not differ si(!nificantly.

',maximum possible store: 52.r11.ecitailin possible genre: 89.

was faster than the other two groups.The mean learning times for the Table 2verbalization conditions are shown Mean Learning Times for Verbalizationin Table 2. Variable: Experiment IA

As can be noted in Tables 1and 2, achievement was positively Means Iminuirs)related to speed of learning. To pro- Pori Hules Namingvide an indication of the efficiency of cP CP (;P -Only

each treatment, a time-weighted scoreI" 135 116 11for each individual was computed by

dividing the student's criterion (or 107 102

retention) test score by his time to "F(2.50 5.4. p < .0 I.complete the course. These scores, br p .05.

which reflected the amount learnedper unit of time, were used in the subsequent analyses of data for the vari-ous treatments.

Prompting and Confirmation Effects. During learning, as measured byerror rates in computer programming, the Prompting Group scored significantlyhigher than the Confirmation Group. However, the groups did not differ signifi-cantly on the criterion tests (including the tests on writing and naming rules) orthe retention tests. If anything, the Confirmation Group was superior, as shownin Table 3.

Verbalization Effects. Students required to write out the rules of computerprogramming during training (lid notdo as well on the criteriontests as students,in the other verbalization groups. The Rules Group also had the highest spreadof scores on the criterion tests, as shown in Table 1. (Later supplementaryanalyses indicated that this effect was particularly evident in students oflower intelligence.)

19

Page 31: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

Table 3

Mean Time.Weighted Criterion and Retention Test Scoresfor Prompting/Confirmation Variable: Experiment 1A"

tIe,t,.;ure

(.itt.,, 1,.., , T ,.......,.... ,..... II 1 R......... ,..,, I 110(111ot( Test II

II. fir- '

n1 t'..ntir 1lornuoiliing . 1 Prwuptitua I

11'Nation I I'lk.mia nit: I "111.11*mallet, 1 Illation lirompting Con fir

I at loll

CP-Writing 368 :1119 719 718 127 142 .123 168

Naming the Rules 34 38 9 16

Writing the ltules 17 2-1 21 19

Aeltiewrorocntoticere \ 1.000.

Time (Min.)

The mean score of the Rules Group on tests of computer programming(Criterion and Retention Tests I and II) was consistently lower than the meanof the Naming Group or the CP-Only Group (Table 4). Two-group comparisons(10 indicated that this difference was statistically significant in all four cases.

Table 4

Mean Time.Weighted Criterion and Retention Test Scoresfor Verbalization Variable: Experiment 1A"

Measure

Criterion Test 1 Criterion Tt st 11 Retention T( st 1 Retention 'ir est 11

hlamingICP-t CP Only

RuleriNiuning4 CP +CP

Cl'-Only

Rulesi CP

Naming t CP-t CP Only

Rules [Namingt-CP i CP

1 CP-Only

Rulesi CP

CP-Writing 314 397 426 624 793 753 356 463 486 385 521 12Naming the Rules 29 0 -10 39 55 42

1t.riting the Rules 24 20 20 27 19 15

Achievement'SPnre - Time x.1.000. All three-group comparkons except Writing the Rules. Criterion Test I.

were significant (p' -05). For !laming the Rules. Criterion Test 11. p< .01.

On the criterion and retention tests following Part I, the CP-Only Groupmade the highest scores, while on the tests after Part II the Naming Groupseemed to be superior. However, this tendency was not statistically significant.

The inferior scores made by the subjects in the Rules Group in solvingcriterion problems did not seem to be due to poorer learning per se, since theyscored higher on those criterion items specifically related to their training,namely, the writing out of rules. In Part I this difference in performance wasnot significant, but in Part II it was (p < .05), On the other hand, the NamingGroup answered the naming questions on the Part II criterion test significantlybetter than the other two groups (p < .05), and also performed well on tests ofcomputer programming. Thus, it appeared that both groups learned the addi-tional materials presented during training, but that the subjects in the RulesGroup were unable to apply their learning to problem solving.'

Intelligence. Intelligence, as measured by scores on parts of the ArmyClassification Battery, was significantly and positively related to criterion

'Criterion measures of problem solving were highly reliable by lloyt's Test (20). Coefficients ranged from79 to 89 (p : 01).

20

Page 32: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

scores in all cases. 1'o interactions appeared between intelligence ,bdcriterion scores related to the experimental variables.

Ex.PER rm ENT 113

The second experiment was basically a repeat or the first. Its primaryfunctions were to confirm the earlier findings and provide the introductoryinstruction necessary for the more complex materials presented in Experiment 2.

Variables studied in Experiment 113 were the same as those for 3A exceptthatthe Rules Group was deleted from the verbalization variable. It was ntendedthat the same measures would also be taken, with the addition of a Level-of-Aspiration measure. However, difficulties attending the administration or alarge study over several months, using volunteers from many different schoolsystems, resulted in the loss of the time scores, and a portion (20%) of theintelligence scores. The high dropout rate (57% for Experiments 113 and 2) alsoaffected measurements.

Evidence for Learning. Mean performance on the Part I criterion dealingwith problem solving averaged about 87% for students who completed all instruc-tion (including Experiment 2) and 81% for all who took the test. On the Part IIproblem-solving criterion, the average was about 8ac for finishers and 72% for allstudents. Thus, the program taught about as effectively as it had in Experiment 1A.

Since the mean performance scores suggested that the finishers were morehomogeneous, the analyses of all experimental findings reported below are con-fined to them, in order to provide uniformity in the data base the,..wouvib -it thestudy. Mean criterion and retention test scores are shown in Tab lc !

Prompting and Confirmation Effects. The Prompting Group !:Iniii-cantly higher than the Confirmation Group (p<.001) during traMing em bothPart I (96.5% vs, 88.5%) and Part II (91% vs.- 87%). The groups did nor differ

Table 5

Mean Criterion and Retention Test Scoresof Naming/No-Naming Groups: Experiment 1B

..e1. ri1

11m,

\o-\Group

I

I

/0Summary

WanPei..PutCorreei

Standard Devi:Ilion

Naming(Troup

Na n,4Group

No- \ amingGroup

Criterion Tem 1 (ACP-Writing"

ie Rules'.Naming ratetiriting Ole Rules'.

Criterion Test 11 (CPAriting'i\Liming the !lisle:463%riting the Rules'

Retention 'rest I" (

Retention too lit (A

17 1)

86.8!1.27.9

17 1)

115.210.76.0

10)117.6

I71)93.1

(V 171)81.89.66.7

(N 171)135.9

9.1.2

( 1 111)I $0,9

(V 159)85.7

NS. .001

.001

NS

. .001

.001

.01

.01

85.810.17.3

110.69.95.1

11 1.1

89.8

88/37

7:3

18;::3

51

66

6 1

19.7

1.8

2.8

20.11.8'2.3

30,9

32.2

20.82.1a.7

37.0.1

2.6

10.1

35.6

allax Mum: 98hIlaN Minim 12rtlaNitonot:'NJ imorit: 0,91.ec Moon: 172(114% Mum: 11.1

21

Page 33: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

Table 6

Mean Criterion and Retention Test Scores ofPrompting/Confirmation Groups: Experiment 1B

11ed.nre

heat

r Notonory%lean

VervetuiCorm

St at:tiara IIo yi;it ion

l'ronyting( :Flow

Con( ion.ti ionGroup

Prompt innCool,

Conf irnno innGroup

Cr i te ri on Teat 1 (X 1631 (X 182)(:1'.% riting" 85.8 85.8 NS 85.8 88 20.2 20.5

Criterion Test 11 (.1 163) (N 182)CP-11riting1' 11t .3 139.8 'NS 140.5 83 32.2 35.2

Ileteniion Test 1' (A 151 (A 161)116.6 1114 .01 111.0 66 35.2 36.1

Iletetoilln TN1 114 CV 157) (X 1761

91.1 88.1 NS 89.7 61 331 31.1

"11.1x nu am: 9i1

1'11.1\1111(1111: 1(.9' 11.e\ 4011111: 172'111.1% imam: I II

significantly on the two criterion tests; however, when scores were adjustedfor learning error the Confirmation Group was superior (p < .005 for CriterionTest I; p < .10 for Test II), This adjustment was made by means of an analysisof covariance; which provided an estimate of scores that would haire beenobserved if performance during learning had been the same across conditions.The adjusted means for the Confirmation and Prompting Groups on both setsof criterion tests are shown in Table 7.

Table 7

Adjusted Mean Scores" of Naming/No.Naming andPrompting/Confirmation Groups on Criterion Tests: Experiment 1B

Part Prompting I Confirmation' p I %tuning iNo-Naming I_ P

NS

.:.10

i

Il

81.6

81.1

90.0

81.9

.007,

.10

88..1

81.9

87.3

81.3

tAiljusteil for learning. orrta by man,: of an ,(n.tlys is of elovaridnee.

Verbalization Effects. Since the Rules Group was dropped from Experiments113 and 2, the re. iaining groups were called the Naming (used mnemonics) andNo-Naming Groups, depending upon whether they named rules and worked onpractice problems, or simply worked problems during training.

The two groups did not differ on the Part I criterion items dealing withproblem solving. However, on criterion items requiring students to name andwrite out rules, the Naming Group scored significantly higher. (The NamingGroup had been given practice in naming rules but not in writing them out.)An analysis that took learning errors into account (analysis of covariance)showed that the superiority could be attributed in part to better performanceduring training (see Table 6). Retention tests reflected the same findings.

Level of aspiration and intelligence data are presented in the final sectionof this chapter.

22

Page 34: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

EXPERIMENT 2

Experiments 113 and 2 were conducted in succession, Experiment 2 dealingwith the new and more difficult materials of Parts III, IV, and V. The numberof treatment combinations W:18 quite high-29. Since attrition was also quitehighabout half the students dropped out before the end of Part IIImany ofthe groups in Experiment 2 were rather sparsely populated. Although this situa-tion dampened the hopes of detecting statistically significant effects, thoseresults which did prove reliable can be considered all the more powerful.

Evidence for Learning. Overall mean performance on the criterion testfor Part III was about 70% correct; the median was about 78%. Thus, there wasevidence of learning at a point three to five weeks after training had begun underExperiment 1B. Overall proficiency dropped to a median of 64% on Part IV and35% on Part V. This drop was not unexpected since the purpose of the experi-ment was to obtain comparative data on teaching combinations; many of the lesseffective combinations were quite ineffectual at the more complex levels. Oneconditionzero confirmationhowever, was effective and yielded medians of81%, 73%, and 48% on Parts III, IV, and V.

A three-way factorial analysis was performed on Partin criterion databy randomly eliminating subjects to equate cell sizes. No interactions wererevealed. Consequently, all subsequent comparisons considered the variablesas independent and utilized all the data. The first set of analyses below treatsParts III, IV, and V separately. A second series waE. performed treating thethree parts as replications.

Prompting and Confirmation Effects. Prompting and confirmation aresimilar in that they both present information pertaining to the correct answer.The amount of information presented will be called the degree of stimulussupport, whether it comes from prompting or from confirmation- It had beenhoped that errors during training could be manipulated by varying the percent-age of stimulus support. An analysis of data from Part III indicated that thiswas accomplished. As shown in Figure 4, errors ranged from 10% for zero

Relations Between Learning Error and Criterion Test Error Under Promptingand Confirmation Conditions: Part III

50 ..-. 50

40

30

20

10

Learning FWD.' trttTesr Error **

...././..../............"........"..

T,..=+,X'..........................

0

tt.tr Learning Erroree Test Error

.., ,.=1.....=.,,,.........w, ....0

e-.... .-. -,t.......

i

......."°

........6.... ,.....,,t

33 Prog 67 100 0

Percent of PromptingFigure 4

50 100

Percent of Confirmation

40

30

20

10

23

Page 35: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

Effects of Prompting and Confirmation onCriterion Test Performance; Port III

80

t.; 70

C

2 60

0_C

50 Prompting

o'L LOP 33P Prog P 67P 100P

PromptingPercent of:

I ...I

Figure 5

stimulus support to 2% for completesupport (total prompting). Perform -ance during training was directlyrelated to degree of stimulus support;the greater the suppo rt the fewer theerrors. However, test performancewas inversely related to stimulussupport (p < .01) when scores wereadjusted for learning errors. Testscores for the adjusted data areshown in Figure 5.

The datavariability in Partmedian dropped tomean fell to 55%.

showed markedIV. The overall64% correct; theIn Part V the test

scores were even lower, 38% correctas the mean and 35% as the median.Therefore, modified nonparametricruns tests were employed usingdeciles instead of individual scores.Results supported the findings inPart III. Where confirmation andprompting reflected comparable

stimulus supportall of the Zero Prompting Groups vs. all of the Zero ConfirmationGroups, and all of the 100% Prompting Groups vs. all of the 100% ConfirmationGroupsthe confirmation treatment proved more effective for criterion perform-ance. A decile analysis of the criterion data relevant to these comparisonsfrom Parts III and IV yielded significant differences (p < .01). In Part V theZero Confirmation-Prompting comparison gave the same result, but the 100%groups did not differ.

On the final retention test, given four weeks after completion of allinstruction, the direction of the effects was the same. The lower the stimulussupport during training the better the scores. However, significant differenceswere obtained only among the confirmation levels (p < .001). A decile comparison

among the prompting conditionsTable 8 revealed that both the Zero and 33%

Prompting Groups scored signifi-cantly higher than the 100% PromptingGroup (both with p < .01). The otherintermediate prompting groups didnot differ from the Zero Prompting

56 Group, but did score significantly6') higher than the 100% group. In short,

the long-term retention test furtherindicated that the greater the errorduring learning the better the perform -

,19 ance on criterion problem solving.5i The retention test means and medians69 are presented in Table 8. These data55 are not adjusted for learning errors.36 Variety-of-Practice Effects.

There was little difference in learning

Group Means and Medians onFinal Retention Test: Experiment 2

Group1

WiliamMeanPercent Correct Percent Correct

Variety 56No- Voriety

Zero Confirmation (17

50ei Confirmation 7)6

100'i Cora irroti t ion %).

Zero Prompting 6 I

:13'i Prompt in g 5

Progressive Prompting 5967% Prompting 5'6

ing"i Prompting 17

24

56

.1.1.

Page 36: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

between the Variety and No-Variety Groups during training.However, students who hadpracticed under the Varietycondition scored significantlyhigher on the criterion test forPart III (E, <.01; Figure 6). 40Analyses of criterion test datafrom Parts IV and V did not _reveal any significant differ-ences. However, the Variety w 30

students scored higher on thePart IV test on a deciles com-parison (p < .01). On the Part V

20criterion test and on the finalretention test the effect disap- _peared completely, althougheven in retention the direction ici

of the difference still held (Vari- x

ety mean, 44%; No-Variety, 38%).Since the Variety stu-

dents practiced with three None

different verbal contextsduring training (vacuum tubes, Figure 6

salaries, and inventories of uniforms) and there were four different contextsused in the five criterion problems, it could be argued that their superiorityderived from a greater similarity between training and testing than was thecase with the No-Variety Group, which practiced on vacuum tube problems only.

This possibility was tested by dichotomizing the criterion problemsinto those that were stated in a novel context for both groups (4 problems) andthose using a setting new only to the No-Variety Group (2 problems). TheVariety Group was superior on both types of problems. Thus, it can be inferredthat variety of context resulted in better learning of the general, mediating skillsnecessary to write computer programs, and not simply better learning of theovert associations practiced during training.

Verbalization Effects. Due to clerical error in assembling grouping ofcourse materials, and to attenuation of some groups because of high dropout,no meaningful data were generated from the treatments added to get supplemen-tary information on the effects of verbalization.RELATIONSHIPS BETWEEN EXPERIMENTS 1B AND 2

Since Experiments 1B and 2 were conducted in succession, using the samestudents and teaching a continuity of subject matter, it became important todetermine the relationships between the two studies. Results presented belowdeal with the question of how various learning conditions in Experiment 1Bcombined with learning conditions in Experiment 2 to affect the criterion testperformance in Parts III, IV, and V. The two variables of Experiment 1B andthe three of Experiment 2 yielded six combinations of interest:

Relation Between leorning Error and CriterionTest Error Under Variety.of.PracticeConditions: Part III

50 ,e....* criterion Error)c)c Learning Error

1

-x

I

VarietySome

Experiment 1B Experiment 2

Prompting/confirmationNaming/no-naming

withwith

Level of confirmationLevel of confirmation

25

Page 37: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

P rompti ng/confi rmationNaming/no-namingPrompting/confirmationNaming/no-naming

withwithwithwith

Level of promptingLevel of promptingVariety/no-varietyVariety/no-variety

Since there were no significant interactions among the variables ofExperiment 2, they were treated as independent for purposes of analysis, Datafrom Experiments lB and 2 were cast into three-way analysis of variance tablesto note possible interactions. Parts III, IV, and V were treated as replications.Graphs of the main effects of confirmation levels and prompting levels areshown in Figures 7 and 8 respectively (see also Tables 9 and 11, Analysesof Variance).

Effects of Degrees of Confirmotionon Criterion Test Performance:Experiment 2

Effects of Degrees of Promptingon Criterion Test Performance:Experiment 2

70 70

*:----------,.....,eN60

.... "- -..,**N. .... .-... -,....,

N,. NSO NNs, N

Ns.. %,,,,

Ns,

AO ..N.....-- In- 100%

30 70

I

Part IIIi

Part IV1

Part V

t, 602:

Ut so04.,t

n.c 400a2

r-

30

0

.1%.:::--",......N',"":"."%7":-......,

^........4 `'....,,,,..

'`...... ,,*.....

*--* 0% 4.......

*----.. 33% ...N N.Progressive .\,67%

.--4 100%i

Part HI

"*-01

( I

Part IV Part V

Figure 7 Figure 8

Naming/No-Naming (Exp. IB) With Confirmation Levels (Exp. 2), Theanalysis of the naming and no-naming conditions of Experiment IB combinedwith the three confirmation levels of Experiment 2 is summarized in Table 9.Significant interactions were indicated. The means for the various treatmentcombinations are shown in Table IO grouped according to a Duncan Range anal-ysis, which enables a comparison of individual treatment means to be made,

26

Table 9

Analysis of Vorionce of the Effects of Confirmotion (Experiment 21by Naming/NoNaming (Experiment 16) on Criterion Tests

for Parts III, IV, and V

Source of Variance

Naming 'No-Nanting. Exp. 113 (A)Confirmation. Exp. 2 (13)Replications ((:)

AllA(:BCA IX:

elf 1 NIS 1 F L I,

1 541800 24.6 <.052 780503 9.8 ;.0052 4571461 37.6 ..0019 5.18986 23.5 .:- .025

2 22060 <1 NS4 79496 3.4 NS1 23383 <1 NS

Within Cells 870 121642

Page 38: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

Table 10

Duncan Range Groupings of the Effects of Confirmation (Experiment 2) -Naming /No- Naming (Experiment 1B) Combinations on Criterion Tests

for Parts III, IV, and V"(N 296)

Part Ill

I _dolt

Part IV I

1 Confirmajtiwi

Pun V

Pr.entCarreet

I Naming Pervert'Corro.t.

Perveni Confirm- ,, ,

Carrert Om I oitinitItt

68.8 0% No 63.7 0% Yes 47.2 0% Yes67.7 100% Yes 63,2 0% Mn ,.--- -- ......--

11.0 0% No

65.I 0% Yes 58,9 50% Yes ____--_---___64.4 50% Yes 58.1 50%

57.6 100% YesNo 42.3 100% Yes

64.463.5

30%

30%

YesNo 39.0 100% No

39.0 50% No

---------- - - - -- 36.6 5O Yes53,9 I 00% No

'27.9 100% No

"Means that are clustered together are nor significantly riifferent from one another. The groups differsignificantly (p

Table 10 reaffirms the stimulus support effect noted previously, thatthe lower the stimulus support during training (here, the lower the confirmation),the better the criterion scores. This effect appeared to be heightened whencombined with prior use of mnemonics (naming the rules); practice in namingrules inExperiment 1B plus low confirmation inExperiment 2 seemed to promotethe best criterion performance. The effect is clearest in PartIII where learningbegan to deteriorate with increased difficulty in the instructional materials. Herethe No-Naming/Zero Confirmation Group came in first, the Naming/Zero Con-firmation Group second, and so on, with the No- Naming /100% Confirmationcombination representing the worst learning conditions.

Prompting/Confirmation (Exp. 1B) With Confirmation Levels (Exp. 2). Theanalysis of the prompting/confirmation conditions of Experiment 1B in combina-tion with the various confirmation levels of Experiment 2 revealed no significantinteractions. Examination of the individual groups did, however, suggest a typeof interaction effect related to similarity of stimulus support. It appeared thatthe more similar the stimulus support conditions the better the performance(see Figure 8). Specifically, zero confirmation (the Prompting Group of Experi-ment 1B) followed by zero confirmation (in Experiment 2) resulted in betterscores on criterion tests for Parts III, IV, and V than total confirmation followedby zero confirmation. This relationship also held for the zero-to-total vs.total-to-total confirmation sequences except for Part IV, where there wasa reversal.

Naming/No-Naming (Exp. 1B) With Prompting Levels (Exp. 2). Analysesof the data (Table 11) reaffirmed the importance of low stimulus support (here,low prompting levels) and early practice with mnemonics (naming the rules) tocriterion pPrformance on Parts III, IV, and V. Mean scores resulting from thevarious sequences of treatments, Experiment 1B to Experiment 2, are shown inTable 12, grouped according to the results of a Duncan Range analysis. In gen-eral, naming followed by low prompting promoted the best performance; no-namingfollowed by high prompting was the worst combination.

27

Page 39: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

Table 11

Analysis of Variance of the Effects of Prompting (Experiment 2)by Naming/No-Naming (Experiment 1B)on Criterion Tests for Parts III, IV, and V

L is

Naming .No- ,,Naming. Exp. 111 (A) I 11)113.17 29.2 '. .05Prompting.. ENvs. 2 ((1) 1 261323 7.9 ,.01Replivotions (C) o 4128116 33.0 _.001

AB 1 155902 9.6 4..01AC 2 3 1593 2.1 NSBC 8 33594 2.1 NSA130 8 16237 :1 NS

11ilLirt Cells 858 12%13

Table 12

Duncan Range Groupings of the Effects of Prompting (Experiment 2) -

Naming /No- Naming (Experiment 1B) Combinations on Criterion Tests"for Parts III, IV, and V

(N 296)Part III Part 1 V Pati V

pt.rventCoro., t

Notinx 1(°:14',fi';:::!: 11cotom ingI Naminglorto.entcorrect

',corns jog1 Naming

68.9 No 66.0 Progress e Yes 16.4 Prop,ress ive Yes

66.8 09 1. es 61.9 67r Yes I:13 I: Yes66.1 679 Yes 11.9 1009 Yes66.I Progress i % e Y es 59.1 09 Nin --66.0 100% Yes 58.2 Ot, 1 es 11.9 1009 Yes

-10.1 09 No66.0 100% Yes 51.2 1009 Yes 0.3 (17% Yes61.1 33% Yes 5348 33% Yes -- --- -- - ---_-_-__-_- 52.8 Progressive. No 37.7 679, No62.0 ProgressiN e No60.6 339 No 52.8 Progressive No 3.1.6 339 Yes

51.1 679 No 34.3 Progressive No58.1 679 No 33.6 33% No

18.9 33% No53.1 100% No 24.1 100% No

32.9 1009 No

octoom. Aro sepatatot at or beyond the .(I tvel.

Prompting/Confirmation () xp, 113) With Prompting 1..evels (Exp. 2). Theinteraction of early prompting/confirmation treatments (Experiment 1B) withlater prompting levels (Experiment 2) was highly pronounced (Table 13). Thedominant factor appeared to be similarity of stimulus support, as it was in theprevious discussion of prompting/confirmation followed by confirmation. Thus,as seen in Figure 9, students given total prompting in both studies performedbetter on Parts III, IV, and V than students given zero prompting followed bytotal prompting. Similarly, zero prompting in both experiments promoted betterperformance on all three criterion tests than total prompting followed byzero prompting.

28

Page 40: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

Table 13

Analysis of Variance of the Effects of Prompting (Experiment 2)by Prompting/Confirmation (Experiment 16)

on Criterion Tests for Parts 111, 1V, and V

SIrnit't 411 I:triune'.

Prompt ing Cola irnim ion .Exp. 111 (:1) 1 .22859 i 6.11 NS

Prompting,. 1:.p. 2 (10 V 16.1381 1 0 .01

Bop! kid ions if 3 4 12g:'561 :11.5 .01

1 103593 3,2 .025AC 2 371 VT I NS

BC 8 3831 11 NS

ABC 8 5(1722 1 NS

Within Cells 8111 121-122

Effects of Similarity of Stimulus Support From Experiment 1B (0 or 1001 toExperiment 2 (0 or 100) on Criterion Test Performance: Experiment 2

BO

70

60

50

40

30

/(

N NN

`\

0-0100-100100-0..- 0 -100

1

Parr III Port 1V Part V

Prompting

Part 101 Port IV Part VConfirmation

Figure 9

Naming/No-Naming (Exp. 1B)With Variety Level(Exp. 2). Analyses showedthat criterion performance in Parts III, IV, and V was clearly affected by thevarious combinations of early naming/no-naming with later variety/no-variety(Table 14). Although the data were not entirely consistent, it appeared thatperformance improved as the combination approached the mix of early namingwith later variety. Further, the importance of naming seemed to increase asmaterials became more complex (and, concomitantly, as performance dropped

29

Page 41: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

Table 14

Analysis of Variance of the Effects of Variety/NoVariety(Experiment 2) by Naming/No-Naming (Experiment 113)

on Criterion Tests for Parts III, IV, and V

Source or V.Iriao..1. 41 1tS

Naming No-Nattting. xp. 111 (A) 1 907216V.triety Nn-ioriet. Exp. 2 (11) 1 778827Ile plieotions C( :) 2 4783363

A11 1 1602861AC :32132fl(: 178816mic .1 38175

within Cell*, 876 121305

28.2 ..011.1 :.035

39.4 < .001

42.0 ..011 NS

1.7 NS

1 NS

off), while the importance of variety seemed to decrease. Means of the variouscombinations are shown in Table 15, arranged according to the results of aDuncan Range analysis.

Prompting/Confirmation ,Exp. 1B) With Variety Level (Exp. 2). No signifi-cant interactions were found.

Table 15

Duncan Range Groupings of the Effects of Variety/NoVariety (Experiment 2) -Naming/No.Naming {Experiment 113) Combinations on Criterion Tests'

for Parts III, IV, and V(N -296)

Part ill Ilart IV Part V

p,r, ttsi04441

vrioti.. N diningPervert,co,,,,e1

. -k .Nam Ing Per,int(.4,,,,,.t

r

0.7 Yes No 03.7 No Yes 441 No YesS8.8 Yes No .10.3 Yes No

68.0 Yes yes58.8 Yes No -10.3 Yes Na

63.5 No Yes 56.1 Yes Yes 39. Yes Yes52.5 No No

14.1 No No 30.4 No No

4 (;Ttf7111ti 41TI .partesi as or (se y ot3s1 Illi 05 (Ivel

FACTORS RELATED TO DROPOUT

As noted earlier, about 57% of the students did not complete instruc-tion. A series of analyses were conducted to determine the relationshipsbetween dropout and stimulus support, level of aspiration (WA), intelligence,and performance.

Stimulus Support. The more stimulus support in trainingthat is, the greaterthe degree of prompting or confirmationthe greater was the likelihood a studentwould finish the course (x2 trend test of proportions, p < .05). Figure 10, basedon Part III data, shows that the prompting and confirmation factors operatedalike in this regard.

Level of Aspiration. At the end of Parts! and II, students were asked to esti-mate how well they had performedontheprevious part and how well they expected

30

Page 42: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

to perform on the instruction to follow. Relation Between Stimulus Support andThe correlatjons between pe rformance Proportion a( Dropouts% Part Illestimates on completed instruction 70 -and actual performance were quitehigh (.65). However, students wholater dropped out expected to per-

c rite rion tests than they did, relative 0ta.

.v) 60form much higher on subsequent

2to students who finished (p < .01). That ois, the discrepancy between level of 15 so _aspi ration and performance was much 9o

higher for dropouts than for finishers. il:0In addition, the expectations of the o

0.

ii: 40 -dropouts were significantly lower -... Pr omPting

than the expectations of the finishersafter Part I, and the performance ofthe dropouts was significantly lower. or I I I 1

In short, relative to the OP 33P Free 67P 100P

finishers, the dropouts (a) expectedto perform at a lower level, (b) greatly Degree of:

Prompting

overestimated their capability, and(c) did perform significantly worse. Figure 10

Intelligence. Dropouts were characterized by significantly lower intelligence(verbal ACB scores) than-finishers.

CORRELATIONS BETWEEN EXPERIMENTS 1B AND 2

Experiment 1B. The overall correlation between intelligence and criterionperformance was high (Part I, r = .69; Part Ii, r= .70). The relationships wereeven higher for finishers (Part I, r = .73; Part H, r = .73).

The overall correlations between learning errors and scores on therelated criterion tests were not so marked, but increased from Part I to Part II(.46 to .68). Forthose subjects on whom intelligence measures were notobtained,the relationship between learning and testing was even higher (.48, N = 141; .82,N= 140). Since learning errors fell within a narrow range (2%-8%), these cor-relations may be spuriously low.

On dividing students by intelligence into low, medium, and high, it wasfound that the learning-testing relationship was significant for the average andbright students who dropped out, but not for the average and bright students whofinished. For low aptitude students the correlations betweenlearning errors andcriterion scores were about the same, positive but low (Table 16).

The correlation between the criterion tests for Parts I and II was high,about .7. Thus, performance on the first criterion test was a good predictor ofsuccess on the second.

Experiment 2. The partial correlations, for finishers only, between LOA(expectations), intelligence, and criterion performance are given in Table 17.The relationship between LOA and subsequent criterion performance, with intel-ligence scores partialled out, increased (except between Parts IV and V) asstudents progressedthrough the course. On the other hand, overall relationshipsbetween intelligence and criterion scores decreased throughthe instruction. Therelationship between LOA (expectations) and intelligence decreased even moremarkedly, shrinking to almost zero (-1-.05) on the last criterion test.

31

Page 43: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

Table 16

Correlations by Intelligence Level Between Learning andCriterion Scores of Finishers and Nonfinishers:

Experiment 1B

Inicl lignci Pali

1.-

ini,her,

Il'

SroutirriAters

r 1-P1.1M 1 .31 .01 .10 .01

II .12 .01 .31 .01

Medi UM i .00 NS .27 - .05II .11 NS .33 .01

I ligli .11 NS -19 .01

II .1 NS .-1:1 .01

Table 17

Correlations Between Expected Success, intelligence, andCriterion Performance of Finishers, as o Function

of Course Complexity: Experiment 2

ratriov. Correlated Port 11 I Pari 111 I Oa" 1V -IPart v

Expected Suer:m.1 and .37 .50 .59 .15Criterion Performance" (N -.2801 (N-- 204) (N 200) (N--- 150)

Intelligence and .74 .53 .50 .48Criterion Performance IN 295) (N 295) (N 2911 (N '290)

Intelligence and .16 .19 .09 .05Expected Success (N. 280) IN- 204) ()V 200) (N 154)

Successive Pairs of .61 .63 .69 .8.1

Criterion Tests (A -337) (N 302) IN 301) (N 96)

aVor these Nor correlatiom.. intelligence ivos partiallell 01)1.

In short, as the finishers progressed through the course, the relationbetween instructional-based factors -LOA and performance - increased; however,the relationships between these factors and Intelligence, a measure taken beforeinstruction began, decreased.

32

Page 44: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

Chapter 4

DISCUSSION

The organization of the ensuing discussion will be from specific to general,and from experimental hypotheses tested in Experiments lA and 1B to thosetested in Experiment 2. Following material pertinent to the immediate findings,the implications of this type of study for future research and development in PIwill be discussed. Finally, the potential utility of the secondary product of theMETHOD II research, the programed course in fundamentals of computer pro-gramming, will be indicated.

COMPARISON OF EXPERIMENTS 1A AND 1B

Evidence for LearningThe means for percent correct on the criterion tests on both Parts I and 11

were almost identical across studies: 87% on Part I and approximately 83% onPart II. The retention test scores for Experiment 1A are somewhat higherthan for 1B; however, the test for Experiment lA was given on the day followingcriterion testing, whereas in Experiment 1B a week separated criterion andretention testing.

The intelligence indicator was significantly and positively correlatedwith criterion scores in both studies and no interactions appeared betweenintelligence and treatment conditions on the principal measure, writing com-puter programs.

Treatment EffectsLearning Time

Unfortunately, administrative problems prevented obtaining adequatetime measures in Experiment 1B. In Experiment 1A, length of learning timewas negatively related to score on the criterion. Doubtless, this was due tothe time spent by the Rules Group in writing out complex programming rules,a requirement which appeared to dull their terminal performance. The learn-ing time /c rite rion score relationship may not generalize to other PI environments.

Criterion Scores(1) Prompting/Confirmation Results. The tendency in Experiment 1A

for the prompting/confirmation effect during learning to be reversed on testingwas found even more strongly in the covariance analysis under Experiment 1B.Learning errors in the two studies were highly comparable: 3% and 3.5% forprompting, 10% and 11.5% for confirmation.

(2) Verbalization Results. With respect to the requirement of writingthe names of the rules during training (mnemonics) as opposed to just writingcomputer programs, the suggestion in Experiment 1A of superiority of theNaming Group with the more complex materials in Part 11 was heightened in

33

Page 45: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

Experiment 113, which involved five times as many students. Moreover, thesuperiority of the Naming Group (mnemonics) in the larger study was in thepercent correct per se on the criterion in Part II and did not involve time-weighting the scores. That is, use of mnemonics resulted in an absolute andnot just relative superiority, which carried over a two-week period, adding tothe practical value of this particular finding.Summary

Summing tip the effects of the treatment conditions across both experiments,the use of mnemonics in learning the concepts in the computer programmingcourse aided students in tests of computer programming. On the other hand,practice in writing the entire rule in the context given by the booklet hinderedability to use the rules effectively.

The confirmation and prompting comparisons are not so clear. Takentogether, the two studies reveal that while confirmation led to more errorsduring learning, it did not lead to differences on criterion testing. If, however,one takes into account the covariate of errors during learning, then confirma-tion was superior to prompting (indicating that the Confirmation Group benefitedfrom the errors made during training). Nevertheless, one cannot say that inthe context of these two experiments error rate during learning was an importantfactor in criterion test performance.EXPERIMENT 2

Stimulus SupportIt seems clear within the context of the present research that a low degree

of stimulus support (in the form of problem information), with attendant relativehigh degree of error during training, is desirable for achieving a high degree ofproficiency on a criterion test requiring the synthesis of the materials presentedduring training.' Variations in degree of prompting or confirmation wereinversely related to criterion performance.

At the outset it was thought likely that a certain amount of error duringtraining would result in the best criterion performance. A legitimate questionis: Why did not this optimization relationship turn up?

Although the reasoning is post hoc, a good starting point is to note that thekey to good instruction is transmission of information. Since error rate in theworst experimental variation of the course was still low, approximately 13%, itis proposed that all groups were either at or beyond the hypothesized optimalpoint relating information transmission, difficulty level, and test performance.To explore this assumption in future research, it would be worthwhile to degradethe information transmitted by decreasing the chunks cf conceptual contentpresented at each step of training and relate these variations to end of courseperformance (see Seidel, 21, on the problem of units of meaning).

These findings support the proposition that complex human learning suchas problem solving is of a different nature than the rote learning of isolated

'Although the temptation may be great to treat these findings as instances of effects of knowledge ofresults (KR). it would be inappropriate to do so. The study does not concern Kit vs. no-KR as, traditionallyunderstood. since concepts were nested in each succeeding one. True. some KR was achieved indirectly.and the sublet.' had some self-confirmation as well. however. prompting and confirmation acted alike. andclearly a prompt. which appeared before the student attempted to answer, could not be looked upon as KR.Thos. the term 'knowledge of results" sC111114 inappropriate when generalized to include foreknowledge. Nloreapplicable here is, the term 'stimulus support." which treats degree of prompting or cotifirmation alike.

34

Page 46: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

paired associates or serialized items. The latter involves training to an inter-nal achievement criterion, as opposed to the problem-solving criterion v.hichis external and of such a nature as to require synthesis of the concepts and thedevelopment of a strategy in a transfer test. Perhaps a good point of departurefor the research reported here is Kend ler's recent siatedient (12, p. 231):

If researchers in the fieldof programedlea.rning believe that the majorresponse being learned is the one that the student writes in 11w ft ;tine,they are badly mistaken. What is being learned are concept,- :

are modified are large segments of the students' verbal repertoire.It should be noted that, when prompting or confirmation were given in the presentset of experiments, they were applied not to simple stimulus-response associa-tions (as in rote learning) but to strategies or portions thereof.

In order to be appropriately applied in the current context, Kendler's state-ment should be amended to include that not only are concepts being learned butstrategies enabling the student to use these concepts are being learned as well.Thus, he reaches a higher level of learning called problem solving. This is inagreement with Ga.gne's recent distinction (13) between concept learning andproblem solving.'Variety

The overall approach of the present research can be meaningfully comparedto a similar study by Gagne and Bossier (23), who used a programed instruc-tional environment to study teaching and retention of elementary non-metricgeometry with sixth graders. Despite certain procedural differences, concep-tually the approaches are sufficiently similar so that a detailed comparison ofresults should indicate some possible generalizations as well as being quiteinstructive for future research.

Tobegin with Gagne and Bass ler found a moderate correlationof the orderof -I-. 62between achievement scores immediately following learning and thoseof retention. In the present study, a comparable relationship (of the order of+.64) was found. In the study by Gagne and Bass ler, they were able to comparethe subordinate learning sets immediately following learning and in retentionand found a correlation of +.46. Correlations in the present study for the earlyor subordinate criterion tasks were on the order of 1.00. Whereas Gagne andBass ler found that the correlations between the retention and the original achieve-ment scores for the subordinate learning sets were lower than for the higherorder learning sets, the data in the present study indicated no such difference.

These findings are reflected in another way by looking at retention ratios(retention score divided by original score) for the various problems. They areall just about 1.0 or fairly close to it. Differences which did appear were con-fined to the effects of different degrees of prompting in Parts III, IV, and V; themean ratios for the latter are given in Table 18. The differences occurring on

'In a larger framework, the current study suggests that Kendler's admonition be extended to a theoreticallevel: If learning or training theorists believe that the basic unit for complex learning is the classical S-Hassociation. they may also be badly mistaken. Scandura (22)arrived at the same skepticism after completing anumber of studies invoking the teaching of various mathematical algorithms. Ile proposed as an ilternatircto S-II associaiionistic theory a "set function Itinguage" (S11.) which seems to describe more precisely theconeptoul 'sod problem-solving levels of behavior. Previously, Shaw and Seidel. in o paper on informationalcontext as a determinant of what can be learned (in preparation). showed that even with as simple <to organismas the rahassoiationisti principles arc insufficient to account for higher order conceptual learning when suchis permitted to occur. Certainly. a new alternative to associationistic tIH such as Scandium's S11. isinteresting one and worthy of further development.

35

Page 47: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

Table 18

Retention Ratios' for Prompting Treatments: Experiment 2

Pfollivt0Degree toi l'ellniptioto

091

:S:171 lirogresri%. I 6-,--1. 1009

1 .00 .99 .96 .98 1.00

.06 1.02 .96 .98 1.09

.01 .98 1.02 1.0 1.00

1 .08 .80 .99 1.02 .61

.93 4:8 .80 .98 .69

'1.10it1 1.01 .92 .97 1.02 .93

"lit.tentilin SI1011.

Original Som.

Problem 4 in that table and on the overall retention problems among the variousprompting conditions are significant. These effects generally reflectthe originaleffects of the stimulus support variables, showing the least stimulus support(zero prompting) yielding the maximum percent correct and the 100% stimulussupport showing the least percent correct (F =2.96, d1= 4/174, p <.05 forProblem 4; F=2.73, dr= 4/174, p < .05 for overall retention comparison).

Gagne and Bassler found that ". . . the present evidence suggests that their[the subordinate learning sets] importance diminishes once the final task hasbeen achieved" (23, p. 127). However, in the present study the retention resultsindicated this was not the case, that the subordinate concepts were retained andhad the same effect as they did on the initial achievement.

Contrary to the findings of Gagne and Bassler, while the results of thepresent study showed the effect of Variety on immediate achievement, the effectsdiminished to nonsignificance on the retention test four weeks later. Their datayielded just the reverse effect, but it should be noted that in their study achieve-ment was measured after 11 successive days of training, whereas in the currentstudy final achievement was measured after approximately 10 weeks of study,one session per week. Retention in their investigation was measured nine weeksafter the immediate achievement test, whereas in the present instance retentionmeasures were taken four weeks following the last achievement test. Thus, it ispossible that differences in the results may be due in part to the differences inthe time frames for instruction and retention testing.Mnemonics Training

Another aspect to the results of the current research relates to the valueof written verbalization of rules which an individual is required to use in develop-ing his strategies for problem solution. The data of both experiments takentogether clearly indicate that "what is in a name" is important. If an individualis given mnemonics practice early in training and is required to learn thesemnemonics as well as to use them in forming his strategy (writing computer pro-grams), he is able to formulate the rules for later use with greater facility thanif he is required to write the rules inthe verbascontext givenby the instructor.

This makes good sense if one thinks about how we go about encoding ourenvironment from day to day. If an individual is asked to recall an event, it ismuch easier for him to provide a sentence or two inhis own words than to report

36

Page 48: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

in great detail or verbatim (if the event is a poem or story). in like manner,when requiring a student to learn the conceptual rules for the use of computercommands, if he is provided with a mnemonic key with which to tag the rule aslie formulates it himself, he will be able to use his own verbal repertoire togood advantage. On the other hand, if the student is required to use the instruc-tor's wordswhich may not match his ownin formulating the rule, he mayindeed end up just memorizing the words without developing any understandingin the use of these rules. At least, that seems to be the inference to be madefrom our present study.

It is this last feature which seems to take exception to one of Gagn6'spositions (13, pp. 307-308) in which he emphasized the value of recallabilityof "subordinate capabilities." The present study clearly supports a distinctionbetween recallability and availability of these subordinate abilities, the latterbeing the more important of the two. The Rules Group inExperimentlA recalledthe rules much better than the other groups, but they did so at the expense ofbeing able to use these lower-order skills. On the other hand, the Naming Groupwas not only able to recall the rules well, but was able to have them availablefor use in working the computer programs.

This distinction between recallability and availability receives furthersupport from Torrey's recent work (24) on the teaching of Russian grammar.She found that students given contextual drill during learning were better indealing with new material on a generalization test than were the controls. Thelatter students learned all the training material by a vocabulary method. It isimportant to note that the control subjects were periodically tested to show theirability to generate correct sentences according to the Russian grammar rules.Thus, during training they were able to generate the rule, but on the test theywere not able to apply it.

The value of mnemonics as used in the present study (Naming Group)warrants further discussion. As already noted, naming practice was helpfulin solving computer problems related to the course material directly involved(Part II criterion). Of perhaps greater significance were the more lastingadvantages which appeared in later and more complex portions of the course.When overall proficiency dropped to a low level (Part V and final retentiontest), a significant interaction appeared between early naming practice (Parts Iand II) and the later treatments (Parts III, IV, and V). This took the form ofgood early training in Experiment 1B (Naming) compensating for poorer train-ing conditions in Experiment 2 (see Tables 10, 12, and 15).

For example, the No-Variety students who had early training with mnemonicswere able to maintain as high a level of criterion proficiency in Parts IV and Vas the Variety Group. In other words, the rate of loss of proficiency was notas great for the No-Variety Naming Group as it was for the Variety No-NamingGroup. The compensating effect was also superimposed on the stimulus supporttreatment effects as seen in Tables 10 and 12.

A reasonable question which follows from these results is: What is therelationship between stage of training in which mnemonics are introduced andenhancement of criterion performance? An attempt was made in Experiment 2to obtain some relevant data by continuing the use of mnemonics through themore complex portions of the course for three additional groups of subjects.Unfortunately, because of clerical error in the preparation of the booklets, andgroup attenuation because of drop-out, meaningful data were not generated. Thequestion of timing and relative value of mnemonics, of course, remains an openand important one for future research.

37

Page 49: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

Transfer Effects of General vs. Task-Specific FactorsFurther discussion of the transfer effects of general vs. task-specific

factor& is worthwhile because the subject has special relevance to the design ofinstructional programs. As seen in Table 17, the initial intelligence measureis of decreasing value as a predictor of succeeding criterion performance.Conversely, the successive criterion test data and level-of-aspiration measuresincreased in predictive value. These findings suggest (as does Bunderson, 25)that a general intelligence (or reasoning) factor may be of greatest importanceonly for early stages of instruction. Thus, in constructing instructional modelsfor optimizing a student's path through a course, the implication is that weight-ing for general and task-specific factors should be differentially assigned andcontinually checked (to decrease the former_arra increase the latter) at appro-priate choice-points during instruction.PRACTICAL POTENTIAL OF THE PROGRAMED COURSE

Relationship of Content to the Parent Army CourseIn order to evaluate most appropriately the practical potential of the pro-

gramed course, it would be desirable to be able to state the number and percentageof hours covered relative to that required by the total ADPS programmirg coursepresented at the Signal School. However, because the material selected waschosen to represent the basic fundamentals of computer programming andbecause the authors' frame of reference for organizing did not necessarilycoincide with that used by the instructors in the ongoing course, it would be-extremely difficult to try to represent the finished programed instructionalproduct in this manner. As an alternative, it may be helpful to examine therelationship between the conceptual material presented in the programedcourse and in the total ADPS course at the Signal School.

Concepts covered in the programed course deal with internal data handlingof the central processing unit (CPU). Also included is materialondesk-checkingand flow-charting. No input or output instructional materials were included.The number of commands included (11) represent roughly 27% of those in theparent course dealing with internal data manipulations. Viewed in terms of theproportion of such conceptual categories represented, the programed courseincludes approximately 50% of those given in the ADPS course. Finally, if thebasic logic of internal data handling is considered (i.e., within the CPU), theprogramed course includes virtually the entire content. Thus encompassed arethe simple movement of data from place to place in storage (although not withoutmoving to an intermediate central location), arithmetic operations, comparingand sorting (general use of the loop concept), and some sophisticated program-ming techniques (e.g., use of index registers). All except the last categorywereinvolved in the experiments conducted with the programed course.Administration, Motivation, and Population Factors

Certainly the large percentage of dropouts among the volunteer subjects inExperiments 1B and 2 (57%) prevents a firm conclusion about the practicalpotential of the programed course. That motivational factors as well as coursedifficulty might have been the cause for the number of dropouts was supportedbythe level-of-aspi ration data (the differences in realistic expectation between thosewho dropped out and those who remained in the course) and the importance ofthe stimulus support (prompting or confirmation) factors as determinants oftendency to drop out.

38

Page 50: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

A further complication to determining the course's practical potential wasthe fact th.it the administrative conditions under which the course was givenwere not those which obtain in the real world, whether it be in an Army schoolor a civilian school setting. Specifically, the students were volunteers andreceived no incentive in the larger study (Experiment 111 and 2) except acertificate at the end of the 14-week course indicating proficiency in the HumRROcomputer programming fundamentals course. In Experiment 1A, on the otherhand, the students met for three successive afternoons for a maximum of fourhours per afternoon and were paid for their participation. Interestingly enough,these students performed better than did the other high school students inExperiments 1B and 2 in the comparable portion of the course.

Finally, the students in the large-scale study may not have been as intel-ligent as the target population of Army students (the mean ACB CL score ofthe high school students was 96, whereas the Army requirement for acceptancein the ADPS Course is 110).

After the large-scale experimentation, it was decided to investigate 'furtherthe possible contribution of motivation, administrative procedures, and coursedifficulty to outcome by administering the course to a sample of enlisted person-nel at Fort Gordon, Georgia (N=25). Such men presumably would be morerepresentative of the target population for whom the course was originallyconstructed. The men chosen for the experiment had just finished a teletypeoperator's course (MOS 723)' and were waiting to be shipped to their dutyassignments. They were given six hours of instruction per day for five suc-cessive days. Their incentives were ball-point pens and cigarettes, allocatedaccording to degree of proficiency. As in the large study, roughly half the stu-dents failed to complete the course, although again the general attitude on thepart of the students toward the course materials was favorable. They felt thatthey would like this kind of course presented with the aid of an instructor. Asin the larger study, the ACB CL scores of the dropouts were virtually identicalto the scores of those who finished the course.

Instructional EffectivenessWhen attempting to evaluate the instructional effectiveness of the course,

one must keep clearly in view the fact that the primary purpose of the overallstudy was to investigate a wide range of training techniques. Thus, it would beexpected that only a few of the combinations would yield adequate proficiency,particularly as the training materials and criterion requirements became morecomplex. Thus, although the Duncan Range analysis (Table 10) appears to showvery poor performance on the most complex portion of the course (Part V), thecombination of lowest degree of stimulus support (especially 0% confirmation)still yielded a criterion mean close to 50% correct. The medians of the zeroconfirmation conditions (N=79) showed 81% correct on Part HI; 73% on PartIV; and 48% on Part V. A further breakdown into variety or no-varietx anddegree of prompting as well as interactions with the treatments of Experiment1B is not valuable since the number of subjects within each of these smallergroups would be much too small to yield meaningful data. Nevertheless, onecan make a reasonable inference that at least some of these combinations wouldlower the median score for the entire 0% Confirmation Group averaged acrossall of these conditions.

`tinder the current MOS structure, the teletype operator's title is Communications Center Specialist. andhis MOS is 720.

39

Page 51: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

The tentative inference, then, is that the instruction was effective, butbecause of the overriding experimental goals of the study, coupled with theother factors noted herein, no exact statement is possible. Although no harddata are available on attitudes, it has been the impression of the course adminis-trators and proctors that general reactions toward the programed instructionalnature of the materials were favorable.

While it seemed apparent that revision was required to simplify the coursematerial, it was not completely clear that motivational factors could be ruledout as contributory to the resulting performance of the experimental students.One final point with regard to the contribution of administrative procedures toperformance should be made. Because experimental control over student read-ing of the materials was of primary interest, all students were required to readone page at a time and not to turn back to previous sections whenever they werein difficulty. Obviously in a practical setting, the opportunity to review materialalready covered would be desirable. If in the experimental setting it would havebeen possible simply to measure the number of times a student returned to aprevious page or, for that matter, turned ahead to other pages, it would havebeen perfectly acceptable to allow it. Because of the need for experimentalcontrol, however, and the impossibility of measuring the review the studentmight do, it could not be permitted.FINAL REVISION OF THE COURSE

As a result of information obtained from the experimental situations, thematerials were simplified and certain format features were added. The finalversion also incorporated implications from the experimental findings and thecourse was then completed by including a section on the use of index registersas the most complex portion of the instruction.Description

In the revised course, the entire context remains oriented toward the writ-ing of computer programs. That is, a concept is introduced and then a problemincorporating this concept is presented for the student to program. Theseproblems were taken from actual computer programming problems encounteredon the job. It is this problem-oriented approach in the course that makes itrather different from most of the other programed booklets available.

Before each concept is presented, the student is given a preview of thingsto expect. Then, as he is introduced to the concept, he is given short completionquestions with the answers in light type on pages opposite the questions (seeAppendix C for sample pages). The problems to be programmed appear in abox-like format and the student is expected to answer the problem, wheneverpossible, without looking up the answer first. At the end of each section, thestudent is given a review and then led on to the next set of concepts.

As was noted previously, because a secondary objective of the researchwas to produce a programed course,available for use by the Army, the reper-toire of commands and the symbolic language in which the computer programsare written have been chosen from material previously used by the Army toteach students how to program the MOBIDIC computer. The languagetheFIELDATA Symbolic Languagenot only represents a set of symbols interpretableby the MOBIDIC, but is sufficiently general that it largely applies to almost anyother computer and to the English language (the addresses).

The Introduction to the course provides an overview of the concepts ofcomputer programming and covers concepts of command and sample mnemonics,

40

Page 52: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

address, the accumulator, memory locations, input and output, and computerfunctioning. There are periodic review materials and questions for the studentand, finally, a short sample program.

The course proper is separated into four levels:(1) Basic Operations (five sections), includes an introductionto storage

locations, the concept of a central work area, the movement ofnumbers from place to place, and simple arithmetic operations.

(2) Basic Looping concepts are covered next (five sections), includinggeneral transfer commands, counting the loops, leaving the loops,and program preparation.

(3) Data Processing (four sections) deals with address modificationand address arithmetic (effective addressing), sorting and countingdata into various categories, and multiple address modification.

(4) Advanced Techniques for transferring data are given (five sections),including indexing.

In addition to writing many computer programs following the completion of eachlevel, the student practices on problems within each section of the course.Evaluation

The final revision of the course was administered successfully to sevencollege freshmen and six high school seniors. Of the college freshmen, fourhad scores of 90% or better on the final criterion test, two were in the high80s, and one scored 77%. Median completion time was 26 hours, with dailysessions self-paced and lasting from 3 to 5 hours. The high school studentsscored slightly lower, 78% to 88%, with median course duration being 31 hours.It should be noted that the two highest scores (88%) in the high school samplewere obtained by students in the bottom third of their senior class standing.Again, the general attitude of the students toward the course was favorableand they felt they had learned something that would be useful to them.

Apparently the revised version of the course was highly successful. Itcan be anticipated that the instruction would be successful in a military settingby virtue of the combination of experimentally effective training methods,simplified content, and administration procedures normally employed in mili-tary courses. However, no large-scale validation of the revised course toestablish proficiency and failure rates in a military setting was undertaken,since this was not the principal purpose of the research.'Suggested Utilization

The ultimate question is how the revised programed course could best beutilized. The nature of the course content is such that it would be quite usefulfor any officer or enlisted man who needs to have some familiarity with com-puter programming, whether for administrative or supervisory duties or someallied operation activity.

The adequacy of the self-contained aspects of the programed instructionalcourse has not yet been evaluated on a large scale. However, since the structureof the course is geared to actually writing computer programsto learning bydoingit should serve both as a useful introduction to a full-scale ADPS pro-gramming course and as supplementary instruction for students who may behaving difficulties with those portions of the course covered in the programedinstructional version.

`One potentially useful finding related to predictor validity was the fact that the Verbal score on theAGO correlated much higher with criterion performance (r, KT) than the Clerical score (r 1.25). The Armynow uses a combined CI, score for course acceptance.

Page 53: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

LITERATURE CITED

AND

APPENDICES

`'-i/se st

Page 54: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

LITERATURE CITED

1. Schramm. Wilbur. The Research on Programed Instruction: An Annotated Bibliography,Bulletin 1964, No. 35 (OE-34034), Office of Education, U.S. Department of Health. Education,and Welfare. Washington, 1964.

2. Cook, John Oliver. and Spitzer, Morton Edward. "Prompting Versus Confirmation in Paired-Associate Learning," J. Exp. Psych., vol. 59, no. 4, April 1960.

3. Angell. David, and Lumsdaine. Arthur A. The Effects of Prompting Trials and Partial-Correction Procedures on Learning by Anticipation, Research Report AIR -C l4 -9 /61-S115(AI:OS11-1343), American Institute for Research, San .Mateo, Calif., September 1961.

4. Gagne, Robert M., and Brown, Larry T. -Some Factors in the Programing of ConceptualLearning," J. Exp. Psych., vol. 62, no. 4. October 1961,

5. Lewin, K., Dembo, T., Festinger, L.., and Sears, P. "Level of Aspiration,' in Personality andthe Behavior Disorders, a Ilandbook. Based on Experimental and Clinical Research, JosephMcVicker Hunt (ed.), Ronald Press, New York, 1944.

6. Battig, William V. 'Evidence for Coding Processes in 'Rote' Paired-Associate Learning," J.Verb. Learn. Verb. Behay.. vol. 5, no, 2. April 1966.

7. Basic Systems, Inc. IrA TS: Development and Field Testing: Report of the Development andInstallation Test Data From a Programed Self-Instructional Training Course, Technical Report,Basic Systems, Ine.,,New York, 1963.

8. Seidel, Robert J., and Rotberg. Iris C. 'Effects of Written Verbalization and Timing of informa-tion on Problem Solving in Programed Learning." J. Ed. Psych.. vol, 57, no. 3. June 1966;issued as HumRRO Professional Paper 6-66, November 1966.

9. Skinner, B.F. 'Teaching Machines," Science. vol, 128. October 1958.

10. Harlow, Harry F. "Learning Set and Error Factor Theory," in Psychology: .4 Study of a Science.vol, 2, Sigmund Koch (ed.), '41a:raw-Hill Book Company, Inc.. 1959.

11. Harlow, Harry F. "The Formation of Learning Sets." Psych. Review, vol. 56, no. 1, Janu-ary 1949.

12. Kendler, 11.11. "The Concept of the Concept," in Categories of Human Learning. Arthur WeeverMelton'(ed.), Academic Press, New York, 1964.

13. Gagne, Robert M. "Problem Solving," in Categories of Human Learning, Arthur Weever Melton(ed.), Academic Press, New York, 1964.

14. Smith, Timothy A., Jones, Lyle V., and Thomas, Stuart. "Learning of Stimulus Similarity.Number of Stimuli per Response, and Concept Formation.' J. Verb. Learn, Verb. fiehay.. vol. I.no. 6, May 1963.

15. Gagne, Robert M., and Smith, Ernest C.. Jr. "A Study of the Effects of Verbalization on Prob-lem Solving," J. Exp. Psych., vol. 63, no. 1, January 1962.

16. Cronbach, Lee J. "The Logic of Experiments on Discovery." paper for Conference of Committeeon Learning and the Educational Process, Social Science Research Council, Ncw York, Janu-ary 1965.

17. Hickey, Albert E., and Newton, John M. The Logical Basis of Teaching: 1. The Effect ofSubconcept on Learning, Entelek, Inc. (prepared under ONR Contract Nour-42 15(00)). Janu-ary 1964.

45

Page 55: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

18. Traub, Ross E. The Importance of Problem heterogeneity to Programmed Learning, ResearchBulletin 64-26, Educational Testing Service, Princeton, N.J., 1964.

19. Scheffe, Henry. a.:v Analysis of Variance, John Wiley & Sons, Inc., New York, 1959.

20. Johnson, Palmer Oliver. Statistical Methods in Research. Prentice-Hail, Inc., EnglewoodCliffs, N.J., 1949.

21. Seidel, Robert J. "Programmed Learning: Prologue to Instruction." Psych. Reports, vol. 20.no. 1. February 1967.

22. Scandura. Joseph NI. The Basic Unit in Meaningful Learning-Association or Principle."The School Review, in preparation.

23. Cagn4. Robert M., and Bass ler. Otto C. "Study of Retention of Some 'ropies of ElementaryNonmctric Geometry." J. Ed. Psych., vol. 54, no. 3, June 1963.

24. Torrey. J. 'Ile Learning of Grammar: An Experimental Study of Two Methods," paper for theEastern Psychological Association conventions, Atlantic City. 1965. and New York, 1966,

25. Bunderson, C. Victor. Transfer of Mental Abilities at Different Stages of Practice in the Solu-tion of Concept Problems, Educational Testing Service Research Bulletin RB-67-20 (ONRTechnical Report, Contract Nonr 18584151), May 1967.

46

Page 56: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

Appendix A

SAMPLES OF COURSE CONTENT AND PROBLEMSFROM EXPERIMENT 2

In these pages of sample course materials from Experiment 2, the materialbelow and on the next two pages illustrates presentation of course content; thenext three pages present three problems typical of the practice and test prob-lems given the students, and the final three pages show the answer sheets for thethree problems. These materials are from the Variety/Confirmation condition.

27 [in course]

Data Processing: MULTIPLE CATEGORIZATION

Either TRZ or TRN can be used to sort addresses into two categories. Butwhat if you want to sort into more than two piles? For example, suppose thateach address in a series has a 1, a 2, or a 3 in it, and you want to know howmany numbers there are of each type? The process is called multiple categoriza-tion. Here is a sample set of instructions for the sorting portion of dataprocessing. Notice that it is the same type of sorting used in the subtract-sort-and-count problems, but that it includes just one additional instruction --the use of both TRZ and TRN.

CLA NUMBERSUB TWO

(1) TRN TYPE1 If negative, the number was originally a .1

(2) TRZ TYPE2 If zero after subtraction, the number was a 2

(3) CLA TRIO If the number in the Accumulator is neither a -1ADD ONE nor a 0, it must have been a 3 before subtraction,STR TRIO and can be counted immediately following sorting.

In other words, clear and add the number (copy it into the Accumulator)and subtract a constant which will make it possible to branch on TRZ or TRN(2 in this example).

1. If the number was originally a 1, subtracting 2 results in a -1 inthe Accumulator, which is a negative number. The TRN (TRansfer if Negative)command then transfers the piZiiii7own to Symbolic Location TYPE1 Where theinstructions which process the information coded by l's is processed.

2. If the number was originally a 2, subtracting 2 results in a 0 inthe Accumulator, a zero. The TRZ command then transfers the program toTYPE2.

3. If the number was originally a 3, the remainder in the Accumulatorwill be a +1, and instead of branching, the computer will perform the instruc-tions next in order.

47

Page 57: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

(28) [in course)

A supply depot maintains a two-wordrecord on each vacuum tube in stock, storedin sequence relative to TUBE. The wordsare cost and location. The first word iscoded with a 2 if the tube cost less than$S, with a 3 if it cost between $S and $10,or with a 4 if it cost more than $10.Write a program to determine the numberof tubes in each category. Store theanswers in LOCOST, AVCOST, and HICOST.The total number of tubes in stock isin TOTUBE, which can be saved for theloop counter by storing in temporarilyin COUNT.

Notice that data processingrequires (1) subtraction of a constant,(2) sorting with both TRZ and TRN,and (3) TRU instructions to the testfor completion after both types ofaddresses counted at the end.

Preliminary Description

Prepare COUNTZero LOCOST, AVCOST, andHICOST

Subtract and sort:REPEAT CLA TUBE

SUB THREETRN LOWTRZ MEDIUM

Count l's:add 1 to HICOST and store

LAST test COUNT for completion

modify TUBE(add 2 to REPEAT)

TRU REPEAT

Count -0's:MEDIUM add 1 to AVCOST and store

TRU LAST

Count -1's:LOW add 1 to LOCOST and store

TRU LAST

STOP HLT

48

Counterpreparation

0 items

Yes

no

clean outgarbage

CLA addressSUB constant

sort: isremaindernegative

Yes

sort: isremainder

zero7

no

count posi-tive numbers

omple-tion test:all itemscounted

7

address mod.TRU loop

Count zeroesTRU completion

Page 58: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

CLATRZSTR

CLASTRSTRSTR

TOTUBESTOPCOUNT

ZROLOCOSTAVCOSTHICOST

REPEAT CLA TUBESUB THREE

2 TRN LOWTRZ MEDIUM

CLA HICOST3 ADD ONE

STR HICOST

LAST CLA COUNTSUB ONE

4 STR COUNTTRZ STOP

CLA REPLATS ADD TWO

STR REPEAT

6 TRU REPEAT

LOW CLA LOCOST7 ADD ONE

STR LOCOST

8 TRU LAST

MEDIUM CLA AVCOST9 ADD ONE

STR AVCOST

10 TRU LAST

STOP HLT

28 [in course]

First we prepare the loop counter to be used laterin the test for completion by checking TOTUBE toguarantee at least one tube and storing it in COUNT.

Since no other number needs to be saved for compu-tation, we next "clean out garbage" from alllocations to be used for storing answers by replacingany numbers that might be there with zeroes.

Since the word indicating the tube's location con-tains a 2, 3, or 4, we subtract 3 to make thenumbers -1, 0, and +1. After the subtraction, a -1remainder indicates a LOCOST tube, which the TRNcommand will detect and transfer the program downto Symbolic Location LOW, where these tubes arecounted. Similarly, a 0 indicates an AVCOST tube,which the TRZ command will pick up.

If the number was neither a -1 nor a 0, the TRN andTRZ commands will not transfer the program. Theremainder must have been a +1, indicating a HICOSTtube.

TOTUBE contains the total number of tubes to be proc-essed, but since we wanted to save that number, itwas copied into COUNT, which we now use in the testfor completion.

Since each tube has a two-word record, we must add2 in address modification to get to the addresscontaining information on cost for the next tube.

Having reached the bottom of the loop, we transferback up to begin processing of the next tube.

The TRN command detected a -1, which means that thenumber was originally a 2 (before subtracting 3),and a 2 was the code number for a low cost tube.

Since the program branched out of normal order, wemust transfer back up to finish out the loop.

A 3 in the word indicated an average cost tube. Sub-tracting 3 made it a 0, which the TRZ commanddetected. Having picked up an AVCOST tube, we countit here.

All tubes must be accounted for in the test forcompletion, whether they are counted right afterthe subtract-and-sort or down here at the end.

It would be a good idea to go over these two pages a few more times.

49

Page 59: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

29 [in course]

Problem I

The Base Electronics Warehouse wants a program that will count thenumber of tubes used in January of last year and the number used in February.The results are to be stored in JAN and FEB.

The words for each tube are in memory starting at location VACUUM. Thewords are: part number, date received, shipment number, manufacturer, datetube was used, and place tube was used. TUBES contains the number of vacuumtubes in the warehouse. The month in which the tube was used is coded bynumber: 1 for January, 2 for February, 3 for March, and so on.

Locations CONI and CON2 contain constants of 1 and 2, respectively.No other constants are available. Use HOLD as temporary storage for TUBES.

Use both the TRZ and TRN commands in writing this program. Writeyour preliminary description at the left first, paying special attentionto the data processing portions.

Write the complete program below:

COMPUT

LAST

MONTHI

MONTH2

STOP

Turn to Page 32-1 for the correct answer. (Appendix-76)

SO

Page 60: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

29A [in course]

Problem 2

Post Personnel wants a program that will count the number of men whohave been in Alaska and the number who have been in Asia. The resultsare to be stored in ALASKA and ASIA.

The words for each man are in memory starting at location TROOPS. Thewords are: serial number, rank, dependent status, assigned unit, last over-seas area, and MOS. POST contains the number of personnel on Post. Theoverseas area is coded by number: I for Alaska, 2 for Asia, 3 for Europe,and so on.

Locations K1 and K2 contain constants of 1 and 2, respectively. Noother constants are available. Use SAVE as temporary storage for POST.Use both Tin and TRN commands in writing this program.

Write your preliminary description at the left first, paying specialattention to the data processing portions.

Write the complete program below:

MODIFY

DONE

NORTH

SOUTH

STOP

Turn to Page 30-2 for the correct answer. (Appendix-77)

51

Page 61: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

29B [in course]

Problem 3

A business firm wants a program that will count the number of menwho scored between 91 and IOD on a screening and the number whoscored between 81 and 90. The results are to be stored in EXCELand GOOD.

The words for each man are in memory starting at location TEST.The words are: title, salary, dependent status, branch, test score,and health. MEN contains the number of men in the firm. Test scoresare coded by number: 1 for 91-100, 2 for 81-90, 3 for 71-80, and so on.

Locations ONE and TWO contain constants of 1 and 2, respectively.No other constants are available. Use KEEP as temporary storage forMEN. Write your preliminary description at the left first, payingspecial attention to the data processing portions.

Write the complete program below:

CHANGE

OVER

FIRST

SECOND

STOP

1

Turn to Page 33-1 for the correct answer. (Appendix-78)

52

Page 62: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

30-2 fin course]

Answer to Problem 2, Page 29A

Post Personnel wants a program that will count the number of men whohave been in Alaska and the number who have been in Asia. The resultsare to be stored in ALASKA and ASIA.

The words for each man are in memory starting at location TROOPS. Thewords are: serial number, rank, dependent status, assigned unit, last over-seas area, and MOS. POST contains the number of personnel on Post. Theoverseas area is coded by number: 1 for Alaska, 2 for Asia, 3 for Europe,and so on.

Locations K1 and K2 contain constants of 1 and 2, respectively. Noother constants are available. Use SAVE as temporary storage for POST.Use both the TRZ and TRN commands in writing this program. Write yourpreliminary description at the left first, paying special attention to thedata processing portions.

This is the program:

CLA POSTTRZ STOPSTR SAVECLA ZROSTR ALASKASTR ASIA

MODIFY CLA TROOPS+4SUB K2TRN NORTHTRZ SOUTH

DONE CLA SAVESUB Kl

STR SAVETRZ STOPCLA MODIFYADD K2

ADD K2

ADD K2

STR MODIFYTRU MODIFY

NORTH CLA ALASKAADD Kl

STR ALASKATRU DONE

SOUTH CLA ASIAADD K1

STR ASIATRU DONE

STOP HLT

Now continue your study with Page 29B. (Appendix-79)

53

Page 63: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

32-1 in course)

Answer to Problem 1, Page 29

The Base Electronics Warehouse wants a program that will count the num-ber of tubes used in January of last year, and the number used in February.The results are to be stored in JAN and FEB.

The words for each Iube are in memory starting at location VACUUM. Thewords are part number, date received, shipment number, manufacturer, datetube was used, and place tube was used. TUBES contains the number of vacuumtubes in the warehouse. The month in which the tube was used is coded bynumber: 1 for January, 2 for February, 3 for March, and so on.

Locations CON1 and CON2 contain constants of 1 and 2, respectively.No other constants are available. Use HOLD as temporary storage for TUBES.Use both the TRZ and TRN commands in writing this program.

Write your preliminary description at the left first, paying specialattention to the data processing portions.

This is the program:

CLA TUBESTRZ STOPSTR HOLDCLA ZROSTR JANSTR FEB

COMPUT CLA VACUUM+4SUB CON2TRN MONTHITRZ MONTH2

LAST CLA HOLDSUB CON1STR HOLDTRZ STOPCLA COMPUTADD CON2ADD CON2ADD CON2STR COMPUTTRU COMPUT

MONTH1 CLA JANADD CON1STR JANTRU LAST

MONTH2 CLA FEBADD CON1STR,FEBTRU LAST

STOP HLT

Now continue your study with Page 29A. (Appendix-80)

54

Page 64: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

33-1 [in course]

Answer to Problem 3, Page 29B

A business firm wants a program that will count the number of men whoscored between 91 and 100 on a screening test, and the number who scoredbetween 81 and 90. The results are to be stored in EXCEL and GOOD.

The words for each man are in memory starting at location TEST. Thewords are: title, salary, dependent status, branch, test score, and health.MEN contains the number of men in the firm. Test scores are coded by num-ber: 1 for 91-100, 2 for 81-90, 3 for 71-80, and so on.

Locations ONE and TWO contain constants of 1 and 2, respectively. Noother constants are available. Use KEEP as temporary storage for MEN.Write your preliminary description at the left first, paying special atten-tion to the data processing portions.

This is the program:

CLA MENTRZ STOPSTR KEEPCLA ZROSTR EXCELSTR GOOD

CHANGE CLA TEST+4SUB TWOTRN FIRSTTRZ SECOND

OVER CLA KEEPSUB ONESTR KEEPTRZ STOPCLA CHANGEADD TWOADD TWOADD TWOSTR CHANGETRU CHANGE

FIRST CLA EXCELADD ONESTR EXCELTRU OVER

SECOND CLA GOODADD ONESTR GOODTRU OVER

STOP HLT

Now continue your study after Page 29B. (Appendix-81)

55

Page 65: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

Appendix B

LEVEL-OF-ASPIRATION SCALE

Name

Before you continue your instruction, tell us what percentage you thinkyou scored on the test you just completed: Make an X on the line for yourestimate. The numbers are only guides. For example, if you think you scored99%, your X would be close to the end of the line.

SO% 90%

0%1I I I I I I I I I

1100%

15%

I

75%

I I

Now tell us what score you think you will make on the test in the nextsection; Remember, make an X wherever you think you will score.

0951

56

50%

1

I I

1%

I

75%

90%

j100%

DON'T FORGET YOUR NAME AT THE TOP OF THE PAGE (Appendix-82)

Page 66: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

Appendix C

SAMPLES OF REVISED COURSE FORMAT

The samples in this appendix illustrate the content and format used in thefinal revised version of the course. The first three pages are excerpts (non -

continuous) from the coarse, showing presentation of instructional content with

six practice problems. The following two pages are excerpts from the separateanswer booklet, showing the answers to the six problems.

'ProblemProblem 1.1. There are two kinds of vacuum tubes in an1 inventory, 6SN7 tubes and 6AQ6 tubes. Write a program i

kto compute the total value of both kinds, placing the

1

kanswer in TOTAL. The number of 6SN7 tubes is in STOCK1.The number of 6AQ6 tubes is in STOCK2. Temporary stor- i

gage will be TEMP and TEMP+1. The cost of a 6SN7 is in i

kVALUE and the cost of a 6AQ6 is in VALUE+1. Store the

1

ktotal worth of 6SN7 tubes in VALSTK and the total worthof 6AQ6 tubes in VALSTK+1. Assume at least one tube of k

leach type. There is a 1 in KON. If you have to, use1 COMPUT as symbolic location for the first looping pro-

I

agram, NEXT for the second, and SUM to get the total ofIlboth. Use asterisks wherever possible.11Ammdwdmr..dirww.11kandwdmirwor..,dor.40.1Problern 1.2. This problem is like the preceding one,kexcept that the number of each type of tube is not known. k

1

Instead, each tube has information on its type storedrelative to TUBE. Each location in the series has either ka 1 or a 0--a 1 to indicate a 6SN7 tube, and a 0 to mean k

ka 6AQ6 tube. The total number of all tubes is in STOCK, k

hwhich can be saved in TEMP. Assume at least one tube.2 Use NEXT for sorting, LAST for the test for completion, 1geed COMPUT for computation outside the loop. Use aster- k

isles kiwherever possible. VALSTK and VALSTK+1 will not be k

kneeled. You can add VALUE into TOTAL in one subroutine Z

2 and VALUE+1 into TOTAL in another.ser.rderAmr...art.m....0Symbolic Locations with Address Arithmetic

Tracing out a program that uses asterisks becomes alittle difficult, especially for someone else, whenthe skips are much greater than 9 instructions. In

these situations, you are better off using a symboliclocation with address arithmetic.

TRU DONE+2 means, "Transfer to the instruction twosteps after symbolic location DONE."

What instruction would be performed after TRU OK-1 below?

CLA THISOK STR THERE

TRU OK-1

57

Page 67: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

Use as few symbolic locations as possible in the fol-lowing problems by (i) using the * for transfer of 9steps or less, and (2) address arithmetic relative toa symbolic location for transfers of more steps.a451Problem 1.4. There is information about the ages and

i ranks of enlisted men and officers starting at loca- kk tion INFO, in the following three-word format: enlisted kman or officer, age, and rank. The first word is coded i

with a + to indicate an officer, or a - to indicate anI enlisted personnel. Count the number of enlisted per- kk sonnel in location TOTEM. The total number of personnel ki is in location MEN. A constant of 1 is in K1 and a con- i1 stant of 3 is in K3. Use EM as symbolic location forq counting. Use DATA to modify the record to be processed. kk (The + and - signs can be considered the same as posi- ktive and negative numbers.)

Imr...........................w...............................s.........11

.................m...................w.i............m...w...............1

kProblem 1.5. Headquarters wants to know the number ofk Second Lieutenants eligible for promotion to FirstLieutenant at the end of this month. Eighteen months

1 active duty in grade are required for promotion fromkSecond Lieutenant to First Lieutenant. The data for Ik officers are stored in 5-word records as follows: rank, k

serial number, months in grade on active duty, MOS, and i

assigned unit. Ranks for officers are coded by 1 forq Second Lieutenant, 2 for First Lieutenant, and so on. kk The records are kept relative to RATING. PERSON contain3kthe number of personnel. This number is not to bedestroyed. Store it temporarily in HOLD. Assume at

I least one officer. An 18 is in TIME, a 1 is in KON, and 1

k a S is in KS. Store the number of eligible officers in 11

UP. Use RANK to modify addresses referring to rank,LAST for the completion test, and SECOND for instructions 1

A referring to time in grade if the officer is a SecondI Lieutenant.Iwor................................................................mr......mrI

56

Page 68: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

In effect, ADD DATA,IR2 says to copy DATA into aspecial temporary location (1R2) and there add to itthe contents of that location (IR2). As IR2 containsthe number 1, DATA+1 results. Thus, the completeinstruction says the same thing as: ADD DATA+1.

Problem 2.1. Write an instruction that says the same !1 ITTlig as COST+I. Index Register 3 contains the

number 1.

Problem 2.2. Write an instruction that says the samei thing us SIR BOOE+2. Register 1 contains the

number 2.

The Comma

Why the comma?

It is needed to separate two address fields, theone that we have been using all along and a new onethat is used in indexing.

Up to now you have been working with symbolic loca-tion fields, operational codes (the commands), and asingle address field. For example:

Symbolic Location Op. Code Address Field

REPEAT ADD COST

The address field you have been using is called theFirst Address Field. All addresses that we have usedpreviously appeared in the First Address Field.

Indexing uses another address field, known as theSecond Address Field. And, thus, the comma. Anytimemore than one address field is used, a comma is requiredto keep them separated. Hence, the comma:

ADD DATA,IR2

What is the address field for DATA in the instructionabove? IR2?

59

Page 69: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

ANSWERS TO PROBLEMS OF PHASE III: DATA PROCESSING

Problem 1.1.

CLA REPEATADD ONESTR REPEAT

Problem 1.2.

CLA HUBERTADD DIGITSTR HUBERT

Problem 1.4.

CLA ZROSTR FINAL

REPEAT CLA VALUEADD FINALSTR FINAL

CLA REPEATADD KSTR REPEAT

TRU REPEAT

Problem 1.5.

CLA ZROSTR EVERY

AGAIN CLA COSTADD EVERYSTR EVERY

CLA AGAINADD TWOSTR AGAIN

TRU AGAIN

60

Change VALUE to VALUE+1 so that onthe next loop VALUE+1 will be addedinto FINAL.

COST is modified to COST+2 for thenext loop.

Page 70: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

Problem 2.1.

CLA ZROSTR BONUS

CHECK CLA PUSHTRZ TEST

CLA BONUSADD UNITSIR BONUS

TEST CLA MENSUB UNITSTR MENTRZ STOP

CLA CHECKADD UNITSTR CHECK

TRU CHECK

STOP HLT

Problem 2.2.

Copy in a number.If it's a zero, skip to the test for

.

completion.If it's not zero, count the salesmanhere; he must have made at least onesale.

Test for completion.

Address modification.

Looping.

CLA MEDICSSTR TEMP

CLA ZROSIR CALLS

CLA MEDICSSTR TEMP

CLA ZROSTR CALLS

1AGAIN CLA DOCTOR AGAIN DOCTOR if not zero

TRZ OUT ICLATRZ OUT

iCLA CALLS CLA CALLSADD ONE ADD ONESTR CALLS if zero STR CALLS

OUT CLA TEMP OUT CLA TEMPSUB ONE SUB ONESTR TEMP STR TEMPTRZ STOP TRZ STOP

4CLA AGAIN CLA AGAINADD ONE ADD ONESTR AGAIN STR AGAIN

TRU AGAIN ITRU GAIN 1

STOP HLT STOP HLT

61

Page 71: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

Unclassified&trutit7 Classification

DOCUMENT CONTROL DATA- R & D(Picardy ciaestlic anon of title. body at abstract and Indostad arinorattiOn must be entered whoa the overall report Is ciasellted)

i. 4)1110mM MO AC VI VITT (COROrale Minter)

Human Resources Research OfficeThe George Washington UniversityAlexandria, Virginia 22314

zeir. acportr smolt, cLAssgrIcAtIonUnclassified

2e. 0110UP

S. RCPORT TITLC

THE APPLICATION OF THEORETICAL FACTORS IN TEACHINGPROBLEM SOLVING BY PROGRAMED INSTRUCTION

4, Descs.prose Naves am, a report and tachtelye date.)Technical Report

8. *avowals) (Firer nose, ritiddie Wile, fast arena)

Robert J. Seidel and Harold G. Hunter,..

0 AAAAAA DAVE

April 19687/2 NO. OF PAOC11

68Th. Pia. or ours

25804 CONTRAC V OR ORAN NO

DA 44-188-AR0-2b PONCLOCCT O.

2J024701A712 01

d.

00 01110iNATOA-0 REPORT RAVOCI1181

Technical Report 68-4

5b. arsca R CCCCC NO.151 (Any other numbers ChM may be deafenedthis report)

to. DISTRIOUT ION STATSCMCNS

This document has been approved for public release and sale;its distribution is unlimited.

11 SUPPLE C NOTES

Research for Programed Instructionin Military Training

12. RPONNORtNCI MILITTA1V ACTIVITY

Office, Chief of Research and DevelopmentDepartment of the ArmyWashington, D.C. 20310

IS. ARLIVRACT

In continuing research into the technology of training, a study was undertakento devise guidelines for applying programed instruction to training coursesthat involve the learning of principles and rules for use in problem solving.As the research vehicle, a portion of the material in the Army's ADPS ProgramingSpecialist Course was programed to explore several different factors in usingautomated instruction to teach computer programming. Experimental versions ofthe course were administered to over 900 subjects in various experimental group-ings. Criterion and retention tests based on actual job problems were used tomeasure subjects' performance, along with in-training measures. Results in aseries of prompting/confirmation variations indicated that giving subjectsextensive stimulus support during training helps motivate them and improvesscores during training, but hampers them in using what they have learned.Requiring subjects to fully write out rules during training hindered them indeveloping problem-solving skills applying these rules; however, using mnemonics(writing only the names of the rules) during training aided subjects in retainingwhat they had learned, particularly for more complex material. Working with avariety of practice problems facilitated the learning of problem-solving skills.

DD 112t%9473 UnclassifiedSecurilv CIrsti dietitian

Page 72: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

UnclassifiedSecurity Classification

1

REV WORDSLINA A LINK 13 LINK C

ROI.0 WT ROLL WT ROI.E WT

Computer Programming

Confirmation

Error Rate

Learning

Mnemonics

Problem Solving

Programed Instruction

Prompting

Stimulus Support

Training Technology

Variety of Practice

Verbalization

I

UnclassifiedSecurity(34.041hcAmm

Page 73: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

DISTRIBUTION LIST

130 cASO MANPOWER IPPCGR1CHF Das* 00 occ Ill BRCIO RUG MASH., O.C. 2010COMB tko coat Off AICNIC SOT 411, SANC1A PAST aliv FOG/SCI TECH INFO FACILELY A10 NASA NEP S-0-1ACINC US EUACPEAN (ONO afTh SupPoRt LOANS RN J3CINC US ARMY PACIFIC APC 969111 SAM 00 III.. 03 CBI OfAt 00CG SOUTHERN EUROPEAN IIISF 040 APO 01160 NYCO US alit JAPAN APO 963A) SAN FRAN AITN 13CO US ARMY FORCCS SOUIHERN COMO CIA SCARCD APO 09631 NYCG US ARMY akaska Ail.. AIACC APO 020 NYCO US ARAI EUROPE APC 09401 NY 111/4 OoNS Divco lank TRANS 40 00C FT EuSIIS Allis IECH kIRCO US Remy AO CORD Mt 00 Mills 000CO 1St ASHY FI GEORGE 0 MERCECO 10 US ARMY FI ocmifilSON GA0 FOGRIN ARMY it SAP IICUST0 00 G3CO FIFTH ANA( Fl SHERMAN atIN akFCC TAGCO SIAN. 00V PRES OF SAN IRON AIIN 00100CG EWER ATtN A5.IC 110 96101 SAN FgamCO EUSA AfIN 0 -1 APC 66)01 SAN FRAN00 PSYCACk sEav 0E0 CI NEuRCESYCHIAT NALIER REED GEN 10340DIM hEk APO MOENGNk PSYCHCk LAO PICNEERING RES CIV LIMY 1111116 LABS NATIO MASSTECH LIB 41001, Nafitg OARS Natick mass

ARNY con. RIO kaas ECGENOCO ROSPA NO ATtN LIONCO ARNY 01 DE VEL COPC CION 0101 RaCtOk ASS SI ACCkEkLANREDSTONE SCIENIIFIC IN0 CTIN VS ARNOSk COmC ATtN OF DOC SEC ALACO usaaa mokry 0E1 011.0AhNa ARMY oEFOTCO ARMS EkEC PG ft mUACHUCA atTA TECH LIB

1S1 AIR DEF ourocc ASt 0400 ING II Kissco us am. CDC EAPERIFEhiatlON COMO Ft ORbStAtH U S ARMY LIB DEIOI OLDG A 13 14 PRES OF SANCHF met Of 014 E SCC PSYCH wAkIER RHO Remy UNIT DI 40 was.. 0009PLANS OFFICER PSyCH HCCIRES USACCEECFORI 010CG Ft ORD alto 11.6 OIVCO DUGWAY PG Utam AIM IECHDIM makrea REED ARMY WI Of RES WALTER REEL ARMY MED CHI0111 haLiEg RE10 ARMY IASI Of RES ARLIIR *EEC ARMY 5110 011

Allis NEURCPSYCHIAI 0111CO 11.2 ARMY MISTED OVAL ctR It OENJ HARRISONCory FOR BitlaSTRCNAUT PG RJR PG CTIN EGLEN AFB130 ARRY EAGAR RIO LAOS Ft flEkVOIR A/IN IECA DOCU C1RCD FRANg0110 AASNk 00 SluFA-N6400/202-4CG 20 40 ARADEOP RICHAROS-GERAUR 08CO sr. RGN USARADCOM Ft ShER100 ATTN 0) TAG60 ROM USAROCCA If BARER40 ARMY PSL COND AIR TRANS0400k1 SAN FRANPERS SUOSYS 00 CREW SUOSYS CRC/ AERONAut SyS 051 malom-varlExsom AreO M ARMY OD FOR AVN ACCIDENt RES if RuccERCO ALCAIINNy 001 DCvEa A J ATM S4MPa VCI00 SUPPLY ACV CAFEAch STAIWN ATtN 1.10CO ARMY 01 01111 COMP TT BENI AARRISON Art4 AOJ GEM AGYREF N MS IS NASA ALLCOT cons RES GP ARMY Of BEVEL CCPG It 100vOIR

atIN DPNS aNkS HUMAN FAGIORSCO ARAI, 01 DEVIL CCPC FT PENNING 4114 AGYCO ARNY COI DEVIL COE II 010 AtIN 006 AOY&RAY CRT BEVEL CCM FT epacc atIN SPEC NARFARE AGYUAL DIV 00 ARMY SIG CIR . sc. II mommuur.co us ARMY CDC AVN AGO It RucAIRCHF CuRRICUOJA OR 0$1001 /MIN COI ARMY kocpsrics mama, cre IT LEECO ARMY Chi DIVEL COPc cot SuAR00 GPCG USCONARC AIN CCS 1$111, Fr AGAROECIV0 PERS OfCR VS AROF SP/ OR ST LOUIS MTN kmPLOYEE 0101 000ARAI, mak CDIA. CARLISLE ORS 00 1.18ASSI CORO' ARMY INIEk SCH Ft 0%010 AtIN PLAYS 01V DIDconot GOAD i GEN SiAff CC It 000010$ atiN ARCH1vE100 OF 1011 PSYC..CL koRSmO 0S ALM ACRD MIST PD111US 11111 ACAD Wilt PGINI AtIN LIECONDO awry AVM 1CM FT R10E1 MN SCH LIBCOP°, ARMY SECUR AO/ IRO cite SOFT COVENS AIIN LieCONDI 00113 FORCES 11.0111 COIL It LESLEY J mChalkCONOT NAIL NAR CCIE. It LESLEY J ACHAIR AI1N GLASSE RECORDS OR LIOFED 00 SERV SCH WILCO aiuty HEC C/R It SA. 0.0uSION AIM siinsom LIB015 OF 1NS11 ARACR SCh Ft FACLACOmOT AIRY Aftp0 SCm II gNOR AIIN WEAPONS DEptc0:43I ARMY cHaekapft SCH It HAMILTONCOMOI ARNY OEM COPS SO II ACCLELkAM AIIN EOUC ACVARMY FINANCE SCA FY BEMJ AARFISOM0001 apHy ApJ GEN SO ft 111.2 H4RAI30N AtIN EOUC LOVaiuty INF ScH It BINNING atiN 100C ACMUSA INF Seit AIIM Int CI 1NSra It &INNINGNO VS Aim 0.1 GEN SCH Ft Opti MARK /SON Air comotARAI, OR SCM II LEE alt.. LIBCONDI ARM'S ON SCm II LEE JOIN ECU( *015CORGI ARMY tRamS SCH Ft EUSIIS MTN 'CUL ACVCOROT Ropy P11.11 POI. /C0 SG. II GCRODN Arth CIR OF PISTROADt US ARAN SOWHEASTERN SIG SCH at/NI EOUC ADVISOR II ODRDOCO ARM'S COO CIR i SCH ABERCEEN PG *ITN A110-SkASSI COW ARM'S AIR LEI SCH it (NM AtM CLASSF TECm k1kCO ARAI Agit i HSk CIR Fr S111 Allis AVN OffRCOPD/ ARAI 00 0101. SO Allis SLAT CEO/0001 ARMED FORCES STAFF COLL moa.okk101101 aim SIG ICH It HONAOUN 00 ECUC 4000COROT JUDGE ADVOCATE GENERALS SO U Of WEOM CONSkr 001, AILIT POI ICE SO FT GORDONCOACT aRpy OMR SCH FT OfkVOIR AIIN 111013E3 -SyCOP01 VS AMP', SO EUACPE AT1N REF UR 00 09172 NyCmf PCLIC'S ING tI1 CIV WAY ARMOR SCH It AmogG0090 ARAV 001 SCH II RuCRER AIM Epic AOVc0001 AMPF PR1 MY MEL SC. FT WOOERSO 0 OF tout iftsrit us mikkr ACAO Ant POINt

1 SPEC AAAAAA L SCH FT ORAGG AIM LIB& USA SPEC AAAAAA f SCh AIII CCUNT0INSUINGENCY DEPI Ft BRAGGI ARMY SIG CIR 1 SCH FT ACAPOU10 AIIN ING kli DIV oao

coaoi US AIRY 1101 SCH F1 0001R2 SECT US 0110 NSL C NUN1110S Ctk A SO agostoft AkSiik/ C0001 NOFENS GAMY COMPS SCH I GIN F1 MCCIALLAN2 HG ASEROEEN PG 1E0 110 LIO1 COROT US 0101, 100 SCH FT HOLAOIRDI COMOI ARMY CM SCH 016 OIR OF NUNRE110 ACM II LEE atfN TAG Nkoja2 Ng ORO i As OPT'S OEPT USASS It 'CANING1 LEADERSHIP COPS CC OPS CEPS VS ARNY INF SCH if BINNING

2

SD

2

2

2

7

10Z

A1

421

10020

62

4

1

2

3

3

1

3

3

1

Ill CORM ELEC USA'S It OFNAING014 COMPANY PAGIICS CEP1 UIa1I FI OINKING0 US ARMY SIGNAL CIR SCH MTN SIIGYL-11COEI IIISECT CF ARMYOCS.PEAS OA ATEA CAF C.S DivP ik OF PERS STUDIES RES OCCSPER CA atiN AG waLLACE L CLEMENICD FOREIGN ICI 1 TECH LIR mum BLOCO GS feA FORCE DEVIL CA AIN CHF 1544 OIV051 CF GOA OPS GAGS km' emu. atIN FOR 0 ppOf 0 EhGARS DA 00 00E-1Of Of INFO DA &TIN CAC IT CIVHO Ropy AAI COMO IMO Otis AIIN MICRO...RCOF CF PERS CPAS OFCR PERS OCTE CA AIIN SIG ORCkIN PSYCHOL CONSLI CPC OF CHF esvc.lai NEGROS COMSIr OTC OF 00 GEN

atIN At CGk MSCCG ARAI RED Rio C000 00 6040 SCI RES BAVS ARMY 110010R0 SCI RES LAB WASH. C.C. AtiNt 00-0OF0 PERS 01 DEN OFC AtTN NOS SEC (NEW IOW) 000ARAY PROVOS/ MARSHAL GEN00000 AAAAA RS ORCIE °MOPSO ft RESERVE CONOCO OAOF ARMY SECUR aGY aRkINGioN MALL SIA JOIN ACS-GI000 00C allot Tea IMIALY) odium sr, a(Ex.. Va. 22)14CO US ARNY RED RES LAB It ANC/CG ARNY 11,01 Copp It mCNAOuim afiN aftEk recmt co 1..0 OA 00 CHF TECH INCSIR kkalsom oftPEAS . INC 00 0014 OFC 0 CHI OF ORO DACG &WI PEO 110 COMP AtTNAIGGII.SR

S WAY BEHAVIORAL $C1 RES LAB MASH. O.C. at TN CRO-AftCOACT ARAI (01 SURFER SCA It HUACHAMII ATM AISUR S3CG OPT AIR 00 COMO ENT AllIMO BEVEL 00 COCS-PIRSCO us mum mar cm was. D.C. AUNT "C"-CM ROOT DRIIERREO RES WAY &ono NO It gmogopES WAY INF RD Fl BONING 00 FESP DIVP APS WAY AIR DEF NO FT BLISS 001 MST CIVFOES WW WONT OD FT KnoxPRES ARM, AVN *1St IC Fi NICKER

at, AO It SILLARNY AON EtEC I SPEC WARFARE 00 FORT 1RA0 atiN kteO PTy PRES ARMS mAT CORD OD 01000 146CO aRpy COI DeVEL COMP Mltlt POLICE AGY FT GORGONUS ARMY ARCIIC IEST CIR R C 0 OFFICE SEATI0OFC CF SuRG 1S1 ARACRED 01V Ft H000CO 20 ARMORED DIV It HOCO atiN DIV AVM OFCRCO AIN ARMORED DIV ARC 00326 WCO WTH ARMOR OP PT. 00CO 20 a...oat° CAP OW 510 06606 Ay00 30 ARNORED GAR RECI an 09034 NYCO 14tH ARMORED GAF OW APO 09026 AY

ARFy ARMOR ART.( FIRING CIA 0 SIINAP, AtiN ACS-03 MG DKR1ST ARMORED DIV H0MC CC Ff 4000 ATIN 04-021St 10 00 1St PED TAW BN 60 ARMOR ft RULE*,ID INF OIV 1St BA 641m ARMOR APO 0036 AY1St IIINg BM 230 ARMOR 70 INF CIV goo outer SAN FRANSt.. IMF 01W 20 BA 68lm ARMOR OD 09034 NTCO COAPANi & 3p ON 32C ARBOR 3C ARACRED 01V ISPEARHOO)

APO 09031 NTCO 151 ON 69tH ARBOR APO 96270 SAN FRANCO SIN ON 330 ARMOR FT &NC&CO 30 RED tamg oh 66I.. ARAOR ATtN S300 09028 AyCO 30 FED taNg Bh 37TH ARMOR APO 09066 NTCO 2D As 3641 aRPOR Ft 111114CALIF NG 40tH ARACREC DIV LDS ANGELES ATtN AC Of SO1SUP. CORD HO DIV ARry NG JACKSONVILLE FLACO 1501.. AVM ON NJ AIR NO 11.100E0GG HO 27TH &RAMC Cul MY AIR NO SYRACUSErEkas ma amt. a...clacc DIV DALLASco after ARMOR GM It ANO AtIN 43 AIOAGICO 1ST INF DIV fl ALLEY MIN 0CO 3R0 1AF DIV WA 03 WCO 41H INF DIY Fr I.EALS &7tH 63CG TIM INF OW SW FRAM AttN G2GG BT.. 16F DIV ATM GB APC 0111 NYCG STH ITA DIV IPSO) It 0011NCO 2441 14F DIV atiN G3 APO 0112 NT0 AO US ARMY AA MTN G2 AID 96551 NTCG 1120 JOIN INF 00 FI BRAGG AllisCO 1970 INF 00 Fr BEANING atiN S3CO 141 ON INEINFt 30 INF timE OLC GUARC, Ff HIERCO HO 20 ON 6114 INF AEGI APO 0101 NTCO 10 ON 60 INF REGT APO 00747 NYCO ittsr INF 600 APC gam NYCO 24TH IMF 010 APO 96225 NYCD III.. BO 30r4 INF It SILLCO 30 614 190 10 APC 0026 NYCO 1St ON 191H LW APO 00034 MYCD 20 ON PSI" 10 NY 0IN S 3CO IS/ ON 00011 S20 1AF ISI ARMORED CIV IOLC IRONSIDESI ft H000Al" OM (NEC../ SAN IFF FT RICOCO ARM'S PWrIC OP NAV 01 DEVICE elk 01 RAS:WIGTON AIIN CODE etaCCNSOk RES GP tim PSYGI 0 APO 06748SAN WAN

01v

Page 74: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

CHF AUDIC vISVAI AAPLICAt (IFC Away RECIENIAL RIF 4FC Of Gmk 51) ORRCHF RED RES PROJ May "COP US 11111 &CAC AESI OPOCG OULU OW of wASOINGTcAV5 00CU CFCIE OFC OF US NAIL NILE PEP S40k APV 09055 NYSYS RES GP IhON4 EApar OIL CCLumPUS d014 TONY LI% PENIAGCR5TAAI1GIC PLANNING CP ceays DI E44141 ARMY OAP sI4oCHF CI MILE TIISI 11A ATTIE GEN RIF enB ED AGE DIV 11 gRAGGCO AIN ARly GO 1001 COVEkrgtoCC SIST *ATV DAGO Ala OEF OldtckEl PEAAaMOTH &RIP GP Allt INF IT ENOCHHO 91591A Agit HAPPICA aDS ARMY tERP palmatemu &oily OF AIR IMF SELFRIDGE AFBSOD ARTY DRGO Alk OEP NIGITLANUS AFSH o NIAGAAA-RUFF&LC OEF 1101 ARTY ORO Ala DIE LOCAFoolHO 45TH ARMY SAGO AIR DEF ARLINCIO HT% ILLSITU ARM, DPW DEC IT GEC 0 'reactCO LOLST 0(1 IT CAHANELECC 1St C4V DIV &PC 06440 5AN FRAMVS AIM/ GEN ECUIP LIEN tECH LIO Fr LEEPS Aka*/ IDOPIC Test CIA PO CORNER AA/ ALIN ALHAy SCIENItS1 IT cEaLTONCO *ROY RES OIL OURHAPOA 'MOMS SISTA MIL 'AISLE GP SAD 'AAA 01101CI' US PACIFIC 14.1 FPO SAN FRAMCIAC PS a IC FLE COLE GOON ADRFDER ALIN LTC 00TyCENG PACIFIC OONS ALES SECS FOO SAN FAO.COP Pro comma* US bACIFIC ELI 504 cumCAF eta CI RED SORG ON EFTA COCE 511CHF RES 01V OUR Cf PVC WAG ONHEAD LLIV P$FCHOL SECI AROFE$NE CET BEM OP MED I SORG 04IVO LIB PERS 110 61111 OF NAP PEAS ARC WEEROla PERS AES DIY OUR OF NAV PERSrEcH LIB OUR Of SNIPS CODE ME Navy cEprIMAGE rt/S IONS ON ALIN ASST CHF FOR AES DEVEL TOOT EVILNAY AIR SYS CCRO REP ATLANTIC NIP AIN ST* kORFURPEAS rue OR ASvCmCE SCE Oly OFC OF NAY AE,CO DER Nav TAG DEVICE CIR CAL, . &ETA TECH LieCo FE1 AN" -&'R wARFATIE TAG S&P 6

CO MDCLEAR HEARD/is T/40 CFA par Nil AIR STA SAV DIEGOCO AAA AIR DEVEL CFR JOITASVILLA FIONA *TIN NW LIDFEE ANTE -Alt aaaaaa E INC OR Oam HEEL VA emuLO IL' INS CFR NAV BASE hEwPORICOD ILL INC GP NAV BASE CHARLESTONCO FES TAG CFR NCRFcERAUPAN FACTORS DEPT Can. PSyCHOL CIF NAY ENG DEVICE CER PI AASHINGTONCO FLEET 04 CFR U S AAV sTA SAN CIEGOWA PSYCHOS PERM HYGIENE LOOT us NAv &CACI ANNAPOLISARES Nay NAA COOL mewrmar am. aaaaa 4I11CO SEAM SCW CORD NAV ENO CFR SAN CIEGOCO NAV GUIDED ME SO, Dam NECK VA BEACHCO I DIR AILANILC PLY ANII-SUB waINFARE TAcrECAL Scv voaFotaCo alJeteaa toward TNG CIR ATLANTIC NAV AO Sta NORFoLKCO FLY ANTI -NIA wAgFARE TAO Lik CAN Aux VL BEACHCO FEE SCNAR SCH REY NESTCo FEE AAAAAA E SCH SAO CIEGOCHF Of NAV RES *TIN SPEC LW FED R CCHF Of NAV RES AtTA HEAC PERS fi4C OR COOS 451CHF CIF Nay RES &ITN CIR PSyCOL SCI 01V COOE ASOCAF Cf NAY AES ARTA HEAL GO PSYCHCL AR CODE 4S2014 US NAV RES Lae AIIN CODE 5120CO OFF OF NAV RES BA WILE BOA )4 FPO 09510 NYCHF CF NAV AIR ENG TAG RES CEDE NAV AEA STA PENSACOLACO AAA 5Cm OF aVA PEG NIT aVA Ak0 CAR PENSACOLAbay WED RES LAD AAV SUB BASE MOCK AITN LIDCO RED F(11 AES LAO CAME (EJEutaCOR NAY NSL CFA P0IN1 NuGV CALIF OEN TECH LEO CODE POOPDIR AEROSPACE CREW ECUIP Lae bay Atft EON* CIR PACO 014 NAY ELEG LLB SAN DIEGO ALTA LIDOIL NAV PERS RES ACTret SAN CIEGONAT, NEURcASYCHIA1 AES oarr SAN CIEGOCDR Nov ASL CIA POINT NUG0 CALIF AT/A kuMAN INGER Div CODE TA-))0DIA PERS AES EAR Nay PERS PAOGOAN SUPPORT acTivITY WASH NAV 10May ING 'CRS CIA bay STA NAY YB &AMEX wASNINCTOA CODE 03 LIBCORGI MARINE CORPS me CANINE CORPS *TIN CODE ao-(eNQ PARE,* Copps kTTA Ix0:11 maim Copps ENG CIR MARINE CORPS SCA COAATECO

arm SECRET CCOF FILES CPOA HARI?* CORPS 'AST &TEN EPEE UNITCHF OF Am, CPAS OP -pillCAP OF May OAS CP-01IcHF OF NAv OARS CP-OTTaCmF OF Nay AIR TECH Th0 NAV AlA STA MEMPHISOER DOS EvAL GAP OFF CP CHF 0 May CPS 0003ELCOm01 PIP COAST GLIANC NCCHF CICA PERS RES REvIEN BA corisr WARD HOD ANS AMES OFC HO STRATEGIC AlA COW) OFF0T1 APBCINC STRATEGIC AIR CCPC °MITT LEE *TEN SUP -1LID ThG CNN/ RANDCLON APE Auto *EFTANO AIR ENG CORD ores amootou AllMO AIR FGACE Atta AFCEN-3C1CAF SCI DDCIE SST TECH OCS 4.0 HO Au' FcRCE AIRSIA(;NI SPEC HAARARE DIV WEE CI PLANS r CPAS aCs-PLANSOONS

NO AIR FORCECHF CA PERS RES OA OACIE CI CIVIL ILK PERS 0£5-PEAS NO AIR POKECHF EvAL earaFeaftet CAAEEP CEYEL CIV uteri of PEAS PLAN No al; FORCEOILY INSPECTOR GIN AIR FORCE oaf:as-Gip NORTON *FeAFSC-SILO IATSAHO CO cern Of THE NAVYFAA CHF INFC REIRIEyAL BR NASH C.C.EEO AAA AGy MID LIR HC.640Ag DISC SCDO ANOREPS f"ROPE AIR DEVEL CIR RASH GAEFFISS AFBCOB ELM SAS DIV L G HARSCO. EEC OECFORC HASI *1111 ESRAOSACRAPEATO &Ea mai AREA SwACV-PEAS TIES aCCEELL44 AFTILTC Ar&AO RANDOLPH AFOHO BALLISTICS SYS DIV PEAS SUSSES BA OSOI,P NOtION aFriAILIT TAG CER DPE LACALAAC65101" NERD mED RES Lae arior waicerFatTrasou Aft(All uoveetur DEStoNatOR ARAN meet *FriDLO -TECH ING DIR spt 146 AkAuCEPH EFS110 AIR 111ANS COM AtCra-m RANCOLP AIDCOD ELEC ST'S WV LG 01104CCm FLO EISA £511D IR &FR V LIO NAxwELE &FS LIMN AuLIT61-253AIA FORCE sag of mocsPACE NEC BMW AFO ATEA AETIONE0 LIBcaq CI LIB US kIR CLIME &CACOACTE OF AEROSPACE SAW, AFIAS-E OPtv IC NORTON AceCONORARCWIC AERomE0 LAO APO ORY11 SEATTLESSFOlk. PEAS RES LAB PRA-k AEROSPACE REV 01V LACKLANO AFBTECH FMG CIR POT ECHO/ APBPsaCmCOIGLOGY PRCG NAM SCE 103140

S

011 01V IR DAT& 10411$ C %IWO(./C CI 10110

014 NATI ICI ICVNEN 9A5HINGItA All%DER NAIL SICvk al1Y II (.10 4 PLAN 0I111 TOLD1R NAIL SLCvk A41 FT GEO G NLACI AIT4 (IR Of 1141.

CENTRAL INTEL LOY Atth CIO/CC SIANOAC DIStRI001104SYS EVAL OIV RES DIRECICRAIE OC-OCE ETNIALLN%Pt CI SIAff nun Or INIks ktS 141E4441 et. 14111.6 1410 4G1 IN1 L RACtuAImEN1 LipSCI 'NFU EACH waSoINGICACHF REGI. ING Ok 114 VIP ILO AVIA 101 1110 Pt PAOM OF NES L 1160 V% EMT Mt OM A114 04 4644A IALloe r.COLA MEDIA FR UE 0Egn OF Kw ATEA I C CE(mIN5CHI PEAS RCS %TAFT CFC CF PENS CLOT OP "NI"'OFC CI woman ING PLANAIN4 4 OPAL AA kIU Ak5ASyS 0EvEL COO SANIA PCNICA 11111 LTh

ASSOC LAC DARIEN MIN tieINESEAACm AAAAA St% CORP mtLEAN yk 01101RAND COUP masOINGTON *TIN LIDDER N ANO CORP SANTA mONICA ATER LIG0 OF ILL OP EFFECTIVE RES LAPV OF SO CALIF Elle PERS RES CPCOEvODIA U WC RES tAB5 AITA TELE L01100Alm vial, BEDFORD PASO AITN LIP

OF AGH LEARNING INC C/k ATIN CIAHUNAN SCI RES INC ACRFCIARESTEAM ELECERIC CC Ike AlHUNAN sCr RES /NC Km,* IA1E04 INFO CIR ENDA Cala 51111 4 AmFA AAA INC T 1100111CHRVSLCR CORP PSE OFF OEIROIF LIMN TICE CIIAVID COOP LAWIENCE PASO ATTN AANGR WAN. MOANS 0101CIA FCA RES IA swat SAS OPER V MIN LIONaavIAEON CO (EEC SERv DONS OURLIACTON mas$ECU ING coustitiauts LOS ANGELES *TIN PRINCIPAL %CIGEN OFEOPICS POmtkA LAM &ilk tieatm swig. 111611 C 4E1 Div 01 FLIGkt $AFIfy raunC INE 114I1NARENAROT CCRP PCmCNA CAW ATtN CEPt 5440110 ELEVAIOR CO Oly ALIN LIP SlaHtucc .NNCHF PERS SUOSAS AIRPLANE OIV AS 1090 RTNICkITTIORCIL CHOP CORP HOPETRICS CIV LOS ANGELES Allt El.,era FCR RES IN SCCIAL SYS PLC DEC DFSECy SPEC mAufAin SI." II 041G4143T FOR OEF TINES RES EAGAR SvFOOAI CIF waSHIA4TU4HUGHES AIRCAAFF CcAPANy WEYER CITY (ALITDip CTA FOR RCS CH LEARNING ttACNINC U OF RIG,0010 SWF VMICR TEO RES AOSTA amER 5GC OF ENG CMS v uF IINNU Of CmICAGO DRAT OF SOCHUNAN FACTORS SECT RAC CEN CITATINIC5 ELECTRIC Rot 061310kCIA FGOI RES EA SCCIat SVS *PEE UORITISm ENDS/ BRITISH OEF RES STaff NAskiNGtv9CANADIAN JOINT STAFF CFC OF OEF RES MEMOIR w&ING104CANADIAN ARAN SIAFF WASHINGTON *TIN GOO/ ENGCANADIAN LIAISON CICR Awn AAN011 ED Fr KNOAGERmAN LIAISON DECK Miry AVE TEST IC II AUEnuiRCS FOR INTEL Foaeion LIAISON OCR /0400AktG MILLI AlIALt4Owe ariACHE ROYAL SWEDISH ENDS*/ NASPINGtoNNAIL INST FGR ALCDmCL RES OSLOOEF RES KO LAD CNtLUICmew. tmsou OFCR ARMY AvA TEST BC Fr fluLACRBRITISH LIAISON COCA ARNo WI TEST BO It RUCKEROFC OF AIR MUCH' AUSTRALIAN CmOSY Eriffs I.A. NkAGA

ycl=P7t; P%TCI,(IAUSTRALIAN Em451 esr rg nu II AITICHEU OF SHEFFIELD OEM CE 05)CAcEMEW/SAGER FOUNDAISCA TOPEKAATER 1651 FOR BET SlEyFA spRENGANER INST IOW RES PON AIM LIONDIA PRIMATE LAB UNIT CI HIS maotseucottuuma U ICH Cf BUSPAMIR CUP AREENGTCA liEN fECE ETONAREA rt)/EL CO NV 1,4

V ET GEORGIA DEPT CI PSACNOLOBERLIN COLL DEPT OF PSyCHOLCA SECACE T SAME CHMA Cf0T CI ATTOLU or rtlGEN ELECIAIC CO saki/. OA AAAAA *TIN LIDvI140 LAOS SILVER SWAG re *ITN SIAMHEAD OW OP PSYCACE LVOV OF SC EOLUTIFIATENN VALLEY atm. AAAAA ANOKAILIE A114 LA004 Re A11045 0104 PIP A A(4..U OF GEORGIA DEPT CF ASYCHCLp Of CIA. DEPT OF PSYCHCEAPED LOST FOR RES PLLC ALTO CALIFMICH STAN U CELL OF SDK SCIEIX STATE O

RONLAND CO HAODCWIELC NJ JET,. ARES

ill6RFPiRPAPENTS $1,4 CTICAMVELLE MC

DIV DP CCAP AT.00A.En (Atli

OREGCN SIATE U DEPT CF PILIT SCE *TIN AUJluFT$ V AWAIT EAGka ihfc 004 PROJHGNANEACFORS RES OF NASA D ST TOUTSAREA PSYCHOS assce waSodk011eN ATEA psycHDE A6511NO ILL V HIND DEPr of OsYCNOLGEORGIC 1Hsr of TECH Clk SC. Of ASYCHOLOR A. V. FEND STANFCAC RES INST FT 0BELL Tit taRs 1110 TEL" INFO LIB HEIPPONV LAO NJ 1114 FLEE amais Trio.amuse AVN CORP FAAPSACGALE LONG ISO Altk SUIERY ENCIAA NIBwASHEASION EOSNA SERV CC ere aorsraorom McL IFE SCE LAC FI wCATH ATTH PRESAKA BEHAR SCI CALIFSAN DIEGO STATE COLE 'MAC ADmIN CIADIR I11011 aasouacas sTare cat ST eon am,LGu CF Um PART SCE CI VOVCSO ILLISCIS U Dior co psycmetCOPMNICAREE alSEAsf cTR OEYEL COAsuErAtluN SEAM SEcl AtEAAlaHASH HILITARY SYS CIV OFTNESCA PCRC* BURLINGTON MASS AFTN1 NA R BUSHNORIHRESFERk V DEPT CF 11105111 ENDERmonEvAELL ORD STA NAIL STA 106 AWNAy SIAM FOUL DEpr ARsTpacr EcErEA AVONEROSP&CE SaFESY Mu CT SOUTHERN um),Rp RRAN0CA a SRI I" RES ASSOC U Cf NINACIR miyakeeo mow OF ECUC AMINCHF PROCESSING ply CUFC U LieOF CALIF CEA CIO OCCU DEPT

FLORIDA STAN U LIB GIFTS ECU.NARy1A0 U PSYCHOS LOSS LIBU OF ILL LIO sER GEARU OF KANSAS cif PER,COJCAt CFPFOF kEBRASKA LiAS ACC WI

4111 4I4 114 14.14 TA.11..%

0I51 IIN 1.06 %GI SCI

LA

ATTN twit 111913111 U (II 0140

Page 75: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

2

2

22

SO Ill U AIIN LINO %IA CMANUS StAll U imutItt CIA

ORWIAm !DAC U SIN StclCF tOuISVIttt 1.10 lqtANAp otuous

OHO StAIF U itqs OFF I FOCI. ITIV0011114 StAFt U PAUSE III COCU Cta01.104.1 to 11115 PFRICOLCALS Cr.tCA1NC FILLSSlANICK0 U LIM OI*Cu LINLIM% L Or ilt&SSYRACUSE U LIS 514 CtoU OF SNIILSOIA 1.111SO" u CF 011w4 1104 SIR ACCNO CAROLSVA 511.01 CCU CM MS(( CIO0050CN UtlISS ACC OtvIJ OF PICO MIS St* CIV110011. u tillcoLonpa U (IFS 00CU ACCDIA JOINT U ISIS NALNVIllkU OF 0111VER MARY OLEIC LIOOIO b 110 GEC NASNINCtcr.110 CO CONONESS CNi CF FAO.. GIFU OtvU OF POI DOCUCATNCLIC LI LIP EOUC L PSYCHO' l lb NAL. CCU DF A0 O4RG40f1 I 11.0 1111

Page 76: DOCUMENT RESUME ED 094 752 IR 000 951 Seidel, Robert J.; … · 2014-01-14 · DOCUMENT RESUME ED 094 752 IR 000 951 AUTHOR Seidel, Robert J.; Hunter, Harold G. TITLE The Application

The...0** c

.

300 North- rashirigiOki:iiiset.

Pirletar..-,isoc2Fte.Dir*ctor

POI!!;ni.Assli4( -1!et. fo. Planning,.A954.4tani'ljitAcici for

Ft0.0*. .1°4.NoLtk;Iv*Iiig59.0

IRO D,iv sion No; .21