youth employment and training programs: a review

12
Cornell University ILR School DigitalCommons@ILR Articles and Chapters ILR Collection 10-1-1987 Youth Employment and Training Programs: A Review Vernon M. Briggs Jr Cornell University ILR School, [email protected] Follow this and additional works at: hp://digitalcommons.ilr.cornell.edu/articles Part of the Education Commons , Labor Relations Commons , Public Policy Commons , and the Training and Development Commons ank you for downloading an article from DigitalCommons@ILR. Support this valuable resource today! is Article is brought to you for free and open access by the ILR Collection at DigitalCommons@ILR. It has been accepted for inclusion in Articles and Chapters by an authorized administrator of DigitalCommons@ILR. For more information, please contact [email protected].

Upload: others

Post on 12-Sep-2021

3 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Youth Employment and Training Programs: A Review

Cornell University ILR SchoolDigitalCommons@ILR

Articles and Chapters ILR Collection

10-1-1987

Youth Employment and Training Programs: A ReviewVernon M. Briggs JrCornell University ILR School, [email protected]

Follow this and additional works at: http://digitalcommons.ilr.cornell.edu/articles

Part of the Education Commons, Labor Relations Commons, Public Policy Commons, and theTraining and Development CommonsThank you for downloading an article from [email protected] this valuable resource today!

This Article is brought to you for free and open access by the ILR Collection at DigitalCommons@ILR. It has been accepted for inclusion in Articlesand Chapters by an authorized administrator of DigitalCommons@ILR. For more information, please contact [email protected].

Page 2: Youth Employment and Training Programs: A Review

Youth Employment and Training Programs: A Review

Abstract[Excerpt] The Youth Employment and Demonstration Projects Act of 1977 (YEDPA) manifested a quantumleap in efforts both to meet the needs and to understand the employment problems of youths in the laborforce. Over its brief life (from mid-1977 to early-1981), YEDPA served both as a massive delivery system fornew programs and as an extensive laboratory for social experimentation. As such, an assessment of itsactivities and accomplishments must inevitably become intertwined with the suspicions that exist betweenthose primarily interested in meeting needs and those largely concerned with evaluating the effectiveness ofthese ventures. These two groups have been cast into the same arena as the result of the congressionaltendency to link public funding for social experiments with the requirement that they be evaluated to see ifpromises are consistent with performance.

Keywordsemployment, labor force, poverty, youth employment, public policy, labor demand

DisciplinesEducation | Labor Relations | Public Policy | Training and Development

CommentsSuggested CitationBriggs, V. M., Jr. (1987). Youth Employment and Training Programs: A review [Electronic version]. Industrialand Labor Relations Review, 41(1), 137-140. http://digitalcommons.ilr.cornell.edu/articles/152/

Required Publisher StatementCopyright by Industrial and Labor Relations Review.

This article is available at DigitalCommons@ILR: http://digitalcommons.ilr.cornell.edu/articles/152

Page 3: Youth Employment and Training Programs: A Review

YOUTH EMPLOYMENT AND TRAINING PROGRAMS:*

A REVIEW AND A REPLY

Review by Vernon M. Briggs, Jr.

THE Youth Employment and Demon-stration Projects Act of 1977 (YEDPA)

manifested a quantum leap in efforts bothto meet the needs and to understand theemployment problems of youths in thelabor force. Over its brief life (frommid-1977 to early-1981), YEDPA servedboth as a massive delivery system for newprograms and as an extensive laboratoryfor social experimentation. As such, anassessment of its activities and accomplish-ments must inevitably become intertwinedwith the suspicions that exist betweenthose primarily interested in meetingneeds and those largely concerned withevaluating the effectiveness of these ven-tures. These two groups have been castinto the same arena as the result of thecongressional tendency to link publicfunding for social experiments with therequirement that they be evaluated to seeif promises are consistent with perfor-mance.

If it were simply a matter of implement-ing programs and then attempting toassess their results, there would be littleroom for disagreement. But, increasingly,the credo is developing that the design ofthe program must be such that it facilitatesthe evaluation process. In a phrase, the tailis attempting to wag the dog. This reportstrongly implies that this inversion ofpriorities is good. Unlike all other major

* Committee on Youth Employment Programs ofthe Commission on Behavioral and Social Sciencesand Education of the National Research Council,Youth Employment and Training Programs: The YEDPAYears. Washington, D.C.: National Academy Press,1985. 495 pp. $24.95.

industrial nations, which have been con-tent to initiate labor market interventionsand be satisfied with the intuitive beliefthat what seems logical to do must be so,the United States has taken the oppositetack. Policy interventions must provethemselves before they can be deemedworthy. Public policy makers have beenmesmerized by the claims of many socialscientists that they actually know how toassess the effectiveness of policy interven-tions if only given the opportunity (i.e. , asufficient number of research dollars andadherence to their professional standardsin the design of program activities).Congress has bitten at the bait and apolitical corollary has evolved that anyhesitancy in endorsing efforts to evaluatea program implies somehow that someonehas something to fear.

It is doubtful that any other employ-ment and training program (or perhapseven any other human resource develop-ment undertaking with employment con-sequences) has been subjected to suchintensive evaluation activities as were thosesponsored by YEDPA. It is in this contextthat this report by the National ResearchCouncil (NRC) was commissioned by theU.S. Department of Labor. The reportwas prepared by a prominent panel oftwelve academics, two researchers fromprivate research organizations, and oneprogram administrator. It was chaired byProfessor Robin Hollister and was assistedby NRC staff. In addition to its own work,the panel commissioned seven excellentbackground papers that are induded as appendices.

Industrial and Labor Relations Review, Vol. 41. No. 1 (October 1987). © by Cornell University.0019-7939/37/4101 $01.00

137

Page 4: Youth Employment and Training Programs: A Review

138 INDUSTRIAL AND LABOR RELATIONS REVIEW

To understand the scale of the task, it isnecessary to review briefly the Herculeanresearch effort that YEDPA itself embod-ied. Throughout the evolutionary era offederally supported employment trainingprograms of the 1960s and 1970s, therewas mounting concern that youth werebeing poorly served. Despite recognitionthat youth unemployment rates werepersistently and inordinately high (fright-eningly so for minority youth), youthswere gradually being squeezed out ofparticipation in these endeavors by thetendency to target activities to adultgroups i e.g., to welfare recipients, veter-ans, household heads, and displaced home-makers). Accordingly, YEDPA attemptedto assure that youths would be served bycreating a new program exclusively forthem while at the same time mandatingthat youth participation levels in the otherprograms be held constant. In the fiscalyear prior to passage of YEDPA, federalexpenditures for youth employment andtraining totaled $955 million. Over thenext four fiscal years, youth expendituresexceeded $8 billion, with about 1.5 millionyouth being served each year.

But in addition to serving youth, YEDPAcontained a legislative mandate "to exploremethods of dealing with the structural un-employment problems of the Nation'syouth" and "to test the relative efficacy ofdealing with these problems in differentlocal contexts." Over $500 million of YE-DPA funds were earmarked for this mas-sive research undertaking. As the back-ground paper by Richard Elmore aptlystates, "It [YEDPA] was one of the largestshort term investments in social researchand development ever undertaken by thefederal government. Its scale and complex-ity dwarfed any research and developmenteffort undertaken by the Department ofLabor before or since" (p. 283). Referredto as "knowledge development," it is thesestudies—not the program's accomplish-ments per i^—that were the concern of thepanel's report. The report is essentially aneffort to evaluate the evaluators.

The knowledge development programwas launched under the leadership ofRobert Taggart, the director of the YE-

DPA-created Office of Youth Programs(OYP) in the U.S. Department of Labor.Taggart, who is one of the most knowledge-able and practical-minded scholars in theentire employment and training field, hadbeen called into government service by theSecretary of Labor, Ray Marshall. AsTaggart had a small research staff, thework had to be done primarily by nongov-ernmental persons and private researchorganizations. In some cases, private "in-termediaries" were actually created byOYP to run programs in order to studythe results. The overall output of thisimmense research effort was 428 studies.

Confronted with such a staggeringnumber of reports, the NRC review panelhad to decide how to proceed. The panel'sinitial selection criteria were twofold: thereport had to be on a youth program thatwas actually implemented during theYEDPA era, and the report had to containquantitative data on the effectiveness ofthat program. About 200 reports metthese standards. These studies, in turn,were subjected to further screening crite-ria: there had to be both pre-program andpost-program measurement of programobjectives; comparable comparison groupdata had to be presented; and the initialsample size and response rates for bothparticipant and control groups had to besufficient to allow standards of statisticalsignificance to be applied.

Ultimately, the winnowing process ledto the selection of 28 studies that becamethe actual basis for the effectivenessreview of YEDPA. The panel concludedthat these studies were chosen on the basisof their "scientific merit" (p. 100). Thus,despite the title of its report, the review isnot of the YEDPA research findings assuch but, rather, only of a sample of thestudies systematically selected to meet thepanel's standards. In a real sense, thereport is less about the accomplishmentsof YEDPA and more a commentary on theefficacy and utility of certain evaluationmethodologies.

Although the panel feels that its screen-ing procedures were "reasonable," the re-sult is an uneven presentation of YEDPAactivities. Only a few studies are used to

Page 5: Youth Employment and Training Programs: A Review

YOUTH EMPLOYMENT AND TRAINING PROGRAMS 139

provide a basis for the broad conclusionsthat the report draws on how completelyvarious program objectives were achieved.The report does warn the reader "not toconfuse a conclusion about the failure ofresearch to provide adequate evidence witha conclusion that a particular program it-self was ineffective or failed in some man-ner" (p. 3). Although such a subtle differ-ence may be obvious to academics, it iscertain to be missed by most others whomight read the report. Indeed, media ac-counts that accompanied the release of theNRC report concluded that it was a clearindictment of YEDPA programs.'

The report contains an excellent discus-sion chapter on the nature and dimen-sions of the youth unemployment prob-lem. It notes that the official unem-ployment rate is a "particularly ambigu-ous" indicator of youth employment prob-lems because it becomes entangled withschool attendance. About half of the youthreported as being unemployed have "afull-time though unpaid occupation, at-tending school." Thus, the importanceand the complexity of the problem re-quires more than a mere recitation ofunemployment figures. Rather, the signif-icance of this issue is to be found in acareful study of the employment experi-ences of youths of different races, inschool and out of school; of inactivity rates(i.e., the percentage of a population that isnot employed, or in the military, or inschool) by race and gender; and of theadverse effects of labor market entrybarriers. Thus, the panel seeks to place theyouth issue in proper perspective whiletrying not to diminish the critical impor-tance of the subject.

Eor the majority of youth, there is noserious problem; but for certain youthsubgroups—especially for blacks, increas-ing for Hispanics, and generally forteenagers from poverty families—the prob-lem of youth unemployment is chronic. Itwas primarily to serve and to study thesesubgroups that YEDPA was enacted. They

' E.g., see Kenneth B. Noble, "Impact of YotithJob Projects Is Declared Limited by Panel." New YorkTimes, November 13, 1985, p. A-20.

are the youths whom the educationalsystem has often failed to serve and withwhom the employment and training pro-grams of the nation have had to deal. Thepowerful theme of this discussion chapteris that it is the actions of our contempo-rary society that generate youth employ-ment problems. It is the broader institu-tional structure of society that is failing,not the youth employment programs thatare groping to find a means to reach andto rescue these forgotten youth.

Yet when the report turns to thefindings of the studies, it concludes thatthere is relatively little evidence to provethat most YEDPA interventions achievedtheir goals. The findings, however, arecarefully couched in a language that doesnot say that the programs failed but onlythat there is limited evidence of accomplish-ment. Although YEDPA did provide asignificant number of temporary employ-ment opportunities for youth, it was hardto find lasting differences between thosewho participated in the programs andthose who did not. For example, thereport states that there was no reliableevidence found to support a view thattemporary employment affected long-runpost-program employment or earningspatterns; or that temporary jobs for youthwere an effective means of increasingschool retention; or that temporary jobscould lure former dropouts back to school.

Thus, for a reader (and for a reviewer)there is a predicament. The report doesnot say that YEDPA activities failed butonly that it is not possible to say whetherthey succeeded. The NRC panel placedvery tight constraints on what it wouldconsider as evidence. Hence, everything itsays is either carefully hedged or based ona review of such a small number of theavailable studies that one does not reallyknow what to deduce.

Some conclusions do seem open todifferent interpretations. For instance, thelargest of all the YEDPA undertakings wasa job entitlement program for eligiblelow-income youths. It guaranteed a part-time job to all dropouts who returned toschool and to potential dropouts whoremained in school if they lived in any of a

Page 6: Youth Employment and Training Programs: A Review

140 INDUSTRIAL AND LABOR RELATIONS REVIEW

selected number of designated geographi-cal areas. The report draws the negativeconclusion that the increases in immediatepost-program earnings of the participantswere essentially due to the pooling ofurban and rural program results. If therural data for the Mississippi Delta regionwere removed, the benefits disappeared.Only by reverse implication does onerealize that this large-scale undertakingdid work in rural areas.

By using quality of data and of researchdesigns as the sole criteria for what itwould study, the report will no doubt winkudos from the professional evaluationcommunity. But f or policy makers andthose scholars and persons in the publicwho are genuinely looking for lessons tobe applied in the future, reading much ofthe report is largely an exercise in frustra-tion. The report is timid and cautiousbecause it chose to be so. Unfortunately, itis likely that the widest general appeal ofthis report will be to the opponents ofdirect public policy interventions, who aresure to interpret it to be a warning againstfuture actions.

The irony of this report is that by itsown admission it is an uneven presenta-tion of YEDPA program activities. As aresult, it is hard to take seriously theinstructional message that it tries to con-vey. It argues that YEDPA programs weretrying to do things that the educationalsystem should have been doing but thatthese programs lacked the "institutionali-zation of the sort that has given theeducational system its accepted place inthe mainstream of American life" (p. 33).Hence, its plea that more attention to bepaid to the institutions that deliver servicesto youth sounds very hollow. There werereports written for YEDPA that diddiscuss these institutional issues, but be-cause of the panel's selection criteria thosereports were ignored.

Likewise, the extensive discussions inthis report of the credentialing effect ofeducation, the prevalence of discrimina-tion in the labor market, and the issue of"stigmatizing" participants in employmenttraining programs are not likely to receivethe research priority they deserve. The

bulk of the report is focused on methodo-logical purism and the alleged merits ofrandom assignment and long-term fol-low-up studies. Having identified theinstitutional labor market issues as beingparamount concerns, the panel essentiallydismisses them from its assessment of theYEDPA experience. Despite its acknowl-edgment of "the need for a direct study ofthe roles and relationship of the educationand the employment and training system"(p. 33), the report essentially extols reli-ance on methodologies that ignore all ofthe key institutional questions that areinherent in this symbiosis.

No institution impinges more signifi-cantly on youth than does the educationsystem. Yet the study of labor economicshas for the last two decades tried to refutethe notion that institutions can signifi-cantly influence labor market outcomes.This report could have sought to reversethe tide of irrelevance that has often beenthe product of such research biases bystating an effective case for a return tobalance between institutional and econo-metric research. YEDPA produced notonly a vast number but also a large varietyof research studies. But the panel chose toignore most of the work in favor of aconcentration on those few studies thatmet (or approached) certain methodologi-cal ideals. These chosen studies were byfar the most expensive conducted byYEDPA. That the sample of these studiesproved so little may be less a comment onthe limitations of the programs than anindication of how myopic are the concernsinherent in the methodology that thepanel lauds as being "scientific." Given thewealth of research that YEDPA produced,this report missed a golden opportunity tosay more about the importance of diver-sity in research methodologies in a fieldthat its discussion proves is still far moresocial than it is science.

VERNON M. BRIGGS, JR.

ProfessorNew York State School of

Industrial and Labor RelationsCornell University

Page 7: Youth Employment and Training Programs: A Review

YOUTH EMPLOYMENT AND TRAINING PROGRAMS 141

Reply by Robinson G. Hollister, Jr.

The other men were easy to talk to, but they didn't knowanything. If one stopped to think about it, it was depressing howlittle most men learned in tbeir lifetimes. Pea Eye was a primeexample. Though loyal and able and brave, Pea had neverdisplayed the slightest ability to learn from his experience,though his experience was considerable. Time and again hewould walk up on the wrong side of a horse that was known tokick, and then look surprised when he got kicked.

L. McMurtry, Lonesome Dave

A central concern of our committee, orat least a concern of mine, was that, for themost part, like Pea Eye, we in the UnitedStates had, in the past, not taken advan-tage of our experience with governmentalemployment and training programs inorder to learn, in a systematic way, aboutwhat programs work for various groups inthe population, including the youth popu-lation. With the massive federal initiativeon youth employment embodied in YE-DPA, had we once again failed to learnfrom experience and been surprised at theresultant "kick," or was it different thistime?

There was some reason to hope, at theoutset of our work, that YEDPA wouldprove the exception to the past habit oflearning little from experience. The legis-lation had explicitly set as a major purpose"establishment of pilot, demonstrationand experimental programs to test theefficacy of different ways of dealing withthe problem of youth unemployment" andcreated authority and money for theSecretary of Labor to conduct research,demonstration, and evaluation activitiesconcerning youth employment problems.Further, pursuant to that authority, theDepartment of Labor's Office of YouthPrograms took the unprecedented step oftrying to lay out specifically the researchand evaluation questions they hoped wouldbe answered and titled this a "KnowledgeDevelopment Plan."

Thus, it seemed sensible for a NationalAcademy of Science Committee to under-take to respond to the charge put to it: (1)to review what is known about the

effectiveness of the principal types ofYEDPA programs; (2) to assess existingknowledge regarding the implementationof Youth Employment Programs; (3) toevaluate the YEDPA research strategy;and (4) to summarize the lessons learnedfrom YEDPA for future policy develop-ment and program implementation.

The results of the committee's work inresponse to that charge are summarized inthe volume that is under review. Weapologize for the length of the volume; wedecided that if we were going to presentsummary judgments it was best to followour high school teachers' admonishmentsto "show all your work," or at least enoughso that readers could see the foundationsupon which those summary judgmentswore built.'

We are grateful to Professor Briggs fora careful reading of our report, particu-larly in light of its considerable length. Onbehalf of the committee, I would like tothank Professor Briggs for his severalfavorable comments on the report. At thesame time, I would naturally like tocorrect what I consider misreadings andoversights.

' In his review (p. 7), Professor Briggs takes issuewith the way we present results of the job entitlementprogram for low-income youths. It is useful, I tbink,to note that he is able to do so because we were soexplicit as to what the results were and how we cameto our conclusions about tbem: findings of negativeurban effects of the program were balanced againstpositive rural effects. One can differ over thepresentation of the findings—and tbere were suchdifferences within the committee—but the importantpoint, not to be missed, is that we provided readerswith the means to reevaluate our conclusions on tbeir

Page 8: Youth Employment and Training Programs: A Review

142 INDUSTRIAL AND LABOR RELATIONS REVIEW

Professor Briggs argues that there arethose primarily interested in "meetingneeds" and those "largely concerned withevaluating the effectiveness of these ven-tures."^ This division seems to us false andmisleading—there are many people withboth concerns—but, setting this disputeaside, there is also at base the issue of howone is "meeting needs" if it turns out that agiven program is shown to be largelyineffective in changing the life chances ofthe program participants.

Professor Briggs argues that "the credois developing that the design of theprogram must be such that it facilitates theevaluation process." We argued, instead,that the design of some programs canfacilitate the evaluation process so that wecan learn from experience. Briggs arguesthat the report suggests that "the tailshould wag the dog," but he would behard pressed to identify language of thereport that makes such a suggestion. Ourlament is that programs and evaluationsseemed to have been run in such a waythat we can make out neither a dog nor atail but, for the most part, only anindecipherable array of body parts. In-deed, the report argues that attempting todo less evaluation research, in the sense oftrying to evaluate a smaller number ofprogram types, but doing the evaluationsin a sound fashion would have contributedmore knowledge than did the broad,ambitious sweep of the YEDPA demonstra-tion and research efforts.

In his polemic against evaluation of

2 It is interesting that a little further on in hisreview Professor Briggs, in discussing the researcband demonstration efforts made in the YEDPAlegislation, states: "Over $500 million were ear-marked for this massive research undertaking."Focusing on that figure illustrates again a confusionabout confiicts between evaluation researcb and"meeting needs." As we note in our report (p. 78),85% of the $500 million designated for demonstra-tion and research went for the delivery of services,which fits, we presume, the "meeting needs" cate-gory; just 15% of the resources went directly forresearch costs. The presumption that doing evalua-tion research on program effectiveness means that"needs" of the target population will not be metbecause resources are being sucked up by research-ers is simply not correct.

programs, Briggs notes that most industri-alized nations have "been content toinitiate labor market interventions and besatisfied in the intuitive belief that whatseems logical to do must be so," implyingthat this is the best way to proceed ingovernmental labor market interventions.This description is certainly a fair repre-sentation of what European nations havedone: these countries generate virtually noserious evaluations of employment andtraining program effectiveness.

In the American experience we havefound that Professor Briggs's suggestionthat "what seems logical to do must be so"is not always a sensible prescription.Gonsider, for example, government edu-cation policies. For many years "schoolmen" have been telling us that the bestway to improve educational performanceis to increase expenditures per pupil,reduce the size of classes, and pay more toteachers who attain higher degrees. It wasa case of "what seems logical to do must beso." But the analysis begun in the 1960shas shown that these simple logical rela-tionships do not hold up, and thateffective government intervention to im-prove educational performance is farmore difficult and complex than had beensupposed by the simple prescriptions ofthe "school men."^

Similarly, it seemed sensible to helpfamily farmers by providing price sup-ports for the commodities they sell, butafter decades of such supports, carefulanalysis showed that the benefits fromthese policies flowed not to the smallfamily farmer but to the large corporatefarming sector.*

In both of these cases the prescriptionsseemed logical and people believed theseprograms were "meeting needs" in thesociety, but careful analysis told a differ-ent story.

' See E. Hanushek, "The Economics of Schooling,"Journal of Economic Literature, Sept. 1986, for a reviewof many of the studies yielding these findings.

•* See J. D. Johnson and S. D. Short, "CommodityPrograms: Who Has Received the Benefits?" Ameri-can Journal of Agricultural Economics, Dec. 1983 for areview of studies of the distributional impact of farmsupport programs.

Page 9: Youth Employment and Training Programs: A Review

YOUTH EMPLOYMENT AND TRAINING PROGRAMS 143

Despite Professor Briggs's implicationthat the thrust for evaluation in this reportserves to undercut youth programs byencouraging "the opponents of directpublic policy interventions, who are sureto interpret it [this report] to be a warningagainst future actions," it can be arguedthat continued strong evaluations haveplayed an important role in sustaining asignificant employment program. Eversince its inception in the 1960s, the JobGorps has continuously been under attackas a very expensive training program fordisadvantaged youth. ("We could send akid to Harvard for that amount.") EachGongress has had to deal with attempts ofvarious parties to terminate this program,but these efforts have been regularlyturned back in part because supporters ofthe Job Gorps were able to point towell-substantiated findings from evalua-tion efforts that indicated that the socialbenefits from the program considerablyoutweighed the costs.

The focus of our report on effectivenessfindings and on research derives clearlyfrom points 1 and 3 of the charge to ourcommittee (quoted at the outset of thisreply). We focused on effectiveness be-cause that was the principal charge to thiscommittee. The criteria for selection ofthe reports seemed to us to refiectreasonable standards to apply if one weregoing to come to conclusions about pro-gram effectiveness. The fact that applyingthese reasonable standards reduced thenumber of usable studies from over 400 tojust under 30 was as shocking to ourcommittee as it would be to any reasonableobserver. It should be emphasized that wewere not arbitrarily posing questions aboutthe effectiveness of YEDPA standards thatwere sharply at variance with those enun-ciated by the program administratorsthemselves. In their "youth knowledgedevelopment plan" the National Office ofYouth Programs explicitly proposed toanswer a series of major questions aboutthe effectiveness of the youth programs;thus, in focusing on the effectivenessaspects of the report on YEDPA, thecommittee was largely following the pathset by the program administrators them-

selves. We did have to set standards forwhat constituted reasonable evidence bear-ing on those questions, but we find it hardto believe that those standards would bejudged unreasonable by the social sciencecommunity.

Briggs notes that the report warns read-ers "not to confuse the conclusion aboutthe failure of research to provide adequateevidence with the conclusion that a partic-ular program itself was ineffective or failedin some manner." Indeed, this warning wasput at the very beginning of the report andunderlined and repeated later in the re-port. He argues that this point is too subtlefor most readers and refers to the New YorkTimes article on the program as evidencethat such a warning is not sufficient. Ofcourse it is always risky to try to get acrossa somewhat complex message, but I reallywonder what alternative path Professor Br-iggs would have had us take. Three possi-bilities come to mind: don't put such a warn-ing into the text; don't report that therewas little evidence on program effective-ness; make up some plausible stories to sug-gest that specific programs were success-ful, or were failures, even though there waslittle reliable evidence bearing upon eithersuccess or failure. It is hard to believe thatProfessor Briggs would endorse any ofthese alternatives (and, of course, our com-mittee never seriously considered any ofthem), yetthat is what he appears to do by implica-tion.

In the concluding section of his reviewProfessor Briggs emphasizes the impor-tance of institutional issues and arguesthat our committee dismissed them. Theinstitutional factors operate at two levels:first, there are those that operate gener-ally in the labor market and educationalsystem, and, second, there are those thatcan affect the implementation and effec-tiveness of employment and training pro-grams per se. The committee struggledwith both of these sets of institutionalfactors in its discussions and, in theprocess, became keenly aware of its owninability to generate satisfactory commentaries on the state of knowledge regardingsuch factors. We sought to remedy our

Page 10: Youth Employment and Training Programs: A Review

144 INDUSTRIAL AND LABOR RELATIONS REVIEW

self-perceived deficiencies in this regardby commissioning papers by others whomwe hoped might better address these typesof concerns. Aspects of these papers wereincorporated both directly into the textand, in some cases, into appendix paperspublished with the report.

Professor Briggs laments that "there werereports written for YEDPA that did discussthese institutional issues, but because of thepanel's selection criteria those reports wereignored. Likewise, the extensive discus-sions in this book of the credentialing ef-fect of education, the prevalence of dis-crimination in the labor market, and theissue of 'stigmatizing' participants in em-ployment training programs are not likelyto receive the research priority they de-serve."

I respond to this contention in twoparts. First, the claim that our selectioncriteria led us to ignore the reportsdiscussing the institutional issues is simplynot correct. The selection criteria had todo with the analysis for effectiveness (thecontent of Ghapters 4—8). Beyond theanalysis for effectiveness findings, all ofthe reports were also screened to pick outthe discussions of implementation, theinstitutional issues. A paper was commis-sioned in which we asked the author to usethese reports and other sources as the rawmaterial to try to draw together whatcould be learned about the problems ofimplementation and the strengths andweaknesses of various methods of dealingwith those problems. Further, two otherauthors were commissioned to write pa-pers on implementation issues. One ofthese authors, who had continuing experi-ence at the local level in the operation ofemployment and training programs, wasasked to try to present "the lessons fromexperience" with the YEDPA and similarprograms. The second, Richard Elmore,was asked to review in detail the back-ground to the di^velopment of YEDPAand the decision-n.-. "ng processes at thefederal level that shaped the program.That paper is reproduced in its entirety asa commissioned paper in the report.

We distilled the major elements of thesefour sources—the reports themselves and

the three commissioned papers dealingwith implementation—and presented theresult as Ghapter 3 of the report, "Imple-mentation of the Youth Employment andDemonstration Projects Act." It is curiousthat in his emphasis on the importance ofinstitutional issues, Briggs fails to mentiona major chapter that was explicitly di-rected to the problems of implementationof YEDPA programs.5

With respect to the second part ofProfessor Briggs's lament—about discrim-ination, stigmatization, and so on—I wouldnote that we point to the possibilities ofdiscrimination as a factor in youth employ-ment problems (pp. 55-56, 63); we com-ment on the potential importance of thesocial context (pp. 64, 65) and include inthe report a commissioned paper by ElijahAnderson on this issue (pp. 348-66); weemphasize, as the concluding major pointof our "Summary, Gonclusions and Rec-ommendations," the dilemma created bythe fact that making employment andtraining more "target efficient" by focus-ing them on the disadvantaged populationmay cause these programs to be both"stigmatized" themselves (a "program forlosers") and a cause of "stigma" for theparticipants (pp. 24, 33). Finally, in one ofour major recommendations, we state:"The role of the school system and therelation between the schools and the youthemployment and training system are criti-cal in resolving this problem [of targetingwithout stigmatizing]. The committeetherefore recommends a direct study ofthe appropriate role of the youth employ-ment and training system and its relationto the educational system." This majorrecommendation is surely a call for moreinstitutional research, but Professor Briggssimply dismisses it as "hollow." What arewe to do?

Professor Briggs censures us for being"timid and cautious." I would argue that

5 Professor Briggs could not have known fromreading the text about all this detail concerningcommissioned papers, but tbat does not excuse hisomitting mention of the chapter on implementationand of other indications that we took the institutionalcontext of employment and training programs veryseriously.

Page 11: Youth Employment and Training Programs: A Review

YOUTH EMPLOYMENT AND TRAINING PROGRAMS 145

we were not cautious but, rather, truthful.We reported the state of knowledge aboutprogram effectiveness as we found it.Professor Briggs may not like what wefound, but he does not mention anystudy or finding that we missed orignored that was at variance with ourfindings or conclusions. We thought wecould detect some of the reasons whymore is not known about "what works forwhom" and, better yet, we laid out somerelatively simple methodological guide-

^ These guidelines (outlined on pp. 30-32) are:randomly assign subjects to participation in theprogram and to a control group; have a reasonablylarge sample of participants and controls; and takevigorous steps to maintain contact with both partici-pants and controls over a long enough periodfollowing the program length—2—3 years—to deter-mine whether the effects of the program becomeevident only with time and whether they endure orfade out. if these steps are taken, elaborate econo-metric techniques are not needed to estimate theimpact of the program; quite the contrary, followingsuccessful implementation of tbese procedures thesimplest comparison of the experience of partici-pants and controls yields reliable estimates of theeffects of the program. These guidelines are not onlystraightforward, but they have in fact been success-fully followed in several major studies of employ-ment and training programs. They are not economet-ric esoterica, as implied in Professor Briggs's review,but sensible procedures for evaluating programeffects.

lines (based not on hypotheses but onexamples of actually executed evalua-tions)^ so that in the future more will belearned from program experience andresearch efforts.

For twenty years employment and train-ing policy formulation has been guidedlargely by the impressions and intuitionsof well-meaning people (including manyof us on the committee) about the charac-ter of employment problems of the disad-vantaged and what would work to solvethem. But good intentions are not enough.I argue that we have plenty of evidencethat impressions and intuitions can gowrong, that the "needs" of the disadvan-taged are hardly "served" by the continu-ation of ineffective programs, and that wecan learn from experience in order toredirect those resources in ways that willbetter serve this population in the future.I hope that our report, and ProfessorBriggs's provocative review of it, stimu-late those concerned with youth employ-ment problems and programs to considerseriously this argument.

ROBINSON G. HOLLISTER, JR.Professor of Economics

and ChairDepartment of EconomicsSwarthmore College

Page 12: Youth Employment and Training Programs: A Review