tales of refusal, adoption, and maintenance: evidence-based substance abuse prevention via...

13
Tales of Refusal, Adoption, and Maintenance: Evidence-Based Substance Abuse Prevention Via School-Extension Collaborations TENA L. ST. PIERRE AND D. LYNNE KALTREIDER Available online 29 September 2004 ABSTRACT Despite availability of empirically supported school-based substance abuse prevention programs, adoption and implementation fidelity of such programs appear to be low. A replicated case study was conducted to investigate school adoption and implementation processes of the EXSELS model (Project ALERT delivered by program leaders through Cooperative Extension). Interviews with school personnel revealed: (1) schools were not aware of evidence-based programs until Extension approached them; (2) schools dared not eliminate DARE; (3) teachers are unlikely to implement with fidelity; (4) implementation of theory-based prevention is not consistent with school views of curriculum delivery; and (5) schools believed Project ALERT via EXSELS was an advantage over teacher delivery, but only three of the eight schools sustained the model. Discussed are potential for Extension as a national implementation system, the value of qualitative inquiry to study processes of adoption, and issues related to the selection and implementation of evidence-based programs. INTRODUCTION The recent movement in many disciplines, such as medicine, social work, and education, to transfer research evidence into practice also has occurred in the field of substance abuse pre- vention. However, despite considerable efforts, instrumental use of evaluation findings, which would lead to changes in practice, has had limited success (Nutley, Walter, & Davies, 2002). Processes involved in disseminating research evidence to use in real-world settings are complex, dynamic, and non-linear (Weiss, 1998). According to Diffusion of Innovation Theory, groups adopt new practices, or innovations, at different rates, and they implement them with varying degrees of fidelity depending on many organizational and contextual factors (Rogers, 1995). Tena L. St. Pierre Department of Agriculture and Extension Education, 325 Agricultural Administration Building, The Pennsylvania State University, University Park, PA 16802, USA; Tel.: (1) 814-865-0399.; Email: [email protected] (T.L. ST. PIERRE) American Journal of Evaluation, Vol. 25, No. 4, 2004, pp. 479–491. All rights of reproduction in any form reserved. ISSN: 1098-2140 © 2004 American Evaluation Association. Published by Elsevier Inc. All rights reserved. 479

Upload: tena-l-st-pierre

Post on 04-Sep-2016

213 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Tales of Refusal, Adoption, and Maintenance: Evidence-Based Substance Abuse Prevention Via School-Extension Collaborations

Tales of Refusal, Adoption, and Maintenance:Evidence-Based Substance Abuse PreventionVia School-Extension Collaborations

TENA L. ST. PIERRE AND D. LYNNE KALTREIDERAvailable online 29 September 2004

ABSTRACT

Despite availability of empirically supported school-based substance abuse prevention programs,adoption and implementation fidelity of such programs appear to be low. A replicated case studywas conducted to investigate school adoption and implementation processes of the EXSELS model(Project ALERT delivered by program leaders through Cooperative Extension). Interviews withschool personnel revealed: (1) schools were not aware of evidence-based programs until Extensionapproached them; (2) schools dared not eliminate DARE; (3) teachers are unlikely to implementwith fidelity; (4) implementation of theory-based prevention is not consistent with school views ofcurriculum delivery; and (5) schools believed Project ALERT via EXSELS was an advantage overteacher delivery, but only three of the eight schools sustained the model. Discussed are potential forExtension as a national implementation system, the value of qualitative inquiry to study processesof adoption, and issues related to the selection and implementation of evidence-based programs.

INTRODUCTION

The recent movement in many disciplines, such as medicine, social work, and education, totransfer research evidence into practice also has occurred in the field of substance abuse pre-vention. However, despite considerable efforts, instrumental use of evaluation findings, whichwould lead to changes in practice, has had limited success (Nutley, Walter, & Davies, 2002).Processes involved in disseminating research evidence to use in real-world settings are complex,dynamic, and non-linear (Weiss, 1998). According to Diffusion of Innovation Theory, groupsadopt new practices, or innovations, at different rates, and they implement them with varyingdegrees of fidelity depending on many organizational and contextual factors (Rogers, 1995).

Tena L. St. Pierre • Department of Agriculture and Extension Education, 325 Agricultural Administration Building, ThePennsylvania State University, University Park, PA 16802, USA; Tel.: (1) 814-865-0399.; Email: [email protected] (T.L. ST.PIERRE)

American Journal of Evaluation, Vol. 25, No. 4, 2004, pp. 479–491. All rights of reproduction in any form reserved.ISSN: 1098-2140 © 2004 American Evaluation Association. Published by Elsevier Inc. All rights reserved.

479

Page 2: Tales of Refusal, Adoption, and Maintenance: Evidence-Based Substance Abuse Prevention Via School-Extension Collaborations

480 AMERICAN JOURNAL OF EVALUATION, 25(4), 2004

The use of qualitative case studies is one way to gain understanding of the processes thatimpede or contribute to the adoption, faithful implementation, and maintenance of evidence-based programs and to identify strategies to enhance their dissemination. Over the last decade,evaluators increasingly have recognized the value of case studies for providing “detailed un-derstanding of what is going on and solid grounds for making improvements” (Patton, 1997, p.288). Moreover, such investigations also may reveal important context-specific information tohelp evaluators understand the more fundamental issues of where and under what conditionsprograms will be effective.

In this paper, we illustrate the use of a replicated case study approach to evaluate theprocesses of adoption, implementation, and maintenance of the Project ALERT substance abuseprevention program delivered in schools by program leaders from the Cooperative ExtensionService (CES).1 The case study was conducted within the context of a larger outcome studyof this delivery model called EXSELS (Extension and Schools Enhancing Life Skills).

We first provide background information on evidence-based substance abuse prevention,the larger outcome study, and CES. We then discuss (1) reasons schools refused to adoptthe EXSELS model, (2) challenges for translating research to practice as identified throughcase study interviews with personnel at the eight Pennsylvania middle schools that adopted,(3) maintenance one year later, (4) CES’ potential to disseminate evidence-based preventionprograms, and (5) implications for evaluation practice.

BACKGROUND

Evidence-based Substance Abuse Prevention

Two decades of prevention research have resulted in the development and identificationof empirically supported school-based substance abuse prevention programs for youth. Withinthe last few years, several federal agencies have been promoting and requiring the use of theseprograms to qualify for funding. For example, since 1998, the Office of Safe and Drug FreeSchools, which supplies the largest single source of federal substance abuse prevention fundingfor schools, has required districts to adhere to “Principles of Effectiveness” by choosing fromits list of approved research-based programs (U.S. Department of Education [DoE], 1998).

Despite these efforts, only a minority of school districts in the United States have adoptedproven programs (Hallfors, Pankratz, & Sporer, 2001). From their survey of 1,905 U.S. publicand private middle schools, Ringwalt et al. (2002) reported that just one third of public schoolsand one eighth of private schools are using effective substance abuse prevention curricula.

Furthermore, studies have shown that even when schools adopt evidence-based programs,implementation fidelity varies widely. A survey of 81 Safe and Drug Free School districtcoordinators across 11 states found that 59% had chosen evidence-based curricula, but only19% of schools were implementing them with fidelity (Hallfors & Godette, 2002).

As suggested by at least one group of investigators (Rohrbach, Graham, & Hansen, 1993),it may be more practical and cost-effective to train outside providers who are enthusiastic aboutthe program and skilled at interactive teaching strategies. Overworked teachers may appreciatehaving someone else take responsibility for teaching substance abuse prevention (Dusenbury& Falco, 1995), and school administrators may be more enthusiastic about adopting newprograms using this approach because staff time is a key barrier to implementation (Krameret al., 2000). Moreover, trained outside providers may be more likely to implement programsfaithfully since teaching the curriculum is their sole responsibility in the school. However,

Page 3: Tales of Refusal, Adoption, and Maintenance: Evidence-Based Substance Abuse Prevention Via School-Extension Collaborations

Tales of Refusal, Adoption, and Maintenance 481

an empirical question remains as to the effectiveness of a proven prevention program whenimplemented by providers different from those in the original clinical trial (Domitrovich &Greenberg, 2000).

With this rationale in mind, the authors were funded by the National Institute on DrugAbuse to test the effectiveness of Project ALERT implemented through the EXSELS model.Program implementation is completed, and data analysis is in progress. Below is a descriptionof our larger outcome study to give readers the context for the replicated case study we reporthere.

The EXSELS Outcome Study as Context for the Case Study

Eight middle schools adopted and implemented the EXSELS model with two cohortsof students who took part in Project ALERT during 7th and 8th grades. The outcome studyemployed an experimental design. Each of the eight schools randomly assigned two 7th-grade classrooms to each of three conditions: (1) adult-led Project ALERT, (2) adult-led, teen-assisted Project ALERT, and (3) control. Students were pretested before and after programimplementation in 7th and 8th grades; a follow-up posttest was given at the end of 9th grade.

Cooperative Extension (CE) Educators from Penn State’s Cooperative Extension Servicehired as program leaders qualified adults in the community who had experience with youthprograms and who related well to young people. A college degree and teacher certificationwere not necessary qualifications given that EXSELS was based on CES’ model of traininglay people from the community to deliver programs. Over two years, 12 adult program leaderstaught the program. Most had work or volunteer experience with young people; nine had abachelor’s degree, four of them in education; one was working on a bachelor’s degree, andtwo had no college degree. Three had public school teaching experience. Program leaderswere given the same training that classroom teachers normally receive from Project ALERT’sprofessional trainers. The evidence-based program delivered at the 8 middle schools was therevised edition of Project ALERT (Ellickson, Miller, Robyn, Wildflower, & Zellman, 2000),which includes 11 lessons delivered in 7th grade and 3 booster lessons in 8th grade. Grantfunds covered all program costs.

Why Cooperative Extension?

The Cooperative Extension Service is part of the land-grant university located in each statewithin the United States. Established in 1914 through the Smith-Lever Act, CES is a partner-ship among federal, state, and local governments for extending research and new knowledgefrom land-grant universities to solve problems and enrich lives (ECOP, 2001). CE Educa-tors, frequently referred to as county agents, have specializations in agriculture, family living,4-H youth development, or community development. With established relationships in theircommunities, county agents work at the grass-roots level in 3,150 counties across the UnitedStates. The land-grant university’s outreach model has been described as the most influentialinnovation in higher education because of its ability to apply and disseminate research-basedknowledge on a large scale (Miller, 1988). It has been characterized as the oldest and mostsuccessful diffusion system in the United States (Rogers, 1995).

As such, the CE model may provide an effective and widely replicable dissemination andimplementation system for evidence-based prevention programs. Implementation of curriculathrough 4-H school enrichment programs managed by local CE Educators may provide an

Page 4: Tales of Refusal, Adoption, and Maintenance: Evidence-Based Substance Abuse Prevention Via School-Extension Collaborations

482 AMERICAN JOURNAL OF EVALUATION, 25(4), 2004

alternative to teacher-led and school-managed substance abuse prevention programs. A keycomponent of these 4-H school enrichment programs is the “train-the-trainer” concept whereby4-H CE Educators manage research-based programs and train local citizens to deliver them.

With technical assistance from the university, CE Educators may be able to disseminateuser-friendly information to local schools and identify, train, and supervise skilled programleaders from the community to deliver substance abuse prevention programs with fidelity. Thestrategy also would strengthen the collaborative relationship between CE and local schools,thereby broadening the community effort to prevent substance abuse.

TALES OF REFUSAL

An important first step in this initiative was to describe the EXSELS model and gain supportfrom Penn State’s Director of Cooperative Extension and the eight Regional Directors (RDs)who oversee the eight CES regions in Pennsylvania. RDs recommended 19 CE Educatorswho worked with youth programming and who they believed had some type of relationshipwith their county schools. Fourteen of the 19 CE Educators recommended by RDs were notinterested when we contacted them, leaving 5 who wanted to take part. Thirteen other CEEducators volunteered to participate after hearing about the project from colleagues. These 18CE Educators invited 29 schools to adopt the EXSELS model to implement Project ALERT.Of the 29 schools, 20 declined while 9 adopted the model and took part in the study. (Eightschools originally took part, but one dropped out after a year and was replaced.) Among the20 schools that declined, the most frequently given reasons were: (1) simply not interested, (2)could not fit the program into their schedule, and (3) research issues including not wanting acontrol group and/or student surveys.

Scheduling (cited by five schools) and research issues (cited by four schools) are previ-ously documented barriers (e.g., Smith et al., 1993). However, the top reason, “not interested,”given by eight schools makes one ponder other possible impediments.

Because Project ALERT’s designation as an evidence-based model program was basedon an efficacy trial (Ellickson & Bell, 1990), evaluators and others may wonder if schools’reluctance to participate was due to skepticism about the curriculum’s real-world effective-ness. This is a reasonable question given that few of the model substance abuse preventionprograms found to work in efficacy studies have been independently replicated in the realworld (personal correspondence with P. Brounstein, April 19, 2004) or with more diversepopulations (Greenberg, 2004). Moreover, among the few independently replicated programs,weaker (e.g., Harrington, Giles, Hoyle, Feeney, & Yungbluth, 2001) or even negative effects(Hallfors, 2004) have been reported in replications. Furthermore, the somewhat different, andsometimes confusing, criteria that have been applied for the designation of “model,” “exem-plary,” or “effective” programs across federal agencies (Substance Abuse and Mental HealthAdministration, 2003; U.S. Department of Education, 2001; U.S. Department of Health andHuman Services, 1999, 2003) raise questions about what constitutes credible evidence.

Readers also may ask whether use of Extension in the EXSELS delivery model was abarrier to school participation. Is it possible that schools did not want to work with CES forsome reason or they did not react favorably to an outside program leader coming into theschool?

Our interpretation is that the eight “not interested” schools did not refuse because ofskepticism regarding criteria for standards and evidence of replicated program effects. Webelieve that, like the majority of U.S. schools and most of our adopting schools, they weren’t

Page 5: Tales of Refusal, Adoption, and Maintenance: Evidence-Based Substance Abuse Prevention Via School-Extension Collaborations

Tales of Refusal, Adoption, and Maintenance 483

aware of evidence-based substance abuse prevention programs, let alone the issues aroundconsistent standards and replications.

We also propose that the eight “not interested” schools did not even assess the programand CES delivery model before refusing. Most likely, the refusals arose largely because theseschools had no internal advocate for the program. As reported by Scheirer (1990) in her studyof schools that adopted the Fluoride Mouth Rinse Program, an internal advocate or “champion”was key to adoption, while non-adopting districts did not go through a decision process at all.In contrast to the eight “not interested” schools in our study, our adopting schools had internalchampions from the outset who guided program adoption through the formal channels. Itappears that our eight “not interested” schools, like the non-adopting schools in the Scheirerstudy, probably did not undertake a formal, considered decision process about whether or notto implement.

TALES OF ADOPTION: EXIT INTERVIEWS

After program delivery with adopting schools, we conducted a replicated case study to qual-itatively investigate school personnel views on the adoption and implementation of ProjectALERT and the EXSELS model of delivery. Across the 8 sites, the authors together con-ducted individual interviews with all 26 school personnel who had been closest to adoptionand implementation. These individuals were familiar with the program and the study becausethey were involved in the decision to participate and/or were teachers whose classrooms wereused for program delivery. School administrators were contacted first to request permission forthe interviews, followed by individual teachers and other personnel. All individuals contactedconsented to be interviewed during the school day.

Twelve administrators, thirteen teachers, and one counselor were interviewed across theeight schools. Schools were located in urban, suburban, and rural areas and ranged in size from330 to 1,368 students. Students were predominantly Caucasian, with one school approaching50% minority enrollment. SES, academic, and violence-related risk levels varied widely acrossschools. Half offered health in 7th grade; two offered DARE in 7th grade, and five others offeredDARE in 5th or 6th grade. Project ALERT was delivered in 7th grade health in four schools,study hall in two, a special topics class in one, and a life sciences class in another.

Interviewers read to respondents the single-item questions designed to provide descriptiveinformation regarding the adoption and implementation of prevention programs and of ProjectALERT specifically. Most questions were open-ended; others asked for responses within a5-point range (1 = none/not at all; 2 = a little; 3 = some; 4 = a lot; 5 = very much) shownon a card. Questions were followed by probes. As described below, questions were based onprevious prevention research, Diffusion of Innovation Theory (Rogers, 1995), and our specificinterests regarding Project ALERT and the EXSELS delivery model.

For example, since support by the school principal has been found to be an importantfactor for successful prevention programs (e.g., Kam, Greenberg, & Walls, 2003), we askedbuilding principals how much they supported school substance abuse prevention in general,and whether they agreed with the U.S. Department of Education’s requirement that schoolsdeliver evidence-based prevention programs to qualify for funding. We also asked teachershow much they supported substance abuse prevention because they are more directly affectedby a program than are administrators and, therefore, could have different concerns.

Drawing on Rogers’ (1995) work showing that relative advantage of an innovation will af-fect adoption, we asked administrators what their current prevention program was and whether

Page 6: Tales of Refusal, Adoption, and Maintenance: Evidence-Based Substance Abuse Prevention Via School-Extension Collaborations

484 AMERICAN JOURNAL OF EVALUATION, 25(4), 2004

they thought Project ALERT would be more effective. We asked administrators and teacherswhether offering Project ALERT through the EXSELS model was an advantage over teachersteaching it. Recognizing that schools later could elect to have their teachers deliver ProjectALERT (vs. outside program leaders), we asked administrators and teachers if implementationwould be different with teachers. If so, we asked why and how they would change it. We alsoasked administrators whether they might be interested in continuing Project ALERT throughthe EXSELS model or with their teachers delivering it.

Following the guidelines for data reduction and analysis suggested by Miles andHuberman (1984), we compared field notes on responses to questions after each set of schoolinterviews to ensure accuracy. Later, using the original evaluation questions as our framework,we organized the data for analysis by summarizing the most relevant findings under eachquestion. Data were labeled to identify the important themes and examples discussed below.

CHALLENGES FOR THE RESEARCH TO PRACTICE MOVEMENT

Schools were not aware of evidence-based programs until Cooperative Extension approachedthem. Despite every respondent’s high level of support for school-based substance abuse pre-vention, none of the eight schools had sought out Project ALERT or any other evidence-basedsubstance abuse prevention program before Cooperative Extension provided information aboutthe curriculum and the EXSELS delivery model. Of the 12 administrators interviewed, only 2had a vague knowledge that a U.S. Department of Education’s list of exemplary evidence-basedprograms existed, and teachers had virtually no knowledge of the evidence-based concept. Inkeeping with Ringwalt et al.’s (2002) observations, these schools previously had not consideredadopting an evidence-based program because they were unaware of specific curricula, and theywere uncertain about how to choose and obtain the curricula.

Given the schools’ lack of awareness of proven programs, Cooperative Extension playeda key role in the adoption process with these eight schools by (1) making schools knowl-edgeable about Project ALERT as an evidence-based program recommended by the USDoEand endorsed by the National Middle School Association, (2) providing a clear, user-friendlydescription of the program and assessing its fit together with the school, and (3) providingaccess to and delivery of the curriculum. In addition, implementing Project ALERT throughCooperative Extension allowed schools to try the new curriculum at no cost and without mak-ing major changes in school curricula or schedules, a feature previously shown to increaseadoption (Rogers, 1995).

Schools dared not eliminate DARE. DARE was the most often used prevention programdespite administrators’ acknowledgment of its ineffectiveness. Several principals were ques-tioning DARE but were unable to scrap the program. Adopting Project ALERT was viewed asa way to ease in an evidence-based program without creating conflict with the community andteachers who supported DARE and had good relationships with DARE officers. Because all butone of the schools had DARE in 5th and/or 6th grade, adding another prevention program in7th grade did not disrupt DARE. Principals generally indicated that it is easier to sustain a newprogram after teachers become accustomed to it and that eventual evaluation results showingProject ALERT’s effectiveness could justify keeping it and eliminating DARE. Administra-tors walked a fine line to create parent, community, and even teacher support for preventionprograms.

Community politics appeared to play an important role in keeping DARE, which waswell integrated into the schools and intertwined in school-community relationships. Examples

Page 7: Tales of Refusal, Adoption, and Maintenance: Evidence-Based Substance Abuse Prevention Via School-Extension Collaborations

Tales of Refusal, Adoption, and Maintenance 485

include having the police chief on the school board, the DARE officer serving on the schooltask force on drugs, and the DARE officer being housed at the school to maintain a watchfulpresence. Openly trying to eliminate DARE would be disruptive to school-community rela-tionships. Moreover, parents perceived DARE as visible evidence that schools were addressingthe drug problem. According to one principal, parents related well to DARE, and the program’sgraduation brought more community people to the school than any other school event. In hiswords, “Parents don’t ask the hard questions that academics and schools are now asking aboutDARE’s credibility, and I need something that parents can relate to.”

Teachers are unlikely to implement with fidelity. A consistent theme across administratorsand teachers was that Project ALERT would not be implemented with fidelity if delivered byteachers. One principal summed up the views of many school personnel interviewed whenhe said, “Every subject gets changed to meet the needs of students, teachers’ perceptionsof how the curriculum should be taught, and time and schedule constraints.” The majorityof teachers said they would make additions and deletions in content, reduce the number ofclassroom periods allocated, change the sequence of lessons to integrate them into their healthcurriculum, and/or condense the time frame (e.g., from once/week to 5–10 days in a row) toconstitute a unit or module within their existing curriculum.

Implementation of theory-based substance abuse prevention is not consistent with schoolviews of curriculum delivery. The increasing time and schedule constraints facing schoolsmake it difficult to implement prevention curricula fully. However, many of the changes thatteachers said they would make in the Project ALERT curriculum seemed to reflect deeplyingrained perceptions that prevention programs should provide primarily drug information,including effects on the body, similar to many drug education units taught in health classes. Assuch, teachers and administrators frequently referred to program implementation as “providingthe information” or “covering the material.” Schools were not aware that information-onlyapproaches in classroom substance abuse prevention curricula are not effective.

Administrators and teachers alike had no knowledge of the theories underlying evidence-based prevention programs. For example, some health teachers believed that Project ALERTdid not cover drug facts as completely as their health unit, and inadequate information waspresented on new drugs. By and large, teachers believed the curriculum was too long, indicatingthey were unaware of research indicating that longer (evidence-based) prevention programsare better and that programs with booster sessions produce more lasting effects than thosewithout boosters (e.g., Gottfredson, 1997). Several teachers believed that it was overload tohave prevention (i.e., DARE) in 6th grade followed by Project ALERT in 7th and 8th grades.One teacher said, “. . . This makes the kids shut down because they have heard the informationover and over.” Some teachers viewed a good substance abuse prevention curriculum as onethat the students like, which meant providing facts about new drugs that students are curiousabout.

Furthermore, school personnel had no concept that Project ALERT and other evidence-based curricula include specific components that are based on theory of what is known aboutpreventing youth substance use. Hence, theoretically grounded components of Project ALERT,such as increasing motivation not to use drugs, developing skills to resist internal and externalpressures to use, and activities to dispel erroneous beliefs about peer drug use norms wereperceived by many teachers as unnecessary and overly time consuming.

Although schools believed that implementingProject ALERT throughEXSELShadadvan-tages over teachers delivering it, administrators at just three schools were open to sustainingthe EXSELS model. School personnel expressed that delivering Project ALERT through the

Page 8: Tales of Refusal, Adoption, and Maintenance: Evidence-Based Substance Abuse Prevention Via School-Extension Collaborations

486 AMERICAN JOURNAL OF EVALUATION, 25(4), 2004

outside program leaders had multiple advantages. They believed that students (1) see drugprevention as important, (2) view the outside person as having special expertise, and (3) maybe more open and receptive because classroom teachers give grades and can have studentsexpelled. Despite these advantages, most schools said they would not sustain the EXSELSmodel.

For the three schools where decision-makers were open to sustaining EXSELS, havinga previous positive relationship with Cooperative Extension appeared to play a key role. Incontrast to other schools that knew less about CES before the study, these schools had developedtrust and respect for CES through the delivery of previous CES programs in the school, havingthe County Extension Director serve on the school board, or having their students involvedwith Cooperative Extension through 4-H activities. Bringing in an outside leader from CES toteach Project ALERT was viewed as similar to other CES programs coming into the school.Although having the “right person” teach the program (e.g., one who can relate well to students,is knowledgeable about the program material, and shows up as scheduled) was important tothem, these schools seemed to believe CES is capable of providing that person. Funding theoutside leader, however, remained an uncertainty given tight budgets all schools were facing.

Administrators at three other schools were considering having their teachers deliverProject ALERT so that it would become more integrated into the school curriculum, but someexpressed scheduling concerns. The two schools not interested in sustaining 7th-grade ProjectALERT at all seemed to be influenced by health teachers protecting class time for their owncurriculum.

TALES OF MAINTENANCE

During the school year after our exit interviews, two of the three schools previously open tosustaining EXSELS continued the model working through CES. One school that previouslysaid a teacher might deliver the program hired the original CES program leader to teach butdid not collaborate with CES. None of the five other schools continued Project ALERT at all.

Four factors were common to the three schools that sustained EXSELS in some form. (1)Between the exit interviews and the new school year, only these three schools maintained theadministrators who had championedthe program. (2) All had a slotin the 7th-grade curriculum(e.g., health, study hall, or special topics class) with supportive teachers, which meant theprogram did not disrupt other subjects. (3) They all had outside fundingto pay for the program.No doubt this was a critical factor given budget constraints schools face. The two schools thathad a prior relationship with CESreceived the program free through tobacco settlement fundsreceived by CES working with local coalitions. The school without a prior CES relationshippaid the program leader with funds received from being in the research project the previousyear.

CES’ POTENTIAL TO DISSEMINATE EVIDENCE-BASED PREVENTION

Admittedly, the high number of non-adopting schools and the low proportion of sustainingschools lead us to be less than optimistic about CES’s current ability to broadly disseminateevidence-based substance abuse prevention programs in school settings. We are more optimisticabout CES’ ability to implement prevention programs with fidelity. Observations of the 654lessons taught across 7th and 8th grade indicated that program leaders completed all of eachactivity in 87% of the lessons and some of each activity in 11% of the lessons. This level of

Page 9: Tales of Refusal, Adoption, and Maintenance: Evidence-Based Substance Abuse Prevention Via School-Extension Collaborations

Tales of Refusal, Adoption, and Maintenance 487

fidelity is similar to that reported in the Project ALERT efficacy study where the curriculumwas implemented by trained health educators instead of by teachers. In that study, observationsof 950 of the 2,300 lessons taught showed that all activities were covered in 92% of the lessonsobserved (Ellickson & Bell, 1990).

Our findings suggest that a great deal of coordinated effort at multiple levels will be neededfor CES to disseminate prevention programs through the EXSELS model. At a minimum, well-trained university prevention specialists will need to provide continuing technical assistance tohelp local CE Educators (1) establish themselves as credible sources of prevention programs, (2)develop partnerships with schools and community agencies, (3) make schools and communitiesaware of evidence-based practices and their theoretical underpinnings, (4) provide trainedprogram leaders to implement with fidelity, (5) establish prevention champions within localschools, and (6) secure external funding to offer programs. Despite the challenges, we continueto believe in the potential for CES to disseminate prevention programs as illustrated by onecounty’s CE Educator and program leader who had worked with an original study school. Lastyear they disseminated the EXSELS model into 10 schools and several youth shelters andafter-school programs in the county. (Interestingly, that original study school did not sustainthe model because the health teacher resisted and the principal was new.) As with two of thecontinuing EXSELS schools, funding came from state tobacco settlement money, illustratingthe advantages CES can provide by their involvement with community coalitions.

IMPLICATIONS FOR EVALUATION PRACTICE

In this study, we qualitatively examined the processes of adoption, implementation, and main-tenance of the EXSELS model—the Project ALERT substance abuse prevention programdelivered in schools by outside program leaders from the Cooperative Extension Service. Webelieve that the case study approach has been beneficial for providing detailed organizationaland contextual information to help us better understand these processes and offer suggestions toaddress the challenges. We also believe our findings have implications for evaluation practice.

We recognize that the implications we discuss may have limited generalizability to broadernumbers of school settings and to evaluation practice more generally. The original 29 schoolsapproached by CE Educators to take part in the project were purposely selected based on theiranticipated receptiveness to the model and, therefore, are not representative of average middleschools. Also, the 8 schools interviewed were even less representative in that they were a smallsubsample of the 29 schools approached. Although our primary interest was understandingschool factors that may influence adoption, implementation, and maintenance of the EXSELSmodel as a unique delivery mechanism (Cronbach, 1982), we acknowledge that the smallsample size limits our ability to identify differential factors across schools. We also wanted tolearn as much as possible about the contexts of these schools to help us understand potentialexplanations for eventual outcome findings from our larger study of EXSELS. However, manyof our findings are congruent with the broader evaluation literature on dissemination and useof evidence-based practices. Therefore, despite the limitations, we believe interview findingsfrom the eight schools may contribute to the broader knowledge base on issues of adoptionand implementation.

In-depth school information revealed the common characteristics that appeared to con-tribute to adoption and maintenance of Project ALERT delivered through the EXSELS model.In particular, findings suggested the importance of having an internal champion or programadvocate for adoption as well as for program maintenance. Referred to as the personal fac-

Page 10: Tales of Refusal, Adoption, and Maintenance: Evidence-Based Substance Abuse Prevention Via School-Extension Collaborations

488 AMERICAN JOURNAL OF EVALUATION, 25(4), 2004

tor, the critical role of this type of individual is echoed in the literature on use of evaluationfindings. According to Patton (1997), the personal factor is the most important explanatoryvariable in evaluation use and “represents the leadership, interest, enthusiasm, determination,commitment, assertiveness, and caring of specific, individual people” (p. 44). The personalfactor seemed important in the adoption and maintenance of Project ALERT as all 8 adoptingschools had such a person, usually an administrator, who promoted and advocated the programmodel through formal channels. Only the three schools that did not lose their internal champi-ons maintained the model (in some form), echoing again utilization-focused evaluation, whichasserts that turnover is its greatest vulnerability (Patton, 1997).

Our findings have provided greater understanding of why teachers tend not to implementprevention programs with fidelity and lead us to suggest that neither teacher-delivery norEXSELS delivery will be totally faithful to program design on a permanent basis in the realworld. Despite achieving high implementation fidelity in our study, we realize that even well-trained outside program leaders may not be able to sustain this level on a permanent basisgiven the time constraints schools face. These realities raise the ongoing fidelity/adaptationdebate and the controversial discussions regarding identification of core components to shortenprograms and allow for local adaptations. Advocates of strict fidelity assert the importance ofdelivering an intervention as tested and found effective, while advocates of adaptation believein the importance of factoring in local contextual characteristics in the process (Mayer &Davidson, 2000). There are no clear solutions to this debate at this time. Moreover, no studiesof prevention programs have yet been conducted to identify core components and to test theirenhancement of fidelity and their effectiveness (Pentz, 2004).

Our findings suggest that a compromise needs to be struck in the fidelity/adaptation de-bate if program adoption by schools is to be improved. The issue was raised by a perceptivereviewer who asked what is the relative value of insisting that evidence-based programs beimplemented with fidelity if a side effect is to drastically reduce the odds of adoption. Further-more, as suggested by Rogers (1995), the more an innovation is viewed as consistent with pastpractices and current values, the more readily it will be adopted. According to our interviewsand what we know about college teacher-training programs (Bosworth, 2004), teachers areheavily oriented toward teaching flexibly and creatively to meet student needs, practices in-congruent with optimal implementation fidelity. Therefore, to increase adoption, it appears thatthe fidelity/adaptation debate must be resolved in some fashion that results in scientifically-tested programs which allow local adaptations around a core of critical program elements. Italso would be essential to provide teacher training on the guidelines for allowable adaptationsand strategies for maintaining fidelity of critical ingredients. Based on our interviews, we alsosuggest including a thorough explanation of a program’s underlying theory and logic modelto help teachers understand the rationale for implementing program components faithfullyand with the recommended dosage. According to Rogers (1995), this type of information,referred to as principles knowledge, minimizes negative adaptations and discontinuation of aninnovation.

Furthermore, we believe that the systematic study of adoption processes may be helpfulfor better understanding the broader issue of contextual impacts on eventual program outcomesfor recipients. It is possible that features of school settings may impose limitations that produceeventual negative effects, suggesting that adoption may be undesirable in some settings. Forexample, it is conceivable that study schools whose students received greater saturation ofDARE before having Project ALERT might show less positive outcomes than schools with lessDARE or none at all, given that DARE has been shown to be ineffective and even detrimental

Page 11: Tales of Refusal, Adoption, and Maintenance: Evidence-Based Substance Abuse Prevention Via School-Extension Collaborations

Tales of Refusal, Adoption, and Maintenance 489

in previous studies (e.g., Rosenbaum & Hanson, 1998). Examination of relative effect sizes inheavily saturated DARE schools versus less saturated ones may reveal differences, providinga rationale for schools to discontinue using DARE with Project ALERT.

Finally, we believe our study has raised the more fundamental implication that the re-search evidence supporting many evidence-based practices being disseminated needs to bestrengthened. Substance abuse prevention programs provide just one example with their lackof effective independent replications and inconsistent and confusing criteria for designation asevidence-based. Strength of evidence is discussed more broadly in a recent article in Science(Mervis, 2004). According to Mervis, studies done by the National Research Council (NRC)and the Building Engineering and Science Talent (BEST) consortium showed that none of theevaluations of science and math programs in U.S. schools were good enough to draw conclu-sions about their effectiveness. Much of the problem, according to the NRC report, stems fromlack of agreement among evaluators regarding what constitutes rigorous evaluation. Accord-ing to Judith Ramaley, head of the education directorate at the National Science Foundation(Mervis, 2004), “the entire discipline of rigorous evaluation is just emerging” (p. 1583). Thefield of evaluation has much work ahead to respond to these challenges.

CONCLUSIONS

This qualitative inquiry has helped us identify, understand, and offer suggestions to addresssome of the challenges related to the adoption, implementation, and maintenance of theEXSELS model, as well as to prevention programs more generally. Although considerableeffort will be required, we believe that the EXSELS delivery mechanism has potential for en-hancing adoption and faithful implementation of evidence-based substance abuse preventionprograms.

This study has made us keenly aware that multiple issues need to be addressed in the diffi-cult work of improving dissemination of evidence-based programs. Programs are not portablepackages that are readily adopted and expected to be effective in all settings. Findings havehighlighted the fact that all school settings are different in many ways, suggesting the almostinfinite number of reasons new programs may or may not be adopted. A quote from 1976 mightbe very applicable to today’s challenges of dissemination: “Factors found to be important forinnovation in one study are found to be considerably less important, not important at all, oreven inversely important in another study. This phenomenon occurs with relentless regularity”(Rogers & Shoemaker, 1971, in Downs & Mohr, 1976, p. 700). Given the difficulty of the task,the challenges associated with dissemination through the EXSELS model may be no moredaunting than those presented by other dissemination strategies.

NOTE

1. Trained health educators delivered Project ALERT in the original efficacy study (Ellickson &Bell, 1990). In a more recent study, regular classroom teachers implemented the revised curriculum(Ellickson et al., 2003) used in the EXSELS delivery model.

ACKNOWLEDGMENT

Work on this paper was supported by a grant from the National Institute on Drug Abuse(ROIDA12011).

Page 12: Tales of Refusal, Adoption, and Maintenance: Evidence-Based Substance Abuse Prevention Via School-Extension Collaborations

490 AMERICAN JOURNAL OF EVALUATION, 25(4), 2004

REFERENCES

Bosworth, K. (2004, May). A seven-year history of implementing a prevention curriculum. Paper pre-sented at the 12th Annual Society for Prevention Research Meeting, Quebec City, Quebec, Canada.

Cronbach, L. (1982). Designing evaluations of educational and social programs. San Francisco: Jossey-Bass.

Domitrovich, C. E., & Greenberg, M. T. (2000). The study of implementation: Current findings fromeffective programs that prevent mental disorders in school-aged children. Journal of Educationaland Psychological Consultation, 11, 193–221.

Downs, G. W., Jr., & Mohr, L. B. (1976). Conceptual issues in the study of innovation. AdministrativeScience Quarterly, 21, 700–714.

Dusenbury, L., & Falco, M. (1995). Eleven components of effective drug abuse prevention curricula.Journal of School Health, 65, 420–425.

ECOP (Extension Committee on Organization and Policy). (2001).Strategic directions of the cooperativeextension system. Retrieved from http://www.reeusda.gov/part/gpra/direct.htm.

Ellickson, P. L., & Bell, R. M. (1990). Drug prevention in junior high: A multi-site longitudinal test.Science, 247, 1299–1305.

Ellickson, P. L., McCaffrey, D. F., Ghosh-Dastidar, B., & Longshore, D. L. (2003). New inroads inpreventing adolescent drug use: Results from a large-scale trial of Project ALERT in middle schools.American Journal of Public Health, 93, 1830–1836.

Ellickson, P. L., Miller, L., Robyn, A., Wildflower, L. Z., & Zellman, G. L. (2000). Project ALERT. LosAngeles, CA: The BEST Foundation for a Drug-Free Tomorrow.

Gottfredson, D. C. (1997). School-based crime prevention. In: L. W. Sherman, D. C. Gottfredson, D.MacKenzie, J. Eck, P. Reuter, & S. Bushway (Eds.), Preventing crime: What works, what doesn’t,what’s promising: A report to the United States Congress. Washington, DC: U.S. Department ofJustice, Office of Justice Programs.

Greenberg, M. T. (2004). Current and future challenges in school-based prevention: The researcherperspective. Prevention Research, 5, 1–13.

Hallfors, D. (2004, May). Findings from a randomized control effectiveness trial of “ReconnectingYouth”.Paper presented at the 12th Annual Society for Prevention Research Meeting, Quebec City,Quebec, Canada.

Hallfors, D., & Godette, D. (2002). Will the “Principles of Effectiveness” improve prevention practice?Early findings from a diffusion study. Health Education Research, 17, 461–470.

Hallfors, D., Pankratz, M., & Sporer, A. (2001). Drug free schools survey II: Report of results. ChapelHill, NC: Department of Maternal and Child Health, School of Public Health, University of NorthCarolina.

Harrington, N. G., Giles, S. M., Hoyle, R. H., Feeney, G. J., & Yungbluth, S. C. (2001). Evaluation of theAll Stars character education and problem behavior prevention program: Pretest–posttest effects onmediator and outcome variables for middle school students. Health Education and Behavior, 28,533–546.

Kam, C., Greenberg, M. T., & Walls, C. T. (2003). Examining the role of implementation quality inschool-based prevention using the PATHS curriculum. Prevention Science, 4, 55–63.

Kramer, L., Laumann, G., & Brunson, L. (2000). Implementation and diffusion of the Rainbows programin rural communities: Implications for school-based prevention programming. Journal of Educa-tional and Psychological Consultation, 11, 37–64.

Mayer, J. P., & Davidson, W. S., II. (2000). Dissemination of innovation as social change. In J. Rappa-port & E. Seidman (Eds.), Handbook of community psychology(pp. 421–438). New York: KluwerAcademic/Plenum Publishers.

Mervis, J. (2004 June 11). Meager evaluations make it hard to find out what works. Science, 304,1583.

Page 13: Tales of Refusal, Adoption, and Maintenance: Evidence-Based Substance Abuse Prevention Via School-Extension Collaborations

Tales of Refusal, Adoption, and Maintenance 491

Miles, M. B., & Huberman, A. M. (1984). Analyzing qualitative data: A source book for new methods.Newbury Park, CA: Sage.

Miller, P. A. (1988). To our colleagues in the land-grant universities: A statement on the crisis of youthin America. Unpublished manuscript.

Nutley, S., Walter, I., & Davies, H. (2002). Discussion paper 1: From knowing to doing: A frameworkfor understanding the evidence-into-practice agenda. Unpublished manuscript, University of St.Andrews, Research Unit for Research Utilization, St. Andrews.

Patton, M. Q. (1997). Utilization-focused evaluation: The new century text. Thousand Oaks, CA: Sage.Pentz, M. A. (2004). Form follows function: Designs for prevention effectiveness and diffusion research.

Prevention Research, 5, 23–29.Ringwalt, C. L., Ennett, S., Vincus, A., Thorne, J., Rohrbach, L. A., & Simons-Rudolph, A. (2002).

The prevalence of effective substance use prevention curricula in U.S. middle schools. PreventionScience, 3, 257–265.

Rogers, E. M. (1995). Diffusion of innovations(4th ed.). New York: Free Press.Rohrbach, L. A., Graham, J. W., & Hansen, W. B. (1993). Diffusion of a school-based substance abuse

prevention program: Predictors of program implementation. Preventive Medicine, 22, 237–260.Rosenbaum, D. P., & Hanson, G. S. (1998). Assessing the effects of school-based drug education: A

six-year multilevel analysis of Project D.A.R.E. Journal of Research in Crime and Delinquency, 35,381–412.

Scheirer, M. A. (1990). The life cycle of an innovation: Adoption versus discontinuation of the fluoridemouth rinse program in schools. Journal of Health and Social Behavior, 31, 203–215.

Smith, D. W., McCormick, L. K., Steckler, A. B., & McLeroy, K. R. (1993). Teachers’ use of healthcurricula: Implementation of Growing Healthy, Project SMART, and the Teenage Health TeachingModules. Journal of School Health, 63, 349–354.

Substance Abuse and Mental Health Administration. (2003).NationalRegistry of EffectivePrograms. Re-trieved July 26, 2004 from http://www.modelprograms.samhsa.gov/template.cfm?page=nrepbutton.

U.S. Department of Education, Safe Drug-Free Schools Program. (1998). Notice of final principles ofeffectiveness. Federal Register, 63(104), 29901–29906.

U.S. Department of Education. (2001). Exemplary and promising safe, disciplined and drug free schoolsprogram 2001. Retrieved July 26, 2004 from http://www.ed.gov/admins/lead/safety/exemplary01/report pg3.html.

U.S. Department of Health and Human Services. (1999). Mental health: A report of the SurgeonGeneral.Rockville, MD: U.S. Department of Health and Human Services, Substance Abuse and MentalHealth Services Administration, Center for Mental Health Services, National Institutes of Health,National Institute of Mental Health.

U.S. Department of Health and Human Services. (2003). Preventing drug use among children andadolescents: A research-based guide(2nd ed.). National Institute of Drug Abuse.

Weiss, C. (1998). Have we learned anything new about the use of evaluation? American Journal ofEvaluation, 19, 21–33.