view - implementation science

15
RESEARCH Open Access Implementing community-based provider participation in research: an empirical study Randall Teal 1* , Dawn M Bergmire 1 , Matthew Johnston 1 and Bryan J Weiner 2 Abstract Background: Since 2003, the United States National Institutes of Health (NIH) has sought to restructure the clinical research enterprise in the United States by promoting collaborative research partnerships between academically- based investigators and community-based physicians. By increasing community-based provider participation in research (CBPPR), the NIH seeks to advance the science of discovery by conducting research in clinical settings where most people get their care, and accelerate the translation of research results into everyday clinical practice. Although CBPPR is seen as a promising strategy for promoting the use of evidence-based clinical services in community practice settings, few empirical studies have examined the organizational factors that facilitate or hinder the implementation of CBPPR. The purpose of this study is to explore the organizational start-up and early implementation of CBPPR in community-based practice. Methods: We used longitudinal, case study research methods and an organizational model of innovation implementation to theoretically guide our study. Our sample consisted of three community practice settings that recently joined the National Cancer Institutes (NCI) Community Clinical Oncology Program (CCOP) in the United States. Data were gathered through site visits, telephone interviews, and archival documents from January 2008 to May 2011. Results: The organizational model for innovation implementation was useful in identifying and investigating the organizational factors influencing start-up and early implementation of CBPPR in CCOP organizations. In general, the three CCOP organizations varied in the extent to which they achieved consistency in CBPPR over time and across physicians. All three CCOP organizations demonstrated mixed levels of organizational readiness for change. Hospital management support and resource availability were limited across CCOP organizations early on, although they improved in one CCOP organization. As a result of weak IPPs, all three CCOPs created a weak implementation climate. Patient accrual became concentrated over time among those groups of physicians for whom CBPPR exhibited a strong innovation-values fit. Several external factors influenced innovation use, complicating and enriching our intra-organizational model of innovation implementation. Conclusion: Our results contribute to the limited body of research on the implementation of CBPPR. They inform policy discussions about increasing and sustaining community clinician involvement in clinical research and expand on theory about organizational determinants of implementation effectiveness. Keywords: Implementation, Academic-community research partnerships, Cancer clinical trials * Correspondence: [email protected] 1 Cecil G. Sheps Center for Health Services Research, University of North Carolina, Chapel Hill, NC, USA Full list of author information is available at the end of the article Implementation Science © 2012 Teal et al.; licensee BioMed Central Ltd. This is an Open Access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Teal et al. Implementation Science 2012, 7:41 http://www.implementationscience.com/content/7/1/41

Upload: others

Post on 12-Sep-2021

3 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: View - Implementation Science

ImplementationScience

Teal et al. Implementation Science 2012, 7:41http://www.implementationscience.com/content/7/1/41

RESEARCH Open Access

Implementing community-based providerparticipation in research: an empirical studyRandall Teal1*, Dawn M Bergmire1, Matthew Johnston1 and Bryan J Weiner2

Abstract

Background: Since 2003, the United States National Institutes of Health (NIH) has sought to restructure the clinicalresearch enterprise in the United States by promoting collaborative research partnerships between academically-based investigators and community-based physicians. By increasing community-based provider participation inresearch (CBPPR), the NIH seeks to advance the science of discovery by conducting research in clinical settingswhere most people get their care, and accelerate the translation of research results into everyday clinical practice.Although CBPPR is seen as a promising strategy for promoting the use of evidence-based clinical services incommunity practice settings, few empirical studies have examined the organizational factors that facilitate or hinderthe implementation of CBPPR. The purpose of this study is to explore the organizational start-up and earlyimplementation of CBPPR in community-based practice.

Methods: We used longitudinal, case study research methods and an organizational model of innovationimplementation to theoretically guide our study. Our sample consisted of three community practice settings thatrecently joined the National Cancer Institute’s (NCI) Community Clinical Oncology Program (CCOP) in the UnitedStates. Data were gathered through site visits, telephone interviews, and archival documents from January 2008 toMay 2011.

Results: The organizational model for innovation implementation was useful in identifying and investigating theorganizational factors influencing start-up and early implementation of CBPPR in CCOP organizations. In general, thethree CCOP organizations varied in the extent to which they achieved consistency in CBPPR over time and acrossphysicians. All three CCOP organizations demonstrated mixed levels of organizational readiness for change. Hospitalmanagement support and resource availability were limited across CCOP organizations early on, although theyimproved in one CCOP organization. As a result of weak IPPs, all three CCOPs created a weak implementationclimate. Patient accrual became concentrated over time among those groups of physicians for whom CBPPRexhibited a strong innovation-values fit. Several external factors influenced innovation use, complicating andenriching our intra-organizational model of innovation implementation.

Conclusion: Our results contribute to the limited body of research on the implementation of CBPPR. They informpolicy discussions about increasing and sustaining community clinician involvement in clinical research and expandon theory about organizational determinants of implementation effectiveness.

Keywords: Implementation, Academic-community research partnerships, Cancer clinical trials

* Correspondence: [email protected] G. Sheps Center for Health Services Research, University of NorthCarolina, Chapel Hill, NC, USAFull list of author information is available at the end of the article

© 2012 Teal et al.; licensee BioMed Central Ltd. This is an Open Access article distributed under the terms of the CreativeCommons Attribution License (http://creativecommons.org/licenses/by/2.0), which permits unrestricted use, distribution, andreproduction in any medium, provided the original work is properly cited.

Page 2: View - Implementation Science

Teal et al. Implementation Science 2012, 7:41 Page 2 of 15http://www.implementationscience.com/content/7/1/41

IntroductionIn 2003, the United States National Institutes of Health(NIH) embarked on a fundamental restructuring of thenational clinical research enterprise [1]. The centerpieceof its Roadmap for Medical Research has been the intentto create collaborative partnerships between academically-based investigators and community-based physicians toconduct clinical research on a sustained basis [2]. Byincreasing community-based provider participation in re-search (CBPPR) through Clinical and Translation ScienceAwards, federally funded provider-based research net-works, and other mechanisms, the NIH has sought toadvance the science of discovery by conducting researchin clinical settings where most people get their care, andaccelerate the translation of research results into everydayclinical practice [1-6]. The Roadmap has spurred discus-sion of the potential benefits of CBPPR [6-8], infrastruc-ture and workforce training needs for CBPPR [4,9],common barriers to increasing CBPPR [7,10,11], and strat-egies for overcoming those barriers [7,10].Missing from this discussion, though, is an empirical in-

vestigation of the organizational factors that facilitate orhinder the implementation of CBPPR. For communitypractice settings, CBPPR is a complex innovation whoseimplementation requires systemic organizational changesin staffing, workflow, information systems, and rewardstructures. Even with substantial external assistance, com-munity practice settings may find it challenging to create asupportive organizational context and culture for CBPPR.This task may be especially challenging when communitypractice settings first become involved in clinical research.Effective implementation of CBPPR during start-up and theearly implementation phase, however, may provide a criticalpathway for sustained community physician engagement inclinical research.In this study, we examine the organizational start-up

and early implementation of CBPPR in community prac-tice settings. Using case study research methods and im-plementation theory, we identify the organizational factors

Implementation

Policies and

Practices

Management

Support

Resource

Availability

Organizational

Readiness for

Change

Figure 1 Organizational model of innovation implementation.

associated with the effective implementation of CBPPR inthree community practice settings that joined the NationalCancer Institute’s (NCI) Community Clinical OncologyProgram (CCOP) in the United States, a federally fundedprovider-based research network with a 28-year history oftranslating research into practice [12]. Our results contrib-ute to theory about organizational determinants of imple-mentation effectiveness and inform policy discussionsabout increasing and sustaining community clinicianinvolvement in clinical research.

MethodsConceptual frameworkWe regarded CBPPR as an innovation and employed anorganizational model of innovation implementation toguide our study (Figure 1). Briefly, the model posits thatconsistent, high-quality innovation use (implementation ef-fectiveness) is a function of the organization’s readiness forchange, the level of management support and resourcesavailable, the implementation policies and practices (IPPs)that the organization puts into place, the climate for imple-mentation that results from these policies and practices,and the extent to which intended users of the innovationperceive that innovation use fosters the fulfillment of theirvalues.Organizational readiness for change (ORC) is a pre-

implementation phase construct that refers toorganizational members’ collective confidence to imple-ment change and collective commitment to pursuecourses of action that will lead to successful change [13].Organizational readiness sets the stage for implementa-tion and affects innovation use through other constructsin the model.Management support and resource availability are two

implementation-phase constructs that shape theorganizational context of implementation [14-16]. Man-agement support is critical because managers setorganizational priorities and control resources needed inimplementation. If management supports exists and

Implementation

Climate

Innovation-

Values Fit

Implementation

Effectiveness

Page 3: View - Implementation Science

Teal et al. Implementation Science 2012, 7:41 Page 3 of 15http://www.implementationscience.com/content/7/1/41

resources are available, organizational members can putinto place more, high-quality policies and practices topromote innovation use [15].IPPs refer to the plans, practices, structures, and strat-

egies that organizations employ to promote innovationuse [15,17]. When IPPs allow organizational members toincorporate the innovation with a significant level of op-erational, cultural, and strategic fit, there is an increasedchance for effective implementation of the innovation.Effective implementation can be achieved through differ-ent combinations of policies and practices [16,17]. Thecollective effect of these policies and practices shape theoptimum climate to implement consistent, high-qualityinnovation use.Implementation climate refers to organizational mem-

bers’ shared perception that innovation use is expected,supported, and rewarded [17,18]. Implementation cli-mate emerges from organizational members’ sharedexperiences with, observations of, and discussions aboutthe organization’s IPPs. Organizations create a positiveclimate for implementation by employing a variety ofmutually reinforcing policies and practices to enhanceorganizational members’ means, motives, and opportun-ity for innovation use.Innovation-values fit (IVF) refers to the extent to

which organizational members perceive that innovationuse will foster the fulfillment of their values [17]. Al-though individuals may vary in their values, emphasishere is given to values shared by groups (e.g., oncologistsin group-practice). IVF is held to moderate the relation-ship of implementation climate and implementationeffectiveness. Even in a strong implementation climate,innovation use can range from non-use to compliant useto committed use depending on IVF.

Study settingEstablished in 1983, the CCOP is a three-way partnershipinvolving the NCI’s Division of Cancer Prevention (NCI/DCP), selected cancer centers and clinical cooperativegroups (CCOP research bases), and community-based net-works of hospitals and physician practices (CCOP organiza-tions) (Figure 2) [12]. NCI/DCP provides overall directionand funding for community hospitals and physician prac-tices to participate in clinical trials; CCOP research basesdesign clinical trials; and CCOP organizations assist withpatient accruals, data collection, and dissemination of studyfindings. As of December 2010, 47 CCOP organizationslocated in 28 states, the District of Columbia, and PuertoRico participated in NCI-sponsored clinical trials. TheCCOP includes 400 hospitals and more than 3,520 commu-nity physicians.In FY 2010, the CCOP budget totaled $93.6 million.

The median CCOP organization award was $850,000.NCI funds CCOP organizations through a cooperative

agreement whereby participating organizations areexpected to share the costs with NCI. Continuedfunding depends on the performance of the CCOPorganization in meeting clinical trial enrollment goals(i.e., accrual) set by NCI. CCOP organizations arerequired to accrue patients in both cancer treatment andcancer prevention and control (CP/C) clinical trials.Cancer treatment clinical trials refer to the evaluation ofcancer therapies that generally fall into one of three cat-egories: medical, surgical, or radiation oncology. CP/Cclinical trials refer to evaluation of new methods ofdetecting cancer risk and preventing primary and sec-ondary cancers. It also refers to the evaluation of symp-tom management, rehabilitation, and continuing careinterventions designed to minimize the burden of cancerand improve quality of life. The NCI establishes accrualgoals for cancer treatment and CP/C clinical trials basedon the CCOP’s past accrual performance and expectedaccrual performance given the available menu of NCI-sponsored clinical trials.CCOP organizations are led by a physician Principal

Investigator (PI) who provides local program leadership[12]. CCOP staff members typically include an associatePI, a program administrator, research nurses or clinicalresearch associates, data managers, and regulatory spe-cialists. These staff members coordinate the review andselection of new clinical trial protocols for CCOPparticipation, disseminate protocol updates to the par-ticipating physicians, and collect and submit study data.CCOP-affiliated physicians accrue or refer participantsto clinical trials, and typically include medical, surgicaland radiation oncologists, general surgeons, urologists,gastroenterologists, and primary care physicians. CCOP-affiliated physicians, through their membership in CCOPresearch bases, also participate in the development ofclinical trials by proposing study ideas, providing inputon study design, and occasionally, serving in the role ofPI or co-PI for a clinical trial.

Study designThe study followed a hypothetico-deductive approach toqualitative research and used a longitudinal, multiplecase study design with the CCOP organization as theunit of analysis. Case study methods are well-suited forstudying implementation processes, which tend to befluid, non-linear, and context-sensitive [19-22]. Inaddition to permitting in-depth analysis of individualcases, case study methods offer analytic strategies forsystematically comparing patterns observed across cases[21,22]. Our sample consisted of three CCOP organiza-tions that are located in the Midwest that received initialCCOP funding between 2002 and 2005. Table 1 showsthe characteristics of each of the three CCOP organiza-tions upon inception of the program. An in-depth study

Page 4: View - Implementation Science

Figure 2 The National Cancer Institute’s community clinical oncology program. Adapted from: Kaluzny AD, Morrissey JP, McKinney MM.Emerging organizational networks: the case of the Community Clinical Oncology Program. In SS Mick and Associates, Innovations in Health CareDelivery: Insights for Organization Theory. San Francisco: Jossey-Bass, 1990.

Teal et al. Implementation Science 2012, 7:41 Page 4 of 15http://www.implementationscience.com/content/7/1/41

of these three CCOPs allowed us to explore theorganizational factors that facilitate or hinder the startup and early implementation of CBPPR in communitypractice settings. The Institutional Review Board (IRB)at the University of North Carolina at Chapel Hillapproved this study.

Data collection proceduresData were gathered through site visits, telephone inter-views, and archival documents from January 2008 to May2011. A two-person research team visited each CCOPorganization between January 2008 and April 2008. Duringthe site visit, the team conducted 47 individual and groupinterviews with CCOP leaders, CCOP physicians and staff,and hospital managers. In subsequent years, the team inter-viewed the CCOP PI and CCOP administrator from eachCCOP organization separately by telephone to gather dataabout implementation processes, facilitators, barriers, chal-lenges, and opportunities. Table 2 shows the breakdown ofinterview participants by CCOP organization over time.Each year, the team developed semi-structured interviewguides to conduct the interviews, audio-recorded the inter-views, and transcribed them verbatim.

Table 1 Characteristics of the CCOP Organizations in Their Fir

CCOPSite

First Yearof CCOPfunding

Numberof Performance

Sites

Number ofAccruingPhysiciansa

A 2002 3 11

B 2005 7 10

C 2002 2 29

Source: CCOP organizations’ first submitted CCOP progress report.Note: a. Includes physicians who enrolled at least 1 patient on a treatment and/or Cb. Does not include the PI, co-PI, or CCOP Administrator.

In addition, the research team obtained data from CCOPannual progress reports and grant applications. The NCIrequires that CCOP organizations file annual progressreports and periodic re-applications for funding. Thesedocuments included detailed data on the CCOP organiza-tion’s structure, operations, and accrual data for each treat-ment and CP/C clinical trial, sorted by CCOP researchbase, accruing hospital, and accruing physician.

AnalysisData analysis involved three phases: data coding, within-case analysis, and between-case analysis. In the firstphase, we used qualitative data analysis software,ATLAS.ti 5.0 (and later 6.0), to code the study data. Theconceptual framework provided a starting list of codes,which we supplemented with emergent codes as analysisproceeded. Using a common codebook, two investigators(BW, DB) conducted a pilot test by independently cod-ing four transcripts. They then fine-tuned the codingmanual’s definitions, decision rules, and examples. Twoother research team members (MJ, RT) coded theremaining documents and an investigator (DB) reviewedthe coding for accuracy and consistency. The research

st Year of Operation

Number ofCCOP FundedResearch Staff

FTEsb

Number ofNCI-researchbase affiliates

Number ofNewly DiagnosedCancer Patients

3.7 4 1474

7.2 4 2470

9.3 4 4115

P/C trial.

Page 5: View - Implementation Science

Table 2 Number and Type of Interview Participants per CCOP Organization, 2008 to 2011

Site 2008 (Yr 1) 2009 (Yr. 2) 2010 (Yr. 3) 2011 (Yr. 4)

CCOP A Number 17 5 2 2

Type CCOP PI CCOP PI CCOP PI CCOP PI

CCOP Admin CCOP Admin CCOP Admin CCOP Admin

2 hospital admin 1 hospital admin

6 physicians 2 physicians

7 CRAs/support staff*

CCOP B Number 11 6 2 2

Type CCOP PI CCOP PI CCOP PI CCOP PI

CCOP Admin CCOP Admin CCOP Admin CCOP Admin

1 hospital admin 2 hospital admin

4 physicians 4CRAs/support staff* 1 physician

1 research nurse

CCOP C Number 19 6 2 2

Type CCOP P I CCOP PI CCOP PI CCOP PI

CCOP Admin CCOP Admin CCOP Admin CCOP Admin

3 hospital admin 1 hospital admin

2 physicians 3 physicians

12 CRAs/support staff*

* Each site visit included a group interview with clinical research associates (CRAs) and CCOP support staff managing IRB and regulatory issues.

Teal et al. Implementation Science 2012, 7:41 Page 5 of 15http://www.implementationscience.com/content/7/1/41

team coded over 1,000 pages of interview transcriptscompiled from the three CCOP organizations.Due to wide variation in the formatting of progress

reports and grant applications across the three CCOPorganizations, we could not assign these documents toATLAS.ti. Instead, we extracted numerical data fromthese documents (e.g., patient accrual figures for individ-ual physicians) and used reported information totriangulate the results of this study. Over 500 pages ofprogress reports and grant applications were reviewed.In the second phase, we conducted a within-case

analysis of each CCOP organization. We generated sum-mary reports of each code for each CCOP site in Year 1through Year 3 of the study. We assessed the degree towhich the construct emerged in the data (its ‘salience’)and the degree to which relationships among constructswere consistent with the hypothesized model. Saliencein this study refers to the frequency with which con-structs (or corresponding codes) appeared in the dataand does not necessarily indicate the importance of theconstruct within the conceptual model [23,24]. Forexample, the implementation effectiveness constructappeared as a code 39 times in CCOP A interview tran-scripts, 49 times in CCOP B, and 46 times in CCOP C(Table 3).In the third phase, we applied the same criteria across

the cases to determine if cross-case variation in imple-mentation was consistent with the hypothesized

relationships in the model. We generated 12 meta-reports that summarized each code across all three sitesfrom Year 1 through Year 3 of the study. Over 120 sum-mary reports were generated from coded text segments.

ResultsThe organizational model for innovation implementationdepicted in Figure 1 proved useful for identifying and in-vestigating the organizational factors that facilitated andhindered start-up and early implementation of CBPPR inCCOP organizations. Generally speaking, the model’sconstructs were salient in the data (Table 3) and theobserved relationships among constructs fit the model.Some constructs exhibited less salience than others, asmeasured by the number of text units coded for thatconstruct, either because interview questions pertinentto that construct were asked only in specific years (e.g.,organizational readiness for change) or because inter-view participants had less to say about the issue (e.g.,management support) or because the construct proveddifficult to identify in the natural language responses ofinterview participants (e.g., IVF). Each year, we askedmany questions and heard a great deal about theoperational aspects of CBPPR implementation. Not sur-prisingly, IPP was the most salient construct in our data.Briefly, results can be summarized as follows. The three

CCOP organizations varied in the extent to which theyachieved consistency in CBPPR over time and across

Page 6: View - Implementation Science

Table 3 Occurrence of each coded text unit from interviews conducted between 2008 to 2011

Code Totaltext unitsacross sites

Illustrative quote

CCOPA

CCOPB

CCOPC

ImplementationEffectiveness

39 49 46 134 So far during this current grant period they’ve onlyput 11 patients on and they were pretty good aboutputting about one-third of the patients for the programoverall on. So they were good for 30 or 35 [patients].

OrganizationalReadiness forChange*

44 36 32 112 I had the feeling that they were fairly confident, notoverwhelmingly confident but fairly confident thatthey’d be able to do it [make the minimumrequirements to become a CCOP].

Management Support 79 54 55 188 Well, we were doing the research and I don’tknow if they [hospital management] still knowexactly what the CCOP is. They know it’s a grantfor our research program. I don’t think there’s alot of understanding at exactly how much thatthe research staff does.

ResourcesAvailable

97 95 68 260 We’re having problems with the trials. We arenot—and across the board, the physiciansfrom different areas—the Rad Onc physiciansare saying we just don’t have the RTOG trials thatwe need to be able to work with you all.

ImplementationPolicies &Practices

221 228 237 686 . . .so we'll see a patient, say we have somebodyhat comes in adjuvant colon cancer, and then theypotentially could be eligible for N0147. Then we’ll putthe information in the chart for the physician withour card and the consent. . .then we’ll put the gutsof the protocol, like the schema and the treatmentplan, eligibility in the calendar, and then I have a—whatwe call pink sheet. It’s a communication sheet thatwe’ll write out for the physician, and we’ll tell him this iswhat it looks like this patient might be eligible for, andthese are the tests needed, and there’s the consent.

Implementation Climate 96 65 83 244 I don’t feel the Rad Oncs are as committed to it, forwhatever reason, and it may be because of the waythey’re set up and the way they do their work, and itmay be set up because of the way their administration is,because we’re a private group. I don’t know for whateverreason why. There's no financial incentive to do it, but Ijust don’t feel that they’re into it as much.

Innovation-Values Fit

51 45 43 139 I think they [physicians] all want to do research. I think theyall want to get their patients the highest level of care. Ithink that in order to participate with the studies that that’swhat they’re doing. So they’re looking for the best way toserve their patients, as well as helping with the hospital.

ExternalFactors

121 137 120 378 It [economic downturn] has had more of an impact thanI would have thought. I was probably naïve about thatbut patients are much more concerned about their co-paysand about what additional charges they might be facing forthings that most of us would consider mundane visits likethe infusion room charge when they’re here getting a freedrug on a research study. Those things sometimes havebeen knockout punches for us.

Total text unitsover studyperiod

748 709 684 2141

*This is a pre-implementation construct only coded during Yr 1 interviews.

Teal et al. Implementation Science 2012, 7:41 Page 6 of 15http://www.implementationscience.com/content/7/1/41

physicians. All three CCOP organizations demonstratedmixed levels of organizational readiness for change, withstrong collective confidence to meet accrual goals but lowphysician and hospital commitment to CBPPR. Hospital

management support and resource availability were limitedacross CCOP organizations early on, although theyimproved in one CCOP organization. Through IPPs, allthree CCOP organizations provided high levels of research

Page 7: View - Implementation Science

Teal et al. Implementation Science 2012, 7:41 Page 7 of 15http://www.implementationscience.com/content/7/1/41

support to some CCOP physicians, but not others. How-ever, none created formal expectations or accountability forphysicians to help the CCOP organization meet its accrualgoals. Further, all three organizations used relatively weakIPPs to recognize and reward physicians for enrollingpatients in clinical trials. As a result, all three created aweak implementation climate. Patient accrual became con-centrated over time among those groups of physicians forwhom CBPPR exhibited a strong IVF. Several external(environmental) factors also influenced innovation use,complicating and enriching our intra-organizational modelof innovation implementation.Below we describe our results in detail, beginning with

implementation effectiveness.

Implementation effectivenessAs an organization-level construct, implementation effect-iveness refers to the consistency and quality of aggregateinnovation use among intended innovation users. We op-erationally defined innovation use as patient accrual, orthe enrollment of new patients in NCI-sponsored clinicaltrials. Aggregate innovation use is measured in terms ofCCOP-level patient accrual.Figure 3 shows that CCOP A exhibited the greatest

consistency over time in total patient accrual to NCI-sponsored clinical trials (i.e., treatment and CP/C accrualcombined). CCOP C exhibited a substantial jump intotal patient accrual in the first three years of CCOPfunding, followed by an even larger decline. CCOP Bexhibited a similar, but less pronounced pattern. Allthree CCOP organizations struggled with patient accrualin 2010 and 2011 due, in part, to external factors(described below). Viewed somewhat differently, CCOP

Figure 3 Total patient accrual by CCOP Organizations. Source: The CCOhttps://ccop.nci.nih.gov/Note: Total patient accrual includes all patients enrsponsored cancer treatment trials. CCOPs report patient accrual to the NCI

A met the NCI’s accrual goals in eight of the nineobserved years for CP/C trials and six of the nineobserved years for treatment trials. CCOP C met theNCI’s accrual goals in eight of the nine observed yearsfor CP/C trials, but only two of the nine years for treat-ment trials. CCOP B met the NCI’s accrual goals in twoof the six observed years for CP/C trials and none of thesix years for treatment trials. In sum, the consistency ofaggregate innovation use was highest in CCOP A, thenCCOP C, then CCOP B.The quality of aggregate innovation use did not vary

meaningfully across the three cases. There were minoraudit issues reported around eligibility criteria andprotocol selection in all three cases, but the data qualityand research procedures were generally robust.The three CCOP organizations also varied in the

consistency with which physicians engaged in CBPPR.To examine how and why this is so, we turn to ourorganizational model of innovation implementation, be-ginning with an analysis of organizational readiness forchange.

Organizational readiness for changeSuccess in the start-up phase of implementationdepends in large part on ORC. As noted earlier, ORCconcerns organizational members’ collective confidenceto implement change and collective commitment topursue courses of action that will lead to successfulchange. In all three CCOP organizations, collectiveconfidence was high. Prior to becoming a CCOP, allthree CCOP organizations had participated in NCI-sponsored clinical trials as research affiliates of co-operative groups. As research affiliates, they gained

P, MBCCOP, and Research Base Management System website:olled in NCI-sponsored cancer prevention and control trials and NCI-on a 12-month basis from 1 June to 31 May.

Page 8: View - Implementation Science

Teal et al. Implementation Science 2012, 7:41 Page 8 of 15http://www.implementationscience.com/content/7/1/41

experience enrolling and managing patients on clinicaltrials and used the per-case reimbursement that theyreceived for enrolling patients to hire research staff tosupport participating physicians.Interview participants said that, when they applied for

CCOP funding, they knew they would have to signifi-cantly ramp up their clinical trials research operations tomeet the NCI’s accrual goals. Moreover, they acknowl-edged some trepidation about having to enroll patientsin CP/C trials, a type of clinical research with which theyhad less experience. Nonetheless, they felt confident thatthey could quickly increase the scope and scale of theirresearch operations:

‘We were pretty confident we’d be able to meet thetreatment trials [accrual goal], if the right trials wereopen. . .. The prevention [trials accrual goal] hasalways been kind of an iffy in the very beginning, justbecause there weren’t [large prevention] trialsavailable. . .. There wasn’t a SELECT [the Seleniumand Vitamin E Cancer Prevention Trial], there wasn’ta STAR [Study of Tamoxifen and Raloxifene]. So wethought we’d be able to meet the treatment. We neverknew about the prevention.’—CCOP B, CRA groupinterview.

Although collective confidence was high, collectivecommitment was more variable in all three CCOP orga-nizations. Some physicians were committed and enthusi-astic about becoming a CCOP organization; otherphysicians were ‘supportive’ or merely ‘interested.’ Inretrospect, it is not clear whether physicians understoodthat, as a CCOP, they collectively would be ‘on the hook’for a much higher level of accrual than they were usedto generating as a research affiliate. To meet the NCI’sexpectations, physicians would have to step up their per-sonal participation in the clinical research, enrollingpatients routinely rather than occasionally in clinicaltrials. The CCOP PIs of all three CCOP organizationssucceeded in building physician support to become aCCOP organization, but the broad support they gar-nered did not translate into broad (or deep) commit-ment to enrolling more patients in clinical trials afterthe CCOP grant was awarded:

‘I think that they [physicians] were tolerant of it but Idon’t know that they would have independentlythought of it and I don’t know that they perceivedthat to become a CCOP was going to move us to adifferent level or establish a different regard for ourpractice in the community. But, you know, I do thinkthat they heard me out and I think they weresupportive of it when they heard the reasons I felt itwas a good idea.’—CCOP B, PI

In addition, it is not clear that the CCOP PIs made thestrategic or competitive case to hospital managers forbecoming a CCOP organization (or, if they did make thecase, that hospital managers understood it). In none ofthe cases did the decision to apply for CCOP fundingcome from hospital managers. Instead, the PIs came upwith the idea to apply and wrote the CCOP proposal. Inone case, the PI waited until a ‘friendly’ but interim hos-pital manager was in place before asking the hospital tosign-off on the CCOP proposal. While the PI receivedthe sign-off, the stage was set for low levels of manage-ment support once the program was funded.In all three cases, the minimal involvement of hospital

management in the decision to apply for CCOP fundingset the stage for minimal management support once thegrant was awarded. In all three cases, managers viewedthe CCOP as a ‘doctor’s program,’ with potential benefitsor advantages for the hospital either misunderstood orunrealized:

‘For many years here, they weren’t supportive. Thenthey became supportive when they thought that it was[good] for the hospital’s existence. So, hospitaladministrators will not support programs that are forresearch, and maybe that was my wrong approachbecause I told them this was a research project. Inretrospect, what I should have done is said this is nota research program, this is a quality of careprogram.’—CCOP C, PI

In sum, the three CCOP PIs overestimated collectiveconfidence and underestimated collective commitment.Although all three CCOP organizations met NCI’s ac-crual goals right away, the mixed organizational readi-ness for change exhibited in the pre-implementationphase foreshadowed the problems that all three casesencountered later in implementation.

Management support and resource availabilityManagement support matters because implementationis resource intensive. Managers can support CBPPR byallocating needed financial, human, and materialresources. In addition, they can shape the implementa-tion context by communicating that CBPPR is anorganizational priority, encouraging organizationalmembers to implement policies and practices to supportCBPPR. Finally, managers can support implementationby overcoming resistance to change and resolving dis-putes over resource allocations, organizational routines,and lines of authority.Given the minimal involvement of hospital managers

in the decision to become a CCOP organization, man-agement support during implementation was largelysymbolic. The managers we interviewed expressed verbal

Page 9: View - Implementation Science

Teal et al. Implementation Science 2012, 7:41 Page 9 of 15http://www.implementationscience.com/content/7/1/41

support for the CCOP organization, but did not seeCBPPR as important in advancing the hospital’s strategicgoals or improving its competitive position:

‘When I attract physicians to the organization, there’seight or nine things that come to the top and this [theCCOP] would probably be one of I’d say five or six, asopposed to the first five. So, I would tell you it’s not inmy top three. I would say that when I’m talking aboutattracting doctors to the organization, I’m generallytalking more at the global level.’—CCOP B, hospitalmanager.

In two cases, hospital managers made no financial com-mitment to CBPPR over the course of our study, despitethe fact that NCI funds CCOP organizations with expecta-tions that institutional cost-sharing will occur. Instead,managers expected the CCOP organization to ‘live on itsCCOP grant’ and any supplemental funding that physiciansgenerated from participating in pharmaceutical industrytrials:

‘Basically, it amounts to the CCOP grant, we get alittle bit of money from the cooperative groups andwe get the money from industry trials and thehospital helps in the sense that they own thefoundation that gives us some philanthropic supportthat helps us fill the space between what we cangenerate and what our true costs are. . .they’re [thehospital] kicking in zero dollars for every dollar thatwe generate.’—CCOP A, PI

In the third case, hospital managers realized thatCBPPR had strategic value in light of the hospital’s plansto become a major teaching affiliate of a newly openedmedical school. Three years into implementation, hos-pital managers began making substantial financial com-mitments to the CCOP organization, starting at onemillion dollars and, later, as the Great Recession contin-ued, reducing the amount to $800,000:

‘I think it’s evolved through the years, but I thinkresearch has taken a greater importance. . .as that hashappened people have seen, well, you have the CCOP,it’s been the gem, it’s been sitting there all these yearsand it’s just kind of like rediscovered every year. Butnow, like I said, with the medical school, theimportance of research is clearer.’—CCOP C,physician

Although management support increased over time inone case, the modest management support that existed inall three CCOP organizations during start-up and early im-plementation was, in interview participants’ minds, both a

blessing and a curse. On the one hand, the CCOPorganization could ‘fly under the radar’ with little manage-ment interference. Moreover, it was largely shielded fromthe staffing reductions and budget cuts that the hospitalimposed elsewhere as the ‘Great Recession’ persisted. Onthe other hand, the CCOP organization could not expandits operations beyond what the CCOP grant could sustain.Further, the CCOP organization could not use hospitalmanagers’ authority and resources to implement policiesand practices that would make physicians accountable forhelping the CCOP organization meet its accrual goals.

Implementation policies and practicesOrganizations can make use of a wide variety of IPPs toenhance organizational members’ means, motives, and op-portunities for innovation use. IPPs are cumulative, com-pensatory, and equifinal [15,17]. That is to say, all otherthings being equal, the more IPPs an organization employsto support innovation use the better. However, high-qualityIPPs can sometimes compensate for missing or poor-quality IPPSs. For example, excellent in-person trainingcould substitute for mediocre program manuals. Finally, dif-ferent combinations of IPPs can produce the same results.Our analysis identified three IPP themes. First, all three

CCOP organizations provided a high level of staffing andoperational support for some, but not all, of the physicianswho wanted to enroll patients in NCI-sponsored clinicaltrials. Although the specific IPPs used by the three CCOPorganizations differed somewhat (e.g., some used a mix ofregistered nurses and clinical research associates whileothers used only registered nurses to staff the clinical trialsprogram), all three provided support by screening chartsto identify potentially eligible patients, flagging charts toprompt physicians to introduce trials to eligible patients,educating and consenting interested patients, followingpatients once enrolled in a study, and managing IRB andother regulatory requirements. However, they could onlyprovide such high levels of support to those physicianpractices located ‘on-site’ or in close proximity to the hos-pitals where research staff worked. Support for ‘off-site’physician practices was harder to provide or sustain dueto travel time and logistical difficulties. The limited sup-port CCOP organizations could offer these physicianscontributed to uneven distribution of patient accrual.Second, all three CCOP organizations avoided using

IPPs to create formal expectations or accountability forphysicians to help the CCOP organization meet its ac-crual goals. The CCOP PIs and CCOP administratorsroutinely cajoled, sometimes exhorted, and occasionallybegged physicians to enroll more patients in clinicaltrials. They regularly provided physicians with feedbackon the CCOP organization’s accrual performance relativeto NCI’s accrual goals and usually provided physician-level patient accrual figures. Yet, none were willing to

Page 10: View - Implementation Science

Teal et al. Implementation Science 2012, 7:41 Page 10 of 15http://www.implementationscience.com/content/7/1/41

put into place a policy specifying the minimum number ofpatients per year that a physician had to enroll in order tomaintain membership in the CCOP organization and,thereby, enjoy the reputational benefit of having an ‘NCI af-filiation’:

‘This institution will not tell the referring doctors thatthey can’t be members of the CCOP for some criteria.You’ve got to put, let’s say, ten patients on a year. Infact, the institution is going to the oppositedirection.’—CCOP C, PI

Moreover, none were willing to confront physicianswho maintained their membership in the CCOPorganization, yet consistently contributed no patient ac-crual. Interview participants mentioned several reasonswhy they were unwilling or unable to implement suchpolicies and practices, including not wanting to shutanyone out who might at some point contribute accrual;not wanting to exclude off-site physicians who receivedlittle or no support from the CCOP organization; andnot wanting to confront colleagues who, in some cases,were business partners:

‘No, the group has to provide support for the clinicalresearch program but there’s no individualexpectation that every physician will accrue tenpatients per year to clinical trials. We keep trying tobang that drum and we provide feedback to ourdoctors on a regular basis about how they are doing,but I can’t tell that they have the requisite shame thatI would hope that they would have with some of thenumbers.’—CCOP B, PI

CCOP organizations sought to compensate for thesemissing policies and practices by providing more re-search support, but with little success.Finally, all three CCOP organizations used relatively

weak IPPs to recognize and reward physicians for enrol-ling patients in clinical trials. Financial incentives orrewards were out of the question due to ethical, legal,and regulatory concerns. So, CCOP organizations fellback on social recognition and non-monetary rewards,neither of which remained effective over time. Social rec-ognition included verbal praise in meetings, certificates ofappreciation, complimentary write-ups in newsletters, andtoken awards given to the highest-accruing physician.Non-monetary rewards included paid travel to confer-ences, batches of cookies, and other small gifts. The mo-tivating effects of these rewards diminished over time:

‘I think that we’ve tried all the rewards that probablymany other research organizations have tried. Youknow, we’ve given, boxes of candy and we’ve given gift

certificates and cookies. . .we found our doctors arenot going to enter patients on clinical trials for thebenefit of winning the chocolate chip cookies orwinning a gift certificate.’—CCOP B, PI

In sum, all CCOP organizations employed IPPs unevenly.Although some IPPs can substitute for other IPPs, the chal-lenges that the CCOP organizations experienced providingsupport to off-site physicians and the minimal or low-power IPPs that they used create expectations, accountabil-ity, recognition, and rewards produced a weak implementa-tion climate.

Implementation climateIPPs influence innovation use through implementation cli-mate. Organizational members ascribe meaning to the IPPsthat they experience directly or vicariously. Through ex-perience, observation, and discussion, they develop a sharedsense of whether innovation use is expected, supported,and rewarded. This shared sense is important when theorganizational benefits of innovation use depend on con-sistent, high-quality use by many organizational members.Such is the case with CBPPR in CCOP organizations. TheNCI evaluates and funds CCOP organizations based on ag-gregate (i.e., CCOP-level) accrual.Given the uneven IPPs that CCOP organizations put

into place, implementation climate was weak in all threecases. None of the interview participants reported thatthey were expected either by the CCOP organization orby hospital management to enroll any particular numberof patients in clinical trials each year to maintain mem-bership in the CCOP organization. Likewise, nonereported that they were recognized or rewarded in ameaningful way for helping the CCOP organization meetits accrual goals. Physicians practicing in or near thehospital felt supported by the CCOP to enroll patients inclinical trials; off-site physicians did not.Although implementation climate was weak in all

three CCOP organizations, all three nonetheless met orexceeded NCI’s accrual goals. To understand how theCCOP organizations achieved this feat, we must exam-ine the pattern of IVF among physician groups.

Innovation-values fitAs noted above, IVF refers to the extent to whichorganizational members perceive that innovation usewill foster the fulfillment of their values. Values are con-cepts or beliefs that pertain to desirable end states orbehaviors. Although individuals can hold differentvalues, so too can groups. Different physician groups, forexample, may hold different beliefs about clinical trialsparticipation. Individuals and groups can also differ inthe amount of feeling they attach to a concept or a be-lief. Clinical trials participation might be valued strongly

Page 11: View - Implementation Science

Teal et al. Implementation Science 2012, 7:41 Page 11 of 15http://www.implementationscience.com/content/7/1/41

and considered a priority, or it might be valued weaklyand considered desirable, but not necessary.When implementation climate is weak, innovation use

depends primarily on IVF. Individuals and groups thatstrongly value CBPPR will enroll patients in clinical trialseven if they perceive little expectation or extrinsic re-ward for doing so, provided adequate support exists. Ifinadequate support exists, ‘true believers’ will find it dif-ficult to enroll patients on a consistent basis, leadingperhaps to frustration and disappointment. By contrast,individuals and groups that do not strongly value CBPPRwill enroll few patients, if any, because they see little rea-son to do so when enrolling patients is neither expected,nor supported, nor rewarded.We observed individual differences in IVF in all three

cases. Not surprisingly, the CCOP PIs highly valuedCBPPR and enrolled many patients. In one CCOPorganization, a physician enrolled many patients inpharmaceutical industry trials but relatively few in NCI-sponsored clinical trials. Industry trials exhibited agreater fit with his high-intensity values because, in hisview, they gave him access to newer, more excitingdrugs. In another CCOP organization, a physician statedthat he did not like the NCI’s treatment trials becausethey were not, in his view, always addressing importantscientific questions; however, he was willing to help theCCOP organization achieve its accrual goals by enrollingpatients in cancer control trials.We also observed group-level differences in IVF in all

three cases. Medical oncologists as a group morestrongly valued CBPPR, and therefore enrolled morepatients in NCI-sponsored clinical trials, than did radi-ation oncologists or other physician groups. Interviewparticipants in two CCOP organizations noted that radi-ation oncologists as a group were ‘supportive,’ but not‘committed’ to enrolling patients in NCI-sponsored clin-ical trials. The radiation oncologists would follow theclinical trial protocol for those patients that the medicaloncologists enrolled, but they would not enroll patientsin radiation-therapy clinical trials in sufficient numbersto help the CCOP organization meet its accrual goals:

‘The total again is 8% of the patients. So that meansthey put eight, nine, ten patients on this wholeyear. . .I’ve been here for 20 years and this is prettytypical for the way the radiation therapists haveperformed over the last 20 years. This is nothingdifferent. They just don’t do a lot of randomized, youknow, Phase III trials, which is what we do as aCCOP. They do a lot of Phase II trials.’—CCOP C, PI

Even when provided with additional research staff sup-port, radiation oncologists as a group would not contrib-ute much on a consistent basis. CBPPR was simply not a

high-intensity value for them, interview participantssaid.In a third case, radiation oncologists as a group strongly

valued CBPPR, as evidenced by their active in-house re-search program. However, they preferred to enroll patientsin in-house studies rather than NCI-sponsored clinicaltrials because such studies were more instrumental in ful-filling their values (i.e., they felt such studies were moreinteresting and easier to do).Given weak implementation climate and variable IVF, all

three CCOP organizations exhibited inconsistent, unevendistribution of patient accrual among individuals andgroups of physicians. Of the three cases, CCOP A exhibitedthe strongest IVF among medical oncologists as a group.This CCOP organization recruited and hired physicianswho valued CBPPR, and it reinforced CBPPR as a groupvalue in a variety of formal and informal ways. Reflecting amore distributed (or even) pattern of patient accrual, the PIof CCOP A enrolled on average only 28% of the patientsper year that CCOP A accrued to NCI-sponsored clinicaltrials from 2002 to 2010. By comparison, accrual was moreconcentrated at CCOP C and CCOP B. In the former case,the average annual accrual contribution of the PI was 34%,while in the latter case it was 44%. At start-up, CCOP Cexhibited more evenly distributed accrual among physi-cians, with the CCOP PI accounting for only 18% of patientaccrual in the first year. With off-site physicians receivinglittle support, and radiation oncologists focused on in-house studies, initial enthusiasm faded and accrual becomeincreasingly concentrated among the few ‘true believers.’ By2008, six years after start-up, the CCOP PI’s contributionaccounted for over 42% of total patient accrual. In CCOPB, the concentration of patient accrual was high at start-upand increased steadily over time. In the first year, the PI’scontribution accounted for 39% of total patient accrual. By2009, it amounted to 59%, despite personal appeals fromthe highly regarded PI and increased research support forlow- or non-accruing physicians. An interview participantdescribed the situation:

‘And our accruals here are actually up but once again,we’re really concerned that [the CCOP PI] has [put]40 some odd patients on treatment trials. Everyoneelse in the CCOP is at single digits. And we don’tknow, we’re getting ready to have a discussion withthe other physicians and [the PI] and I are mullingthings over on how to present this but this can’t goon. It just can’t go on. If you want us to maintainourselves as a viable CCOP, everybody’s got to pulltheir weight.’ –CCOP B, CCOP Administrator

External factorsAlthough the organizational model of innovation imple-mentation used in this study focused our attention on

Page 12: View - Implementation Science

Teal et al. Implementation Science 2012, 7:41 Page 12 of 15http://www.implementationscience.com/content/7/1/41

intra-organizational facilitators and barriers, we observedthat CCOP organizations are heavily dependent on two ex-ternal resources needed for CBPPR—trials and patients. Invarying degrees, all three CCOP organizations experienceddifficulties obtaining these resources, which, in turn, exacer-bated the challenges posed by a weak implementationclimate and variable IVF.Interview participants noted that the menu of NCI-

sponsored clinical trials available to CCOP organizationshas shrunk in the past two years and contains significantgaps in common disease areas, such as colon cancer.Interview participants attributed the trial availabilityproblem to the consolidation of the cooperative groupsystem that is currently taking place in response to aninfluential, yet critical report by the Institute of Medicine[25], a not-for-profit, non-governmental organizationthat advises the federal government on issues related tobiomedical science, medicine, and health. With fewertreatment trials available, CCOP organizations are strug-gling to meet their treatment trial accrual goals. Inaddition, with no large cancer prevention trials available,CCOP organizations are trying to meet their CP/C ac-crual goals through symptom management trials. Thesesmaller trials often close quickly to new patient accrual,disadvantaging those CCOP organizations (like CCOPC) that work with slow IRBs, which must approve stud-ies locally before CCOP organizations can enroll patientsin them.Interview participants further noted that NCI-

sponsored clinical trials are increasingly testing targetedtherapies applicable only to small subgroups of cancerpatients. Not long ago, breast cancer treatment trialswould open to almost any patient with, say, advanceddisease and no co-morbidities. Today, trials are testingdifferent therapies for node positive and negative breastcancer, which can be further divided into pre- and post-menopausal categories, which can be further sub-dividedinto other categories, creating approximately 16 differentgroups of breast cancer patients. Similar trends are oc-curring in treatment trials for other cancers. Interviewparticipants referred to this as a ‘needle in a haystack’problem:

‘What’s happening is [that] what used to be giganticbaskets of patients are [now] being divided up bybiology. It’s better medicine to divide them. . .. Eachone of these breast cancers is a bit different, [and]each one of them has a different [treatment] protocol.So now. . .I just cut breast cancer into 16 differentgroups. . .. And I think that accounts for a largedegree why we don’t really have any rainmakerstudies.’—CCOP A, PITo meet their accrual goals, some interview partici-

pants observed, they cannot afford to miss any

opportunity to enroll an eligible patient. Hence, they feelfrustrated when low- or non-accruing physicians let aneligible patient go un-approached.Finally, interview participants commented that meeting

the NCI’s expectations for patient accrual became increas-ingly difficult in the face of persistent economic recessionand increased market competition. Midwestern states,where the three COCP organizations operate, have been hitespecially hard. Patient volume has declined across theboard as patients put off medical care to manage other liv-ing expenses. Interview participants noted that patientshave voiced more concern about enrolling in a clinical trialbecause they are worried about their health insurancecoverage and out-of-pocket costs. Given job insecurity,patients are more reluctant to take time off work to receivemultiple rounds of cancer treatment and return for follow-up clinical visits. At the same time, local market competi-tion has grown. The PI of CCOP A noted, for example, thathe and his four partners were the only medical oncologistsin town in the first year of the CCOP grant. Now, he haseight partners and there are two competing oncology prac-tices not affiliated with the CCOP organization that haveseven more medical oncologists. The number of cancerpatients has not increased threefold over time.

DiscussionCBPPR is a promising strategy for translating researchresults into clinical practice. By engaging providers inthe research process, researchers can gain insight intothe clinical issues in community practice settings, obtainprovider input on study design and implementation con-cerns, and discover the tacit practice-based knowledgethat exists in community practice settings. By doing re-search in their own practice settings, providers may feela greater sense of ownership and trust in clinical re-search results, which, in turn, may increase their com-mitment to acting on research findings. Yet, CBPPRdoes not occur spontaneously or effortlessly. To partici-pate in research, community-based providers often mustimplement systemic changes in organizational staffing,office workflow, information systems, and reward struc-tures. In addition, they must develop an organizationalculture (shared values) that supports CBPPR andevidence-based practice.Our study findings contribute to the limited body of

research on the implementation of CBPPR. Three find-ings in particular have policy and practice implicationsthat merit discussion. First, CBPPR is difficult to createand sustain in the absence of management support fromlocal, affiliated hospitals and health systems. Managersnot only control critical institutional resources, but play(or could play) an important role in legitimizing thevalue of CBPPR, authorizing policies and practices tosupport CBPPR implementation, and addressing turf

Page 13: View - Implementation Science

Teal et al. Implementation Science 2012, 7:41 Page 13 of 15http://www.implementationscience.com/content/7/1/41

issues and other political obstacles to CBPPR. From apolicy standpoint, the NIH and other federal agenciesseeking to increase CBPPR should require hospital andhealth system executives to make tangible institutionalcommitments in order to participate in federally fundedprovider based research networks or other programs. Inits National Community Cancer Centers Program, theNCI went a step further by engaging the chief executiveofficers of participating hospitals and health systems as astakeholder group, inviting them to national programmeetings, recognizing their participation publicly, andcreating opportunities for them to meet privately as agroup with the NCI Director [26]. From a practicalstandpoint, community-based providers need to makethe strategic or business case for CBPPR by showinghospital and health system executives how participatingin clinical research aligns with and advancesorganizational priorities. It might be useful to frameCBPPR as ‘quality enhancement’ rather than ‘research,’as the CCOP PI did in the one case we observed wherehospital managers made a significant financial commit-ment to CBPPR. Newly available tools for assessing thebusiness case could also prove useful [27].Second, community-based providers face demanding

schedules and heavy workloads and, thus, need substan-tial staff support to participate in clinical research.CCOP organizations struggled to provide adequate staffsupport to physicians who practiced in locations morethan a few miles from the hospital or physician officebuildings where research staff were based. Even high-accruing physicians found it difficult to enroll patients inclinical trials when they traveled to outlying rural clinics.The policy implication is that the NIH and other federalagencies seeking to promote CBPPR should be preparedto provide adequate, consistent funding to create andmaintain the research infrastructure necessary to sup-port CBPPR. To its credit, the NCI uses peer-reviewedcooperative agreements to award multi-year grant fund-ing directly to CCOP organizations, rather than passingthe money through academic medical centers or otherintermediaries [12]. Direct funding puts more resourcesin the hands of community-based providers who want toengage in clinical research. Multi-year funding offerspredictability and stability for longer-term planning, or-ganizing, and staffing. Cooperative agreements permitlocal flexibility in resource allocation, while ensuringregular performance monitoring and financial reporting.From a practical standpoint, community-based providerscould stretch the resources they have for CBPPR byexperimenting with decentralized research staffing mod-els, having a research staff member accompany travelingphysicians, training office staff members to assist withresearch tasks, using information and communicationstechnologies to support providers’ participation in

research, and triaging research staff to support those‘off-site’ providers with the greatest potential for andlikelihood of increased research participation.Finally, research infrastructure is necessary, but not

sufficient to gain providers’ commitment to CBPPR. Al-though CCOP organizations were able to provide highlevels of staff support to at least some CCOP physicians,they did not succeed in creating a shared sense amongCCOP physicians that enrolling patients in clinical trialswas expected, supported, and rewarded. Given a weakimplementation climate and variable degrees of IVF, allthree CCOP organizations exhibited inconsistent, un-even distribution of patient accrual among individualsand groups of physicians. Implementation climate ismore amenable to policy and practice intervention thanIVF [16,17]. At the policy level, the NIH, other federalagencies, and professional associations could stimulatedemand for CBPPR by launching a national public ser-vice advertising campaign to encourage patients to par-ticipate in clinical research [25]. Increased patientdemand could increase clinicians’ sense that CBPPR isexpected. Further, these parties could do more to pro-vide public recognition of clinicians who engage inCBPPR. On a practical note, community-based providersshould consider establishing formal expectations andaccountability procedures to strengthen the implementa-tion climate for CBPPR [25]. The Delaware ChristianaCCOP organization did so in 2008 [28] and laterreported that CCOP accrual increased, despite one phy-sician’s resignation from the CCOP organization [29]. In2011, two of the three CCOP organizations we studiedfollowed Delaware Christiana’s example. It is too soon totell what results they achieved.The organizational model of innovation implementation

employed in this study proved useful for investigating theorganizational factors that facilitated and hindered start-up and early implementation of CBPPR in CCOP organi-zations. However, study findings results suggest two waysin which the model could be refined. First, the model’sfocus on intra-organizational factors obscures the extentto which extra-organizational factors influence innovationimplementation within organizations. Community-basedproviders are heavily dependent on external resources andsubject to many legal, regulatory, and professional con-straints. While these external factors may influenceinnovation implementation through intra-organizationalfactors identified in the theory (e.g., by shaping manage-ment support, resource availability, and IPPs), they couldalso directly influence innovation use. CCOP organiza-tions, for example, cannot modify the protocols of NCI-sponsored clinical trials once the trials become open forpatient accrual. CCOP organizations can decide not tomake use of trials that do not fit local patient populations,practice patterns, or organizational resources. Doing so,

Page 14: View - Implementation Science

Teal et al. Implementation Science 2012, 7:41 Page 14 of 15http://www.implementationscience.com/content/7/1/41

however, limits CBPPR implementation and affects CCOPaccrual. External factors may also change the conceptualmeaning of intra-organizational factors. For CCOP organi-zations, for example, resource availability includes bothorganizational slack—that is, unused and potentially avail-able financial resources from hospitals [15,30-32]—as wellas CCOP grant funding.Second, we described IPPs as cumulative, compensa-

tory, and equifinal. Specifically, following Klein et al.[15,17], we argue that the presence of some high-qualityIPPs may compensate for the absence or low quality ofother IPPs. Although we found some support for thisidea, our results suggest there are limits to IPPs’ substi-tutability. Providing more research support to low- ornon-accruing physicians, for example, did not compen-sate for missing or weak IPPs concerning expectations,accountability, recognition, or rewards. Organizationalmembers must have not only the means to achieve con-sistent, high-quality innovation use, but also the motivesand the opportunity to do so. It may be the case thatIPPs can compensate or substitute for one anotherwithin the categories of means, motives, and opportun-ities, but not perhaps across these categories.

Study limitationsAll studies have limitations. Ours is no exception. Casestudy research emphasizes depth over breadth, insightover generality [20-22]. Three cases, no matter howclosely studied, do not provide a sufficient basis for statis-tical generalization. To help others assess the reasonable-ness of applying our findings to other practice-basedresearch networks (PBRN), we offer a ‘proximal similaritymodel’ [33] that suggests two important gradients alongwhich PBRNs vary: the type of clinical research conductedby the PBRN, and the strength of the science upon whichit is based; and the level of federal support provided to thePBRN. The more proximal a PBRN is to the CCOP onthese gradients, the greater the reasonableness of applyingstudy findings beyond the specific people, places, settings,and times from which they were generated. With thisproximal similarity model in mind, two factors increaseour confidence in the transferability of our study findings.First, the CCOP has already served as a model for otherfederally supported PBRNs [34], and the NIH continues tolook to the CCOP for guidance as it implements the Road-map. Second, a recent survey found that many clinicalresearch networks funded by the NIH share with theCCOP key organizational features with respect to funding,design, and operation [35]. This means that many NIH-funded clinical research networks are proximally similarto the CCOP.As is true of all research, case study research involves an

irreducible element of expert judgment. We employed sev-eral case study research tactics to increase the dependability

and credibility of our study results, including the use of the-ory to conceptually and operationally define constructs; useof multiple sources of data; use of experienced interviewers;use of a common interview guide and codebook; audio-recording and transcription of interviews; use of softwareto code transcripts and document coding decisions; andduplicate coding and systematic review of coded tran-scripts, summary reports, and meta-reports by multipleresearch team members. We feel confident in our use ofthese time-honored case study research tactics, but we can-not discount fully the possibility that investigator bias ininterpretation influenced our results.

ConclusionIn conclusion, a small, but growing body of research indi-cates that community-based providers who engage inclinical research more rapidly adopt evidence-based clinicalservices than those who do not engage in clinical research[36-41]. These studies suggest that CBPPR holds promiseas a strategy for accelerating the translation of researchresults into clinical practice. Our results contribute to thelimited body of research on the implementation ofcommunity-based provider participation in research. Theyinform policy discussions about increasing and sustainingcommunity clinician involvement in clinical research andexpand on theory about organizational determinants of im-plementation effectiveness.

Competing interestsThe authors declare that they have no competing interests.

Author contributionsBJW conceived the idea for the manuscript. RT took the lead in drafting it.BJW, RT, DB, and MJ conducted the research that informed the manuscript,participated in data analysis and interpretation, made editorial andsubstantive changes to manuscript drafts, and approved the final manuscript.All authors contributed equally to this work. All authors read and approvedthe final manuscript.

AcknowledgementsThis work was supported by funding from the National Cancer Institute (1R01 CA124402). The authors would like to thank Charles Michael Belden forhis assistance with this research.

Author details1Cecil G. Sheps Center for Health Services Research, University of NorthCarolina, Chapel Hill, NC, USA. 2Department of Health Policy andManagement, Gillings School of Global Public Health, University of NorthCarolina, Chapel Hill, NC, USA.

Received: 14 October 2011 Accepted: 17 April 2012Published: 8 May 2012

References1. Zerhouni E: Medicine. The NIH Roadmap. Science 2003, 302:63–72.2. NIH Roadmap for Medical Research. [http://nihroadmap.nih.gov/

clinicalresearch/overview-translational.asp]3. Zerhouni EA: US biomedical research: basic, translational, and clinical

sciences. JAMA 2005, 294:1352–1358.4. Sung NS, Crowley WF Jr, Genel M, Salber P, Sandy L, Sherwood LM,

Johnson SB, Catanese V, Tilson H, Getz K, et al: Central challengesfacing the national clinical research enterprise. JAMA 2003,289:1278–1287.

Page 15: View - Implementation Science

Teal et al. Implementation Science 2012, 7:41 Page 15 of 15http://www.implementationscience.com/content/7/1/41

5. NCRR Fact Sheet: Clinical and Science Translational Awards. [http://www.ncrr.nih.gov/publications/pdf/ctsa_factsheet.pdf]

6. Westfall JM, Mold J, Fagnan L: Practice-based research–'Blue Highways' onthe NIH roadmap. JAMA 2007, 297:403–406.

7. Kahn K, Ryan G, Beckett M, Taylor S, Berrebi C, Cho M, Quiter E, Fremont A,Pincus H: Bridging the gap between basic science and clinical practice: arole for community clinicians. Implement Sci 2011, 6:34.

8. Kon AA: The Clinical and Translational Science Award (CTSA) consortiumand the translational research model. Am J Bioeth 2008, 8:58–60.

9. Fagnan LJ, Davis M, Deyo RA, Werner JJ, Stange KC: Linking Practice-BasedResearch Networks and Clinical and Translational Science Awards: NewOpportunities for Community Engagement by Academic Health Centers.Academic Medicine 2010, 85:476–483.

10. Beckett M, Quiter E, Ryan G, Berrebi C, Taylor S, Cho M, Pincus H, KahnK: Bridging the gap between basic science and clinical practice: Therole of organizations in addressing clinician barriers. Implement Sci2011, 6:35.

11. McMillen JC, Lenze SL, Hawley KM, Osborne VA: Revisiting Practice-BasedResearch Networks as a Platform for Mental Health Services Research.Administration and Policy in Mental Health and Mental Health ServicesResearch 2009, 36:308–321.

12. Minasian LM, Carpenter WR, Weiner BJ, Anderson DE, McCaskill-Stevens W,Nelson S, Whitman C, Kelaghan J, O'Mara AM, Kaluzny AD: Translatingresearch into evidence-based practice: the National Cancer InstituteCommunity Clinical Oncology Program. Cancer 2010, 116:4440–4449.

13. Weiner BJ: A theory of organizational readiness for change. Implement Sci2009, 4:67.

14. Damschroder LJ, Aron DC, Keith RE, Kirsh SR, Alexander JA, Lowery JC:Fostering implementation of health services research findings intopractice: a consolidated framework for advancing implementationscience. Implement Sci 2009, 4:50.

15. Klein KJ, Conn AB, Sorra JS: Implementing computerized technology: Anorganizational analysis. Journal of Applied Psychology 2001, 86:811–824.

16. Weiner BJ, Lewis MA, Linnan LA: Using organization theory tounderstand the determinants of effective implementation of worksitehealth promotion programs. Health Educ Res 2009, 24:292–305.

17. Klein KJ, Sorra JS: The challenge of innovation implementation. AcadManage Rev 1996, 21:1055–1080.

18. Weiner BJ, Belden CM, Bergmire DM, Johnston M: The meaning andmeasurement of implementation climate. Implement Sci 2011, 6:78.

19. Ferlie E, Fitzgerald L, Wood M, Hawkins C: The nonspread of innovations:The mediating role of professionals. Acad Manage J 2005, 48:117–134.

20. Miles MB, Huberman AM: Qualitative data analysis: an expanded sourcebook.2nd edition. Thousand Oaks, CA: Sage Publications; 1994.

21. Yin RK: Case study research. 4th edition. Thousand Oaks, CA: SagePublications; 2009.

22. Gerring J: Case study research: principles and practices. Cambridge:Cambridge University Press; 2007.

23. Namey E, Guest G, Thairu L, Johnson L: Data reduction techniques forlarge qualitative data sets. [http://www.stanford.edu/~thairu/07_184.Guest.1sts.pdf]

24. Beutow S: Thematic analysis and its reconceptualization as ‘saliencyanalysis. J Health Serv Res Policy 2010, 2:123–125.

25. Institute of Medicine (U.S.). Committee on Cancer Clinical Trials.,National Academies Press (U.S.), Institute of Medicine (U.S.). Board onHealth Care Services., NCI Cooperative Group Program (National CancerInstitute): A national cancer clinical trials system for the 21st centuryreinvigorating the NCI cooperative group program. In Book A nationalcancer clinical trials system for the 21st century reinvigorating the NCIcooperative group program (Editor ed.^eds.). pp. 1 online resource(xviii, 297 p.). City: National Academies Press; 2010:1 online resource(xviii, 297 p.).

26. Clauser SB, Johnson MR, O'Brien DM, Beveridge JM, Fennell ML, Kaluzny AD:Improving clinical research and cancer care delivery in communitysettings: evaluating the NCI community cancer centers program.Implement Sci 2009, 4:63.

27. Reiter KL, Song PH, Minasian L, Good M, Weiner BJ, AS M: A method foranalyzing the business case for provider participation in federallyfunded provider-based research networks. . (under review).

28. Petrelli NJ, Grubbs S, Price K: Clinical trial investigator status: you need toearn it. J Clin Oncol 2008, 26:2440–2441.

29. Petrelli N: Update from Nicholas Petrelli about article on pilot project todefine criteria for CCOP Clinical Trials Investigators [Letter]. OncologyTimes 2009, 31:3.

30. Helfrich CD, Weiner BJ, McKinney MM, Minasian L: Determinants ofimplementation effectiveness - Adapting a framework for complexinnovations. Medical Care Research and Review 2007, 64:279–303.

31. Bourgeois JL: On the measurement of organizational slack. Acad ManageRev 1981, 26:29–30.

32. Nord WR, Tucker S: Implementing routine and radical innovations. Lexington,MA: Lexington Books; 1987.

33. Trochim WMK: The research methods knowledge base. Cincinnati, OH: AtomicDog Publishing; 2001.

34. Lamb S, Greenlick MR, McCarty D (Eds): Bridging the Gap Between Practiceand Research. Washington, DC: National Academy Press; 1998.

35. Inventory and Evaluation of Clinical Research Networks. [Available from:https://www.ctnbestpractices.org/collaborations/networks-for-clinical-research-ncr/resources/national-leadership-forum-proceedings]

36. Carpenter WR, Reeder-Hayes K, Bainbridge J, Meyer AM, Amos KD, WeinerBJ, Godley PA: The role of organizational affiliations and researchnetworks in the diffusion of breast cancer treatment innovation. MedCare 2011, 49:172–179.

37. Ducharme LJ, Knudsen HK, Roman PM, Johnson JA: Innovationadoption in substance abuse treatment: exposure, trialability, andthe Clinical Trials Network. J Subst Abuse Treat 2007, 32:321–329.

38. Laliberte L, Fennell ML, Papandonatos G: The relationship of membership inresearch networks to compliance with treatment guidelines for early-stagebreast cancer. Med Care 2005, 43:471–479.

39. Johnson TP, Warnecke RB, Aitken MJ: Changing practice patterns. InManaging a health care alliance: improving community cancer care.Edited by Kaluzny AD, Warnecke RB, Fredrick MD.: Beard Books;2000:105–128.

40. Abraham AJ, Knudsen HK, Rothrauff TC, Roman PM: The adoption ofalcohol pharmacotherapies in the Clinical Trials Network: the influenceof research network participation. J Subst Abuse Treat 2010, 38:275–283.

41. Knudsen HK, Abraham AJ, Johnson JA, Roman PM: Buprenorphineadoption in the National Drug Abuse Treatment Clinical TrialsNetwork. J Subst Abuse Treat 2009, 37:307–312.

doi:10.1186/1748-5908-7-41Cite this article as: Teal et al.: Implementing community-based providerparticipation in research: an empirical study. Implementation Science 20127:41.

Submit your next manuscript to BioMed Centraland take full advantage of:

• Convenient online submission

• Thorough peer review

• No space constraints or color figure charges

• Immediate publication on acceptance

• Inclusion in PubMed, CAS, Scopus and Google Scholar

• Research which is freely available for redistribution

Submit your manuscript at www.biomedcentral.com/submit