cc.comunitara in colaborare

14
Health Education & Behavior(April 1999) Sanstad et al. / Community Research Consortium Collaborative Community Research Consortium: A Model for HIV Prevention Katherine Haynes Sanstad, MBA Ron Stall, PhD, MPH Ellen Goldstein Wendy Everett, PhD Ruth Brousseau, PhD In 1991, the Center for AIDS Prevention Studies (CAPS) at the University of California, San Francisco, set out to develop a model of community collaborative research that would bring the skills of science to the service of HIV prevention and the knowledge of service providers into the domain of research. Essential elements of the model were training for community-based organizations (CBOs) in research protocol writing, partnership between CBOs and CAPS researchers, program research funding, support to implement studies and analyze results, and a program manager to oversee the effort and foster the relationships between CBOs and researchers. In this article, the authors describe the CAPS model of consortium-based community collaborative research. They also introduce a set of papers, written by researchers and service providers, that describes collaborative research projects conducted by research institutions and CBOs and illustrates how collaboration can change both HIV prevention research and service. INTRODUCTION Take a moment to imagine a perfect world of HIV prevention research and service. In this perfect world, data on HIV behavioral risk taking and prevention interventions would be effectively understood and used by the community-based organizations (CBOs) that provide “on-the-ground” prevention services. In addition, behavioral scientists and epi- demiologists would maintain ongoing and respectful relationships with frontline preven- tion workers. These relationships would help researchers integrate CBO staff’s insights Katherine Haynes Sanstad was director of the Technology and Information Exchange (TIE) Core of CAPS from 1990 to 1998. Ron Stall is an associate professor of epidemiology and biostatistics at the University of California, San Francisco, and on the faculty of the TIE Core. Ellen Goldstein is the manager of community pro- grams at CAPS and managed both the collaborative consortia presented here. Wendy Everett was the consultant to the Northern California Grantmakers (NCG) AIDS Task Force from 1994 to 1996 and is now director of the Institute for the Future. Ruth Brousseau is cochair of the NCG AIDS Task Force and senior program officer at the California Wellness Foundation. Address reprint requests to Katherine Haynes Sanstad, MBA, California Health Care Foundation, 476 9th Street, Oakland, CA 94607; phone: (510) 238-1040; fax: (510) 238-1388; e-mail: [email protected] The collaborative work represented here was made possible by the Northern California Grantmakers AIDS Task Force; the University of California, San Francisco, Center for AIDS Prevention Studies with support from the National Institute of Mental Health grant #NIMH/MH 42459; and the California Department of Health Services, Office of AIDS. It could not have been completed without the dedication of the staff of the community-based organizations that participated. Health Education & Behavior, Vol. 26 (2): 171-184 (April 1999) © 1999 by SOPHE 171

Upload: mrmatei6899

Post on 25-Dec-2015

217 views

Category:

Documents


1 download

DESCRIPTION

CERCETAREA IN COMUNITATE

TRANSCRIPT

Page 1: Cc.comunitara in Colaborare

Health Education & Behavior(April 1999)Sanstad et al. / Community Research Consortium

Collaborative Community Research Consortium:A Model for HIV Prevention

Katherine Haynes Sanstad, MBARon Stall, PhD, MPH

Ellen GoldsteinWendy Everett, PhDRuth Brousseau, PhD

In 1991, the Center for AIDS Prevention Studies (CAPS) at the University of California, San Francisco, setout to develop a model of community collaborative research that would bring the skills of science to the serviceof HIV prevention and the knowledge of service providers into the domain of research. Essential elements of themodel were training for community-based organizations (CBOs) in research protocol writing, partnershipbetween CBOs and CAPS researchers, program research funding, support to implement studies and analyzeresults, and a program manager to oversee the effort and foster the relationships between CBOs and researchers.In this article, the authors describe the CAPS model of consortium-based community collaborative research.They also introduce a set of papers, written by researchers and service providers, that describes collaborativeresearch projects conducted by research institutions and CBOs and illustrates how collaboration can changeboth HIV prevention research and service.

INTRODUCTION

Take a moment to imagine a perfect world of HIV prevention research and service. Inthis perfect world, data on HIV behavioral risk taking and prevention interventions wouldbe effectively understood and used by the community-based organizations (CBOs) thatprovide “on-the-ground” prevention services. In addition, behavioral scientists and epi-demiologists would maintain ongoing and respectful relationships with frontline preven-tion workers. These relationships would help researchers integrate CBO staff’s insights

Katherine Haynes Sanstad was director of the Technology and Information Exchange (TIE) Core of CAPSfrom 1990 to 1998. Ron Stall is an associate professor of epidemiology and biostatistics at the University ofCalifornia, San Francisco, and on the faculty of the TIE Core. Ellen Goldstein is the manager of community pro-grams at CAPS and managed both the collaborative consortia presented here. Wendy Everett was the consultantto the Northern California Grantmakers (NCG) AIDS Task Force from 1994 to 1996 and is now director of theInstitute for the Future. Ruth Brousseau is cochair of the NCG AIDS Task Force and senior program officer atthe California Wellness Foundation.

Address reprint requests toKatherine Haynes Sanstad, MBA, California Health Care Foundation, 476 9thStreet, Oakland, CA 94607; phone: (510) 238-1040; fax: (510) 238-1388; e-mail: [email protected]

The collaborative work represented here was made possible by the Northern California Grantmakers AIDSTask Force; the University of California, San Francisco, Center for AIDS Prevention Studies with support fromthe National Institute of Mental Health grant #NIMH/MH 42459; and the California Department of HealthServices, Office of AIDS. It could not have been completed without the dedication of the staff of thecommunity-based organizations that participated.

Health Education & Behavior, Vol. 26 (2): 171-184 (April 1999)© 1999 by SOPHE

171

Page 2: Cc.comunitara in Colaborare

about new trends in behavioral risk prevention research. Rapid, continuous communica-tion about innovative intervention programs developed by CBOs would facilitate measur-ing the effects of these interventions. Cooperative analysis of such data by CBOs andresearch organization staff could be used to identify those aspects of prevention programsthat are effective and those that need improvement. Finally, the strong relationship main-tained by researchers and CBO staff could create the knowledge base and foster the politi-cal will needed to promote social strategies to avert prevention policy malpractice (e.g.,failure to pass drug paraphernalia laws that support disease prevention).1,2 In short, theresearch and service arms of HIV prevention would work cooperatively, with rapid andeffective flow of information and insights between the research and service worlds, sothat prevention science and services would be timely, innovative, and effective.

This perfect world of AIDS prevention work does not exist yet. Instead, researchersand frontline prevention workers often move in parallel worlds, sometimes characterizedmore by mutual distrust and open competition than by cooperation. One telling measureof the disconnection between the prevention service and research worlds is a recent studythat shows that less than 10% of CBOs surveyed use scientific publications as a source ofinformation in their work.3 Some of this alienation of service from science can be traced tothe different reward systems under which researchers and CBO staff operate. Researchersare rewarded for publishing in professional journals; providers are rewarded for thenumber of clients they serve. Research values objectivity; service values empathy.4,5

Standards of practice also vary. In seeking objective answers, research values uniformityin programs in which service values responsiveness. The primary objective of traditionalacademic research is furthering knowledge; the primary objective of service is meetingclient needs. Training and vocabulary also often differ between researchers and serviceproviders and hinder communication.6 These disciplinary differences suggest that con-ducting research in a collaborative manner requires that we design a model that providesrewards under the systems of both research and service. In addition, a model for increas-ing cooperation between the research and service worlds should emphasize effectivecommunication and relationship building.

The costs of the current lack of communication and cooperation between the researchand service arms of AIDS prevention work are enormous. As the epidemic moves intopoor and disenfranchised groups that are poorly understood by researchers, we risk beingunable to predict how the epidemic will next manifest and how to best arrest it. By ignor-ing the understanding that frontline prevention workers develop about risk-taking behav-iors as they serve high-risk populations, useful information necessary to improve preven-tion programs and science is lost. The schism between the research and preventionservice worlds also means that CBOs may be unnecessarily slow in adopting scientifi-cally proven prevention strategies either because such interventions are unknown orimpractical to implement. In addition, researchers often remain unaware of groundbreak-ing interventions invented by CBOs, missing the opportunity to evaluate whether suchprevention programs are meeting their goals. The lack of communication betweenresearchers and frontline prevention workers also means that neither group is as effectiveat fighting for sound HIV prevention policy alone as it might be if it worked with the other.Thus, both science and service are more vulnerable to unfortunate policy decisions. Thegap between the service and research arms of HIV prevention has needlessly compro-mised the effectiveness of the work of both groups and adds to the continuing tragedies ofthe HIV epidemic.

In this special issue, we describe a model for improving communication and coopera-tion between the two worlds of HIV frontline service and research. A set of papers

172 Health Education & Behavior(April 1999)

Page 3: Cc.comunitara in Colaborare

describing research projects conducted by staff from academic research institutions andCBOs is presented to illustrate how collaboration can change both HIV preventionresearch and service. The goal of this set of projects was not only to conduct HIV preven-tion research within a CBO setting but also to experiment with building CBO evaluationskills and improving communication and cooperation between the researchers and serv-ice providers in the context of a consortium of peers. We hope that readers will considerthe outcomes of our experiment in collaboration to determine whether this model holdspromise for improving prevention research and service within and outside of HIVprevention.

THE CENTER FOR AIDS PREVENTION STUDIES MODEL OFCOMMUNITY-BASED COLLABORATIVE RESEARCH

In 1991, the Center for AIDS Prevention Studies (CAPS) at the University of Califor-nia, San Francisco, set out to develop a model of community collaborative research thatwould bring the skills of science to the service of HIV prevention and the knowledge ofservice providers into the domain of research. Essential elements of the model were train-ing for CBOs in research protocol writing, partnership between community-based ser-vice providers and CAPS researchers, research funding, technical support to implementstudies and analyze results, and a program manager to facilitate the relationships betweenCBOs and researchers.

The underlying assumption of the CAPS model is that research that carries the prom-ise of both informing science and offering practical knowledge needed to solve commu-nity problems provides a meaningful endeavor on which researchers and service provid-ers can collaborate.7 It is predicated on the belief that by putting researchers face-to-facewith frontline service providers and, in turn, with prevention clients, collaboration canfocus research questions and stimulate more incisive examinations of prevention pro-grams in real-world settings. By putting service providers in contact with researchers,collaboration can create both access to scientific information and the ability to use it.

Research collaboration is not merely CBOs and researchers cooperating with oneanother; it is more. In the model presented here, collaborative research is that in whichservice providers pose questions about their programs and clients and work closely withresearchers to develop those questions into feasible research questions and to design andimplement studies to answer them. It is research in which scientists and service providerswork in tandem to implement the study, analyze the data, and interpret results. By scien-tists or researchers, we mean those academically trained to conduct social scienceresearch. By service providers or, alternatively, community investigators, we mean thosewho offer HIV prevention services in a variety of settings. These settings include HIVservice organizations, community clinics, churches, drug treatment facilities, depart-ments of public health, and multipurpose agencies such as the YWCA.

The model also values peer review and relies on an inclusive definition of peers thatgives rise to the emphasis on consortium. By definition, a consortium is a cooperativeassociation of institutions engaging in a joint venture that none could accomplish indi-vidually. Peer review, one of the engines of the academic world driving research funding,publication, and what is accepted as scientific knowledge, assumes that through collec-tive review and refinement, the ultimate scientific product will be better than it would be ifany one person or team had produced the product alone. In the CAPS model, the consor-tium members are the peers who collectively review, refine, and interpret the research of

Sanstad et al. / Community Research Consortium 173

Page 4: Cc.comunitara in Colaborare

consortium members. Among the peers are academic researchers, CAPS program admin-istrators, funders, and CBO staff, the majority of whom are program implementation staffrather than evaluators, grant writers, or administrators. Together, these peers strive toimprove the performance of each individual study team in an effort to learn more aboutprevention and community collaborative research than they would have learnedotherwise.

The CAPS model of community collaborative research is based on the idea that, whenbrought together in partnerships to pursue mutual goals, community and academicresearchers form study teams that can open communication, connect science with ser-vice, share skills and information, and provoke new insights into HIV prevention. Themodel draws heavily on empowerment evaluation and action research, bidirectionalinterpretations of traditional technology transfer, and, most directly, from the CAPSprogram of collaborative research in developing countries.8-10 It seeks to facilitate aprocess in which CBOs identify the questions of interest and use them in fielding preven-tion programs. It strives to build collaborative relationships between researchers andcommunity providers. It endeavors to provide data—both process and outcome—thatcan help us understand how these programs work and change for the better. Above all, themodel is designed to ensure that researchers, service providers, and funders learn fromone another: that CBO staff gain new skills they can apply elsewhere, that researchersgather new understandings of behavioral risk and intervening on risk behavior, and thatfunders discover how prevention programs work and how they can and perhaps should beevaluated.

APPLYING THE CAPS MODEL:THE PREVENTION EVALUATION INITIATIVE AND

STATEWIDE COLLABORATIVE HIV EVALUATION PROJECT

CAPS has implemented its model of community collaborative research in two differ-ent programs: the HIV Prevention Evaluation Initiative and the Statewide CommunityHIV Evaluation Project (SCHEP). The HIV Prevention Evaluation Initiative was a SanFrancisco Bay Area program, managed by CAPS, that brought together a consortium ofBay Area CBOs and trained HIV prevention researchers from CAPS, CAPS programadministrators, and Bay Area philanthropists to implement HIV prevention interventionresearch. SCHEP included researcher-CBO teams from throughout California in a pro-gram, run by two CAPS administrators, that was designed to implement formative andoutcome research. In each case, at least two levels of collaboration occurred: at the studyteam level and at the level of the consortium. In addition, each collaborative effort alsosought to foster a sense of community across research projects in the consortium.

In 1993, CAPS and the Northern California Grantmakers (NCG) AIDS Task Force, anassociation of private and corporate foundations, came together to launch a programcalled the HIV Prevention Evaluation Initiative. The goals of the HIV Prevention Evalua-tion Initiative were (1) to determine whether the funded programs had an effect on clientbehavior, (2) to build evaluation skills among participants, (3) to bring universityresearchers new insights on populations at risk for HIV, and (4) to evaluate the overall col-laboration among research institutions, CBOs, and funders. The initiative featured (1) apartnership with HIV prevention services funders; (2) a 4-day research protocol writingworkshop; (3) up to $50,000 in program and $10,000 evaluation funding per year; (4) aconsortium of CBOs, funders, and HIV researchers from CAPS; (5) ongoing technical

174 Health Education & Behavior(April 1999)

Page 5: Cc.comunitara in Colaborare

assistance, including 10% of a scientist’s time, 10% of a statistician’s time, data entry, andstatistical analysis; (6) a dedicated program administrator to support the researcher-CBOpairs, facilitate training and assistance, and remove obstacles to project implementation;and (7) an overall evaluation of the initiative conducted by an outside consultant.

Community projects were selected through a competitive process. After submitting aletter of intent, CBOs that showed promise participated in a 4-day workshop that soughtto coach them as they developed their questions about their programs into feasibleresearch protocols. Although large group sessions were held, the lion’s share of protocoldevelopment happened in small groups in which CBOs and researchers critiqued eachother’s work and offered recommendations for improvement. After completing the work-shop, CBOs applied for funding. Fifty-nine letters of intent were submitted, 17 agenciesparticipated in the workshop, and of the 14 who submitted funding applications, 11 wereawarded grants.11

The initiative then focused on the partnerships between researchers and communityinvestigators. We selected CAPS researchers on the basis of their experience with orinterest in conducting community-based research. We also chose researchers with exper-tise on different populations and intervention approaches. All researchers had experiencein HIV prevention research, ranging from 1 to 7 years of experience. Researchers andCBO investigators were matched by CAPS program managers. Matches were based onthe expertise of the researchers, population or intervention approach of the preventionprogram, and the chemistry between researchers and CBO investigators that wasobserved during the 4-day workshop. Thus, neither CBO investigators nor the CAPSresearchers had a free choice of partner; they did have the right of refusal, which noneexercised.

There were two components of ongoing technical assistance and support. The first wasthe consortium forged among all participants in the initiative. Monthly meetings,attended by all HIV Prevention Evaluation Initiative participants, reinforced the collabo-ration across projects that grew out of the small group work in the workshop. CAPSresearchers, CBO staff, CAPS program managers, and one or two representatives fromthe NCG attended the meeting each month. Under the leadership of the CAPS programmanager, monthly meetings provided training, opportunities to discuss projects, andproject review through every phase of study implementation. CAPS also drew on itsresources and called on those of participating CBOs to offer training on a wide range oftopics. For example, CAPS investigators presented a session on tracking study partici-pants for postintervention follow-up, CBO staff presented a session on peer education,and CBO and CAPS personnel copresented special training to prepare outreach workersto administer questionnaires.

The second aspect of ongoing support was provided by the CAPS program managerand NCG project manager. When conflict arose within partnerships or when projectsstalled, CAPS and NCG managers intervened with researchers, CBO investigators, orCBO management as necessary. For example, a school district, which was one of thestudy sites, changed policy and required that the investigators get positive parental con-sent for students to participate in the research project rather than assume parental consentif a parent did not explicitly bar a child from participating. As a result, study staff putmuch more effort into securing positive consent and processed a much higher volume ofpermission slips than planned. Project costs soared. Working together, the CAPS andNCG managers found resources to cover additional costs. In another study, when itbecame apparent that a new intervention was not attracting the number of participantsneeded to test behavioral outcomes, the CAPS manager was there to help them modify the

Sanstad et al. / Community Research Consortium 175

Page 6: Cc.comunitara in Colaborare

protocol, and the NCG manager was there to understand the problems and facilitatechanges in the funded program. Both the CAPS researcher and CBO investigator hadassistance in resolving problems if needed.

The HIV Prevention Evaluation Initiative was initially designed as a 2-year project butlasted 3 years. The complexity of operating within the constraints of university bureau-cracy and the realities of conducting research in the real world delayed progress. Serviceproviders, funders, and researchers agreed that to stop at the end of the second year wouldbe to end the project before it came to fruition. Since the NCG and CAPS were both par-ticipants and funders, we were able to extend the initiative to 3 years rather than the 2years originally planned.

The HIV Prevention Evaluation Initiative was evaluated by an independent consultant,Harder + Company, funded by a 2-year, $105,000 grant by the Ford Foundation.12,13 Theevaluation relied primarily on periodic interviews with HIV Prevention Evaluation Initia-tive participants at different stages of research implementation. In addition, some closed-ended surveys were fielded. The goal of the evaluation was to assess participants’ satis-faction with the process and ascertain what benefits they experienced and costs theyincurred by participating.

SCHEP adapted the CAPS model of consortium-based collaborative research in astatewide program. The California Department of Health Services Office of AIDS fundedCAPS to design and administer the statewide program. Researcher-CBO teams appliedfor funding, and CAPS served as both funder and program administrator with state spon-sorship. The manager who oversaw the HIV Prevention Evaluation Initiative directedSCHEP, with the assistance of a field manager who maintained contact with the fourgrantee teams.

While the HIV Prevention Evaluation Initiative and SCHEP shared goals of fosteringcollaboration, building CBO capacity, and testing community-driven interventions,SCHEP differed from the HIV Prevention Evaluation Initiative in key characteristics. InSCHEP, CAPS solicited proposals from researcher-service provider pairs throughout thestate of California. In contrast to HIV Prevention Evaluation Initiative researchers whowere all from one institution. SCHEP researchers and CBOs chose each other; HIV Pre-vention Evaluation Initiative researchers were assigned. In addition, two of the fourresearchers were new to HIV prevention research when they applied for funding, whereasall CAPS researchers were trained and/or experienced in HIV prevention research andwere supported by an institution devoted to HIV prevention research. All data entry andstatistical support were provided by SCHEP researchers themselves or personnel in theirresearch institutions. Although all funds were from the State Office of AIDS, the state,unlike the NCG, had little hands-on involvement once funds were awarded to CBOresearch teams. While the HIV Prevention Evaluation Initiative grantees came from fivecounties in the San Francisco Bay Area, SCHEP grantees came from Central, Southern,and Northern California. As a result, SCHEP grantees met three times over 21 monthsrather than monthly for 36 months as did the HIV Prevention Evaluation Initiative grant-ees. Thus, the SCHEP consortium was less cohesive than the HIV Prevention EvaluationInitiative group.

The distance between SCHEP grantees and between the grantees and CAPS requiredmechanisms other than face-to-face contact to foster a consortium among the studyteams. CAPS SCHEP program staff produced a monthly newsletter during the first year,publishing project updates and useful information in an effort to keep consortium mem-bers informed of one another’s progress. In addition, CAPS attempted to set up a com-puter network among the participants through which they could share information. While

176 Health Education & Behavior(April 1999)

Page 7: Cc.comunitara in Colaborare

a promising vehicle for communication, the widely varying states of computer readinessat participating agencies made the communication potential difficult to realize.

LESSONS IN COMMUNITY COLLABORATIVE RESEARCH

Over the past several years, through the numerous research projects in which we haveparticipated, we have learned a few keys to maximizing the potential of community-basedcollaborative research: (1) invest time in building and maintaining collaborative relation-ships; (2) invest money in supporting collaborative research; (3) ensure that all partici-pants are committed to the services being offered, the evaluation project, and the consor-tium; (4) set appropriate expectations for research efforts and fund accordingly; (5) ifpossible, actively involve funders; (6) be flexible in designing and fielding collaborativestudies; and (7) learn from difficulties encountered in implementing research.

Invest Time in Building and Maintaining Collaborative Relationships

When an academic researcher and CBO staff come together to conduct research, eachmust give up some control. The result is uncertainty—an uncertainty that must be over-come by trust. Collaborative intervention research may focus on interventions developedby researchers or on those jointly developed by researchers and service providers. How-ever, when the intervention is developed by service providers, the researcher may have lit-tle or no control over the intervention to be studied. He or she must rely on the CBO part-ners to make the program run and thus give up some power to determine whether theresearch endeavor can succeed. As in all community research, the researcher cannot con-trol external events, but in this case, neither can he or she fully control how the programaccommodates those events. While service providers do articulate the questions theywant to answer about their programs, they often must rely on the judgment and experienceof their researcher partner to use the tools of science to justly answer the questions. Andthe answers might not be what they expect. They may learn that their programs have noeffect on clients. Thus, not only do they lose privacy, but their assumptions about howtheir program works may also be challenged. Giving up control takes trust, and trust canonly be built through an accumulation of positive experiences over time.

Building research skills among service providers was an important part of fosteringgood collaborative relationships. The training and technical assistance in research meth-ods not only helped provide a language researchers and service providers could share butalso enabled service providers to collaborate on the research project rather than cooperatewith it. Research training also enabled them to advise other CBO-researcher teams intheir consortium. In SCHEP and the HIV Prevention Evaluation Initiative, the workshopsfamiliarized participants with the steps in the research process and with research vocabu-lary. In the HIV Prevention Evaluation Initiative, monthly meetings and special trainingsreinforced that familiarity, transforming it into knowledge as each service provider wasasked not only to pursue his or her own work but to assist others. The time spent in build-ing skills that enabled true collaboration on the research endeavor paid off. It built confi-dence among service providers, and in the HIV Prevention Evaluation Initiative, itincreased CBOs’ overall capacity to evaluate their programs.13

CAPS worked hard to foster and support good collaborative relationships in both theHIV Prevention Evaluation Initiative and SCHEP. Dedicated CAPS program managerschecked with researchers and service providers to see how relationships were going.

Sanstad et al. / Community Research Consortium 177

Page 8: Cc.comunitara in Colaborare

They helped solve problems in research design, staffing, and budgeting, and they inter-vened with CBO and, in SCHEP, university management when it seemed to be obstruct-ing the project. In addition, the program managers convened the consortium of CBOs,funders, and researchers and called on each member to use his or her expertise for thebenefit of consortium members. From the initial protocol writing workshop to the consor-tium meetings, SCHEP and the HIV Prevention Evaluation Initiative were designed toexpose CBOs and researchers to the expertise each brought to the collaboration and toteach them to rely on one another. Each dedicated the time it takes to collaborate. TheHarder + Company evaluation of the HIV Prevention Evaluation Initiative found thatresearchers and CBOs spent as much as twice the time budgeted on the collaboration.12

Grant makers committed funds for an additional year to complete the project.Independent groups come together to form true consortia only out of a shared purpose.

The HIV Prevention Evaluation Initiative and SCHEP had very different experiences atcreating a consortium of CBOs, researchers, and funders. While there were specific occa-sions on which SCHEP grantees helped each other (one project helped another assessdose response in a multicomponent intervention), the group of four researcher-CBOteams never truly formed a consortium.14 In contrast, the HIV Prevention Evaluation Ini-tiative CBOs spent considerable time working with one another across projects, sharingexperience in intervention approaches and in fielding research. Several factors contrib-uted to creating a consortium in the HIV Prevention Evaluation Initiative. The group waslarge enough to have subgroups whose projects focused on specific populations or inter-vention approaches (e.g., youth or peer education). The group met at least monthly for 3years and saw research projects grow from the rough questions posed during the HIV Pre-vention Evaluation Initiative workshop to preliminary analyses and final reports.Monthly meetings gave opportunities for people to think about one another’s projects andshare expertise. When consortia do hang together, they enrich participants’ work bybringing new insights into the programs, research questions, and study results—insightsthat a study team immersed in its own project sometimes lacks.

Invest Money in Supporting Collaborative Research

Money pays not only to deliver prevention services and evaluate them but also to fosterstrong, collaborative relationships. Together, CAPS and the NCG spent $2.5 million tosolicit proposals, train participants, maintain the consortium, and field 10 studies (one ofthe original 11 was terminated before completion). Ninety percent of direct costs of theHIV Prevention Evaluation Initiative supported services and evaluation. The remaining10% of the budget funded project staff. An additional $105,000 covered the cost of evalu-ating the overall collaboration among the CBOs, CAPS, and NCG.12,13 It cost SCHEP$688,000 to conduct four projects and maintain a consortium of four CBO-researcherteams. More than 90% of direct costs for SCHEP went to fund services and evaluation,with less than 10% allocated to overall program management.14 It is important to note thatthese costs include program administration—the technical assistance and ongoing sup-port provided by program managers, as well as the consortium meetings and emergencyassistance for things such as data cleaning—and the provision of prevention services.While administrative costs are not disproportionate to program or evaluation expenses,they are not zero. One cannot run a collaborative research consortium without paying tosupport that consortium.

178 Health Education & Behavior(April 1999)

Page 9: Cc.comunitara in Colaborare

Ensure That All Participants AreCommitted to Services, Evaluation, and the Consortium

If community-based research is to succeed, the service being studied must be imple-mented well, and the program staff and researchers must work together well to field thestudy. In the collaborative consortia described here, projects worked best when research-ers, service providers, and funders were committed to the program being studied, to theresearch under way, and to attending and participating in the consortium. We believe thatthis kind of commitment comes from involving each participant in the program itself andin the research design and implementation.

Researchers must be committed to the successful implementation of the preventionprogram they are studying to conduct good science. Our experience showed thatresearchers who visited the interventions more frequently had a better understanding ofthe program being evaluated. Those who spent time with program staff understood notonly how staff implemented the program but also how staff conceptualized it. Time spenton-site is a function of the interest of the researcher and of the resources dedicated to sup-porting that researcher.

Service providers must actively participate as research partners to make sure that theright question is being asked. Their knowledge of how the program runs and beliefs abouthow each component contributes to the whole program are needed to frame researchquestions, develop the study design, and prepare questionnaires or interview guides.Often, program staff are the ones to administer questionnaires or interview clients, sotheir support and understanding of research goals are essential. In the data analysis phaseof the project, service providers should discuss what questions they would like to answerand make sure that they complete the resulting analyses. All the skills built in the work-shop, meetings, and special training sessions help equip CBOs to be active participants inthe analysis phase. Sometimes that means asking questions until the answers make sense.

Set Appropriate Expectations for Research Efforts and Fund Accordingly

It is not unusual for one intervention study funded by the National Institutes of Healthto cost from $1 million to $3 million for 3 to 5 years of research. The average cost of theintervention studies fielded through SCHEP and the HIV Prevention Evaluation Initiativeis tiny by comparison. The projects undertaken in the HIV Prevention Evaluation Initia-tive were as complex as many National Institutes of Health studies. Many sought to com-plete outcome evaluations for programs that had not been offered before. While 3 of the10 HIV Prevention Evaluation Initiative studies completed were able to assess the effectsof the interventions studied, funding was largely not sufficient to execute these studies asdesired.

When setting out to field collaborative research projects, one should set reasonablegoals and fund accordingly. It may be unreasonable to conduct intervention outcomeresearch on an annual budget of $60,000, particularly if one seeks to assess the effects of anew intervention. None of the studies of new interventions fielded in the HIV PreventionEvaluation Initiative was able to determine whether the program had an effect on behav-ior. This was largely due to problems fielding the program. All of the projects fielded inSCHEP and the HIV Prevention Evaluation Initiative generated valuable data describingpopulations at risk and documenting intervention procedures. Most fell short of definitivedata on intervention efficacy due to the lack of statistical power driven by poor

Sanstad et al. / Community Research Consortium 179

Page 10: Cc.comunitara in Colaborare

recruitment into the interventions or low follow-up rates. SCHEP, following the HIV Pre-vention Evaluation Initiative, set more reasonable goals, fielding formative research proj-ects for two of the four studies.

Actively Involve Funders

Funders can be the fulcrum of the collaborative process between researchers andCBOs. They are interested in whether the programs they fund are sound, and they want toensure that valuable services are delivered to the greatest number of people possible.Including funders in community collaborative research can help strike a balance betweenthe need for exceptional scientific rigor and complete program flexibility. The NCG, oneof the HIV Prevention Evaluation Initiative funders, was a private agency with the abilityto set its own rules and to change them. As such, the NCG was able to facilitate midcoursecorrections or major changes in research and/or program design. This type of flexibility iscritical to successfully fielding community collaborative research.

Funder participation in this consortium model of collaborative research was key. Asfunders, both CAPS and the NCG had intimate knowledge of the problems that CBOsfaced in implementing programs and that the researcher-CBO teams had in fielding stud-ies. They worked hand in hand with grantees to find solutions. By participating in the HIVPrevention Evaluation Initiative collaborative consortium rather than awaiting progressreports, the NCG began to understand how much money and time it takes to implementprograms and evaluate them.13 The NCG was able to work with CAPS and CBOs whentheir original study or program designs needed to change or when the service proposedcould not be provided as planned. As funders, they were flexible enough to allow programmodifications and study design changes that helped deliver services clients could use andmake studies work.

Be Flexible in Designing and Fielding Collaborative Studies

The researcher has a scientific toolbox with which to work and must be dexterous withthe tools in community collaborative research. In the tool box are study designs, samplingschemes, specific measures, and methods of statistical analysis. The researcher’s job inthe collaboration is to take suggestions of field staff, incorporate their working theories ofhow the program works, and help the CBO researchers adapt and apply the most appropri-ate scientific tool. At the same time, they have to respect the integrity of the program. Forexample, Gómez, Hernández, and Faigeles15 chose not to randomly assign women to con-trol and intervention groups. They sacrificed a degree of scientific certainty in the effectsof the intervention, but they preserved the nature of the program they sought to assess.Still, the researcher needs to hold the line when it is crucial to answering the agreed-onresearch question, so Gómez, Hernández, and Faigeles rigorously documented exactlywhat and how much contact research participants had with the multipart program that wasbeing investigated.

Research design compromises are inevitable. Minor design revisions were necessaryin many of the 14 projects fielded in the HIV Prevention Evaluation Initiative andSCHEP; some projects required major design changes. One project sought to answer thequestion of whether having a parent participate in the theater-based intervention in whichtheir high school children were going to participate would foster greater parent-teen com-munication. Of the 3,000 parents eligible to participate in the intervention, only 39 did.The study team could not draw any conclusions on the effects of intervening with parents

180 Health Education & Behavior(April 1999)

Page 11: Cc.comunitara in Colaborare

and teens versus intervening with teens alone. But they had more than 1,000 completedquestionnaires from students participating in the study, many containing open-endedresponses to a question about parental involvement. The team analyzed these data andlearned much from what, on its face, was a scientific failure.16 Researchers and CBOinvestigators need to be able to capitalize on the opportunities of outright disasters inresearch implementation. They need to constantly review design options, collectprogram-relevant process data, and systematically collect qualitative data.

Learn From Difficulties Encountered in Implementing Research

In research, there is a bias for publishing positive results. Studies with inconclusive ornegative findings seldom appear in the literature. As a result, we forego opportunities notonly to learn from research “failures” but also to capitalize on lessons learned whileimplementing research. During the course of the collaborative studies fielding throughthe HIV Prevention Evaluation Initiative, we saw that parents were the missing partnersin three out of three studies that sought to involve them. Their absence from HIV preven-tion for youth had major implications for the CBO program plans (e.g., trying to attractparents of younger children). Another study also found that few of the people targetedactually attended the intervention being offered. As you will read in the Practice Notes,this agency invited gay male social groups to host interventions, and most said no thankyou. That CBO did not continue to invest time and money in that intervention. When regu-larly scheduled sessions failed to attract enough adolescents in one clinic-based youthprogram, the intervention was immediately revised to offer drop-in sessions. Many ofthese lessons arose from circumstances that prevented the CBO-researcher teams fromanswering their original research questions. Yet, they were essential to making programdecisions and in pursuing different, more relevant research questions.

WHAT IS IN THIS ISSUE

In this special issue, we present articles on collaboration in the context of six separateresearch projects. Articles by Gómez, Hernández, and Faigeles;15 Grinstead, Zack, andFaigeles;17 Harper and Carver;18 and Laub et al.19 come from the HIV Prevention Evalua-tion Initiative. Articles by Klein, Williams, and Witbrodt20 and Weiker, Edgington, andKipke21 hale from the SCHEP project. All of these articles address collaboration in thecontext of their particular HIV prevention program. They discuss collaboration’s influ-ence on research design, levels of collaboration, collaborative application of theory, andhow collaboration manifests itself at each stage of the research press.

Gómez, Hernández, and Faigeles15 write about the process of collaboration on a studyof the efficacy of a multifaceted empowerment intervention that included HIV preventionfor Latina immigrant women. They discuss the need for flexibility in research design bothto maintain the integrity of the program and to react to external events influencing thelives of their clients that could not be avoided. They also address the positive results of theintervention they assessed and the promise of meaningful HIV risk behavior change thatsuch a multifaceted empowerment intervention can hold for immigrant women.

While the primary focus of much commentary on collaborative research is theresearcher-CBO or researcher-community dyad, collaboration in community-based re-search takes place in a larger context. Community-based collaborative research occurs inthe context of the support, neglect, or outright opposition of surrounding social and

Sanstad et al. / Community Research Consortium 181

Page 12: Cc.comunitara in Colaborare

political structures. Grinstead, Zack, and Faigeles17 discuss this theme and give the his-tory of the development of a portfolio of truly collaborative projects focusing on HIV pre-vention for incarcerated men and their female partners at a large state prison. In additionto the complexities of the CBO-researcher relationship, these projects were conductedwithin a prison, with the support of prison staff and administration.

Harper and Carver18 also address multiple levels of collaboration. In their interventiontargeting chronically truant, suburban youth, Harper and Carver not only forged a col-laboration between CAPS and the Tri-City Health Center, but they also included the ado-lescents who were their research participants. In soliciting and incorporating adolescents’advice in research and program design, Harper and Carver directly engaged the clients inthe research process.

Laub et al.19 present an example of the meeting of implicit, program-based theory andacademic theory in the context of a school-based HIV prevention intervention for middleschool students. The collaboration between CAPS and the YWCA provided a venue formaking the theories of behavior change that are implicit in the evolution of the YMCA’sintervention explicit and for linking them with formal, academically developed theory.Laub et al. discuss the process of (1) determining the limitations of the underlying theo-ries of their intervention, (2) developing a new intervention component that sought toovercome these limitations, (3) making explicit a new theoretical framework for that newintervention component, and (4) finding a match between their own implicit theory ofbehavior change and a body of scientific theory they encountered through their interac-tion with a researcher.

Klein, Williams, and Witbrodt20 write of the role collaboration played at each stage ofstudying and implementing a program for urban, minority women in a Native Americanclinic. The authors discuss the give-and-take that occurred within the research-CBOstudy team at different stages of the process, highlighting how the underlying valuesinherent in the prevention program guided the research collaboration. They address thereal-world constraints of doing good science and service while encountering organiza-tional resistance and the limited resources and time to complete the research project.

Weiker, Edgington, and Kipke21 discuss collaboration on the evaluation of a needleexchange and creative arts prevention program for young injection drug users (IDUs).Based on the theory of harm reduction, the intervention sought to make contact with andserve young IDUs. The authors write of how the harm reduction philosophy came to beunderstood by the research team over the course of the collaboration and of how harmreduction shaped study design and the implementation.

This volume closes with a commentary by Schensul22 that places this body of collabo-rative work in the context of a long history of community participatory research. This arti-cle compares and contrasts the CAPS model to participatory research conducted in a vari-ety of fields.

Together, we hope that this group of articles articulates practical lessons incommunity-research collaborations, speaks to the strengths researchers and service pro-viders bring to the work, and helps those who wish to conduct such research to anticipateand avoid some of the difficulties inherent in collaboration. We believe that in small ways,these individual research partnerships and the larger consortia did forge cooperation andopen communication among researchers, service providers, and funders. We hope thatthe CAPS model of community collaborative research offers one model of how to moveaway from the real division between research and service and toward the ideal partnershipbetween these two worlds.

182 Health Education & Behavior(April 1999)

Page 13: Cc.comunitara in Colaborare

References

1. Atwood K, Colditz GA, Kawachi I: From public health science to prevention policy: Placingscience in its social and political contexts.Am J Public Health87:1603-1606, 1997.

2. Lurie P, Drucker E: An opportunity lost: Estimating the number of HIV infections associatedwith the lack of a national needle exchange program in the United States.Lancet349:604-608,1997.

3. Goldstein E, Wruble J, Faigeles B, DeCarlo P: Sources of information for HIV prevention pro-gram managers: A national survey.AIDS Educ and Prev10:63-74, 1998.

4. Gómez C, Goldstein E: The HIV prevention evaluation initiative: A model for collaborative andempowerment evaluations, in Fetterman D, Kaftarian SJ, Wandersman A (eds.):EmpowermentEvaluation: Knowledge and Tools for Self-Assessment and Accountability. Thousand Oaks,CA, Sage, 1996, pp. 100-122.

5. Auerbach JD, Wypijewska C, Brodie HKH (eds.):AIDS and Behavior: An IntegratedApproach. Washington, DC, National Academy Press, 1994.

6. Binson D, Harper G, Grinstead O, Haynes Sanstad K: The Center for AIDS Prevention Studies’Collaboration Program: An alliance of AIDS scientists and community-based organizations, inNyden P, Figert A, Shibley D, Borrows M (eds.):Building Community: Social Science inAction. Thousand Oaks, CA, Pine Forge Press, 1997, pp. 177-189.

7. Fawcett S: Some values guiding community research and action.Journal of Applied BehaviorAnalysis24:621-636, 1991.

8. Fawcett S, Paine-Andrews A, Francisco V, et al: Using empowerment theory in collaborativepartnerships for community health and development.American Journal of Community Psy-chology23:677-697, 1995.

9. Orlandi MA, Landers C, Weston R, Haley N: Diffusion of health promotion innovations, inGlanz K, Lewis FM, Rimer BK (eds.):Health Behavior and Health Education. San Francisco,Jossey-Bass, 1990.

10. Hearst N, Mandel J, Coates T: Collaborative AIDS prevention research in the developing world:The CAPS experience.AIDS9(Suppl.):S1-S5, 1995.

11. Everett W, Harder P, Brousseau R, Eldred J: Early evaluation results of a collaborative partner-ship.Health Affairs15:210-212, 1996.

12. Harder + Company: Evaluation of the NCG/CAPS/CBO collaboration for HIV prevention:Final report. Funded by a grant from the Ford Foundation. Available from Northern CaliforniaGrantmakers AIDS Task Force, 116 New Montgomery Street, Ste. 742, San Francisco, CA94105.

13. Human Interaction Research Institute: San Francisco Bay Area HIV/AIDS prevention andevaluation initiative: Lessons for funders, nonprofits and evaluators. A report of the NorthernCalifornia Grantmakers AIDS Task Force. Available from NCG AIDS Task Force, 116 NewMontgomery Street, Ste. 742, San Francisco, CA 94105, 1998.

14. Quirk K, Goldstein E: Final report: Statewide community HIV evaluation project (SCHEP),Contract #84-21130. Available from the Center for AIDS Prevention Studies, University ofCalifornia, San Francisco, http://www.caps.ucsf.edu, 1998.

15. Gómez CA, Hernández M, Faigeles B: Sex in the new world: An empowerment model for HIVprevention in Latina women.Health Education & Behavior26(2):200-212, 1999.

16. Heft L, Faigeles B, Hall TL: Mom? Dad? Can we talk? Parent-adolescent communication aboutHIV. Lisa Heft, independent consultant, email: [email protected], 1998.

17. Grinstead OA, Zack B, Faigeles B: Collaborative research to prevent HIV among male prisoninmates and their female partners.Health Education & Behavior26(2):225-238, 1999.

18. Harper GW, Carver LJ: Out-of-the-mainstream youth as partners in collaborative research:Exploring the benefits and challenges.Health Education & Behavior26(2):250-265, 1999.

19. Laub C, Somera DM, Gowen LK, Díaz RM: Targeting “risky” gender ideologies: Constructinga community-driven, theory-based HIV prevention intervention for youth.Health Education &Behavior26(2):185-199, 1999.

Sanstad et al. / Community Research Consortium 183

Page 14: Cc.comunitara in Colaborare

20. Klein D, Williams D, Witbrodt J: The collaboration process in HIV prevention and evaluation inan urban American Indian clinic for women.Health Education & Behavior26(2):239-249,1999.

21. Weiker RL, Edgington R, Kipke M D: A collaborative evaluation of a needle exchange programfor youth.Health Education & Behavior26(2):213-224, 1999.

22. Schensul JJ: Organizing community research partnerships in the struggle against AIDS.HealthEducation & Behavior26(2):266-283, 1999.

184 Health Education & Behavior(April 1999)