w is a process design one e of ‘sinovuyo teen’ p … · post-evaluation implementation ......

34
3IE-LIDC SEMINAR SERIES: 'WHAT WORKS IN INTERNATIONAL DEVELOPMENT' WHAT IS A PROCESS EVALUATION AND HOW TO DESIGN ONE? EXAMPLE OF ‘SINOVUYO TEEN’ P ARENT SUPPORT PROGRAM IN SOUTH AFRICA YULIA SHENDEROVICH YS416@CAM.AC.UK/ Y .SHENDEROVICH@GMAIL. COM INSTITUTE OF CRIMINOLOGY , CAMBRIDGE (SUPERVISOR PROFESSOR MANUEL EISNER) DEPARTMENT OF SOCIAL POLICY AND INTERVENTION, O XFORD (SUPERVISOR DR. L UCIE CLUVER)

Upload: ngoque

Post on 01-Jul-2018

214 views

Category:

Documents


0 download

TRANSCRIPT

3IE-LIDC SEMINAR SERIES: 'WHAT WORKS IN INTERNATIONAL DEVELOPMENT' WHAT IS A PROCESS EVALUATION AND HOW TO DESIGN ONE? EXAMPLE OF ‘SINOVUYO TEEN’ PARENT SUPPORT PROGRAM IN SOUTH AFRICA YULIA SHENDEROVICH [email protected]/ [email protected] INSTITUTE OF CRIMINOLOGY, CAMBRIDGE (SUPERVISOR PROFESSOR MANUEL EISNER) DEPARTMENT OF SOCIAL POLICY AND INTERVENTION, OXFORD (SUPERVISOR DR. LUCIE CLUVER)

OUTLINE Definition Why is a process evaluation necessary? Guidelines for process evaluations Examples of process evaluations Development of the process evaluation for Sinovuyo Teen

DEFINITION Process evaluation – study aiming to understand the functioning of an intervention, by examining:

Implementation: structures, resources and processes of delivery (fidelity, uptake, adaptations, dose, reach)

Mechanisms of impact: how intervention activities, and participants’ interactions with them, trigger change

Context: external factors influencing the delivery and functioning of interventions (culture, economic context, infrastructure, etc.)

Research processes (randomization, spill-over, etc.)

STAGES OF PROGRAM DEVELOPMENT Process evaluation is useful and has different applications during: Feasibility and piloting Effectiveness evaluation Post-evaluation implementation Pragmatic policy trials and natural experiments

Process evaluation is essential for high-quality impact (outcome) evaluations: How, why and under what conditions does the program function?

KEY FUNCTIONS OF PROCESS EVALUATION (MOORE 2014)

KEY TERMS

Dose - How much of the intervention is delivered? Uptake - How much is actually received by participants? Reach – How much of the target audience comes into contact

with the intervention? Fidelity (adherence) and quality of implementation– To what

extent is the program implemented as intended? Fidelity-adaptation debate: Manualisation (Mihalic 2004) Form (action) v function (Hawe et al. 2004)

WHY IS PROCESS EVALUATION NECESSARY? For researchers and policy makers

Explaining success (Will outcomes be similar in other contexts? How can the effects be replicated?) Explaining failure (Is it due to the intervention or to poor implementation?) Does the intervention have different effects on subgroups?

For systematic reviewers

Understanding the nature of intervention and implementation heterogeneity (CONSORT-SPI, TIDieR)

RELATED CONCEPTS Theory-based evaluation (Weiss 1997) Theory-driven evaluation (Chen & Rossi 1983) Realistic evaluation (Pawson and Tilley 1997) Realist trials (Bonell et al. 2012) Implementation assessment (JPAL) Implementation research (Peters, Tran & Adam 2013) Causal map (Montibeller & Belton 2006) Logic model (Rogers 2004) Performance framework (Montague 1998) And many others!

“The terminology is not important, it is about buying into the critical thinking” (Helene Clark, ActKnowledge)

RELEVANT TOOLS Review of program records and documentation Performance data Surveys and administrative data Interviews Focus groups Other consultative designs Observational studies Ethnography Case studies

Theories of Change –/Logic model Data Required

•Surveys, statistics, demographic data

•Qualitative data •Costs/benefits data •Systematic review data •Documentary analysis

•Performance data •Historical data •Diversity data •Qualitative data •Effectiveness data

•Stakeholder data •Qualitative data •Public opinion data •Effectiveness data

•Performance data •Effectiveness data •Stakeholder data •Qualitative data •Costs/benefits data

•Administrative data •Performance data •Costs/benefits data

•Administrative data •Performance data •Qualitative data

•Counterfactual data •Administrative data •Survey data, statistics

PROGRAM THEORY/LOGIC MODEL Thanks to Radhika Menon, Birte Snilstveit, Philip Davies

METHODS OF COLLECTING IMPLEMENTATION DATA BREITENSTEIN ET AL., 2010, SUMMARY BY FELIX VAN URK

PROCESS EVALUATION EXAMPLE 1: EXPLAINING PARTIAL SUCCESS A Stop Smoking in Schools Trial in the UK (ASSIST trial)

Found reductions in smoking amongst occasional and

experimental smokers, but not regular smokers (quant) In-depth (qual) evaluation in 4 schools: in protecting

themselves from potential hostility, peer supporters concentrated their attention on peers who they felt could be persuaded

Audrey, S., Cordall, K., Moore, L., Cohen, D. and Campbell, R. (2004) The development and implementation of a peer-led intervention to prevent smoking among secondary school students using their established social networks, Health Education Journal, 63, 266-284.

PROCESS EVALUATION EXAMPLE 2: EXPLAINING LACK OF EFFECT “Continuous and Comprehensive Evaluation” scheme in

Mahendragarh and Kurukshetra districts of Haryana, India

CCE schools did not perform significantly better than control schools on either oral or written tests

Assumptions not met: Teacher training is adequate Teachers have time and resources to implement

Type III error (Basch et al 1985)

Duflo, E, Berry, J, Mukerji, S and Shotland, M (2014). A Wide Angle View of Learning: evaluation of the CCE and LEP Programmes in Haryana, 3ie Impact Evaluation Report

SINOVUYO TEEN Parent management training program designed by researchers to provide a low cost intervention to reduce child abuse and teen externalizing behaviour Implemented by local NACCW (youth and child care) workers Tested in a pre-post pilot with 115 families, incl. some process eval; RCT upcoming Part of Parenting for Lifelong Health collaboration

STUDY LOCATION – KING WILLIAM’S TOWN, EASTERN CAPE

Sinovuyo Teen

Pre-post test + quals N=60

Keiskamma Trust

Pre-post test + quals N=200

NACCW & UNICEF

RCT + process evaluation

adaptation

adaptation

Qualitative 2012

2013

2014

2015-16

PROGRAM DEVELOPMENT & EVALUATION TIMELINE

THANK YOU TO: Thank you to research staff and participants, NACCW, UNICEF South Africa, European Research Council, Cambridge International Scholarship Scheme, and other partners:

Universities: Cambridge and

Oxford, Cape Town & Bangor

National Action

Committee (NACCA) SA

RESEARCH COLLABORATORS

Professor Lucie Cluver, Professor Catherine Ward, Professor Frances Gardner, Dr. Franziska Meinck, Dr. Jenny Doubt, Dr. Mark Boyes, Jamie McLaren Lachman, Sibongile Tsoanyane, Sussie Mjwara, Tshiamo Petersen, Rocio Herrero Romero, Sachin De Stone, Netesah Sunapta, Alice Redfern, Steinert Janina, Melissa Pancoast, David Carel, Daphee Blanc, Meryn Lechowicz, Vira Ameli and many others!

DEVELOPMENT ADVICE UNICEF HQ’s Theresa Kilbane, Patricia Lim Ah Ken and the Child Protection team; the International Rescue Committee’s Laura Boone; Amanda Sim; PEPFAR-USAID’s Gretchen Bachman, Dr Nicole Behnam; Dr Janet Shriberg; Clowns Without Borders South Africa’s Jamie Lachman, Hannah Mangenda; Sibongile Tsoanyane; UNICEF South Africa’s Heidi Loening, George Laryea-Adjei, Patrizia Benvenuti, Seamus Mac Roibin, Andries Viviers and the Child Protection and Social Protection teams; UNICEF ESARO’s Denise Stuckenbruck and Maud Droogleever Fortuyn; UNICEF Innocenti’s Jasmina Byrne; the National Association of Child and Youth Care Workers (NACCW)’s Zeni Thumbadoo, Donald Nghonyama and the Isibindi team; The WHO Violence Prevention Unit Dr Chris Mikton, Dr Alex Burchardt; Professor Mark Tomlinson; Professor Judy Hutchings; Professor Frances Gardner; Professor Geri Donenberg; Professor Mary Jane Rotheram-Borus; Dr Danuta Kaspryzyk; Dr Daniel Montano; Professor Theresa Betancourt; Professor Asher Ben Arieh; Professor Larry Aber; Professor Lorraine Sherr; Dr Ashraf Grimwood; Professor Howard Dubowitz; Dr Diane de Panfilis; Professor Manuel Eisner, Dr Karen Devries, Dr Daniel Michelson, Professor Joe Murray, The South African National Department of Social Development’s Deputy Director-General Conny Nxumalo, Thabani Buthelezi, Dr Malega Kganakga and the Children’s and HIV/AIDS

directorates; the National Department of Basic Education’s Gugu Ndebele, Likho Bottoman and the Social Inclusion Unit; The Eastern Cape Buffalo City Metro District Department of Social Development Mr Mtutuzeli Njungwini; Professor Claude Mellins; Professor Arvin Bhana and Professor Inge Petersen; Stuart Kean; REPSSI’s Noreen Huni and Lynette Mukedunye; The Keiskamma Trust; Professor Lorraine Sherr; Dr Tamsen Rochat; Professor Rachel Jewkes and Dr Anik Gevers; Dr Franziska Meinck, Dr Lucy Steinitz and Lucy Hillier, Professor Lynne Murray, Professor Peter Cooper. Special thanks to Jamie McLaren Lachman, Tshiamo Petersen, Dr Mark Boyes, Dr Franziska Meinck,,Dr. Lauren Kaplan and Dr. Jenny Doubt.

WHY PARENTING?

LMICs have high level of both child maltreatment and youth violence (Krug, Mercy, Dahlberg, & Zwi, 2002).

A lot of evidence in HICs, including reduction of child maltreatment by caregivers (Barlow, Johnston, Kendrick, Polnay, & Stewart-Brown, 2006; Mikton & Butchart, 2009) and improvement in child behaviour (Kaminski, Valle, Filene, & Boyle, 2008)

Limited evidence in LMICs (Knerr, Gardner, & Cluver, 2013). Evidence primarily for younger children High costs of established programmes

Principles based on social learning theory: Modelling behaviour, positive parenting skills before discipline, positive reinforcement to promote good behaviour, positive instruction giving, ignoring negative attention seeking behaviour, nonviolent limit-setting

Examples of facilitator skills: active listening, acceptance, praise, positive body language, reflecting on program principles, reframing,

SINOVUYO TEEN SESSIONS 1. Defining goals and values 2. Life dreams 3. Praising each other 4. Discussing family goals 5. Talking about emotions 6. Dealing with stress and anger 7. Problem solving 8. Methods of saving

9. Rules, routines, responsibilities 10. Keeping safe in the community 11. Responding to crisis 12. Widening circles of support 13. Motivation to save 14. Information on how to save 15. Making a family savings plan

Inputs

• Based on evidence from HICs, adapted to South Africa • Partnership with lay workers and an implementing NGO

Activities

• Identifying at-risk families w teens w behaviour problems • Facilitator training • Weekly facilitator supervision • 12 weekly sessions and home visits delivered as planned • Main caregiver and teen participate

Outputs

• Improve parent-adolescent communication • Improve non-violent discipline • Reduce inconsistent discipline • Improve parent and adolescent social support • Reduce parent and adolescent stress

Outcomes (proximal)

• Increase positive parenting • Increase parental supervision • Increase parent involvement

Impacts (distal)

• Reduce violent discipline and abusive parenting • Reduce adolescent problem behaviour

PRE-POST PILOT

Focus group with Isibindi facilitators Interviews with Isibindi facilitators Interviews with CWBSA Focus groups with participants Workshop observation

LESSONS FROM THE PILOT: EXAMPLES OF PROGRAM REVISIONS BASED ON FOCUS GROUP High attendance rates (>75%), high satisfaction Khaya Catchups (visiting participants at home) are essential to successful

delivery Caregivers said financial insecurity was a major issue related to their stress,

so a micro-savings component included in the RCT Role-plays essential for delivery of sensitive content Sino Buddy system provides some on-going support Participants set up their own Sinovuyo groups

Teen (left ) and parent (right)

report of changes in their

homes after participating

LESSONS FROM THE PILOT: CONSTRAINTS

Limited resources for data collection Limited time and literacy among facilitators and programme

participants Hawthorne effect (evaluation as intervention) Requires very broad buy-in

LESSONS FROM THE PILOT: WHAT IS WRONG WITH THIS PICTURE?

Name

Signature

LOCATION: DATE:

rand

omiz

ati

on

Intervention = 20

Comparison= 20

Villages

Post

-test

Follo

w-u

p at

12

mon

ths

Cluster RCT Intro SINOVUYO TEEN CLUSTER RCT PLAN

12-13 families per village, 500 families total MDES=0.37 (Cohen’s d)

PROCESS EVALUATION PLANNED FOR THE RCT

How is the Sinovuyo Teen Programme (RCT) implemented?

What are the predictors of enrolment, attendance, and quality of participation in the Sinovuyo Teen Programme?

How do attendance and quality of participation affect treatment response to the Sinovuyo Teen Programme at post-test?

What are experiences of caregivers, teens and facilitators participating in the Sinovuyo Teen Programme?

QUANTITATIVE MEASURES OF IMPLEMENTATION:

Enrolment rates Attendance (using facilitator report & photos) Participation quality (supervisor ratings and

independent ratings based on video for a subgroup) Predictors of enrolment and participation - demographics - baseline problem levels - perceived benefits and barriers - attitude to the program (perceived norms) Fidelity of delivery by facilitators (supervisor

observations and randomly recorded sessions - video coding scheme)

QUALITATIVE DATA COLLECTION Interviews with: - 15-30 randomly selected intervention group families - 20 randomly selected control group families - families who attended 10 sessions or more - families who attended fewer than 4 sessions - local authorities Focus group and interviews with facilitators

Topics: Experience with the research process Understanding of programme topics Programme delivery Perceived mechanisms of change Involvement of other family members in the program Costs and benefits of participation, existing motivations Potential adverse effects (“dark logic”)

WHEN PLANNING: Clearly describe the intervention and clarify its causal

assumptions Identify key uncertainties Define relationship between intervention developers,

implementers, outcome and process evaluators Identify previous process evaluations of similar interventions Agree scientific and policy priority questions Select a combination of quantitative and qualitative methods

“There is no single best way to design and carry out a process evaluation” (Grant 2013)

REFERENCES Moore, G., Audrey, S., Barker, M., Bond, L., Bonell, C, Hardeman, W., Moore, L.,

O’Cathain, A., Tinati, T., Wight, D., Baird, J. (2014). Process evaluation of complex interventions. UK Medical Research Council (MRC) guidance.

Grant, A., et al. (2013). "Process evaluations for cluster-randomised trials of complex interventions: a proposed framework for design and reporting." Trials 14(1): 15.

Hoffmann, T. C., Glasziou, P. P., Boutron, I., Milne, R., Perera, R., Moher, D., ... & Michie, S. (2014). Better reporting of interventions: template for intervention description and replication (TIDieR) checklist and guide. BMJ: British Medical Journal, 348.

Michie, S. & Prestwich, A. (2010). Are Interventions Theory-Based? Development of a Theory Coding Scheme. Health Psychology 29(1): 1-8.

Montgomery, P., Grant, S., Hopewell, S., Macdonald, G., Moher, D., Michie, S., & Mayo-Wilson, E. (2013). Protocol for CONSORT-SPI: An Extension for Social and Psychological Interventions. Implementation Science, 8, 99. doi:10.1186/1748- 5908-8-99

White, H. (2009). "Theory-based impact evaluation: principles and practice." Journal of development effectiveness 1(3): 271-284.

Funnell, S. C. and P. J. Rogers (2011). Purposeful program theory: effective use of theories of change and logic models, John Wiley & Sons.

Breitenstein, S.M., Gross, D., Garvey, C.A., Hill, C., Fogg, L. & Resnick, B. (2010). Implementation Fidelity in Community-Based Interventions. Research in Nursing and Health 33: 164-173

Mihalic, S. (2004). The importance of implementation fidelity. Emotional and Behavioral Disorders in Youth 4(4): 83–105.

Moore, J., Bumbarger, B.K. & Cooper, B.R. (2013). Examining Adaptations of Evidence-Based Programs in Natural Contexts. The Journal of Primary Prevention 34, 1, 147-61

Linnan, L. and A. Steckler (2002). Process evaluation for Public Health Interventions and research: an overview. Process evaluation for public health interventions. L. Linnan and A. Steckler. San Francisco CA, Jossey Bass.

Palinkas, L. A., G. A. Aarons, et al. (2010). "Mixed method designs in implementation research." Adm Policy Ment Health 38: 44-53

Mowbray, C. T., Holter, M. C., Teague, G. B., & Bybee, D. (2003). Fidelity criteria: Development, measurement, and validation. American journal of evaluation, 24(3), 315-340.

Walker, R., Hoggart, L., & Hamilton, G. (2008). Observing the implementation of a social experiment. Evidence & Policy: A Journal of Research, Debate and Practice, 4(3), 183-203.

THANK YOU!