building evaluation capacity: the “systems evaluation protocol” in practice
DESCRIPTION
Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice. Monica Hargraves, PhD Manager of Evaluation for Extension and Outreach Cornell University October 2009 [email protected]. Brief outline. What we do (The “Evaluation Partnership” approach) - PowerPoint PPT PresentationTRANSCRIPT
![Page 1: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice](https://reader033.vdocuments.mx/reader033/viewer/2022051316/56813a38550346895da22308/html5/thumbnails/1.jpg)
Cornell UniversityCornell Office for Research on Evaluation (CORE)
Building Evaluation Capacity:The “Systems Evaluation Protocol” in Practice
Monica Hargraves, PhD
Manager of Evaluation for Extension and Outreach
Cornell University
October [email protected]
![Page 2: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice](https://reader033.vdocuments.mx/reader033/viewer/2022051316/56813a38550346895da22308/html5/thumbnails/2.jpg)
Brief outline
1. What we do (The “Evaluation Partnership” approach)
2. Key steps in the training (with stories from extension program partners)
3. “Swimming against the tide…”
4. Making it feasible and sustainable
![Page 3: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice](https://reader033.vdocuments.mx/reader033/viewer/2022051316/56813a38550346895da22308/html5/thumbnails/3.jpg)
Evaluation Partnerships
Evaluation Partnerships – CORE provides training, brings evaluation
expertise– Partners bring experience, expertise in their
programs, their communities, their “systems”
Planning Phase is a one-year commitment, with intentions and clarity of roles captured in an MOU
![Page 4: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice](https://reader033.vdocuments.mx/reader033/viewer/2022051316/56813a38550346895da22308/html5/thumbnails/4.jpg)
What the EP entails, in the “planning year”
Stages:• Preparation for Partnership (Jan – March)
• Modeling (intensive!) (April – June)
• Evaluation Planning (July – Oct/Nov)
Formats this year:• Two in-person, full-day training meetings • Web-conferences• Listserve, e-mail, phone support
![Page 5: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice](https://reader033.vdocuments.mx/reader033/viewer/2022051316/56813a38550346895da22308/html5/thumbnails/5.jpg)
2006: NYC
History of the Project within Cornell Cooperative Extension (CCE)
![Page 6: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice](https://reader033.vdocuments.mx/reader033/viewer/2022051316/56813a38550346895da22308/html5/thumbnails/6.jpg)
2007: Chenango, Jefferson, Onondaga, St. Lawrence, Tompkins, Ulster
History of the Project within Cornell Cooperative Extension (CCE)
![Page 7: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice](https://reader033.vdocuments.mx/reader033/viewer/2022051316/56813a38550346895da22308/html5/thumbnails/7.jpg)
2009: Chemung, Chenango, Clinton, Cortland, Franklin, Fulton & Montgomery, Genesee, Jefferson, Madison, Monroe, Oneida, Ontario, Oswego, Rensselaer, Saratoga, Seneca, Tioga, Tompkins, Ulster, Wayne
History of the Project within Cornell Cooperative Extension (CCE)
![Page 9: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice](https://reader033.vdocuments.mx/reader033/viewer/2022051316/56813a38550346895da22308/html5/thumbnails/9.jpg)
“Systems Evaluation Protocol”: Planning Phase
![Page 10: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice](https://reader033.vdocuments.mx/reader033/viewer/2022051316/56813a38550346895da22308/html5/thumbnails/10.jpg)
Stakeholder Analysis
Dairy Program Parents
FFA Teachers
Youth
Jefferson County Legislatures
4-H MembersNational Dairy Industry
Jefferson County Fair Board
FundersCCE Staff
Breed Associations
Local Ag Businesses
Surrounding County Youth
NYS 4-H
SUNY Morrisville Cobleskill
Other Youth Programs
JCADCA
Jefferson County Dairy Producers
Media
Volunteers
CCE Board of Directors
State Fair
NYS Jr. Holstein Association
Local School Districts
Cornell University
Taxpayers
CCE-Jefferson 4-H Dairy Program: Stakeholder Map
![Page 11: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice](https://reader033.vdocuments.mx/reader033/viewer/2022051316/56813a38550346895da22308/html5/thumbnails/11.jpg)
Developing Stakeholder charts
![Page 12: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice](https://reader033.vdocuments.mx/reader033/viewer/2022051316/56813a38550346895da22308/html5/thumbnails/12.jpg)
Stakeholder Analysis … why it matters:
![Page 13: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice](https://reader033.vdocuments.mx/reader033/viewer/2022051316/56813a38550346895da22308/html5/thumbnails/13.jpg)
Logic Model Development
![Page 14: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice](https://reader033.vdocuments.mx/reader033/viewer/2022051316/56813a38550346895da22308/html5/thumbnails/14.jpg)
Quick “poll” on formal modeling …
Think of programs you are evaluating, or wish to evaluate.
How many of those have a written-down model (Logic Model, or something similar)?
A – allB – manyC – someD – few E – none
![Page 15: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice](https://reader033.vdocuments.mx/reader033/viewer/2022051316/56813a38550346895da22308/html5/thumbnails/15.jpg)
Focus on Activities, Outputs, and Outcomes
![Page 16: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice](https://reader033.vdocuments.mx/reader033/viewer/2022051316/56813a38550346895da22308/html5/thumbnails/16.jpg)
Make connections (create “links”)
![Page 17: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice](https://reader033.vdocuments.mx/reader033/viewer/2022051316/56813a38550346895da22308/html5/thumbnails/17.jpg)
Pathway Model Development
4-H “SET-To-Go” (an after-school science program), CCE-Cortland County
Pathway Model, October 2009
![Page 18: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice](https://reader033.vdocuments.mx/reader033/viewer/2022051316/56813a38550346895da22308/html5/thumbnails/18.jpg)
“Mining the Model”
![Page 19: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice](https://reader033.vdocuments.mx/reader033/viewer/2022051316/56813a38550346895da22308/html5/thumbnails/19.jpg)
Comments from an Evaluation Partner…
Shawn Smith
4-H Issue Area Leader & Evaluation Project Manager
CCE – Cortland County (CCECC)
![Page 20: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice](https://reader033.vdocuments.mx/reader033/viewer/2022051316/56813a38550346895da22308/html5/thumbnails/20.jpg)
CCECC Advisory Committee Input
![Page 21: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice](https://reader033.vdocuments.mx/reader033/viewer/2022051316/56813a38550346895da22308/html5/thumbnails/21.jpg)
Internal Stakeholder Analyses
![Page 22: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice](https://reader033.vdocuments.mx/reader033/viewer/2022051316/56813a38550346895da22308/html5/thumbnails/22.jpg)
CCECC PM “Final” To date
![Page 23: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice](https://reader033.vdocuments.mx/reader033/viewer/2022051316/56813a38550346895da22308/html5/thumbnails/23.jpg)
Note: You can do Pathway model visuals without the Netway!
![Page 24: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice](https://reader033.vdocuments.mx/reader033/viewer/2022051316/56813a38550346895da22308/html5/thumbnails/24.jpg)
Impa
ct
initiation growth
Program Life Cycle
maturity transformation
Time
Source: Program Leadership Certification, “Accountability and Evaluation” PowerPoint, Michael Duttweiler
![Page 25: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice](https://reader033.vdocuments.mx/reader033/viewer/2022051316/56813a38550346895da22308/html5/thumbnails/25.jpg)
Quick Poll on Program Lifecycles
Think about a program you are evaluating or are going to be evaluating
What lifecycle stage is it in?
A – early development, pilotB – still revising/tweakingC – implemented consistentlyD –consistent across sites/facilitators and documentedE –well-established, stable, candidate for replication
![Page 26: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice](https://reader033.vdocuments.mx/reader033/viewer/2022051316/56813a38550346895da22308/html5/thumbnails/26.jpg)
Program & Evaluation AlignmentE
valu
atio
n S
pe
cial P
roje
cts
Process assessment and post-only evaluation of participant reactions and satisfaction.
Post-only assessment of outcomes, implementation assessment, outcome measurement development and assessment of internal consistency (reliability).
Unmatched pretest and posttest of outcomes, qualitative assessment of change, and assessment of reliability and validity of measurement.
Matched pretest and posttest of outcomes. Verify reliability and validity of change. Human subjects review.
Controls and comparisons (control groups, control variables or statistical controls).
Controlled experiments or quasi-experiments (randomized experiment; regression-discontinuity) for assessing the program effectiveness.
Multi-site analysis of integrated large data sets over multiple waves of program implementation.
Formal assessment across multiple program implementations that enable general assertions about this program in a wide variety of contexts (e.g., meta-analysis).
Pha
se I
Pha
se I
IP
hase
III
Pha
se I
V
Program Lifecycle Evaluation Lifecycle
Init
iati
on
Dev
elo
pm
ent
Mat
uri
tyD
iss
emin
ati
on
Pro
cess &
Resp
on
seC
han
ge
Co
mp
arison
& C
on
trol
Gen
eralizability
Phase IAIs program in initial implementation(s)?
Is program in revision or reimplementation?
Phase IB
Is program being implemented consistently?
Does program have formal written procedures/protocol?
Is program associated with change in outcomes?
Is effective program being implemented in multiple-sites?
Does program have evidence of effectiveness?
Phase IIA
Phase IIB
Phase IIIA
Phase IIIB
Phase IVA
Phase IVBIs evidence-based program being widely distributed?
![Page 27: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice](https://reader033.vdocuments.mx/reader033/viewer/2022051316/56813a38550346895da22308/html5/thumbnails/27.jpg)
Determining Evaluation Scope
It’s all about making GOOD CHOICES…
• What kind of evaluation is appropriate for the program lifecycle stage?
• What are the key outcomes this program should be attaining?
• What do important stakeholders care most about?
• What will “work best” in this kind of program?
• What kind of evaluation is feasible for this year? What should wait until a future year?
![Page 28: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice](https://reader033.vdocuments.mx/reader033/viewer/2022051316/56813a38550346895da22308/html5/thumbnails/28.jpg)
Middle-TermOutcome
Activity Activity Activity Activity
Output Output Output Output
Middle-TermOutcome
Middle-TermOutcome
Middle-TermOutcome
Long-TermOutcome
Long-TermOutcome
Short-TermOutcome
Short-TermOutcome
Short-TermOutcome
Key Outcomes
Components
Key Links
Key Pathway
Scope
Stakeholders 1 2
3
1 1
1
2
2
3
3
2Internal Priorities
Determining Evaluation Scope
![Page 29: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice](https://reader033.vdocuments.mx/reader033/viewer/2022051316/56813a38550346895da22308/html5/thumbnails/29.jpg)
Comments from another Evaluation Partner…
Linda Schoffel
Rural Youth Services Program Coordinator
CCE – Tompkins County (CCETC)
![Page 30: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice](https://reader033.vdocuments.mx/reader033/viewer/2022051316/56813a38550346895da22308/html5/thumbnails/30.jpg)
Using the Pathway Model for making evaluation choices – RYS Rocketry Program
![Page 31: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice](https://reader033.vdocuments.mx/reader033/viewer/2022051316/56813a38550346895da22308/html5/thumbnails/31.jpg)
Evaluation should “fit” the program …(lifecycle, stakeholders, context, etc.)
RYS Rocketry Program, CCE-Tompkins CountyPathway Model, October 2009
Do youth who participate in RYS Rocketry feel like they are part of the group? (belonging)
![Page 32: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice](https://reader033.vdocuments.mx/reader033/viewer/2022051316/56813a38550346895da22308/html5/thumbnails/32.jpg)
“Swimming against the Tide”
The most frequently cited challenge to program evaluation is lack of time.
The systems approach involves spending a lot of time before you even get to the point of choosing
measures…
Programs often face significant pressure for more evaluation, and for evidence of “impact” …
The systems approach argues, essentially, that “less is more” if the evaluation truly “fits” the program
![Page 33: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice](https://reader033.vdocuments.mx/reader033/viewer/2022051316/56813a38550346895da22308/html5/thumbnails/33.jpg)
Making it feasible for the long-term
Key ingredients that help:
• focusing on the most valuable elements (choosing well)
• identifying interim benefits of the process• integrating with other needs • building on others’ progress• sharing resources
![Page 34: Building Evaluation Capacity: The “Systems Evaluation Protocol” in Practice](https://reader033.vdocuments.mx/reader033/viewer/2022051316/56813a38550346895da22308/html5/thumbnails/34.jpg)
Wrapping Up …
Thank you!
Any questions for any of us, before returning to Bill…?
For follow-up questions later, Monica Hargraves: [email protected]
Shawn Smith: [email protected] Schoffel: [email protected]
Also see our website at http://core.human.cornell.edu/