matt keene (ppt)

62
Integrating Evaluation into the Design of Your Innovative Program Evaluation Support Division National Center for Environmental Innovation Office of Policy, Economics and Innovation US Environmental Protection Agency Innovation Symposium Chapel Hill, NC Thursday, January 10, 2008

Upload: jackie72

Post on 05-Dec-2014

1.076 views

Category:

Technology


1 download

DESCRIPTION

 

TRANSCRIPT

Page 1: Matt Keene (PPT)

Integrating Evaluation into the Design of Your

Innovative Program

Evaluation Support DivisionNational Center for Environmental InnovationOffice of Policy, Economics and InnovationUS Environmental Protection Agency

Innovation Symposium Chapel Hill, NCThursday, January 10, 2008

Page 2: Matt Keene (PPT)

2

Workshop Outline

1. Introductions

2. Activity – Evaluation in Our Lives

3. Evaluation and its Evolution at EPA

4. Case Study – Product Stewardship in MN

5. Exercise – Integrating Evaluation in MN

6. Opportunities to Integrate Evaluation

Page 3: Matt Keene (PPT)

3

Introductions

This will be an interactive workshop… so let’s interact!

• Get to know someone at your table• Tell us

• Who they are, • Who they work with, and • Their New Year’s resolution

Page 4: Matt Keene (PPT)

4

Purpose of the Workshop

Through discussion and a practical, real-world example, provide participants with the structure and conceptual understanding necessary to integrate evaluation and performance management into the design of environmental programs.

Page 5: Matt Keene (PPT)

5

Evaluation In Our Lives

Activity

• Name something in your life that you or someone else decided was worth measuring and evaluating.

• What was the context?

• Was there a target or goal…what was it?

• Who was the audience?

• How did you measure progress or success?

• How did you use what you learned?

Page 6: Matt Keene (PPT)

6

Evaluation In Our Programs

What can we take from evaluation in our lives and apply to addressing environmental challenges?

• Measure what matters

• Evaluate for others and for ourselves

Integrating evaluation into program design

• Equal parts art and skill

• Performance management and quality evaluation are inseparable

Page 7: Matt Keene (PPT)

7

Evaluation In The EPA

Evaluation Support Division

ESD’s Mission

• Evaluate innovations

• Build EPA’s capacity to evaluate

Performance Management

• An approach to accomplishing EPA goals and ESD’s mission

Page 8: Matt Keene (PPT)

8

Performance Management

PERFORMANCE MANAGEMENTPerformance management includes activities to ensure that goals are consistently being met in an effective and efficient manner. Performance management tools include logic models, performance measurement and program evaluation.

Logic Model

Tool/framework that helps identify the program/project

resources, activities, outputs customers, and

outcomes.

Performance Measurement

Helps you understand what

level of performance is achieved by the program/project.

Program Evaluation

Helps you understand and

explain why you’re seeing the

program/project results.

Page 9: Matt Keene (PPT)

9

Steps to Completing an Evaluation

VI. Design the Evaluation

II. Identify Team/Develop Evaluation Plan

III. Describe the Program

IV. Develop Evaluation Questions

V. Identify/Develop Measures

VIII. Analyze and Interpret Information

IX. Develop the Report

VII. Collect Information

I. Selecting a Program for Evaluation

Page 10: Matt Keene (PPT)

10

Page 11: Matt Keene (PPT)

Logic Model

Longer term outcome

(STRATEGIC AIM)

Short termoutcome

CustomersOutputs

WHYHOW

PROGRAM RESULTS FROMPROGRAM

EXTERNAL CONDITIONS INFLUENCING PERFORMANCE (+/-)

Intermediateoutcome

ActivitiesActivitiesResources/ InputsResources/ Inputs

VictoryCommitment TrainingSnodgrass Juggling Regimen Me

Page 12: Matt Keene (PPT)

12

Performance Measurement

Definition

• The ongoing monitoring and reporting of program progress and accomplishments, using pre-selected performance measures

Measures are designed to check the assumptions illustrated in the logic model

Page 13: Matt Keene (PPT)

13

Measures Across the Logic Model SpectrumElement Definition Example Measure

Resources/ Inputs

Measure of resources consumed by the organization.

Amount of funds, # of FTE, materials, equipment, supplies (etc.).

Activities Measure of work performed that directly produces the core products and services.

# of training classes offered as designed; Hours of technical assistance training for staff.

Outputs Measure of products and services provided as a direct result of program activities.

# of technical assistance requests responded to; # of compliance workbooks developed/delivered.

Customer Reached

Measure of target population receiving outputs.

% of target population trained; # of target population receiving technical assistance.

Customer Satisfaction

Measure of satisfaction with outputs. % of customers dissatisfied with training; % of customers “very satisfied” with assistance received.

Outcomes Accomplishment of program goals and objectives (short-term and intermediate outcomes, long-term outcomes--impacts).

% increase in industry’s understanding of regulatory recycling exclusion; # of sectors that adopt regulatory recycling exclusion; % increase in materials recycled.

Page 14: Matt Keene (PPT)

14

Program Evaluation

Definition

• A systematic study that uses measurement and analysis to answer specific questions about how well a program is working to achieve its outcomes and why.

Orientation/Approaches to Evaluation

• Accountability External Audience

• Learning & Program ImprovementInternal/External Audiences

Page 15: Matt Keene (PPT)

15

Types of Evaluation

Process Evaluation

OutcomeEvaluation

ImpactEvaluation

Longer term

outcome (STRATEGIC

AIM)

Intermediate outcome

Short term outcome

CustomersOutputsActivitiesResources/Inputs

WHYHOW

Design Evaluation

Page 16: Matt Keene (PPT)

16

Questions, Comments and Clarifications

Are there any questions or comments about what we have covered so far?

Page 17: Matt Keene (PPT)

17

Environmental Evaluation: Evolving Theory and Practice

ESD is witnessing the shift from awareness to action

We are adapting to the increasing sophistication of our clients and demands from stakeholders

• Capacity Building

• Evaluations

Managing performance requires integrating evaluation into program design

Page 18: Matt Keene (PPT)

18

Our Case Study

Our case study is representative of a trend toward more sophisticated evaluations of environmental programs

ESD is applying learning and adding to it as we take on more sophisticated projects

From here on, you are receiving information necessary to complete the exercises

• You are responsible for integrating evaluation into the program

• Ask questions and take notes!

Page 19: Matt Keene (PPT)

19

Case Study: Paint Product Stewardship Initiative Background on…

Current Status and Goals of PPSI

Minnesota Demonstration Program

Page 20: Matt Keene (PPT)

20

Evaluating the Demonstration Program

What Will We Evaluate?

• Paint• Management

Systems• Education• Markets• Cooperation?• Financing

system?

Page 21: Matt Keene (PPT)

21

Regional Draft Infrastructure

Why Are We Evaluating?

• Leadership

• Legislation

• Learning

• Transfer

Page 22: Matt Keene (PPT)

22

Evaluating the Demonstration Program

What will we evaluate?

• Paint, Management Systems, Education, Markets

Why are we evaluating the program?

• Leadership, Legislation, Learning, Transfer

Can we integrate evaluation into this project?

• We need a framework to follow…and we are building it as we go

• Initially, integrating evaluation into your program is a design and planning activity

Page 23: Matt Keene (PPT)

Integrating Evaluation into Program Design

1. Team 2. Mission3. Goals & Objectives4. Logic Model

Integrating Evaluation into Program Design

Program

QuestionsDocumentation

Measures

1. Context2. Audience3. Communication4. Use

1. Data Sources2. Collection Methods

& Strategy3. Analysis Tools4. Data Collection5. Data Management

1. Performance Management Policy

2. Evaluation Methodology

Page 24: Matt Keene (PPT)

24

Questions, Comments and Clarifications

Take a few minutes to familiarize yourself with the mission, goals and objectives of the MN demonstration program

Page 25: Matt Keene (PPT)

25

Exercise: Integrating Evaluation

Minnesota Demonstration Project and Performance Management

• We will introduce a process for integrating evaluation into the MN program

• We will use the process to, step-by-step, integrate evaluation into the design of the MN program

Logistics

• Your table is your group for the rest of the workshop

• After brief instruction, each team will complete each step of the process and report the results

Page 26: Matt Keene (PPT)

1. Team 2. Mission3. Goals & Objectives4. Logic Model

Integrating Evaluation into Program Design

Integrating Evaluation into Program Design

Program

QuestionsDocumentation

Measures

1. Context2. Audience3. Communication4. Use

1. Data Sources2. Collection Methods & Strategy3. Analysis Tools4. Data Collection5. Data Management

1. Performance Management Policy2. Evaluation Methodology

Page 27: Matt Keene (PPT)

is our program

Your table is the team that will build evaluation into the MN program.

Describing the MN program

Mission

Goals and objectives

Logic model: we are going to make one!

Program

Select and Describe the Program

1. Team 2. Mission3. Goals & Objectives4. Logic Model

Integrating Evaluation into Program Design

Page 28: Matt Keene (PPT)

Describe the Program: Logic Model

VictoryCommitment TrainingSnodgrass Juggling Regimen Me

Instructions: Each table will craft a line of logic based on one goal (long-term outcome) of the MN project. For each component of the model (e.g. activity, output, outcome), brainstorm with your group to decide on 2-3 items to complete your line of logic.

Resources Activities Outputs Customers Short Term Intermediate Long Term

Outcomes

Page 29: Matt Keene (PPT)

1. Team 2. Mission3. Goals & Objectives4. Logic Model

Integrating Evaluation into Program Design

Program

Questions

What are the critical questions to understanding the success of the MN program?

Use an outcome from your logic model to create your evaluation question

Evaluation Questions

Page 30: Matt Keene (PPT)

1. Team 2. Mission3. Goals & Objectives4. Logic Model

Integrating Evaluation into Program Design

Program

Questions

What contextual factors may influence the answers to each question?

Who are the audiences for each question?

•What’s the best way to communicate with each audience?

•How might each audience use the answer to each question?

Evaluation Questions

1. Context2. Audience 3. Communication4. Use

Page 31: Matt Keene (PPT)

31

Evaluation Questions

What are the critical questions to understanding the success of the MN program?

Use an outcome from your logic model to create your evaluation question.

What contextual factors may influence the answers to each question?

Who are the audiences for each question?• What’s the best way to communicate

with each audience?• How might each audience use the

answer to each question?

1. Team 2. Mission3. Goals & Objectives4. Logic Model

Integrating Evaluation into Program Design

Program

Questions

1. Context2. Audience 3. Communication4. Use

Page 32: Matt Keene (PPT)

1. Team 2. Mission3. Goals & Objectives4. Logic Model

Integrating Evaluation into Program Design

Program

Questions

1. Context2. Audience 3. Communication4. Use

What can we measure to answer each question?

Where can we find the information for each measure?

How can we collect the information?

Given our questions and information to be collected, what will be an effective collection strategy?

Performance Measures

1. Data Sources2. Collection Methods & Strategy3. Analysis Tools4. Data Collection5. Data Management

Measures

Page 33: Matt Keene (PPT)

What analytical tools will give us the most useful information?

How will we implement the collection strategy?

How will we manage the data?

Performance Measures

1. Team 2. Mission3. Goals & Objectives4. Logic Model

Integrating Evaluation into Program Design

Program

Questions

1. Context2. Audience 3. Communication4. Use

1. Data Sources2. Collection Methods

& Strategy3. Analysis Tools4. Data Collection5. Data Management

Measures

Page 34: Matt Keene (PPT)

34

Performance Measures

What can we measure to answer each question?

What methods are best suited for each measure?

What analytical tools will give us the most useful information?

Given our questions and information to be collected, what will be our collection strategy?

• How will we implement the collection strategy?

• How will we manage the data?

1. Team 2. Mission3. Goals & Objectives4. Logic Model

Integrating Evaluation into Program Design

Program

Questions

1. Context2. Audience 3. Communication4. Use

1. Data Sources2. Collection Methods

& Strategy3. Analysis Tools4. Data Collection5. Data Management

Measures

Page 35: Matt Keene (PPT)

1. Team 2. Mission3. Goals & Objectives4. Logic Model

Integrating Evaluation into Program Design

Program

Questions

1. Context2. Audience 3. Communication4. Use

1. Data Sources2. Collection Methods

& Strategy3. Analysis Tools4. Data Collection5. Data Management

Documentation: Methodology & Policy

Evaluation Methodology

The process of integrating evaluation generates a framework for a methodology and an evaluability assessment

Performance Management Policy

Across office programs and projects

Guides strategy and planning

1. Evaluation Methodology2. Performance Management Policy

Measures

Documentation

Page 36: Matt Keene (PPT)

36

Check the Logic

Revisit the process and the decisions made

Look for the flow in the process and identify potential breaks

Identify potential obstacles to our approach to managing the performance of the MN demonstration program

1st cycle is integrating – next cycle begins implementation

Page 37: Matt Keene (PPT)

37

What is happening today with the PPSI?

MOU

Workgroups/committees

Minnesota demonstration project planning

Integrating evaluation into project design

Page 38: Matt Keene (PPT)

38

Recap and Next Steps

Practice : Theory

• An inconsistent ratio

Movement in the environmental community toward:

• Evidence

• Effectiveness

• Evaluation

Opportunities to merge theory and practice

• Policy

• Leadership

• New programs

• Capacity building efforts like this one

Page 39: Matt Keene (PPT)

39

Thank You!

Evaluation Support DivisionNational Center for Environmental

InnovationOffice of Policy, Economics and InnovationU.S. Environmental Protection Agency

Matt Keene

(202) 566-2240

[email protected]

www.epa.gov/evaluate

Page 40: Matt Keene (PPT)

40

Page 41: Matt Keene (PPT)

41

Page 42: Matt Keene (PPT)

42

Page 43: Matt Keene (PPT)

43

Adaptive Management Cycle

Page 44: Matt Keene (PPT)

44

Evaluation…In the Life of a Program

When to do it?

What are the obstacles?

Are there solutions?

Are there opportunities to improve evaluations in your shop?

Page 45: Matt Keene (PPT)

45

Page 46: Matt Keene (PPT)

46

Evaluation Questions

What are the critical questions to understanding the success of the MN program?

Link your questions to a component in your line of the logic model

What contextual factors may influence the answers to each question?

Who are the audiences for each question?

• What’s the best way to communicate with each audience?

• How might each audience use the answer to each question?

Page 47: Matt Keene (PPT)

47

Document Evaluation Policy and Methodology

Evaluation Policy

Evaluation Methodology

Page 48: Matt Keene (PPT)

48

Performance Measures

What can we measure to answer each question?

What methods are best suited for each measure?

What analytical techniques could we use to maximize the rigor of our analysis?

Given the level of rigor desired, what will be our collection strategy?

• How will we implement the collection strategy?• How will we manage the data?

Page 49: Matt Keene (PPT)

49

Materials

Presentation

Flip charts

Markers

Projector

Laptop

Tape for flipchart paper

Post its

Page 50: Matt Keene (PPT)

50

Supporting documents from PPSI, etc.

MN MOU

MN Goals and Objectives and Tasks

Workplan

Logic Model

Page 51: Matt Keene (PPT)

51

Logic Model

Conceptual framework

Performance Measurement

Helps you understand

what.Program

Evaluation

Helps you understand and

explain why.

Program Mission

Adapt/Learn/ Transfer

Aggregate/Analysis

Planning

Performance Management Cycle – needs adaptive management componets like “implement”

Page 52: Matt Keene (PPT)

52

Steps to Integrating Evaluation into Program Design

Needs

Mission

Goals & Objectives

Logic ModelContext

Select a Program

Document

Identify Measures

Develop Questions

Describe Program

Identify a Team

AudiencesUse

Communication

Data Management

CollectionCollection Strategy

Analysis

Methods

Policy

Methodology

Page 53: Matt Keene (PPT)

Integrating Evaluation into Program Design

TeamProgram

Questions

Measures

Documentation

Needs & Mission

Goals & Objectives

Logic Model

Audience

Methods

Analysis

Strategy

CollectionContext

Communication

Use

Performance Management

Policy

Evaluation Methodology

Data Management

Integrating Evaluation into Program Design

Page 54: Matt Keene (PPT)

54

Program Management Cycle

Page 55: Matt Keene (PPT)

55

Needs, Mission and Goals and Objectives

Mission

What drives the need for performance management?

Goals and Objectives

Page 56: Matt Keene (PPT)

56

Logic Model

Each table gets a logic model template

Goals from the MN project represent a long term outcomes

Each table fills in the other components of the Logic Model

We’ll put the lines of logic together to form a complete’ish model

Page 57: Matt Keene (PPT)

Integrating Evaluation into Program Design

Program

Questions

Measures

DocumentationIntegrating

Evaluation into Program Design

Page 58: Matt Keene (PPT)

58

Program

Measures

Documentation

Goals & Objectives

Logic Model

Data Sources

Methods &Strategy

AnalysisTechniques

CollectionContext

Performance Management

Policy

Evaluation Methodology

Data Management

Integrating Evaluation into Program Design

Needs & Mission

Questions

Communication

Use

Audience

Team

Page 59: Matt Keene (PPT)

59

1. Team2. Mission3. Goals & Objectives4. Logic Model

Program

QuestionsDocumentation

Measures

1. Audience2. Context 3. Communication4. Use

1. Data Sources2. Collection Methods & Strategy3. Analysis Tools4. Data Collection5. Data Management

1. Performance Management Policy2. Evaluation Methodology

Page 60: Matt Keene (PPT)

60

Program

Measures

Documentation Questions

1. Team 2. Mission3. Goals and Objectives4. Logic Model

1. Data Sources2. Collection Methods & Strategy3. Analysis Techniques4. Data Collection5. Data Management

1. Audience 2. Context3. Communication4. Use

1. Performance Management policy2. Evaluation Methodology

Page 61: Matt Keene (PPT)

61

Page 62: Matt Keene (PPT)

62