dr. g. johnson, program evaluation and the logic model research methods for public administrators...

34
Dr. G. Johnson, www.resea rchdemystified.org 1 Program Evaluation and the Logic Model Research Methods for Public Administrators Dr. Gail Johnson

Upload: shanon-reeves

Post on 25-Dec-2015

215 views

Category:

Documents


2 download

TRANSCRIPT

Dr. G. Johnson, www.researchdemystified.org

1

Program Evaluation and the Logic Model

Research Methods for Public Administrators

Dr. Gail Johnson

Dr. G. Johnson, www.researchdemystified.org

2

What to Evaluate?

Projects: a single intervention in one location or a single project implemented in several locations.

Programs: an intervention comprised of various activities or projects which are intended to contribute to a common goal.

Organizations: multiple intervention programs delivered by an organization.

Dr. G. Johnson, www.researchdemystified.org

3

When to Evaluate?

Before program starts: To improve design

During the implementation: To improve implementation Identify barriers to be removed Lessons learned about implementation

Dr. G. Johnson, www.researchdemystified.org

4

When to Evaluate

Mid-term evaluation Relevance, effectiveness, efficiency Lessons learned: management tool

Impact evaluation Either at the end of the project or a few years after the

program has been operating: assessing a mature program

Can also look at: effectiveness, efficiency, early signs of impact and sustainability

Lessons learned: future projects

Dr. G. Johnson, www.researchdemystified.org

5

Why Is Evaluation Useful?

Feedback Accountability Learning Improvement Results Testing underlying assumptions or theory Funding decisions

Dr. G. Johnson, www.researchdemystified.org

6

Fear of Evaluation

If evaluation is so useful, why do some people fear evaluation?

Dr. G. Johnson, www.researchdemystified.org

7

Evaluation QuestionsCompliance/ Accountability Questions

Did the promised activities actually take place as they were planned?

“How” Questions What was the sequence or processes that led to successful (or not) outcomes

Impact Questions

Did the program achieve the desired results?

Dr. G. Johnson, www.researchdemystified.org

8

Types of Evaluations

Auditing: accounting for money Is the money being spent according to plan? Efficiency and effectiveness.

Monitoring: measuring implementation and results Is the intervention producing the intended

results? Process: measuring operations and service delivery

Are there problems in service delivery?

Dr. G. Johnson, www.researchdemystified.org

9

Types of Program Evaluations

Feasibility evaluations Before the program begins Intended to improve program design

Evaluability assessments Assesses potential usefulness of the evaluation Used to test out different strategies for

conducting an evaluationWhat is doable given the situation?

Dr. G. Johnson, www.researchdemystified.org

10

Evaluability Assessment

Helps to define the actual objectives, implementation and management of a program. The actual objectives may differ from those

initially planned.

Determines the coherence of the program: are goals, activities, program infrastructure linked?

Dr. G. Johnson, www.researchdemystified.org

11

Evaluability Assessment

Key steps in the process: Interview key program staff to actual program

mission, goals, objectives and activities. Site visits to observe and get a sense of what is

going on. May include interviews with key stakeholders. Observe program delivery.

Dr. G. Johnson, www.researchdemystified.org

12

Evaluability Assessment

Reach agreement as to: Whether to conduct the evaluation. Scope and objectives of the evaluation.

The decision could be to not conduct the evaluation.

Dr. G. Johnson, www.researchdemystified.org

13

Evaluability Assessment: Challenges Key components of the program may not be

well defined: Lack of agreement on program objectives. Lack clear, measurable indicators of

performance and/or impact. Target group may not be clearly defined. The delivery system is poorly articulated.

Dr. G. Johnson, www.researchdemystified.org

14

Types of Program Evaluations

Formative evaluations During implementation Feedback about operations and processes Used to make mid-course corrections

Dr. G. Johnson, www.researchdemystified.org

15

Definition: Performance Monitoring

Performance monitoring: the continuous process of collecting and

analyzing data to compare how well a project, program or policy is being implemented against expected results.

Traditional: focus on inputs, activities and outputs.

Dr. G. Johnson, www.researchdemystified.org

16

Types of Evaluation:Monitoring

On-going review: On-time On-budget On-target

Linked with on-going management Measured against established baselines Indicators of progress toward targets

Dr. G. Johnson, www.researchdemystified.org

17

Types of Program Evaluations

Summative Evaluations At the end of the program or after the program has been

running long enough to achieve its goals (with “mature” programs)

Identify lessons learned Other issues: unintended outcomes, program

sustainability, program efficiency, costs and benefits Sometimes called impact evaluations and ex-post

evaluations

Dr. G. Johnson, www.researchdemystified.org

18

Program Evaluation

Summative Evaluation Question:

Do Public Programs Work? Implied cause-effect relationship

Did the program cause a desired outcome? Performance-based:

Focus on outcomes, results, impacts, goal achievement.

Dr. G. Johnson, www.researchdemystified.org

19

Differences

Formative Evaluations Project Monitoring Early Years of Implementation Key Question:

Are we doing things right?– Have we hired the right people with the right skills?

– Have we marketed the program effectively?

– Have we met our strategic objectives?

– Have we spent our money according to our plan?

Dr. G. Johnson, www.researchdemystified.org

20

Differences

Summative Evaluations Measuring Results or Impacts A longer time before results or impacts are visible Key Question:

Are we doing the right thing? This gets back to the theory or underlying assumptions of the

program:– We can do an excellent job at training people but if the problem

is not about the larger structural economic issues, a training program, no matter how well implemented, may show little result.

Dr. G. Johnson, www.researchdemystified.org

21

Working with Models

Visualize a program in context Systems approach, within an environment

Identify the relationships between various components

Identify cause and effect Identify key assumptions

Dr. G. Johnson, www.researchdemystified.org

22

Models: Cause and Effect:

Did the program cause something to happen?

Education Employment

Dr. G. Johnson, www.researchdemystified.org

23

Reduced Poverty

Improved Quality of Life

Increased Income

Job

Training

Unemployed

Hierarchy of Objectives

Dr. G. Johnson, www.researchdemystified.org

24

Logic Models

The focus is on results or impacts rather than inputs and activities We are not training people just for the sake of training

people We believe if we train the chronically unemployed,

then there quality of life will be improved and poverty will decrease.

Our goal is to reduce poverty Also called Program Outcome Model or

Measuring for Results

Dr. G. Johnson, www.researchdemystified.org

25

Logic Model

Inputs

Activities

Outputs

Outcomes

Impact

Dr. G. Johnson, www.researchdemystified.org

26

Elements of the Logic Model

Inputs: what resources are usedUniversity inputs: budget, number of

faculty, number of staff, number of buildings, number of classrooms

Activities: what the program doesUniversity activities: teaching, research,

and service

Dr. G. Johnson, www.researchdemystified.org

27

Elements of the Logic Model

Outputs: the services or products produced University outputs: number of students that

graduate, number of articles and books published by faculty

Outcomes: what happened: immediate results Graduates are sought after, get good jobs,

active alumni who donate big bucks Faculty well-known, obtain big grants, enhance

rating of university

Dr. G. Johnson, www.researchdemystified.org

28

Elements of the Logic Model

Impacts: the “so what.” Larger, long term results, usually tied to program goals. A more informed and engaged citizenry,

preserves democratic institutions, future leaders. Faculty research contributes to knowledge.

Dr. G. Johnson, www.researchdemystified.org

29

Logic Model

Logical Connections: Inputs to do activities Activities lead to outputs Outputs lead to one or more outcomes Outcomes lead to impacts

Dr. G. Johnson, www.researchdemystified.org

30

Logic Model: Training Program

Inputs Activities Outputs

Resources

•money

•staff

•Supplies

•mentors

•Training Programs

•Dress for success coaching

•Interview coaching

•Resume assistance

Products

•Number of graduates per training session

•% graduate rate

Outcomes

Benefits changes

•Increased skills

•% Obtain jobs

•% Obtain high paying, quality jobs

•Increased self-esteem

Impacts

Goals

•Increased income

•Self-sufficiency

•Family stability

•Reduction in poverty

Dr. G. Johnson, www.researchdemystified.org

31

Takeaway Lesson

Program evaluation takes many forms but all follow the same research planning process.

Evaluation of programs have shifted from reporting inputs and activities to attempting to measure results: the difference the program actually made.

Dr. G. Johnson, www.researchdemystified.org

32

Takeaway Lesson

Measuring results is harder than it appears. Program goals and objectives may be fuzzy. Sites may vary in how they have implemented

the program. It takes money to collect and analyze data. Results may not be observable for many years. The operating environment may make it hard to

see results.

Dr. G. Johnson, www.researchdemystified.org

33

Takeaway Lesson

Measuring results is harder than it appears. But there is much that can be learned from

engaging in this process and doing the best job possible.

Remember: do not quickly conclude a program does not work just because you cannot measure the result. The research tools available may not be up to the job.

Dr. G. Johnson, www.researchdemystified.org

34

Creative Commons

This powerpoint is meant to be used and shared with attribution

Please provide feedback If you make changes, please share freely

and send me a copy of changes: [email protected]

Visit www.creativecommons.org for more information