monitoring and evaluation (part 2)

17
ESTABLISHING A CULTURE OF MONITORING AND EVALUATION SPREADING THE WORD, (RE)-TURNING THE CULTURE (PART 2) Vilimaka Foliaki Monitoring and Evaluation Advisor Tonga Education Support Program, Phase 2 (TESP2) In Collaboration with Ponepate Taunisila Deputy Chief Education Officer Quality Assurance Division Tonga Ministry of Education Professional Development Training Ministry of Education and Training TONGA Wednesday 12 August – Thursday August 2015

Upload: vilimaka-foliaki

Post on 12-Apr-2017

333 views

Category:

Education


2 download

TRANSCRIPT

Page 1: Monitoring and evaluation (Part 2)

ESTABLISHING A CULTURE OF MONITORING AND EVALUATIONSPREADING THE WORD, (RE)-TURNING THE CULTURE (PART 2)

Vilimaka FoliakiMonitoring and Evaluation AdvisorTonga Education Support Program, Phase 2 (TESP2)

In Collaboration with Ponepate TaunisilaDeputy Chief Education OfficerQuality Assurance DivisionTonga Ministry of Education

Professional Development TrainingMinistry of Education and Training TONGAWednesday 12 August – Thursday August 2015

Page 2: Monitoring and evaluation (Part 2)

YOUR WORK

An investment: Resources are spent between

‘start’ and ‘end’ Time-bound

A project “a temporary endeavor designed

to produce a unique product, service or result with a defined beginning and end … undertaken to meet unique goals and objectives, typically to bring about beneficial change or added value” (Wikipedia)

Page 3: Monitoring and evaluation (Part 2)

YOUR WORK – LIFE CYCLE

M&E - a critical element of planning

M&E - usually forgotten/neglected: Often remembered LATE:

during implementation stage at end – to measure and

evaluate success M&E - part of everything that we

do all the time

Page 4: Monitoring and evaluation (Part 2)

LET’S LEARN FROM THE EXPERIENCE OF THE USA

Invented by USAID in 1969 – based on experiences with development initiatives. There were 3 problem areas:1. Planning: Objectives were not clearly

defined; no clear link between objectives and activities.

2. Management: Managers were unwilling to be accept responsibility for results.

3. Evaluation: Evaluators use their own strategies as there was no common agreement as to what the projects were trying to achieve. No common understanding of what ‘success’ looked like.

1. Planning was too vague

2. Manageme

nt responsibilities were unclear 3.

Evaluation was an

adversary process

USAID experiences

THE LOGICAL FRAMEWORK APPROACH (LFA)

Page 5: Monitoring and evaluation (Part 2)

THE LOGICAL FRAMEWORK APPROACH (LFA)

An approach to project/program design and development. It ensures Alignment Relevance Awareness of external factors (assumptions and

risks) Success

An approach to problem solving: projects/programs are designed to solve

problems It outlines the sequences of events in the

process solving a problem

Basically:

Page 6: Monitoring and evaluation (Part 2)

YOUR LOGFRAME

A 4x4 table/matrix: It organizes answers to 4 key

management questions:1. What we trying to accomplish

and why? (Goal)2. How will be measure success? 3. What other conditions must

exist?4. How do we get there?

• A logical way of 3 “directional” logics:

1. Vertical logic2. Horizontal logic3. Zigzag logic

Description of objective level

Success measures (Indicators)

Means of verification

Assumptions and risks

Goal (Impact)

Purpose (Outcomes)

Outputs

Activities

1 2 3 4

1

2

3

4

Page 7: Monitoring and evaluation (Part 2)

FILLING UP YOUR LOGFRAME

COLUMN 1• Most important

• Set up your hierarchy of objectives (Goal, Outcomes, Outputs, Activities) . This provides structural foundation for project

• Test for ‘vertical logic’ – adjust the framework to overcome logical flaws (unfeasible/unlikely relationships) – use ‘If-Then’ hypothesis. Example:

1. IF this PD training (activity) is carried out, THEN these

Page 8: Monitoring and evaluation (Part 2)

STRUCTURE OF THE LOGFRAMEColumn 2

• Indicators Look at each

objective and ask ‘How can we measure it?”

What does success at each of the objective level look like?

(Consensus – everyone to agree on indicators)

Page 9: Monitoring and evaluation (Part 2)

STRUCTURE OF THE LOGFRAMEColumn 3

Means of verification

How can we verify these ?

How can we really know?

Where can we find these indicators?

Page 10: Monitoring and evaluation (Part 2)

EXTERNAL FACTORS THAT AFFECT PROGRESS - ASSUMPTIONS AND RISKSAssumptions:

Ask this question: What are the external factors, if present, could influence progress (from one level to the next)? List these assumptions. E.g.

“Language used in pupils books is appropriate”

If they are: True – your work benefits Wrong – your work suffers

Risks: (Assumptions and risks have

opposite effects) Those that will negatively affect

progress. List these. E.g. “Boat/car breaks down”.

If they are: True – your work suffers Wrong – your work benefits

Page 11: Monitoring and evaluation (Part 2)

STRUCTURE OF THE LOGFRAMEColumn 4

Assumptions and risks(These are the external factors which you cannot control) How can we

verify these ? How can we

really know? Where can we

find these indicators?

Page 12: Monitoring and evaluation (Part 2)

CARRY OUT LOGICAL TEST

Focus on Columns 1 and 4:• Begin at bottom

with the ‘Activities’ at bottom left corner of matrix

• Follow IF – THEN arrows

Overcome logical flaws - when there are ‘‘unlikely/unfeasible” relationships, adjust Logframe.

Page 13: Monitoring and evaluation (Part 2)

SAMPLE MATRIX

(Page 7 Logframe Format - handout)

Page 14: Monitoring and evaluation (Part 2)

WHY IS THE LOGFRAME IMPORTANT?

1. Combines important ideas from different areas (scientific-method, management, strategic planning, systems thinking) into project planning and design

2. It bridges gap between strategic planning and actionable project designs:

Makes the links (between goals and activities etc. clear)

Page 15: Monitoring and evaluation (Part 2)

LOGFRAME ANSWERS FOUR (4) IMPORTANT MANAGEMENT QUESTIONS

What are we trying to accomplish and why?

What other conditions that must exist?

How will we measure success?

How do we get there?

Page 16: Monitoring and evaluation (Part 2)

YOUR LOGFRAME• What does it do?• Provides a clear,

concise and meaningful (logical) description of the ‘bigger picture’ of your work and how you and others are working together to achieve a common goal.

Page 17: Monitoring and evaluation (Part 2)

LOGFRAME – A REFERENCE FOR M&EColumn 1 – Description of

objective level SMART objectives Check link, logical flow –

from activities to goalsColumn 2 - Indicators:

Know what to look for – data, evidence – verifiable objective,.

Data collection instruments

Column 3 – Means of Verification Know where to for indicators Informs you about how to

look for indicators – as basis for data collection

Column 4 – Assumptions and Risks Makes us aware of the extent

our work depends on external factors Carefully monitor assumptions

and risks Adjust activities and inputs to

maximize chances of success