duane muller, usaid november 7, 2008

27
USAID’s Experience and Lessons Learned in Approaches used in Monitoring and Evaluating Capacity Building Activities Duane Muller, USAID November 7, 2008 UNFCCC Meeting on Experiences with Performance Indicators for Monitoring and Evaluation of Capacity Building Rio De Janeiro, Brazil

Upload: fawzi

Post on 05-Jan-2016

69 views

Category:

Documents


3 download

DESCRIPTION

USAID’s Experience and Lessons Learned in Approaches used in Monitoring and Evaluating Capacity Building Activities. Duane Muller, USAID November 7, 2008 UNFCCC Meeting on Experiences with Performance Indicators for Monitoring and Evaluation of Capacity Building Rio De Janeiro, Brazil. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Duane Muller, USAID November 7, 2008

USAID’s Experience and Lessons Learned in Approaches used in

Monitoring and Evaluating Capacity Building Activities

Duane Muller, USAIDNovember 7, 2008

UNFCCC Meeting on Experiences with Performance Indicators for Monitoring and Evaluation of Capacity Building

Rio De Janeiro, Brazil

Page 2: Duane Muller, USAID November 7, 2008

USG COMMITMENT TO CAPACITY BUILDING

• Integral to development programs

• Country driven approach

• Useful lessons at the project level

Page 3: Duane Muller, USAID November 7, 2008

Managing for ResultsUSAID’s Experiences

Page 4: Duane Muller, USAID November 7, 2008

PERFORMANCE MANAGEMENT

Page 5: Duane Muller, USAID November 7, 2008

PERFORMANCE MANAGEMENT

The systematic process of:

• Monitoring the results of activities • Collecting and analyzing performance information • Evaluating program performance• Using performance information• Communicating results

Page 6: Duane Muller, USAID November 7, 2008

STEPS IN DEVELOPING A PERFORMANCE MANAGEMENT PLAN (PMP)

Step 1:

Review results

statements

Step 2:

Develop performance

indicators

Step 3:

Identify data source and collection method

Step 6:

Plan for other

assessing and learning elements

Step 4:

Collect baseline data

and verify quality

Step 5:

Establish performance

targets

Page 7: Duane Muller, USAID November 7, 2008

MANAGING FOR RESULTS

• Performance Indicators– Scale or dimension

• Standard Indicators– Combination of output and outcome indicators– Measure direct, intended results

• Custom Indicators– Meaningful outcome measures

Page 8: Duane Muller, USAID November 7, 2008

CHARACTERISTICS OF GOOD INDICATORS

• Direct measures

• Objective

• Plausible attribution

• Practical

• Disaggregated

• Quantitative

Page 9: Duane Muller, USAID November 7, 2008

Monitoring & Evaluation

Different but complementary roles at USAID

Page 10: Duane Muller, USAID November 7, 2008

MONITORING AND EVALUATION

MONITORING

• Clarify program objectives

• Link project activities to their resources/objectives

• Translate into measurable indicators/set targets

• Collect data on indicators

• Report on progress

EVALUATION

• Analyzes why and how intended results were/were not achieved

• Assesses contributions of activities to results

• Examines results not easily measured

• Explores unintended results

• Provides lessons learned/recommendations

Page 11: Duane Muller, USAID November 7, 2008

EXPERIENCES WITH MONITORING USAID’s lessons learned

Page 12: Duane Muller, USAID November 7, 2008

8 step process to collect monitoring data

1) Indicators/Definitions

2) Data source

3) Method: data collection

4) Frequency: data collection

5) Responsibilities: acquiring data

6) Data analysis plans

7) Plans for evaluations

8) Plans for reporting/using performance information

Page 13: Duane Muller, USAID November 7, 2008

EXPERIENCES WITH EVALUATION USAID’s Lessons Learned

Page 14: Duane Muller, USAID November 7, 2008

EVALUATION= POWERFUL LEARNING TOOL

• Identifies lessons learned

• Improves quality of capacity building efforts

• Critical to understanding performance

• Retrospective

Page 15: Duane Muller, USAID November 7, 2008

ANALYTICAL SIDE OF PROJECT MANAGEMENT

• Analyzes why and how intended results were/were not achieved

• Assesses contributions of activities to results

• Examines results not easily measured

• Explores unintended results

• Provides lessons learned/recommendations

Page 16: Duane Muller, USAID November 7, 2008

TYPES OF EVALUATION USED BY USAID

Page 17: Duane Muller, USAID November 7, 2008

TRADITIONAL EVALUATION

• Donor focused and ownership of evaluation

• Stakeholders often don’t participate

• Focus is on accountability

• Predetermined design

• Formal evaluation methods

• Independent/third party evaluators

Page 18: Duane Muller, USAID November 7, 2008

PARTICIPATORY EVALUATION

• Participant focus and ownership

• Broad range of stakeholders participate

• Design is flexible

• Focus on learning

Page 19: Duane Muller, USAID November 7, 2008

ASSESSMENTS

• Quick and flexible

• Trends and dynamics

• Broader than evaluations

Page 20: Duane Muller, USAID November 7, 2008

METHODOLOGIES FOR EVALUATIONS

• Scope of Work (SOW)• Interviews• Documentation Reviews• Field Visits• Key informant interviews• Focus group interviews• Community group interviews• Direct observation• Mini surveys• Case studies• Village imaging

Page 21: Duane Muller, USAID November 7, 2008

SUCCESSFUL EVALUATIONS= LESSONS LEARNED

• Making the decision to evaluate

• Ensuring Scope of Work is well thought-out

• Finding the appropriate team

• Ensuring the results are used

Page 22: Duane Muller, USAID November 7, 2008

PROGRAM ASSESSMENT RATING TOOL (PART)

Page 23: Duane Muller, USAID November 7, 2008

PART: Goals, Procedures and Results

• Reviewing performance of US government programs– Program purpose and design – Strategic planning – Program management – Results

• Standard questionnaire called PART

• Results in an assessment and plan for improvement

Page 24: Duane Muller, USAID November 7, 2008

PART RATINGS

Performing – Effective– Moderately effective– Adequate

Not Performing– Ineffective– results not demonstrated

Page 25: Duane Muller, USAID November 7, 2008

Conclusions

• Lessons learned/best practices for M&E• Project Level experiences

– Cost effective– Timely– Ensure data is used

• National experience= PART• Country driven approach to capacity building

– Paris Declaration on AID Effectiveness

Page 26: Duane Muller, USAID November 7, 2008

ADDITIONAL RESOURCES

Development Experience Clearinghouse • http://dec.usaid.gov

Performance Management • A Guide to Developing and Implementing Performance Management Plans

http://www.usaid.gov/policy/ads/200/200sbn.doc

Evaluation Documents• Preparing an Evaluation Scope of Work

http://www.usaid.gov/pubs/usaid_eval/pdf_docs/pnaby207.pdf• Conducting a Participatory Evaluation

http://www.usaid.gov/pubs/usaid_eval/pdf_docs/pnabs539.pdf• Constructing an Evaluation Report http://pdf.usaid.gov/pdf_docs/PNADI500.pdf• Conducting Key Informant Interviews

http://www.usaid.gov/pubs/usaid_eval/pdf_docs/pnabs541.pdf

PART- http://www.whitehouse.gov/omb/part/- http://www.whitehouse.gov/omb/expectmore/

Page 27: Duane Muller, USAID November 7, 2008

For further information:

Duane Muller

USAID

EGAT/ESP/GCC

Tel 1-202-712-5304

Fax 1-202-216-3174

Email: [email protected]

Website: www.usaid.govKeyword: climate change