Download - Duane Muller, USAID November 7, 2008
USAID’s Experience and Lessons Learned in Approaches used in
Monitoring and Evaluating Capacity Building Activities
Duane Muller, USAIDNovember 7, 2008
UNFCCC Meeting on Experiences with Performance Indicators for Monitoring and Evaluation of Capacity Building
Rio De Janeiro, Brazil
USG COMMITMENT TO CAPACITY BUILDING
• Integral to development programs
• Country driven approach
• Useful lessons at the project level
Managing for ResultsUSAID’s Experiences
PERFORMANCE MANAGEMENT
PERFORMANCE MANAGEMENT
The systematic process of:
• Monitoring the results of activities • Collecting and analyzing performance information • Evaluating program performance• Using performance information• Communicating results
STEPS IN DEVELOPING A PERFORMANCE MANAGEMENT PLAN (PMP)
Step 1:
Review results
statements
Step 2:
Develop performance
indicators
Step 3:
Identify data source and collection method
Step 6:
Plan for other
assessing and learning elements
Step 4:
Collect baseline data
and verify quality
Step 5:
Establish performance
targets
MANAGING FOR RESULTS
• Performance Indicators– Scale or dimension
• Standard Indicators– Combination of output and outcome indicators– Measure direct, intended results
• Custom Indicators– Meaningful outcome measures
CHARACTERISTICS OF GOOD INDICATORS
• Direct measures
• Objective
• Plausible attribution
• Practical
• Disaggregated
• Quantitative
Monitoring & Evaluation
Different but complementary roles at USAID
MONITORING AND EVALUATION
MONITORING
• Clarify program objectives
• Link project activities to their resources/objectives
• Translate into measurable indicators/set targets
• Collect data on indicators
• Report on progress
EVALUATION
• Analyzes why and how intended results were/were not achieved
• Assesses contributions of activities to results
• Examines results not easily measured
• Explores unintended results
• Provides lessons learned/recommendations
EXPERIENCES WITH MONITORING USAID’s lessons learned
8 step process to collect monitoring data
1) Indicators/Definitions
2) Data source
3) Method: data collection
4) Frequency: data collection
5) Responsibilities: acquiring data
6) Data analysis plans
7) Plans for evaluations
8) Plans for reporting/using performance information
EXPERIENCES WITH EVALUATION USAID’s Lessons Learned
EVALUATION= POWERFUL LEARNING TOOL
• Identifies lessons learned
• Improves quality of capacity building efforts
• Critical to understanding performance
• Retrospective
ANALYTICAL SIDE OF PROJECT MANAGEMENT
• Analyzes why and how intended results were/were not achieved
• Assesses contributions of activities to results
• Examines results not easily measured
• Explores unintended results
• Provides lessons learned/recommendations
TYPES OF EVALUATION USED BY USAID
TRADITIONAL EVALUATION
• Donor focused and ownership of evaluation
• Stakeholders often don’t participate
• Focus is on accountability
• Predetermined design
• Formal evaluation methods
• Independent/third party evaluators
PARTICIPATORY EVALUATION
• Participant focus and ownership
• Broad range of stakeholders participate
• Design is flexible
• Focus on learning
ASSESSMENTS
• Quick and flexible
• Trends and dynamics
• Broader than evaluations
METHODOLOGIES FOR EVALUATIONS
• Scope of Work (SOW)• Interviews• Documentation Reviews• Field Visits• Key informant interviews• Focus group interviews• Community group interviews• Direct observation• Mini surveys• Case studies• Village imaging
SUCCESSFUL EVALUATIONS= LESSONS LEARNED
• Making the decision to evaluate
• Ensuring Scope of Work is well thought-out
• Finding the appropriate team
• Ensuring the results are used
PROGRAM ASSESSMENT RATING TOOL (PART)
PART: Goals, Procedures and Results
• Reviewing performance of US government programs– Program purpose and design – Strategic planning – Program management – Results
• Standard questionnaire called PART
• Results in an assessment and plan for improvement
PART RATINGS
Performing – Effective– Moderately effective– Adequate
Not Performing– Ineffective– results not demonstrated
Conclusions
• Lessons learned/best practices for M&E• Project Level experiences
– Cost effective– Timely– Ensure data is used
• National experience= PART• Country driven approach to capacity building
– Paris Declaration on AID Effectiveness
ADDITIONAL RESOURCES
Development Experience Clearinghouse • http://dec.usaid.gov
Performance Management • A Guide to Developing and Implementing Performance Management Plans
http://www.usaid.gov/policy/ads/200/200sbn.doc
Evaluation Documents• Preparing an Evaluation Scope of Work
http://www.usaid.gov/pubs/usaid_eval/pdf_docs/pnaby207.pdf• Conducting a Participatory Evaluation
http://www.usaid.gov/pubs/usaid_eval/pdf_docs/pnabs539.pdf• Constructing an Evaluation Report http://pdf.usaid.gov/pdf_docs/PNADI500.pdf• Conducting Key Informant Interviews
http://www.usaid.gov/pubs/usaid_eval/pdf_docs/pnabs541.pdf
PART- http://www.whitehouse.gov/omb/part/- http://www.whitehouse.gov/omb/expectmore/
For further information:
Duane Muller
USAID
EGAT/ESP/GCC
Tel 1-202-712-5304
Fax 1-202-216-3174
Email: [email protected]
Website: www.usaid.govKeyword: climate change