results-based management.pdf

15

Click here to load reader

Upload: mineasaroeun

Post on 01-Dec-2015

78 views

Category:

Documents


0 download

DESCRIPTION

good

TRANSCRIPT

Page 1: Results-based Management.pdf

RBM Page 1

Results-based Management

What is results-based management (RBM)?

RBM is a program/project life-cycle approach to management that integrates strategy, people,

resources, processes and measurements to improve decision-making, transparency, and

accountability. The approach focuses on achieving outcomes, implementing performance

measurement, learning, and adapting, as well as reporting performance.

How is RBM used at CIDA?

RBM has a long history at CIDA and has been used, in one form or another, for over 30 years.

CIDA's first official RBM policy was released in 1996 and a revised and updated policy was approved

in 2008.

RBM is not a tool, but rather a way of working that looks beyond activities and outputs to focus on

actual results; the outcomes of projects and programs.

CIDA uses RBM to effectively and efficiently manage Canada's international development aid.

Performance Management Division of CIDA's Strategic Policy and Performance Branch supports the

Agency for a sound practice of results-based management.

For further information, please contact Performance Management Division.

RBM - Key Policy Documents: CIDA's 2008 RBM Policy and supporting documents

RBM - Guides: For understanding RBM at CIDA - from theory to practice

RBM - Templates: For working with CIDA's three main RBM tools; Logic Model (LM), Performance

Measurement Framework (PMF) and Risk Register.

RBM - Related Sites

Results-Based Management Tools at CIDA: A How-to

Guide

Instructions for the use of CIDA's three main results-based management working tools:

the logic model, performance measurement framework, and risk register

Introduction Using CIDA's results-based management tools

The logic model

The performance measurement framework

Risk analysis

Introduction

What is results-based management (RBM)?

Results-based management (RBM) is a life-cycle approach to management that integrates strategy,

people, resources, processes, and measurements to improve decision making, transparency, and

accountability. The approach focuses on achieving outcomes, implementing performance

measurement, learning, and adapting, as well as reporting performance.

Page 2: Results-based Management.pdf

RBM Page 2

RBM is:

defining realistic expected results based on appropriate analysis;

clearly identifying program beneficiaries and designing programs to meet their needs;

monitoring progress toward results and resources consumed with the use of

appropriate indicators;

identifying and managing risk while bearing in mind the expected results and necessary

resources;

increasing knowledge by learning lessons and integrating them into decisions; and reporting on the results achieved and resources involved.

Why results-based management?

Historically, government departments-and implementing organizations (IOs)- focused their

attention on inputs (what they spent), activities (what they did), and outputs (what they produced).

Although accurate information at this level is important, they discovered it did not tell them whether

or not they were making progress toward solving the problem they had set out to resolve, and that

the problems often remained once projects were completed.

Modern management requires that we look beyond activities and outputs to focus on actual results:

the changes created, and contributed to, by our programming. By establishing clearly defined

expected results, collecting information to assess progress toward them on a regular basis, and

taking timely corrective action, practitioners can manage their projects or investments in order to

maximize achievement of development results: a sustained improvement in the lives of people in

developing countries.

How is results-based management used at CIDA?

RBM has a long history at CIDA: it has been used, in one form or another, for more than thirty

years. CIDA's first official RBM policy was released in 1996, and a revised and updated policy was

approved in June 2008.

CIDA uses RBM to better manage its international development programming from start

(investment or project analysis, planning, design, implementation, monitoring, adjusting, and

reporting) to finish (final evaluations and reports, and integrating lessons learned into future

programming).

CIDA has developed three main RBM working tools to make managing for results throughout the

entire life cycle of an investment or project easier for CIDA staff, partners, and executing agencies:

thelogic model (LM), performance measurement framework (PMF), and risk register. As the

name implies, these tools are meant to be flexible working documents that remain evergreen

throughout the lifecycle of the investment, and can be adjusted and modified under certain

circumstances. The LM and PMF are usually at least partially completed during the planning and

design stages of an investment, and refined during the development of an implementation plan

(this will vary depending on the type of programming in question). The risk register is completed

during project design, and updated on a regular basis.

Using CIDA's results-based management tools

Participatory approach

Developing the LM, PMF, and risk register in a participatory fashion is an integral part of investment

design and planning. In RBM, investments must be designed, planned, and implemented using a

participatory approach whereby all stakeholders (including beneficiaries) are involved throughout

the investment's life cycle. By using a consensus-building process to define and agree upon the

information in the LM and the PMF, local stakeholders are given a sense of ownership that enhances

subsequent commitment throughout the investment and beyond.

Page 3: Results-based Management.pdf

RBM Page 3

The logic model What is a logic model?

Sometimes also called a "results chain," a LM is a depiction of the causal or logical relationships

between inputs, activities, outputs, and the outcomes of a given policy, program or investment.

The LM is divided into six levels: inputs, activities, outputs, immediate outcomes,

intermediate outcomes, and ultimate outcome. Each of these represents a distinct step in the

causal logic of a policy, program, or investment. The bottom three levels (inputs, activities, and

outputs) address the howof an investment, whereas the top three levels (the various outcomes)

constitute the actual changesthat take place: the development results.

Logic Model with definitions

Standard template - logic model (LM)

CIDA has a standard template for the logic model

Please note that CIDA's LM template does not include inputs; rather, it starts at the activity level.

To complete a logic model template, you need to write clear and concise result statements.

Drafting or assessing result statements

What is a result?

A result is a describable or measurable change that is derived from a cause-and-effect relationship.

At CIDA, results are the same as outcomes, and are further qualified as immediate (short term),

intermediate (medium term), or ultimate (long term).

What is a result statement?

A result statement outlines what a policy, program, or investment is expected to achieve or

contribute to. It describes the change stemming from CIDA's contribution to a development activity

in cooperation with others.

Outcomes = result statements

Hint # 1: Questions to ask yourself when drafting or assessing a result statement:

Is the statement simply worded, and does it contain only one idea? The logic model is a

snapshot of your investment, and the result statements should be clearly stated and easy to

understand. Would the Canadian public be able to understand this result statement? Does the

statement contain more than one idea? If so, can it be split into separate statements?

Was the result statement drafted in an inclusive, participatory fashion? RBM is a

participatory process. The process and methodology for the selection of outcomes and drafting

of result statements should be as participatory as possible, involving a wide representation of

key stakeholders. It is essential to ensure that all the voices are heard and that your expected

outcomes are shared with all involved. Were key stakeholders, including analysts and specialists

(working in gender equality and environment), partners, and implementers involved? Make sure

that the IO has mechanisms in place to ensure that leaders, decision makers, women and men, minorities and direct beneficiaries are involved.

A. Stakeholder involvement

Has a stakeholder analysis been done?

Has adequate consultation been performed?

Is there participation of both male and female stakeholders?

Are there mechanisms for participation in the design and decision making throughout the life cycle of the investment?

Page 4: Results-based Management.pdf

RBM Page 4

B. Gender analysis

Are the results truly gender sensitive?

Do they address the concerns, priorities, and needs of women and men, and girls and boys?

C. Environmental analysis

Have you taken environmental implications into consideration?

Will results be sustainable?

Does the result statement include an adjective and tell you:

WHAT? Does the result statement describe the type of change expected using an adjective

that is drawn from a verb and that indicates direction (e.g. increased, improved,

strengthened, reduced, enhanced)?

WHO? Does the result statement specify the target population or beneficiary of the

intervention? Does it specify the unit of change (e.g. individual, organization, group)?

WHERE? Is the location or site where the result will occur specified?

Can the result be measured? Can the result be measured by either quantitative or qualitative

performance indicators? Can performance indicators that will measure the result be easily

found, collected, and analyzed?

Is the result realistic and achievable? Is the result within the scope of the project's control or

sphere of influence? Is there an adequate balance between the time, resources allocated, and

expected reach and depth of change expected? Are the results at the immediate and

intermediate levels achievable within the funding levels and time period for the project? Is the

result (immediate and intermediate outcome levels) achievable during the life cycle of the

investment? In other words, can the expected changes (immediate and intermediate outcome

levels) be realistically achieved by the end of the intervention?

Is the result relevant? Does the result reflect country ownership and needs, and will it support

higher-level developmental change in the strategies or programs it supports? Is the result

aligned to the country partner's national development strategy? Does the result reflect needs

and priorities among the beneficiaries that were identified in a participatory fashion? Does the

result take into account the culture of the local population? Is the result aligned to CIDA program and corporate priorities?

Examples of weak and strong result statements

Result Issue Is it a strong

result statement?

Increased literacy Does not identify for whom or where the expected change will occur

Not strong

Increased literacy among men and

women in region X of country Y

Strong

More women can get maternal

health-care services

Doesn't use an adjective drawn from

a verb that clearly indicates direction

of change

Does not identify where the expected change will occur

Not strong

Improved access to maternal

health-care services for women in

Strong

Page 5: Results-based Management.pdf

RBM Page 5

country X

Peace in country X Does not specify direction of expected

change, nor whom, specifically, it will

affect. Not achievable

Not strong

Increased stability in country X Strong

Developing a logic model

Here are the steps that need to be taken to create a LM. The order in which they are undertaken

will depend on the status, scope, and size of the investment.

Identify ultimate beneficiaries, intermediaries, and stakeholders.

Hint #2: Useful definitions

Beneficiary: The set of individuals and/or organizations that experience the change of state at

the ultimate outcome level of a LM although they could also be targeted in the immediate and

intermediate outcome levels. Also referred to as "reach" or "target population" (Treasury Board

Secretariat lexicon)

Intermediary: An individual, group, institution or government who are not the ultimate

beneficiary of an investment but who are the target of select activities that will lead, via the

associated immediate and intermediate outcomes, to a change in state (ultimate outcome) for

the ultimate beneficiaries.

Stakeholder: An individual, group, institution, or government with an interest or concern,

either economic, societal, or environmental, in a particular measure, proposal, or event.

(Termium Plus)

Partner: The individuals and/or organizations that collaborate to achieve mutually agreed upon

expected results. (OECD-DAC Glossary of Key Terms in Evaluation and Results Based

Management, with slight modification)

Implementing organization: An IO is any organization or agency, whether governmental,

non-governmental, intergovernmental, specialized, multilateral, or in the private sector, that implements an investment (project or program) for which CIDA provides funding.

Ensure that the right people (e.g. development officer; specialists in environment, governance, and

gender equality; executing agency representative; local stakeholders; and beneficiaries) are at the

table. Remember that this is a participatory exercise. This can be done via brainstorming, focus

groups, meetings, consultative emails, etc. Note that the "right people" may vary depending on the

type of programming. For directive programming, ensure that country-partner organizations,

beneficiaries, and stakeholders (including women, men, and children) are at the table during the

design/development of the LM. For responsive programming, ensure that the right CIDA team is at

the table during the review and assessment of the LM. The review team should include the

development officer or project team lead; branch specialists in environment, governance, and

gender equality; other corresponding sector specialists; and performance management advisors.

You should also validate, as part of your due diligence, that the proponent developed the LM

through a participatory approach.

Identify the ultimate outcome. Start by identifying the problem your investment intends to address.

The ultimate outcome of an investment is its raison d'être: the highest level of change we want to

see to solve that problem. Make sure to analyze the context (cultural, socio-political, economic, and

environmental) surrounding the problem).

Example:

Problem: Poor health among inhabitants of region Y of country X due to water-borne illness.

Page 6: Results-based Management.pdf

RBM Page 6

The ultimate outcome is the highest level of change that can be achieved, a change of state for

the target population.

Ultimate outcome: Improved health among people living in region Y of country X.

Identify main activities (for both CIDA and partners). Brainstorm the main or key activities of

the investment, making sure to address contributing contextual factors. If possible, group

activities into broad categories or work packages to avoid duplication.

Identify outputs for each activity package.

Hint #3: Make sure activity statements begin with a verb in the imperative form and that

outputs are written as completed actions. Outputs are usually things you can count.

Example:

To achieve the ultimate outcome of "Improved health among people living in region Y of country

X," stakeholders in country X (e.g. ministry of health, regional health authority, local community

organizations), our IO and CIDA staff have decided to concentrate on three groups of activities:

building wells, offering training in well maintenance, and rehabilitating and staffing regional

health centres.

Activities:

Build wells in region Y

Develop and deliver training on well maintenance to people living in region Y

Rehabilitate and staff regional health centres in region Y

Outputs:

Wells built in region Y

Training on well maintenance developed and delivered to people living in region Y

Regional health centres in region Y rehabilitated and staffed

Identify logical outcomes for immediate and intermediate levels

Hint #4: A logic model is like a pyramid: it gets smaller the closer you move toward the highest

level. Three or four changes at the immediate level (changes in access, ability, awareness) may

lead to only two changes at the intermediate level (practice, behaviour). Similarly, two changes

at the intermediate level will lead to only one change at the ultimate level (change in state). The

logic model template is flexible: it will allow you to change the number of boxes at each level to

reflect the logic of your investment. Make sure the number of outcomes decreases as you move

upward toward the ultimate outcome. Try also to have only one outcome per box.

Example:

Immediate level results flow logically from the activities and outputs. They represent the change

brought about by the existence of goods and/or services created through the activities. Thus,

the provision of wells = increased access to clean water. Intermediate-level results represent a

change in behaviour. They are the next logical step from the immediate level and lead logically

to the ultimate outcome.

Immediate outcomes (a change in access, ability, or skills):

Increased access to clean drinking water for people living in region Y.

Increased ability to maintain wells among people living in region Y.

Page 7: Results-based Management.pdf

RBM Page 7

Increased access to health services for people living in region Y.

Intermediate outcomes (a change in behaviour or practice):

Increased use of clean drinking water by people living in region Y.

Increased use of health services by people living in region Y.

Identify linkages. Check back and forth through the levels (from activities to ultimate outcome

and from ultimate outcome to activities) to make sure everything flows in a logical manner.

Make sure there is nothing in your outcomes that you do not have an activity to support.

Similarly, make sure that all your activities contribute to the outcomes listed.

Validate with stakeholders/partners. Share your draft logic model with your colleagues,

specialists, stakeholders, and partners, etc. to ensure that the outcomes meet their needs and

that the investment will actually work the way you have envisioned it.

Where required, write the narrative text to illustrate linkages and explain the causality of the

logic model. The narrative should speak to the arrows in the logic model: the causal relationship

between the levels and how we see the activities proposed leading to the expected changes.

The most compelling narratives are those that are succinct and use brief, concrete, evidence-based examples to support these explanations.

Note: Targets, although necessary for the establishment of a budget, are not displayed in the LM;

rather, they appear in the PMF. These will be discussed in further detail in the PMF section.

Example:

Ultimate

outcome

Improved health among people living in region Y of country X.

Intermediate

outcomes

Increased use of clean drinking water by people living

in region Y.

Increased use of health

services by people living

in region Y.

Immediate

outcomes

Increased access to

clean drinking water for

people living in region Y.

Increased ability to maintain

wells among people living in

region Y.

Increased access to

health services for

people living in region Y.

Outputs Wells built in region Y. Training on well maintenance

developed and delivered to

people living in region Y.

Regional health centres

in region Y rehabilitated

and staffed.

Activities Build wells in region Y. Develop and deliver training

on well maintenance to

people living in region Y.

Rehabilitate and staff

regional health centres in

region Y.

The performance measurement framework What is performance measurement?

Measuring performance is a vital component of the RBM approach. It is important to establish a

structured plan for the collection and analysis of performance information. At CIDA, as in other

departments and agencies of the Government of Canada, the PMF is the RBM tool used for this

purpose. Use of the PMF is not limited to the Government of Canada, however: other organizations

and donors use similar tools to plan the collection and analysis of performance information for their

programming as well.

Why performance measurement?

Performance measurement is undertaken on a continuous basis during the implementation of

investments so as to empower managers and stakeholders with "real time" information (e.g. use of

resources, extent of reach, and progress toward the achievement of outputs and outcomes). This

helps identify strengths, weaknesses, and problems as they occur, and enables project managers to

take timely corrective action during the investment's life cycle. This in turn increases the chance of

Page 8: Results-based Management.pdf

RBM Page 8

achieving the expected outcomes.

What is a performance measurement framework?

A performance measurement framework is a plan to systematically collect relevant data over the

lifetime of an investment to assess and demonstrate progress made in achieving expected results.

It documents the major elements of the monitoring system and ensures that performance

information is collected on a regular basis. It also contains information on baseline, targets, and the

responsibility for data collection. As with the LM, the PMF should be developed and/or assessed in a

participatory fashion with the inclusion of local partners, beneficiaries, stakeholders, etc.

Standard template - performance measurement framework (PMF)

CIDA has a standard Performance measurement Framework template

The PMF is divided into eight columns: expected results, indicators, baseline date, targets,

data sources, data-collection methods, frequency, and responsibility. To complete a PMF,

you will need to fill in each of the columns accurately.

Definitions

Expected results column

The expected results column is divided into four rows: outputs, immediate outcomes, intermediate

outcomes, and ultimate outcome. To complete this column, simply copy and paste the result

statements from your LM into the appropriate row.

What are performance indicators?

The performance indicators are what you will use to measure your actual results. A performance

indicator is a quantitative or qualitative unit of measurement that specifies what is to be measured

along a scale or dimension. However, it is neutral: it neither indicates a direction or change nor

embeds a target. It is important that the stakeholders agree beforehand on the indicators that will

be used to measure the performance of the investment.

Quantitative performance indicators are discrete measures such as a number, frequency,

percentile, and ratio, (e.g. number of human rights violations, ratio of women-to-men in decision-

making positions in the government).

Qualitative performance indicators are measures of an individual or group's judgement and/or

perception of congruence with established standards, the presence or absence of specific conditions,

the quality of something, or the opinion about something (e.g. the client's opinion of the timeliness

of service). Qualitative indicators can be expressed concretely when used to report on achievement

of results. They should convey specific information that shows progress towards results, and is

useful for project management and planning.

Example:

Our investment has, as one of its immediate outcomes, "Increased ability to maintain wells among

people living in region Y." Through consultation, it was decided that this would be measured by

tracking "confidence of women and men who took training in their ability to maintain wells." The

pre-training survey of women and men participating in the training showed that 3 percent felt that

they were capable of maintaining wells. A survey conducted directly after training showed that 80

percent of participants felt that they were capable of maintaining the wells. As well, a follow-up

survey at the midpoint of the investment showed that 75 percent of women and men who received

training still felt that they were capable of maintaining the wells in their communities.

Hint #5: The criteria of a strong performance indicator are as follows:

Validity: Does the performance indicator actually measure the result?

Your result statement is: "Increased use of clean drinking water by people living in community X." A

valid performance indicator would be "percentage of households using drinking water drawn from

clean source." An invalid performance indicator would be something like "number of wells in

community X," because although this performance indicator would measure the availability of clean

Page 9: Results-based Management.pdf

RBM Page 9

water (i.e., wells) it would not actually tell us whether people were using them and whether the

number of people using them had increased. This would not actually measure your result.

Reliability: Is the performance indicator a consistent measure over time?

Your result statement is: "Increased access to health services for people living in region Y." A

reliable performance indicator would be "percentage of population living within a two-hour walk of a

health clinic." An unreliable performance indicator could be "gross mortality rate for region Y of

country X." This performance indicator would not be reliable because it may not change consistently

along with the result: a change in access to health care is not a change in usage, and thus may not

be reflected in a change in mortality rates. Similarly, mortality rates can be affected by external and

unpredictable circumstances (e.g. drought, natural disaster) that can change independently of the

result.

Sensitivity: When the result changes, will the performance indicator be sensitive to those

changes?

The example above for reliability also applies here. The performance indicator "gross mortality rate

of region Y of country X" may not always be sensitive to a change in the availability of health care,

or may be sensitive to other factors that are not directly linked to the result. "Percentage of

population living within a two-hour walk of a health clinic," on the other hand, is sensitive to access

to health services, and will change when it changes.

Simplicity: How easy will it be to collect and analyze the data?

An indicator may provide a good measure of the expected outcome but present too many

challenges (such as complexity, technical expertise, local capacity, and shared understanding) for

easy use. If your result statement is: "Increased ability to maintain wells among people living in

region Y," the indicator "number of women and men in region Y who receive a passing grade on a

practical well-maintenance exam" would provide an accurate measure, but would also require a

complex and time-consuming data-collection process. The indicator "Confidence of women and men

who took training in their ability to maintain wells," on the other hand, could be incorporated as

pre- and post-exercises in the training activities, and be collected through a number of methods

such as a written survey for training participants, a verbal response, or a response using the body

(position in the room or height of a raised hand) to indicate level of confidence.

Utility: Will the information be useful for investment management (decision making, learning, and

adjustment)?

If your result statement is: "Increased use of clean water among people living in region Y," various

performance indicators could be used to measure this result. Some may be more useful for

decision-making purposes than others. A performance indicator such as "Percentage of households

using drinking water drawn from clean source" would provide information that could be used to take

corrective action, if need be (e.g. to make adjustments to the project during implementation to

ensure that the expected results will be achieved), or for planning subsequent phases of the

investment (e.g. do there need to be more wells to facilitate greater use?). A performance indicator

such as "Number of times well used daily" would indeed measure the result, but wouldn't provide a

lot of useful information such as who was using it or how widespread that use was over the

community.

Affordability: Can the program/investment afford to collect the information?

A household-by-household survey of the inhabitants of region Y to ascertain their opinion of the

new wells and the training they received on how to maintain them may provide excellent

performance data on the investment, but may also be too costly for the executing agency to

conduct. Choose performance indicators that provide the best possible measurement of the results

achieved within the budget available, and wherever possible, use existing sources and data-

collection methods. Look for a balance between rigour and realism.

What is baseline data?

Baseline data form a set of conditions existing at the outset of a program/investment-the

quantitative and qualitative data collected to establish a profile. Baseline data is collected at one

point in time, and is used as a point of reference against which results will be measured or

assessed. A baseline is needed for each performance indicator that will be used to measure results

during the investment.

Page 10: Results-based Management.pdf

RBM Page 10

What are targets?

A target specifies a particular value for a performance indicator to be accomplished by a specific

date in the future. It is what the investment would like to achieve within a certain period of time in

relation to one of its expected results. Targets provide tangible and meaningful points of discussion

with beneficiaries, stakeholders, and partners, and allow us to add further specificity to the

outcomes from the LM.

Targets however, belong only in the PMF; they should not appear in outcomes themselves, for a

number of reasons. First, when targets are included in a result statement, they limit the ability to

report against the achievement of that result by restricting success to an overly narrow window: the

target itself. Reporting, in this context, becomes an exercise of justifying why the target was not

met or was exceeded (both could be seen as poor planning) instead of comparing expected

outcomes to actual outcomes and discussing the variance. In addition, outcomes need to be

measurable statements that capture, with simplicity and specificity, an expected change. The

inclusion of targets makes this impossible: a statement that includes its own measurement cannot

then be assessed or measured.

Hint # 6: Developing strong targets

Targets must be realistic and reviewed regularly.

Beneficiaries and stakeholders should be involved in establishing targets.

Time lines for targets can vary from short to long term (e.g. monthly, midway, end of project).

A strong target consists of a clear statement of desired performance against an expected outcome, and it is developed using an established baseline.

Example:

Indicator: Percentage of households in region Y living within X distance of a well.

Baseline: At the moment, 5 percent of households in region Y live within x distance of a well.

Target: For the first year of the health initiative for region Y of country Z, the target is to have 25

percent of households living within X distance of a well. The target for the end of the initiative is to

have 65 percent of households living within X distance of a well. This target is realistic because it

takes into account the low percentage established during the baseline study and the fact that some

communities in region Y are very remote and potentially difficult to work in.

What is a data source?

Data sources are the individuals, organizations, or documents from which data about your

indicators will be obtained. Performance data on some indicators can be found in existing sources,

such as a land registry, appointment logs, tracking sheets, or the reports and studies carried out

annually by actors in the international development community. Other data can be obtained

through indicators tracked by governments and partner organizations, and reported in annual

reports to donors. Finally CIDA staff and/partners may need to identify their own sources of data to

track performance against expected results.

The source of performance data is very important to the credibility of reported results. Try to

incorporate data from a variety of sources to validate findings.

Some examples of data sources:

Beneficiaries

Partner organizations (local and international)

Government documents

Government statistical reports

Human Development Report

Page 11: Results-based Management.pdf

RBM Page 11

What is a data-collection method?

Data-collection methods represent how data about indicators is collected. Choosing a data-

collection method depends on the type of indicator and the purpose of the information being

gathered. It also depends on how often this information will be gathered.

Hint # 7: Selecting appropriate data-collection methods:

Determine which data-collection methods best suit the indicators in question.

Use multiple lines of evidence.

Consider the practicality and costs of each method. Weigh the pros and cons of each data-collection method (accuracy, difficulty, reliability, time).

Some examples of data-collection methods:

Observation

Analysis (of records or documents)

Literature review

Survey

Interview

Focus group

Comparative study

Collection of anecdotal evidence

Questionnaire Pre- and post-intervention survey

Hint # 8: The identification of data-collection methods and data sources can help with the selection

and validation of realistic indicators. Data sources and collection methods should be established in

collaboration with partners, IOs, stakeholders and evaluation specialists.

What is ''frequency''?

Frequency looks at the timing of data collection: how often will information about each indicator be

collected and/or validated? Will information about a performance indicator be collected regularly

(quarterly or annually) as part of ongoing performance management and reporting, or periodically,

for baseline, midterm, or final evaluations? It is important to note that data on some indicators will

need to be collected early in the investment to establish the baseline.

What is ''responsibility''?

Responsibility looks at who is responsible for collecting and/or validating the data.

In the case of responsive or core programming, the implementing organization (IO) or multilateral

institution, (together with its local partner(s) and potentially beneficiaries) has the primary

responsibility for collecting and validating the data, and for using the data to report to CIDA on

project performance. CIDA will use the information received as one of many potential tools to

assess the performance of the organization's investment against the plan (their proposal; LM; PMF;

risk register; project implementation plan, if relevant; and work plans). CIDA may also monitor and

evaluate ongoing performance through exercises such as site visits and the inclusion of

performance among criteria for evaluation in audit activities. Although the IO is responsible for the

management of the investment, CIDA is responsible for ensuring adequate due diligence in the

expenditure of Canadian funds.

In the case of directive programming, CIDA has designed, in collaboration with the partner-country

government or other partner organization, the project and is responsible (in a participatory fashion)

Page 12: Results-based Management.pdf

RBM Page 12

for the LM and the PMF. CIDA retains an IO under contract to implement the project. This IO is

responsible for collecting data in accordance with the PMF and for reporting results to CIDA. CIDA

may also engage in independent data collection through monitoring or evaluation activities to

validate the information being provided by the IO. CIDA is accountable for ensuring that the data is

collected and the reporting is undertaken through the project steering committee and in conjunction

with the recipient government partner. This performance information is used to assess overall

progress, evaluate annual work plans, and take corrective action as required.

Some examples of actors responsible for data collection/validation:

Beneficiaries

Local professionals

Partner organizations

Consultants

External monitoring and evaluation specialists CIDA staff

Steps to complete a Performance Measurement Framework

The development of a PMF starts at the planning-and-design phase. Remember that some elements

of the PMF may be established after or during project implementation (e.g. collection of baseline

data and setting some targets).

1. Ensure that the information for your PMF is developed in a participatory fashion, including

key local stakeholders, partners, beneficiaries, and the appropriate specialists.

2. Cut and paste the ultimate outcome, intermediate outcomes, immediate outcomes, and

outputs from your LM into the appropriate boxes in the PMF template.

3. Establish performance indicators for your expected outcomes and outputs, and enter the

performance indicators for the ultimate, intermediate, and immediate outcomes and the

outputs. Validate and check the quality of your performance indicators. Do they have:

validity, reliability, sensitivity, utility, and affordability?

4. Establish the data sources and data-collection methods for your chosen performance

indicators. Look to include multiple lines of evidence wherever possible to increase the

reliability of your performance data.

5. Fill in the Frequency and Responsibility columns for each performance indicator. Decide

whether information on each performance indicator needs to be collected on an ongoing

basis as part of performance monitoring or periodically through evaluations.

6. Fill in baseline data where possible. If reliable historical data on your performance indicators

exists (in the form of government data, information from a previous phase of the

investment, or information gathered during a needs analysis), then it should be used;

otherwise, you will have to collect a set of baseline data at the first opportunity. If you will

be gathering the data later, indicate this in your PMF with a statement such as: "Baseline

data to be collected at investment inception" or "Data to be provided by IO after

communities identified." If possible, set the date by when this will be completed. This

should be done within the first year.

7. Establish realistic targets for each indicator in relation to the baseline data you have

identified. This sets the expectations for performance over a fixed period of time. Key

targets, based on gaps and priorities identified during initial analysis, are necessary to

establish budgets and allocate resources, and play an important role in project planning and

design. Others may be established latter, once a baseline study had been conducted.

Example: Community X, CIDA, and the IO are working together to create a logic model and a PMF

for their health investment.

Risk analysis What is a risk register?

A risk register lists the most important risks, the results of their analysis, and a summary of risk-

response strategies. Information on the status of the risk is included over a regular reporting

Page 13: Results-based Management.pdf

RBM Page 13

schedule. The risk register should be continuously updated and reviewed throughout the course of a

project.

Integrated risk management

Integrated risk management is a continuous, proactive, and systematic process to understand,

manage, and communicate risk across the organization. Other government departments, other

donors, and private-sector companies use similar frameworks.

Elements of integrated risk management:

1. Development of a corporate risk profile

2. Establishment of an integrated risk-management framework

3. Practicing integrating risk management at all levels 4. Ensuring continuous risk-management learning

Drivers:

Aid effectiveness - Integrated risk management supports a consistent approach to risk

management both vertically and horizontally. CIDA is recognized as working effectively in a

high-risk environment. By providing a common and consistent platform, we can reduce

uncertainty for staff and managers and allow them to better understand and manage their

risks. As a result, they will be in a position to make informed decisions and take responsible

risks where appropriate.

Good management - The Treasury Board Secretariat (TBS), Management Accountability

Framework (MAF), Office of the Auditor General of Canada, and Organisation for Economic

Co-operation and Development Development Assistance Committee (OECD-DAC) all

highlight the importance of ranking and rating risks; identifying accountabilities for

managing, reporting, and monitoring risks, scanning for new risks; and assessing risk

collaboratively with other donors.

The development of an integrated risk management approach is a TBS expectation under

the MAF.

Key objectives:

Integrated risk management helps make more informed decisions in managing risks that are within

our control and will position us to better respond to risks that are beyond our control. Specific

objectives are:

Develop a systematic approach to risk management

Contribute to a risk-aware culture

Propose simpler, more effective practices Provide an ongoing scan of key risks.

Corporate risk profile:

Producing a corporate risk profile (CRP) is a key step in developing an integrated risk management

approach. At CIDA, the CRP was validated and approved in June 2008, and it takes into account

CIDA's evolving working environment. The process used was iterative and based on a structure of 4

key risk areas and 12 key risks. The CRP is an evergreen document for risk management, and

therefore must undergo annual reviews.

Basic model (adapted from the World Bank; similar to those of AusAID, DFID, and others):

Page 14: Results-based Management.pdf

RBM Page 14

Conclusions:

CIDA understands and manages risk well.

However, the Agency needs consistency, simplification, accountabilities, risk rating and

ranking, and collaboration with other donors.

A continuous learning culture exists.

There is significant staff demand.

Infrastructure is in place and accountable. Management of risks and due diligence is embedded in CIDA practices.

Useful risk terminology:

Risk refers to the effect of uncertainty on results (ISO 31000).

Impactis the effect of the risk on the achievement of results

Likelihood is the perceived probability of occurrence of an event or circumstance

Risk level is Impact multiplied by Likelihood

Risk response is the plan to manage a risk (by avoiding, reducing, sharing, transferring or

accepting it)

Risk owner is the person who owns the process of coordinating, responding to and gathering

information about the specific risk as opposed to the person who enacts the controls. Stated

otherwise, it is the person or entity with the accountability and authority to resolve a risk incident

(ISO 31000)

Operational risk is the potential impact on CIDA's ability to operate effectively or efficiently

Financial risk is the potential impact on the ability to properly protect public funds

Development risk is the potential impact on the ability to achieve expected development results

Reputation risk is the potential impact arising from a reduction in CIDA's reputation and in

stakeholder confidence in the Agency's ability to fulfill its mandate

Standard template - risk register

CIDA has a standardized investment risk register template

Steps to complete a risk register:

Step 1. Under Risk Definition, write down the key risks to the project. There should be at least two

risks each for the categories Operational Risks, Financial Risks, and Development Risks, and at least

one risk in the category of Reputation Risks.

Step 2. For each risk selected, establish the current risk level, i.e. the intensity of the risk. A risk

map or some other tool may be useful for determining the level. Identify the risk on the four-point

scale below, and transfer the colour under Initial rating.

Page 15: Results-based Management.pdf

RBM Page 15

Step 3. Over a regular monitoring schedule, re-rate the risk and add the colour under Date 2, and

so on. Monitoring periods will vary according to the project, but a typical period is three months.

Step 4. Indicate if the risk is the same as one found in the program risk assessment (if one exists).

Step 5. A risk is an uncertainty about a result. Indicate the level of the result as found on your

Logic Model.

Step 6. Give a brief summary of the risk response strategies that will be used to manage the risk or

to prevent a risk event.

Step 7. Indicate the risk owner. If possible, there should only be one person per box. The owner

will vary according who is the person that actually has to deal with a given risk event.

Monitoring. In the real world of development, the risk profile will change constantly during the life

of the project. As risks arise or disappear, change the corresponding risk definitions and risk level.

Also track the use and the effectiveness of the risk response strategies, and change the Risk

Response column as necessary.

Note: Please do not hesitate to rate risks as "Very likely" if that is their real level.

Four-point rating scale

Criteria Very low (1) Low (2) High (3) Very high (4)

Likelihood of

occurrence

Very unlikely Unlikely Likely Very likely

Potential impact

on CIDA ability

to meet

objectives

Routine procedures

sufficient to deal

with consequences

Could threaten

goals and

objectives, and thus

may require

monitoring

Would threaten

goals and

objectives, and

thus may require

review

Would prevent

achievement of

goals and

objectives

PDF Format

Note: to read the PDF version you need Adobe Acrobat Reader on your system. If the Adobe

download site is not accessible to you, you can download Acrobat Reader from an accessible page.

If you choose not to use Acrobat Reader you can have the PDF file converted to HTML or ASCII text

by using one of the conversion services offered by Adobe.

Logic Model with definitions (PDF - 21 KB, 1 page)

RTF

To read the RTF version, use the document conversion features available in most modern word

processing software, or use a file viewer capable of reading RTF.

Logic Model Template (RTF - 80 KB, 1 page)

Performance Measurement Framework Template (RTF - 70 KB, 1 page)

Investment Risk Management Template (RTF - 250 KB, 4 page)