· web view2021. 8. 10. · the real-time evaluation of humanitarian action . abstract. this...

45
The Real-Time Evaluation of humanitarian action Abstract This papers reviews the current use of real time evaluation (RTE) in the evaluation of humanitarian action. It begins with an introduction the background of humanitarian action; considers some of the challenges RTEs face, and reviews practices around team recruitment, methods, and outputs. It makes use of a dataset of 103 RTE to examine some of the issues discussed. It concludes by examining how the quality of RTEs could be improved. Humanitarian Action Every year, hundreds of millions of people are affected by disasters or by conflict. 210 million people were affected by natural or technological disasters in 2011 (Zetter et al., 2012, p. 258), Tens of millions of others were affected by conflict 1 or epidemics. Disasters are increasing in number and scale, in particular weather- related disasters (Parker et al., 2007, pp. 2-4). This is because of increasing population, particular in disaster-vulnerable river basins and coastal areas, environmental fragility, climate change, and urbanisation. Some 93 million people were targeted for international humanitarian action in 2011 (Buston et al., 2013, p. 5). Many more were assisted at the national level. In 2013 an estimated 44 million people in China and India were affected by natural disasters but were assisted at the national level (Swithern et al., 2014, p. 9) Humanitarian action is action intended to: “save lives, alleviate suffering and maintain human dignity during and in the aftermath of man-made crises and natural disasters, as well as to prevent and strengthen preparedness for the occurrence of such situations” (Good Humanitarian Donorship, 2003). Humanitarian action is funded by official humanitarian assistance from governments and intergovernmental organisations as well as private sources. Funding for humanitarian action has been an increasingly important part of all Official Development Assistance (ODA) since the end of the Cold War. Humanitarian funding shows a strong upward trend both in real terms and as a proportion of all ODA with funding peaks for high-profile emergencies. The reasons for 1 The Stockholm Peace Research Institute’s 2013 Yearbook give the total of 37 active state-based conflicts, 38 active non-state conflicts, and 23 active situations of one-sided violence in 2011 (Stockholm International Peace Research Institute, 2013, p. 4).

Upload: others

Post on 17-Aug-2021

2 views

Category:

Documents


0 download

TRANSCRIPT

Page 1:   · Web view2021. 8. 10. · The Real-Time Evaluation of humanitarian action . Abstract. This papers reviews the current use of real time evaluation (RTE) in the evaluation of humanitarian

The Real-Time Evaluation of humanitarian action AbstractThis papers reviews the current use of real time evaluation (RTE) in the evaluation of humanitarian action. It begins with an introduction the background of humanitarian action; considers some of the challenges RTEs face, and reviews practices around team recruitment, methods, and outputs. It makes use of a dataset of 103 RTE to examine some of the issues discussed. It concludes by examining how the quality of RTEs could be improved.

Humanitarian ActionEvery year, hundreds of millions of people are affected by disasters or by conflict. 210 million people were affected by natural or technological disasters in 2011 (Zetter et al., 2012, p. 258), Tens of millions of others were affected by conflict1 or epidemics. Disasters are increasing in number and scale, in particular weather-related disasters (Parker et al., 2007, pp. 2-4). This is because of increasing population, particular in disaster-vulnerable river basins and coastal areas, environmental fragility, climate change, and urbanisation.

Some 93 million people were targeted for international humanitarian action in 2011 (Buston et al., 2013, p. 5). Many more were assisted at the national level. In 2013 an estimated 44 million people in China and India were affected by natural disasters but were assisted at the national level (Swithern et al., 2014, p. 9) Humanitarian action is action intended to: “save lives, alleviate suffering and maintain human dignity during and in the aftermath of man-made crises and natural disasters, as well as to prevent and strengthen preparedness for the occurrence of such situations” (Good Humanitarian Donorship, 2003).

Humanitarian action is funded by official humanitarian assistance from governments and intergovernmental organisations as well as private sources. Funding for humanitarian action has been an increasingly important part of all Official Development Assistance (ODA) since the end of the Cold War. Humanitarian funding shows a strong upward trend both in real terms and as a proportion of all ODA with funding peaks for high-profile emergencies. The reasons for the growth are complex, reflecting the rising incidence of disasters (Parker et al., 2007), the arrival of non-traditional donors such as the newly industrialised countries and the Gulf states, and post-Cold War access to previously inaccessible areas. The advent of global satellite news has also helped to drive up funding as images of each new crisis can be seen around the world as they happen2. Since the end of the Cold War there has also been an increased 1 The Stockholm Peace Research Institute’s 2013 Yearbook give the total of 37 active state-based conflicts, 38 active non-state conflicts, and 23 active situations of one-sided violence in 2011 (Stockholm International Peace Research Institute, 2013, p. 4).2 Studies of the so-called CNN effect suggest that the level of media coverage influences official funding only where there are no vital security issues at stake for donor countries (Olsen et al., 2003, p. 124). However, media coverage has a large effect on private giving, (Brown and Minty, 2006, pp. 15-16) found that an additional minute of television news on the Tsunami increased daily donations to charity by 13.2%, and that each additional 100 words in major newspapers increased donations by 2.6%.

Page 2:   · Web view2021. 8. 10. · The Real-Time Evaluation of humanitarian action . Abstract. This papers reviews the current use of real time evaluation (RTE) in the evaluation of humanitarian

tendency to use of humanitarian funding for political or security agendas. Examples of this include the German focus on the Kosovo crisis because of fears of a flood of asylum seekers (Sahla, 2013,17-21), or funding driven by the concern that festering humanitarian crises may provide breeding grounds for extremism (Cosgrave, 2004, p. 8).

Figure 1: Official humanitarian assistance shows a strong upward trend, both in real terms and as a proportion of all Official Development Assistance (Source OECD/DAC). Note that the figures do not include private humanitarian funding which adds another one third to the official figures (Poole et al., 2012, p. 10)

Evaluating humanitarian action(Feinstein and Beck, 2006, p. 546) characterise evaluations of humanitarian action as atheoretical and not drawing on conceptual thinking in mainstream evaluation and not developing its own theory. This characterisation is not correct, as humanitarian evaluation reflects two broad lines of thinking in mainstream evaluation. In terms of Alkins theory tree (Alkin and Christie, 2004; Christie and Alkin, 2008), these can be described as Use and Valuing arms.

The utilisation of evaluation has become a strong theme in humanitarian evaluation drawing in particular on the work of Patton (2008). (Sandison, 2007) demonstrates how different forms of utilisation can be seen in humanitarian practice. The recent ALNAP pilot guide on evaluating humanitarian action focus strongly on utilisation (Buchanan Smith and Cosgrave, 2013).

Valuing is reflected in the focus on participatory evaluation in humanitarian evaluation.

Humanitarian action is different from development interventions in that that it often place against a background of poor quality information3, even on such basic 3 Keen and Ryle (1996, p. 169) state that “In a disaster accurate information, like clean water, is an indisputable good. The kind of information sought by outside agencies is, however, seldom available.” Hallam (1998, p. 39) noted that “A product of the

Page 3:   · Web view2021. 8. 10. · The Real-Time Evaluation of humanitarian action . Abstract. This papers reviews the current use of real time evaluation (RTE) in the evaluation of humanitarian

data as the number of persons affected. Responses are often based on estimates and assumptions that are may never be verified. For example, in the 2004 Indian Ocean Tsunami, it was three and a half months before there was an accurate estimate for the number of persons who lost their lives (Telford et al., 2006, pp. 33-35).

Attempts to develop a base of good quality data in humanitarian response are challenged by:

the large number of humanitarian actors4 whose participation in coordination structures is variable;

the unstable situation (due to continuous displacement, ongoing conflict, or the outbreak of epidemics);

the frequent absence of baseline data5; the access difficulties imposed by conflict or damage to infrastructure; the complicated (with many parts) and complex (with unpredictable

outcomes) nature of humanitarian interventions6; the loss of key records or the high turnover of knowledgeable persons in

crises7; and the politicisation of information and polarisation of stakeholders views in

conflict situations.

This information deficit poses problems for decision-making and planning in emergencies (Cosgrave, 1996). Plans frequently have to be changed in response to new information about needs or the actions of other relief actors. The fluidity of plans and the information deficit brings difficulty for assessment, monitoring, and evaluation. Evaluation difficulties are further compounded by the high rate of rotation and turnover in humanitarian crises (Loquercio et al., 2006).

Ethical considerations limit the range of evaluation designs in humanitarian action. Randomised Control Trials (RCTs) are rare, partly because the random selection of aid recipients runs counter to a central tenet of humanitarian action,

characteristics of complex emergencies is that key information on a range of matters of vital significance to evaluators is often unavailable.”

4 Over 160 international NGOs and 430 local NGOs registered with the local authorities in Banda Aceh after the December 2004 Indian Ocean Tsunami (Telford et al., 2006). The international response to the January 2010 Haiti Earthquake exceeded this by an order or magnitude. The evaluation of the Office for the Coordination of Humanitarian Assistance (OCHA) states that “thousands of humanitarian and faith‐based organisations arrived on the scene to provide relief to the affected communities” (Bhattacharjee and Lossio, 2011, p. 21). The Inter-Agency Standing Committee’s six month report notes that OCHA estimated that 400 international agencies were operational by the end of January 2010 (IASC, 2010, p. 8). The first Haiti RTE noted that over 1,000 international organisation has provided humanitarian assistance by May 2010 (Grünewald et al., 2010, p. 7). 5 The 2003 ALNAP annual review (ALNAP et al., 2003, p. 29) notes that most evaluations of humanitarian action flag up “a lack of baseline data against which to measure intervention progress”.

6 Complicated and complex are used here in the sense given by (Rogers, 2008, pp. 30-31).7 For example 28% of all government staff were killed by the December 2004 Asian Earthquake and Tsunamis in Aceh Jaya, and 22% in Banda Aceh (BRR et al., 2005, p. 35).

Page 4:   · Web view2021. 8. 10. · The Real-Time Evaluation of humanitarian action . Abstract. This papers reviews the current use of real time evaluation (RTE) in the evaluation of humanitarian

the assistance should be on the basis of need alone (Borton, 1994, p. 8). The random selection of recipients also raises serious ethical issues similar to that of using RCTs to test the effectiveness of parachutes (Smith and Pell, 2003).

Although designs such as Regression-Discontinuity designs avoid the ethical issues, they are still very little used for humanitarian evaluation. Not one of the sample of humanitarian evaluations examined used this design.

Addressing challenges though quality initiativesQuality and accountability problems in the sector have been regularly identified by the key sector-wide evaluations that have been conducted of humanitarian response such as the Joint Rwanda Evaluation (Borton et al., 1996) or the Joint Tsunami Evaluation (Telford et al., 2006). To address them the humanitarian community has tried to introduce quality and accountability standards for humanitarian action.

There have been a large number of initiatives in the sector to try and address shortcomings in quality and accountability8.

One key initiative that resulted from the Joint Rwanda Evaluation in 1996 was the Active Learning Network for Accountability and Performance in Humanitarian Action (ALNAP). ALNAP has promoted better quality evaluation in the sector through, among other activities:

the publication of evaluation guidance (Beck, 2006; Cosgrave et al., 2009; Slim and Bonwick, 2005);

encouraging the use of lessons from evaluation by publishing (with Provention) summaries of lessons learned from different types of humanitarian crises (Cosgrave, 2008, 2014; Hedlund and Clarke, 2011; O'Donnell et al., 2009);

publishing broader reviews of the sector (ALNAP et al., 2007; Harvey et al., 2010; Taylor et al., 2012); and

maintaining a database of evaluations of humanitarian action.9

The United Nations has pushed for improving the quality of evaluation through the work of the United Nations Evaluation Group UNEG. UNEG has published a significant amount of guidance ranging from standards to checklists. (UNEG, 2005a, 2005b, 2008a, 2008b, 2010a, 2010b, 2010c).

One initiative the humanitarian community has taken to address the problems of ensuring good quality work under information constraints has been by the use of Real-time Evaluation or RTE. By 2010 RTEs had become a central pillar of the new humanitarian evaluation architecture (Polastro, 2011, p. 10) with inter-agency RTEs then the norm in all large new humanitarian crises (IA RTE Steering Group, 2011).8 One survey in 2007 (HAP) identified 70 initiatives and a 2012 study identifying 71 (Cragg) but, only 22 initiatives are common to both lists. Even the 119 initiatives that these two studies identify are not the full picture as they, for example, leave out many UN initiatives (Cosgrave, 2013). Other lists of quality initiatives such as those by BOND (Litovsky et al., 2006), the One World Trust (Lloyd and Casas, 2006) or ALNAP (Borton, 2008) also include initiatives not listed by Cragg or HAP.9 http://www.alnap.org/resources/

Page 5:   · Web view2021. 8. 10. · The Real-Time Evaluation of humanitarian action . Abstract. This papers reviews the current use of real time evaluation (RTE) in the evaluation of humanitarian

However, since 2012 there has been a divergence between NGO and UN practice on RTEs. The Transformative Agenda is the current name for the UN Humanitarian Reform begun in 2005 by Jan Egeland (McNamara, 2006, p. 10). Inter Agency Real Time Evaluation were initially piloted to review the level of implementation and identify the key success and weaknesses and were considered being an important tool for transformation, and the IASC Principles meeting of December 2011 reaffirmed its importance (IASC, 2012b, p. 3). 2012 was a busy year for IASC RTEs with IASC RTEs in Haiti and the Horn of Africa (three country RTE and one regional report). However, by November 2012, the IASC seemed to have shifted away from RTEs as a tool, with the a change to Real-Time Operational Reviews (RTORs) instead (IASC, 2012a, p. 5).

The joint DAC-UNEG review of the Evaluation Function in WFP refers to RTEs having "gained limited traction" (OECD-DAC, 2014, p. 42) in the UN system and clarifies that reviews such as the RTORs are internal and are not independent (OECD-DAC, 2014, p. 25). The RTOR appears to have been replaced by the Operational Peer Review (OPR). These are conducted by teams of senior managers from the UN system with one NGO representative. However none of the team are evaluators. The Term of Reference for the OPRs state: "An OPR is not a real-time evaluation, and it is not intended to measure results or the impact of the response." (IASC, 2014a, p. 1; 2014b, p. 1). Further the reports are internal, with the short narrative report and an action matrix remaining internal, marked as confidential, and submitted only to the HC/HCT and UN Agency Emergency Directors (IASC, 2014c, p. 3). Short summary reports may be published, but these appear to be relatively bland (IASC, 2014d).

Despite the apparent abandonment of RTEs by the IASC, NGOs, the Red Cross, and some individual UN agencies have continued to commission RTEs.

The IASC RTEs highlighted the weaknesses in the multilateral humanitarian response even in recent reports (Grünewald et al., 2010; Polastro et al., 2011a). This has been cited as the reason for the launch of the UN's Transformative Agenda (IASC, 2012b, p. 1). (Telford, 2009, p. 5) in his review of three of the first IA RTEs found that few of the recommendations made were implemented. This situation continued with the same problems being repeatedly identified by RTEs.

What is RTE?The United Nations High Commissioner for Refugees (UNHCR), which has originally piloted anc continues to be a strong advocate of RTEs, defines an RTE as “a timely, rapid and interactive peer review of a fast evolving humanitarian operation (usually an emergency) undertaken at an early phase. Its broad objective is to gauge the effectiveness and impact of a given UNHCR response, and to ensure that its findings are used as an immediate catalyst for organizational and operational change.” (Jamal and Crisp, 2002, p. 1). UNHCR uses RTEs as a supportive measure to immediately identify and unlock existing operational bottlenecks.

The ALNAP RTE pilot guide defines an RTE as “an evaluation in which the primary objective is to provide feedback in a participatory way in real time (i.e. during the evaluation fieldwork) to those executing and managing the humanitarian response” (Cosgrave et al., 2009, p. 10). The ALNAP definition does not require that RTEs be in the early stage of a response. This was because some both UN

Page 6:   · Web view2021. 8. 10. · The Real-Time Evaluation of humanitarian action . Abstract. This papers reviews the current use of real time evaluation (RTE) in the evaluation of humanitarian

agencies and IFRC members of ALNAP use the term for an iterative series of evaluations during the life of a project or programme with a pre-mid-term evaluation, a mid-term evaluation and a final evaluation. The evaluations prior to the final evaluations are intended to influence policy directly.

The Inter-Agency Standing Committee (IASC)10 evaluations of the Haiti response fit into this iterative category (Grünewald et al., 2010; Hidalgo and Théodate, 2012), even though the first of these was conducted at the three month mark. Other examples of the iterative RTE are the UN World Food Programme (WFP) Southern Africa RTE (Bennett et al., 2003), and the UN Darfur RTE (Broughton et al., 2006) tsunami response RTEs IFRC (Bhattacharjee et al., 2005) and 2006) .

The critical aspect of real-time evaluation is that it takes place early enough in the agency’s response to influence the current operation. In this paper we will focus on RTEs that take place early on in an intervention as these are the most challenging. Evaluations that take place later on are little different from other humanitarian evaluations.

History of RTEThe RTE concept first emerged in a 1992 evaluation which recommended that UNHCR undertake operational reviews in the initial phase of a response11. Hallam (1998, p. 6), in his guide to the evaluation of humanitarian action, notes that “there is a lot to be said for a reasonably ‘quick and dirty’ evaluation, to take place, while programmes can still be changed”.

Sandison prepared a desk review of the experience of RTEs for Unicef (2003), and identified the 1999 Danida12-commissioned an RTE in Kosovo (ETC UK) as the first Humanitarian RTE. Danida did not repeat the experiment and Sandison notes that this RTE had little impact and was not cited in any of the documents that she reviewed. The term RTE have been in use in the scientific community since at least 196013, but the 1999 Danida evaluation was the first use of it in a humanitarian context.

RTEs in the humanitarian context began to take off with their use by UNHCR. The first formal UNHCR RTE was of the Eritrea-Sudan emergency in 2000 following the influx of refugees in Kerbala, and the final report was published the following year (Jamal, 2001). UNHCR subsequently published a frequently asked question guide to RTEs (Jamal and Crisp, 2002), continued to make use of RTEs, and encouraged others to do so also. RTEs have become an increasingly important

10 The Inter-Agency Standing Committee (IASC) is a UN-led inter-agency forum for coordination, policy development and decision-making involving the key UN and non-UN humanitarian partners.11 Jamal and Crisp (2002, pp. 4 citing J. Crisp, et al, 1992. Review of UNHCR emergency preparedness and response in the Persian Gulf crisis: GULF/EVAL/1912, Rev. 1991. Geneva: UNHCR)12 The Danish Government’s Aid Administration13 The earliest use of the term found in a Google Scholar search was a 1960 NASA Technical Note about monitoring the temperature of the Vanguard 2 rocket (Simas et al., 1960)

Page 7:   · Web view2021. 8. 10. · The Real-Time Evaluation of humanitarian action . Abstract. This papers reviews the current use of real time evaluation (RTE) in the evaluation of humanitarian

part of evaluation in the humanitarian sector. Of ReliefWeb14 postings that refer to evaluation the proportion referring to RTEs have increased from less than 0.5% in 2002 to over 2.5% in 2012. The number of RTEs published15 has also grown over the same period.

Situating RTEs in the evaluland

Bob Strake, quoted by Scriven in his Evaluation Thesaurus (1991, p. 169), made the distinction between formative (learning) and summative (accountability) evaluations with the soup analogy: "when the cook tastes the soup, that's formative; when the guest tastes he soup, that's summative". With RTEs the chef is tasting the soup at an early stage of the cooking so that there is time to change the recipe as needed and not just the seasoning.

Patton identifies developmental evaluations as a third category separate from summative or formative evaluations. Patton states that RTE can be developmental evaluations when they lead to the development of a new or changed approach (Patton, 2011, p. 12). He lists RTEs as one of the approaches that can be used in complex, emergent and dynamic conditions (Patton, 2008, p. 141). Thus in Strake's analogy, an RTE that leads the chef to change the recipe is developmental. The intent and essence of RTEs is direct instrumental use and utilisation, rather than the conceptual utilisation identified by Sandison (2007), or summative use. Of the five types of use identified by Fleischer and Christie (2009), RTE are primarily intended for instrumental use, with some process use (where there is strong engagement with the field team). Their use may also include some elements of conceptual, enlightenment, or persuasive use.

The key stakeholders and users of RTEs are in the field. Nevertheless, they can also be of use at regional office and headquarters levels. Although it is not their primary function, RTEs can also contribute to policy development, as has happened with the transformative agenda in the UN. In this sense, the biennial report of the OCHA evaluation section stated that the results of the IASC RTEs were 'highly seminal' in the development of the UN's Transformative Agenda (OCHA, 2014, p. 15).

When RTEs are conducted at an early stage of an operation, the work that has already been done and can be evaluated is limited. However, as the operational contexts in humanitarian settings changes very rapidly, RTEs need to look forward if they are to be of use. When Eleanor Chelimsky was at the US General Accounting Office (GAO), the GAO produced a paper on prospective evaluation (General Accounting Office and Datta, 1990). The specific context here was the prediction of the likely impact of proposed legislation, based on learning from 14 ReliefWeb (http://www.reliefweb.int) is the website run by the Office for the Coordination of Humanitarian Affairs (OCHA) where humanitarian agencies post information about what they are doing. The postings can vary from short press releases to reports with hundreds of pages. The site has averaged over 37,000 postings per year over the last five years. It provides a good reflection of current trends in humanitarian action.15 Many RTEs are not published even on the ALNAP evaluative reports database, but remain as internal documents. For example, although IFRC conducted an iterative RTE of its Tsunami Response, only the second round RTE was published, and then only as a summary (Bhattacharjee et al., 2005).

Page 8:   · Web view2021. 8. 10. · The Real-Time Evaluation of humanitarian action . Abstract. This papers reviews the current use of real time evaluation (RTE) in the evaluation of humanitarian

previous similar enactments. Similarly, RTEs contain prospective elements, in that the evaluators are using their knowledge of previous programmes to highlight potential problems implicit in the current implementation.

Figure 2: RTE contains retrospective and prospective elements

It is this prospective dimension that distinguishes RTEs from monitoring. Ordinary monitoring looks at what has been achieved against plans, RTEs look at what has been achieved and then look at the implications that this has for the future. Of course for an RTE to be prospective the evaluators must have deep knowledge of the humanitarian sector as a whole and of the lessons learned from other interventions.

Monitoring of humanitarian interventions is often weak. An RTE can help bridge the gap as it provides an immediate snapshot that can help managers identify and address the strengths and weaknesses of the response

The constraints imposed by RTERTEs impose a number of constraints both on the evaluation manager and the evaluation team. These include the need to:

mobilise the evaluation team quickly; avoid overloading the staff in the field managing the emergency response; use appropriate methods adapted to the contextual reality on the ground

as well as the resources available; deliver and validating results before leaving the field; be credible to the field team (context, type of crisis, organisation culture); use appropriate criteria to formulate the evaluation judgement; and

Page 9:   · Web view2021. 8. 10. · The Real-Time Evaluation of humanitarian action . Abstract. This papers reviews the current use of real time evaluation (RTE) in the evaluation of humanitarian

provide value the justifies the load that an early-stage evaluation imposes on the field.

These constraints are likely to determine the type of methods used to collect information, team size, method of analysis and the depth of analysis. The following section will look into the different ways in which actual RTEs have addressed these issues.

The DatasetThe dataset was built up from RTEs already in the authors library, RTEs published on ReliefWeb, and RTE on the Evaluative Reports Database at ALNAP. This last was the source of the bulk of the evaluations.

Year

No RTEs by UN agencies or Donors

RTEs by NGOs or the IFRC

1999

1 One Danida RTE, (ETC UK, 1999). This was the first Humanitarian RTE (Sandison, 2003).

None found

2000

0 None found None found

2001

4 Three UNHCR RTEs, of which one was a short report (Jamal, 2001), and the other two were web bulletins (Stigter and Crisp, 2001; UNHCR, 2001).

One Oxfam RTE on floods in Bangladesh (Corsellis et al., 2001).

2002

2 One UNHCR RTEs bulletin (Jamal and Stigter, 2002).

One International Federation of Red Cross and Red Crescent Societies (IFRC) RTE of food security in Southern Africa (Essack-Kauaria et al., 2002).

2003

1 One WFP RTE (Bennett et al., 2003)

None found

2004

2 One UNHCR RTE for the Chad response (Bartsch and Belgacem, 2004)

One RTE for Darfur (Baker and Chikoto, 2004).

2005

6 Two Tsunami RTEs (Goyder et al., 2005; Grünewald, 2005).

Four tsunami RTEs (Bhattacharjee et al., 2005; Herson, 2005; Ketel et al., 2005; Lee, 2005).

Page 10:   · Web view2021. 8. 10. · The Real-Time Evaluation of humanitarian action . Abstract. This papers reviews the current use of real time evaluation (RTE) in the evaluation of humanitarian

Year

No RTEs by UN agencies or Donors

RTEs by NGOs or the IFRC

2006

11 Eight RTEs of which four focused on Drought in the Horn of Africa (Grünewald et al., 2006a; Grünewald et al., 2006b; Grünewald et al., 2006c; Grünewald et al., 2006d), one on the Niger Crisis (Back et al., 2006), one was the culmination of the multi-visit Darfur RTE (Broughton et al., 2006), one was a UNHCR RTE (Sperl et al., 2006) and the last was the first IASC RTE (IASC, 2006).

Three RTEs by Oxfam, including the first Oxfam RTE (Prideaux-Brune and Scott, 2006) which established the initial Oxfam RTE Benchmarks (timeliness, scale, management, support, and external relations) and two others (Springett and Creti, 2006; Walden et al., 2006).

2007

11 Eight RTEs including five UNHCR RTEs of operations with internally displaced persons (Bourgeois et al., 2007a; Bourgeois et al., 2007b; Diagne et al., 2007; Savage et al., 2007; Wright et al., 2007), the first (Cosgrave et al., 2007) and second (Young et al., 2007) IASC RTEs conducted by external consultants and the conclusion of the FAO tsunami RTE series (FAO, 2007).

Three RTES with two from Oxfam on flood responses in Asia (Agrawal et al., 2007; Loquercio and Mubayiwa, 2007). One IFRC evaluation which although characterised as Mid-term was intended to collect real-time lessons for ongoing and future operations (IFRC, 2007).

2008

5 Two RTEs including IASC RTE of the response to tropical cyclone Nargis (Turner et al., 2008) and an ECHO RTE of HIV support in Zimbabwe (Hoogendoorn and Chisvo, 2008).

Three NGO RTEs on tropical cyclones, one of which, was late for an RTE, but was conducted using the Oxfam RTE benchmarks (Walden, 2008) and two others (Ternstrom et al., 2008; Walden et al., 2008).

2009

5 One Unicef RTE (Bhattacharjee and Varghese, 2009).

Four RTEs by three NGOs (Featherstone et al., 2009; Hagens, 2009; Simpson et al., 2009; Tinnemans et al., 2009).

Page 11:   · Web view2021. 8. 10. · The Real-Time Evaluation of humanitarian action . Abstract. This papers reviews the current use of real time evaluation (RTE) in the evaluation of humanitarian

Year

No RTEs by UN agencies or Donors

RTEs by NGOs or the IFRC

2010

18 Six RTEs including three IASC RTEs (Cosgrave et al., 2010; Grünewald et al., 2010; Polastro et al., 2010), a UNHCR RTE (Crisp et al., 2010) a UNICEF RTE (Steets and Dubai, 2010) and one for a government (Grünewald and Renaudin, 2010)

Twelve RTEs with eight commissioned by four different NGOs (Alainchar et al., 2010; Bernstein et al., 2010; CRS, 2010; Dolphin et al., 2010; Goyder, 2010; Hagens and Ishida, 2010; Slootweg et al., 2010), plus three for the IFRC (Christensen, 2010; Djordjevic et al., 2010; Fisher et al., 2010) and one for the British Red Cross (Fortune and Rasal, 2010).

2011

14 Four RTEs, One by a donor (Grünewald et al., 2011), two by the IASC (Bhattacharjee and Lossio, 2011; Polastro et al., 2011b), and one by UNHCR (Balde et al., 2011).

Ten RTES, three by Oxfam (Khan et al., 2011; Murphy et al., 2011; Simpson et al., 2011) and five by four other NGOs (Albone and Hackman, 2011; Dolphin et al., 2011; Johnson, 2011; Murtaza et al., 2011; Sandison and Khan, 2011), plus two by the IFRC (Burton, 2011; Khogali et al., 2011).

2012

13 Seven RTES, with one by Unicef (Arqués and Leonardi, 2012) and six but the IASC, of which five relate to the crisis in the Horn of Africa (Bhattacharjee and Marroni, 2012; Darcy et al., 2012b; Duncalf et al., 2012; Sida et al., 2012; Slim, 2012), and one is a follow-up RTE for the Haiti Earthquake (Hidalgo and Théodate, 2012).

Six RTEs of which five relate to the crisis in the Horn of Africa, with four from the Disasters Emergency Committee (Clifton, 2012; Darcy, 2012; Darcy et al., 2012a; Darcy et al., 2012c) and one is from CAFOD (Tindal, 2012), with a further RTE on Niger (Dolphin, 2012).

2013

8 Three RTEs, including two UNHCR RTEs on the Syrian Crisis (Crisp et al., 2013) and the South Sudan crisis (Ambroso et al., 2013).

Five RTEs including a CRS evaluation in Mali (Abdourhimou et al., 2013), a real-time lessons learned summary for the Philippines (Grünewald and Boyer, 2013), and an RTE of the Haiyan response (Duncan, 2013), and a Mozambique Floods RTE (Simpson et al., 2013). Finally there is a DEC Response Review on the Syria Crisis, which is really a form of RTE (Darcy, 2013).

Page 12:   · Web view2021. 8. 10. · The Real-Time Evaluation of humanitarian action . Abstract. This papers reviews the current use of real time evaluation (RTE) in the evaluation of humanitarian

Year

No RTEs by UN agencies or Donors

RTEs by NGOs or the IFRC

2014

3 None found Three RTEs, all of the Cyclone Haiyan response in the Philippines (Cosgrave and Ticao, 2014; Greenhalgh et al., 2014; Ricafort, 2014)

Total

104 48 RTEs; 45 by the UN (of which 19 were Inter-Agency RTEs), one by a bilateral Donor and two by ECHO.

56 RTEs: 45 by NGOs and 11 by the Red Cross Movement

Another six evaluation reports with the title of RTE were excluded from the dataset as they were not considered to be RTEs of traditional humanitarian action and relevant for the analysis16. It should be noted that this dataset is more than twice the size of the data-set of 44 used by (Krueger and Sagmeister, 2014) and significantly larger than the sample of 56 used by (Letch, 2013, p. 33)All of the 104 reports in the dataset were indexed using dtSearch17 to facilitate sophisticated searches within the document set including proximity searches, word-stem searches, and thesaurus based searches. The evaluations in the dataset fall into three groups: early stage single agency RTEs by NGOs; joint inter-agency RTEs by the IASC; and a group of other evaluations.

Guidance on RTEsThe first guidance on RTEs was the UNHCR Frequently Asked Questions note (Jamal and Crisp, 2002). Brusset prepared an issues and options paper for ALNAP (2007), and ALNAP eventually produced the Pilot Guide in early 2009 (Cosgrave et al., 2009). Despite the fact that this guide is described as a pilot, there has been no formal follow-up process or development yet. Other guidance and advice on RTEs include the 2010 article in New Directions for Evaluation (Brusset et al., 2010), the short article in 2011 Humanitarian Exchange (Polastro, 2011), the IASC guidelines (IA RTE Steering Group, 2011) and a presentation to the European Evaluation Society that set out five ways in which RTEs could be improved (Krueger and Sagmeister, 2012, 2014). There has also been a Canadian Journal of Programme Evaluation practice note on humanitarian rtes.(Polastro, 2014). . There has also been some broader attention to the evaluation of complex interventions in real time (Ling, 2012), but this does not focus on humanitarian interventions in particular. There are also some specific agency guides on RTE including for Unicef and WFP.

16 These include two FAO RTEs on the response to Avian Flu (Perry et al., 2010; Shallon et al., 2007), and three RTEs on support to climate change including two from Norad (Lincoln et al., 2013; Tipper et al., 2011) and one from the ADB (Thukral et al., 2014) and the evaluation of the GENCAP project (Binder, 2009). It should be noted that the FAO examples were included by (Krueger and Sagmeister, 2014, pp. author-date) in their analysis.17 DTsearch is a flexible full text indexing software application (http://www.dtsearch.com).

Page 13:   · Web view2021. 8. 10. · The Real-Time Evaluation of humanitarian action . Abstract. This papers reviews the current use of real time evaluation (RTE) in the evaluation of humanitarian

The impact of guidance on the quality of RTEs is uncertain. While (Letch, 2013, p. 69) found improvements in the quality for RTEs after the publication of the 2009 RTE pilot guide, the analysis conducted for this article shows no statistically significant difference between RTEs before and after the guide's publication. The quality of evaluations reviewed in the dataset varies and appears to be dependent on the depth of the evaluation experience of the teams.

Rapid mobilisationRTEs mobilise faster than regular humanitarian evaluations. The earlier an RTE mobilises the greater the theoretical opportunity it has to influence the direction of operations. However, if management systems are still very much in flux, an early RTE may not be able to influence ongoing policy, but only the current day-to-day operations. Most humanitarian RTEs take place from one to six months after the start of the operation. The UNHCR paper on RTE suggests that ideally the RTE should take place between four and six weeks after the start of the operation (Jamal and Crisp, 2002, pp. 2-3) and the ALNAP RTE guide focus on RTE taking place “within a few months of the start of the response” (Cosgrave et al., 2009, p. 6).

Single agency evaluations tend to be faster than joint evaluations. In his review of three early IA-RTEs (Telford, 2009, p. 6) noted the need for an automatic trigger mechanism for IA-RTEs . The IASC guidelines (IA RTE Steering Group, 2011, p. 16) established an automatic trigger for holding RTEs. The same guidance set a target of completing IA RTEs of 90 days from the onset of the disaster (IA RTE Steering Group, 2011, p. 6). However, the IASC never introduced any preparedness or standby capacities arrangements for the timely launching of IA RTEs.

The need for speed is driven by the short time scale of many humanitarian interventions combined with the high staff turnover in the initial stages of an emergency. The Oxfam Pakistan Floods RTE noted that, even though the RTE took place two months after the floods, many of the flood response projects were already closing (Loquercio and Mubayiwa, 2007, p. 5). In the case of the Mozambique floods, the evaluation team found that the operation was winding down when they started work two months after the floods and that several of the people deployed as surge capacity had already left (Cosgrave et al., 2007, p. 18).

Recruiting an evaluation team takes time and this is why the agencies such as Oxfam and UNHCR that place the greatest emphasis on early RTE use their own staff for RTEs rather than external consultants. Agencies such as the IASC use external consultants and this makes the process of fielding an RTE longer. However, even with external consultants, the IASC has started at least one RTE (Cosgrave et al., 2007) at the two month mark and finished it within three months.

Maintaining a light footprintThe larger the evaluation team the bigger the stress it puts on resources in the field. While a team of two can travel in one vehicle with a driver, a liaison person, and a translator, a team of four may need two vehicles. The IASC guidelines suggest a team of two international and two national consultants (IA RTE Steering Group, 2011, p. 43). Often, the team will be together for the HQ interviews and

Page 14:   · Web view2021. 8. 10. · The Real-Time Evaluation of humanitarian action . Abstract. This papers reviews the current use of real time evaluation (RTE) in the evaluation of humanitarian

then split into pairs for the field work, as was the case with the RTE of the Kenya Drought Response (Murphy et al., 2011, p. 7).

One sixth of the RTEs in the dataset were been conducted by solo evaluators. This is against one fifth of other evaluations in the sector (from a sample of 163 other humanitarian evaluations examined). Both RTEs and other evaluation had teams of three persons or less in 61% of the cases. Only one in ten RTEs had a team size of over five against one in six of other evaluations. The average team size in the RTE sample was 3.45 against 3.92 for other evaluations.

Team size is affected by a number of factors, including whether an evaluation is a single agency or a joint evaluation. Most of the solo evaluations cited were single agency evaluations.

This difference arises because evaluations of major humanitarian operations with a wide geographic and sector coverage can field big teams. For example the 2000 Kosovo evaluation (Wiles et al., 2000) had a team of 11, the Disasters Emergency Committee Tsunami evaluation had a team of 13 (Vaux et al., 2005) and the follow up to the Tsunami Linking Relief to Rehabilitation and Development Study (Brusset et al., 2009) had a team of 14. The Tsunami Evaluation Coalition evaluations had 73 different authors (Cosgrave, 2007, pp. 2, 39-40). However most humanitarian evaluation have team sizes similar to the RTE sample, with the exception that team sizes of over five are far less common for RTEs.

The second way in which RTEs avoid overloading the field is through reducing the time spent in the field. The possibilities for doing so depend on the geographical scale and complexity of the response and the levels of access available18. The data set contained examples ranging from five days in the field (Springett and Creti, 2006) to four weeks (Grünewald et al., 2010). However, ten to fourteen days is the most common duration.

In a few cases the RTE team did not visit the field as in the case of the CARE Darfur RTE (Baker and Chikoto, 2004, p. 1) or has a very constrained field programme as in the 2007 Pakistan Floods RTE which spent a total of four days (including travel) in the field of which half was taken up by visits to provincial headquarters (Young et al., 2007). This can be contrasted with the 2010 Pakistan Floods RTE with visited 20 field locations scattered from Karachi to Peshawar along the Hindus river over a three week period (Polastro et al., 2011b).

Designs and MethodsRTE draw on the usual range of evaluation designs and methods for Humanitarian Evaluations. However, it should be clear that the majority of the humanitarian evaluations focus more on process than on impact. It is difficult to establish impact in the time-frames associated with humanitarian action. Furthermore agencies humanitarian responses tend to have very short result chains that end when inputs are delivered and rarely focus on outputs and outcomes. Even when they examine impact, they tend to fall in to the weakest design category (post-test only with no comparison groups) of the seven basic impact evaluation frameworks suggested by Bamberger et al. (2011). 18 For example, security considerations may restrict access to field locations and therefore limit the overall length of evaluation missions.

Page 15:   · Web view2021. 8. 10. · The Real-Time Evaluation of humanitarian action . Abstract. This papers reviews the current use of real time evaluation (RTE) in the evaluation of humanitarian

Randomised Control Trials are very unusual in humanitarian evaluations19 and then only occur in the recovery rather than the acute phase as in CRS reconstruction project in Liberia (Fearon et al., 2008) and WFP Cash Transfer pilot in Sri Lanka (Sandström and Tchatchua, 2010). Conducting Randomised Control Trials in humanitarian action poses some of the same ethical issues that the double blind randomised testing of parachutes does (Smith and Pell, 2003).

The term counterfactual appears once in the RTE data set, but only to explain that a counterfactual control group was not possible or ethical reasons (Hoogendoorn and Chisvo, 2008, p. 3). Comparisons are occasionally made, e.g. the slow response in Darfur is compared with the scaling up in other contexts (Broughton et al., 2006, p. 4). However, this reference is a weak echo of the specific comparison in the Terms of Reference which contrasts Darfur with Kosovo, Iraq, and Afghanistan (Broughton et al., 2006, p. 98).

Comparisons are sometimes drawn with unassisted groups, or the work of other agencies, but these tend to be ad-hoc rather than rigorous comparisons. One interesting use of comparisons is in the Oxfam RTEs. The Zimbabwe cholera RTE states that the standard Oxfam benchmarks will be used to facilitate comparison between different responses and broader organisational learning (Simpson et al., 2009, p. 24).

The methods used by the RTEs vary with the length of the mission. The main method is the semi-structured interview. Group interviews, often misnamed focus group interviews are another common method. True focus group discussions20 (where there is a limited number of participants discussing the topic with each other under the guidance of a facilitator) are rare.

Document review is little used, only ten of the RTEs make reference to it, and these tend to be the larger and more complex RTEs such as the IASC RTEs. This is not surprising, as there is normally a paucity of programmatic and analytical documents to review in an early-stage RTE.

Observation is referred to by more than half of the RTE sample. This is entirely appropriate as one would expect observation to be an important method as humanitarian settings are generally fast changing in the early stages of the response.

As with all qualitative and mixed-methods research, triangulation is key to ensuring accuracy and reliability. Triangulation is referred to by two fifths of the sample.

Even with limited timeframes and limited use of non-qualitative methods, RTEs can still be systematically designed to generate significant amounts of data. The

19 A search of the ALNAP Evaluative Reports Database (http://www.alnap.org/resources/erd.aspx) returns only 18 documents with the phrase “control group”, whereas searching for the phrase “evaluation” returns 1,134 documents. However of these, only five (Colliard, 2005; de Klerk, 2004; Devereux et al., 2006; Melville and Scarlet, 2003; Styger, 2008) actually used control groups, one was a synthesis of a number of other (largely development) evaluations (Sridhar and Duffield, 2006), and one (Corsellis et al., 2001) made a reference to using a comparison village as an ad-hoc control group.20 E.g as defined by (Krueger and Casey, 2009)

Page 16:   · Web view2021. 8. 10. · The Real-Time Evaluation of humanitarian action . Abstract. This papers reviews the current use of real time evaluation (RTE) in the evaluation of humanitarian

team of three on the two week Pakistan IDP RTE recorded over 1,300 items of evidence totalling over 28,000 words, largely based on semi-structured interviews (Cosgrave et al., 2010). This evidence was then tabulated to form the basis for the report.

Quantitative methods are little used at all in the RTE sample. None of the reports reviewed present statistically significant data.

While nearly two thirds of the dataset referred to surveys these references were often to monitoring survey (such as nutrition surveys) of different kinds, or to survey results reported in other evaluations. Formal surveys are rarely used, although FAO did so after the Tsunami (FAO, 2007), and the IASC Ethiopia RTE made use of household surveys (Sida et al., 2012, p. 34).

InterviewsKey informant interviews are the backbone of humanitarian evaluation generally and the same is true for RTEs. The RTEs all make use of key-informant interviews, sometimes with over 100 interviews (the IASC cyclone Nargis RTE undertook over 120 key informant interviews (Turner et al., 2008, p. 1), and the IASC RTE of the 2010 Pakistan floods listed 35 individual semi-structured interviews and a further 283 persons interviewed in semi-structured group interviews (Polastro et al., 2011b, p. 110).

Online surveysThree of the RTEs used online surveys (Burton, 2011; Cosgrave and Ticao, 2014; Khogali et al., 2011) to gather data. Two more used exit surveys of departing staff and volunteers (Fisher et al., 2010; Greenhalgh et al., 2014). One used a mini face to face survey (Leonardi and Arqués, 2013). Another 63 reports made reference to the results of other surveys including nutrition survey and household survey, but these had not been conducted specifically for the evaluation. In total surveys were used as a source of data by two thirds of RTEs reviewed in this study.

Sampling strategiesOnly one RTE specifically discusses the sampling strategy used (Cosgrave and Ticao, 2014). Six RTEs refer to random sampling, but this is in reference to other data sources and not to the evaluation's own sampling. One RTE had an insightful comment on the size of the sample used to estimate the overall pattern of needs (Brown and Jivan, 2010, p. 17).

Sampling strategies are generally not discussed, and only 6 reports in the dataset refer to random sampling. A few reports talk about “random interviews” (Walden et al., 2006, p. 13) or interviews with “random selected householders” (Ternstrom et al., 2008, p. 8) but the context suggests that these were convenience samples rather than true random samples. Random sampling was explicitly excluded by the Terms of Reference for the IASC Pakistan Displacement RTE “There will not be sufficient time to do a random sampling” (Cosgrave et al., 2010, p. 59).

Page 17:   · Web view2021. 8. 10. · The Real-Time Evaluation of humanitarian action . Abstract. This papers reviews the current use of real time evaluation (RTE) in the evaluation of humanitarian

The most common sampling strategy is purposive. Purposive sampling is appropriate for key informant interviews, given the relatively high level of skills needed for such interviews. None of the documents make specific reference to snowball sampling – a technique used to increase the quality of a purposive sample by asking the key-informant whom else they would regard as a key-informant. Suggestions are then added to the interview target list until no new names are suggested21.

Delivering and validating results in the field

One key element of RTEs is that they should influence policy and practice during the implementation of the response. That is, they should be developmental evaluations in the definition given by (Patton, 2011, p. 12). While headquarters may set broader strategy, managers at the field level decide on tactics and make day to day decisions on how broader policy such as the humanitarian reform and its transformative agenda should be applied.

In the Review of Joint Evaluations and the Future of Inter Agency Evaluations (Telford 2009) a systematic weakness identified was the follow-up on evaluation recommendations, leading to poor implementation. To tackle this, some of reports introduced innovative features, as it was the case with the Pakistan Floods where the executive summary takes the form of a short introduction of less than one page followed by a ten page matrix listing, by thematic area of investigation, the findings, conclusions, and recommendations, together with an indication of the level recommendation priority and the level to which it applies (e.g. global, national, or provincial), who is responsible, and when it should be implemented by (Polastro et al., 2011b, pp. 4-15).

A key part of delivering results is the field is validation of results by involving stakeholders. It is quite common for RTEs to have either a concluding workshop or day of reflection to develop the incipient recommendations from the evaluation team. A day of reflection is a common part of Oxfam and CRS RTEs. The CRS Benin, Niger, and Swat Valley RTE used an adapted matrix from the ALNAP pilot guide to present the findings conclusions and recommendations, which were ten discussed by the participants (Dolphin et al., 2010; Dolphin et al., 2011, p. 81; Hagens, 2009)

The arrangement with IASC and other RTEs has varied, but a debriefing meeting was commonly held to validate findings. The 2011 RTE for the Pakistan floods had an initial debriefing at the end of the first mission followed by a second mission after the preparation of the draft report. During this second mission three provincial and one national workshop were held with key stakeholders involved in the humanitarian response to the floods. Findings, conclusions and recommendations were initially presented by the team leader during the workshops. Stakeholders then jointly validated, framed, and prioritized recommendations and defined who was to implement them and on what timeframe (Polastro et al., 2011b, p. 11). Findings, conclusions and recommendations were then presented at Headquarter level in Geneva and New

21 The lack of reference to snowball sampling may reflect the very limited descriptions of methodology found in many of the sample rather than a lack of knowledge and use of this technique.

Page 18:   · Web view2021. 8. 10. · The Real-Time Evaluation of humanitarian action . Abstract. This papers reviews the current use of real time evaluation (RTE) in the evaluation of humanitarian

York. This comprehensive approach fostered real time learning within the humanitarian system.

While debriefing workshops are a common feature of humanitarian evaluations, they are normally much more limited than has been the case with RTEs. This is appropriate given the intent that the learning from RTEs should influence the current operation in “real time”.

(Bamberger, 2009, p. 42) notes that “many potentially useful evaluations have little impact because the findings are not communicated to potential users in a way that is useful or comprehensible way." RTE teams have to prioritise facilitated, reflective, participatory methods to the point of developing and delivering final recommendations with the main “stakeholders” in the field. The current trend is for the increasing use of evaluation briefs or short summaries, translations of the executive summary in to the local language, video clips and other tailored communication tools (Polastro, 2014, p. 126)@126.

Be credibleUnless the findings of the RTE are seen as credible by the field team, they are unlikely to be applied by the field team, even if they are not overtly challenged. RTE use two main approaches to achieving credibility with field teams. The first is the approach sometimes taken by UNHCR with evaluations led by very highly regarded experienced senior figures such as Jeff Crisp (Balde et al., 2011; Bourgeois et al., 2007b; Crisp et al., 2010; Stigter and Crisp, 2001). Their broad experience of many emergency operations, and the level of seniority, of such persons gives the team a tremendous natural authority that means that their views are listened to.The second approach is based on the understanding that field staff will have greatest confidence in evaluation when it is participative and interactive and that findings are based in part on their experience and which they have had a voice in developing. Thus methods, like key-informant interviews, after-action reviews22, or iterative focus group discussions with staff can help to capture learning by those most involved in the response and increase the credibility.

Similarly, the reflection days, or debriefing workshops, or workshops, engage the field staff in developing the final recommendations and therefore help to ensure that they are owned and grounded being seen as credible and actionable by field staff.

The credibility of the RTEs for an external audience is often reduced by the lack of indication of the strength of the evidence on which the evaluation was based. Little information is sometimes given on the detailed methodology or on the constraints faced. Interview and focus group guides or sampling protocols are often not presented, so it is difficult to know what data the team were gathering and what challenges they faced in doing so.

22 While eight of the RTEs referred to After Action Reviews (AAR), or included them in their cited document lists, only one, the WFP Tsunami RTE, made use of an AAR as part of the evaluation (Goyder et al., 2005, p. 52).

Page 19:   · Web view2021. 8. 10. · The Real-Time Evaluation of humanitarian action . Abstract. This papers reviews the current use of real time evaluation (RTE) in the evaluation of humanitarian

Recommendations made by RTEsThe RTEs in the data set vary greatly in the level and number of recommendations made. Later-stage RTEs tend to make fewer recommendation, and to make those at a higher level than do other early-stage evaluations. There appear to be a number of reasons for this:

First, early stage RTEs are very much focused on operational issues, and can make recommendations at a quite detailed level. For example 13 of the 104 RTEs reviewed make reference to the Job Descriptions for particular positions.

Second, the limited time for analysis (as the team normally have to present results before leaving the country), and the more limited evidence base than evaluations that can draw upon years of monitoring reports or other documents encourages attention to surface features rather than to broader strategic issues.

Later and iterative RTEs typically pay more attention to strategy .

Part of the reason for the large number of recommendations in early stage evaluations in the dataset is that many of the Oxfam evaluations were led by people whose normal job function is management rather than evaluation. Full time evaluators are much more aware of the limited capacity of agencies to process lots of recommendations. However, the argument for using managers from outside the specific programme is a very strong one. The evaluators are those who learn most from any evaluation, and using managers from one area to evaluate responses in another helps to ensure that the learning from the evaluation is retained within the organisation.

Large number of recommendations, especially if no priority, timescale, or responsibility, are assigned, , are normally not very useful because agencies are not able to address more than a few recommendations. However, in an RTE, the high level of engagement with the field team means that a lot of minor recommendations on operational matters can be delivered during the course of the work. However, even here, the priority and likely timescale should be highlighted, and the recommendations should be documented in the report.

Perhaps the best argument that RTEs are delivering value is their use by agencies such as UNHCR and Oxfam and continuing uptake with the key components of the humanitarian system that have a very strong operational focus.

ConclusionThe numbers of people affected by disasters is increasing as is humanitarian action to meet their needs in the aftermath of disasters. Humanitarian action typically takes place against a background of very poor information compared with development interventions. This information deficit poses problems for the quality of humanitarian action. RTE is approach that the humanitarian community has used to try to improve the quality of humanitarian response.

The term RTE is used to cover quite a range of evaluations, but those of greatest interest are evaluations in the early stages of humanitarian responses where the

Page 20:   · Web view2021. 8. 10. · The Real-Time Evaluation of humanitarian action . Abstract. This papers reviews the current use of real time evaluation (RTE) in the evaluation of humanitarian

evaluations have the greatest chance of influencing the response. RTEs are different from monitoring, in that RTEs are prospective and look at the likely outcomes of current policies and not just at whether targets are being met. The later stage RTEs (those occurring at or after the six month mark) are little different from other humanitarian evaluations.

The usefulness of RTEs can be seen by their growing use in the sector, particularly by agencies with a strong operational focus. RTEs impose a number of constraints including the need to: mobilise the evaluation team rapidly; avoid overloading the field; use appropriate methods; deliver and validate results during the fieldwork; be credible to the field team; use appropriate criteria; and provide value that justified the load on the field team.

A general weakness in the early-stage RTEs seen is the focus on process rather than on eventual outcomes or impact. Thus RTEs are not as prospective or forward-looking as they could and should be and many in the sample are more akin to monitoring than evaluation. This is partly due to RTEs being used to fill the gap caused by weak monitoring in humanitarian action. RTEs need to be more prospective as discussing the likely outcomes of particular approaches helps to clarify why particular strategic and operational changes are recommended, and which changes should be prioritised.

Debriefing meetings and workshops are a vital part of many RTEs. However, even greater use could be made of validation and debriefing workshops, to develop prioritized and actionable recommendations for the future from the conclusions derived by RTEs. This would increase stakeholder ownership of the recommendations and improve the chances of utilisation.

This focus on process is linked with the relatively limited use of mixed methods. RTEs rely very heavily on qualitative methods and make little use of even the simplest quantitative methods. Mixed methods offer increased opportunities or triangulation and for ensuring that the RTEs are drawing the correct conclusions.

Another weakness in the RTEs is that relatively little information is provided about methods in many cases. This not only makes it harder to assess how much reliance should be placed on the conclusions, but also makes it harder for others to use the same methods in subsequent evaluations. For reporting, RTEs should be following the advice offered in the ALNAP Quality Proforma (ALNAP, 2005) which effectively set out what constitutes a good quality report. It would then be possible for the readers to determine the quality of the process on which the evaluation findings, conclusions, and recommendations are based.

However, despite these shortcomings, the RTEs in the data set seem to be producing useful information for the agencies commissioning them. Most consider them good value for money and continue to commission them. At the very least, the RTEs are identifying key constraints and bottlenecks in humanitarian action. Thus RTEs seem to be meeting the overarching objective of helping to improve the quality of humanitarian action.

Page 21:   · Web view2021. 8. 10. · The Real-Time Evaluation of humanitarian action . Abstract. This papers reviews the current use of real time evaluation (RTE) in the evaluation of humanitarian

References

Abdourhimou, A., Coly, S. I. P., & Dolo, I. (2013). Real Time Evaluation: Project NAATA Mali (pp. 19). Baltimore: Catholic Relief Services.

Agrawal, M. K., Scott, I., Farhodi, W. M., & Walden, V. M. (2007). Real Time Evaluation of Oxfam GB’s response to the South Asia floods July-September 2007: Final Report (pp. 24). Oxford: Oxfam.

Alainchar, F., Budge, T., & Haldorsen, a. J. (2010). Plan International: Real Time Evaluation of Earthquake Relief Programme in Haiti (pp. 42). Woking: Plan International.

Albone, S., & Hackman, R. (2011). Real Time Evaluation of CARE’s Response to the Drought and Food Security Emergency, Southern Lao, 2010-11 (pp. 38). Vientiane: Care Lao PDR.

Alkin, M. C., & Christie, C. A. (2004). An evaluation theory tree. In M. C. Alkin (Ed.), Evaluation roots: Tracing theorists' views and influences (pp. 12-65). Thousand Oaks: Sage.

ALNAP, Beck, T., Christoplos, I., Goyder, H., Mitchell, J., & Houghton, R. (2003). ALNAP Annual Review 2003: Humanitarian Action: Improving Monitoring to Enhance Accountability and Learning. London: Active Learning Network on Accountability and Performance in Humanitarian Action.

ALNAP. (2005). The ALNAP Quality Proforma 2005. In N. Behrman (Ed.), ALNAP Review of Humanitarian Action in 2004: Capacity Building (pp. 181-193). London: Active Learning Network on Accountability and Performance in Humanitarian Action.

ALNAP, Mitchell, J., Slim, H., Vaux, T., & Sandison, P. (2007). ALNAP Review of Humanitarian Action: Evaluation Utilisation. London: Active Learning Network on Accountability and Performance in Humanitarian Action.

Ambroso, G., Ja, J., Lee, V., & Salomons, M. (2013). Flooding across the border: A review of UNHCR’s response to the Sudanese refugee emergency in South Sudan (Evaluation Report PDES/2013/08, pp. 36). Geneva: United Nations High Commisser for Refugees.

Arqués, R. S., & Leonardi, E. (2012). Real-Time Independent Assessment (RTIA) of UNICEF’s Response to the Sahel Food and Nutrition Crisis, 2011–2012 (pp. 89). Dakar: Unicef West and Central Africa Regional Office.

Back, L., Gonzalez-Aleman, J., & Fabre, D. (2006). Real Time Evaluation of UNICEF's Response to the 2005 Food and Nutrition Crisis in Niger (pp. 113). New York: United Nations Childrens Fund.

Baker, J., & Chikoto, G. (2004). Final report CARE international’s humanitarian response to the Darfur crisis: real-time evaluation (RTE) phase #1 (pp. 21). Geneva: CARE International.

Balde, M. D., Crisp, J., Macleod, E., & Tennant, V. (2011). Shelter from the storm: A real-time evaluation of UNHCR’s response to the emergency in Côte d’Ivoire and Liberia (Evaluation Report PDES/2011/07, pp. 59). Geneva: United Nations High Commissioner for Refugees.

Bamberger, M. (2009). Institutionalizing impact evaluation within the framework of a monitoring and evaluation system (pp. 50). Washington: World Bank.

Bamberger, M., Rugh, J., & Mabry, L. (2011). RealWorld evaluation: Working under budget, time, data, and political constraints. Los Angeles: Sage Publications.

Page 22:   · Web view2021. 8. 10. · The Real-Time Evaluation of humanitarian action . Abstract. This papers reviews the current use of real time evaluation (RTE) in the evaluation of humanitarian

Bartsch, D., & Belgacem, N. (2004). Real time evaluation of UNHCR’s response to the emergency in Chad (pp. 32). Geneva: United Nations High Commissioner for Refugees.

Beck, T. (2006). Evaluating humanitarian action using the OECD-DAC criteria. London: ALNAP.

Bennett, J., Weingaertner, L., Herbinger, W., Murphy, J., Jones, A., & Lefevre, J. (2003). Full Report of the Real-Time Evaluation of WFP’s Response to the Southern Africa Crisis, 2002-2003 (EMOP 10200) July 2002 – May 2003 (OEDE/2003/03, pp. 128). Rome: World Food Programme Office of Evaluation.

Bernstein, C., Foss, J., & Robins, G. (2010). A Real- Time Evaluation of Christian Aid’s Response to the Haiti Earthquake (May 16 – June 08) (pp. 21). London: Christian Aid.

Bhattacharjee, A., Rajasingham-Senayake, D., Fernando, U., & Sharma, S. (2005). Real time evaluation of tsunami response in Asia and East Africa, second round: Synthesis Report (pp. 52). Geneva: IFRC.

Bhattacharjee, A., & Varghese, M. (2009). UNICEF’s Response to Georgia Crisis: Real Time Evaluation (pp. 41): UNICEF.

Bhattacharjee, A., & Lossio, R. (2011). Evaluation of OCHA Response    to the Haiti Earthquake (pp. 118). Geneva: OCHA.

Bhattacharjee, A., & Marroni, M. (2012). IASC Real Time Evaluation (IASC RTE) of the Humanitarian Response to the Horn of Africa Drought Crisis 2011: Regional Mechanisms and Support during the Response (pp. 34). Geneva: IASC.

Binder, A. (2009). Real Time Evaluation of the GenCap Project: Final report (pp. 52). Berlin: Global Public Policy Institute.

Borton, J., Millwood, D., & Joint Evaluation of Emergency Assistance to Rwanda: Steering Committee. (1996). The international response to conflict and genocide : lessons from the Rwanda experience: Study 3: Humanitarian aid and effects. Copenhagen: Steering Committee of the Joint Evaluation of Emergency Assistance to Rwanda.

Borton, J. (2008). Summary of the Emerging Results from the ALNAP Humanitarian Performance Project: Exploratory Phase (pp. 16). London: ALNAP.

Borton, J. (Ed.). (1994). Code of conduct for the International Red Cross and Red Crescent Movement and NGOs in disaster relief: Network Paper 7. London: Relief and Rehabilitation Network, Overseas Development Institute.

Bourgeois, C., Diagne, K., & Tennant, V. (2007a). Real time evaluation of UNHCR’s IDP operation in the Democratic Republic of Congo (Evaluation Report PDES/2007/02 - 5, pp. 28). Geneva: United Nations High Commissioner for Refugees.

Bourgeois, C., Wright, N., & Crisp, J. (2007b). Real-time evaluation of UNHCR's IDP operation in Uganda (Evaluation Report PDES/2007/02 - RTE 3, pp. 21). Geneva: United Nations High Commissioner for Refugees.

Broughton, B., Maguire, S., Ahmed, H. Y., David-Toweh, K., Tonningen, L. R. v., Frueh, S., Prime, T., & Kjeldsen, S. (2006). Inter-agency Real-Time Evaluation of the Humanitarian Response to the Darfur Crisis A real-time evaluation commissioned by the United Nations Emergency Relief Coordinator and Under-Secretary-General for Humanitarian Affairs (pp. 248). New York: Office for the Coordination of Humanitarian Affairs.

Brown, P., & Minty, J. (2006). Media Coverage and Charitable Giving after the 2004 Tsunami (William Davidson Institute Working Paper 855, pp. 32). Ann Arbor: WilliamWilliam Davidson Institute at the University of Michigan.

Brown, S., & Jivan, J. J. (2010). Tearfund Pakistan Floods Response: Real Time Evaluation Report (pp. 60). Teddington: Tear Fund.

Page 23:   · Web view2021. 8. 10. · The Real-Time Evaluation of humanitarian action . Abstract. This papers reviews the current use of real time evaluation (RTE) in the evaluation of humanitarian

BRR, World Bank, Said, S., Clark, J., Stephens, M., Fengler, W., Mangkusubroto, K., Rahmany, A. F., Subekti, A., Wibisana, B. H., Nursani, D., Purwanto, E., Mardhatillah, H. F., Prasetyo, H., Iskandar, Hutabarat, J. S. U., Watson, P., Ibrahim, R., Widjajanto, Nicol, B., Evans, K., Ihsan, A., Sim, A., Bald, A., Klausen, A.-L., Perdana, A., Triasdewi, C., Rahmi, C. D., Janssen, G., Sheppard, J., Arze, J., Lebo, J., Sharp, J., Bell, K., Roesad, K., Adriani, M., Arnold, M., Eza, M., Ivaschenko, O., Subramaniam, R., Barron, P., Cibulskis, R., Roesli, R., Siregar, S., Burgess, S., da Cruz, V., Bottini, V., & Bougas., W. (2005). Rebuilding a Better Aceh and Nias: Stocktaking of the Reconstruction Effort: Brief for the Coordination Forum Aceh and Nias (CFAN) - October 2005 (pp. 164). Jakarta: BRR and World Bank.

Brusset, E. (2007). RTE: Approach to Guidance An Approach to Evaluation Guidance for ALNAP on Real Time Evaluation in Humanitarian Action (pp. 34). Ohain: Channel Research.

Brusset, E., Bhatt, M., Bjornestad, K., Cosgrave, J., Davies, A., Deshmukh, Y., Haleem, J., Hidalgo, S., Immajati, Y., Jayasundere, R., Mattsson, A., Muhaimin, N., Polastro, R., & Wu, T. (2009). A ripple in development?: Long term perspectives on the response to the Indian Ocean tsunami 2004: A joint follow-up evaluation of the links between relief, rehabilitation and development (LRRD) (ISBN 978-91-586-4086-3, pp. 160). Stockholm: Swedish International Development Agency.

Brusset, E., Cosgrave, J., & MacDonald, W. (2010). Real-time evaluation in humanitarian emergencies. New Directions for Evaluation, 2010(126), 9-20.

Buchanan Smith, M., & Cosgrave, J. (2013). Evaluation of Humanitarian Action: Pilot Guide (pp. 209). London: Alnap.

Burton, C. (2011). Real-time evaluation of IFRC response to 2010 Pakistan floods (pp. 81). Geneva: International Federation of Red Cross and Red Crescent Societies.

Buston, O., Smith, K., Stirk, C., Sparks, D., Malerba, D., & Sweeny, H. (2013). Global humanitarian assistance: Report 2013. Bristol: Development Initiatives.

Christensen, L. (2010). Real-time evaluation of ERU psychosocial support component: Haiti 19-28 February 2010 (pp. 24). Geneva: International Federation of the Red Cross and Red Crescent Societies.

Christie, C. A., & Alkin, M. C. (2008). Evaluation theory tree re-examined. Studies in Educational Evaluation, 34(3), 131-135.

Clifton, D. (2012). Gender Equality in the East Africa Crisis Response (pp. 7). London: Disaster Emergency Committee (DEC) and the Humanitarian Coalition.

Colliard, C. (2005). A Psychosocial Programme in Recreational Centres in Bam (Iran): Evaluation (pp. 57). Geneva: Centre for humanitarian psychology.

Corsellis, T., Rahman, S., Nasreen, M., & Ali, H. (2001). Internal evaluation of Emergency Floods Relief and Rehabilitation Programmes 2000 - 2001 (pp. 63). Oxford: Oxfam.

Cosgrave, J. (1996). Decision Making in Emergencies. Disaster Prevention and Management, 5(4), 28-35.

Cosgrave, J. (2004). The impact of the war on terror on aid flows (pp. 35). London: ActionAid.

Cosgrave, J. (2007). Synthesis Report: Expanded Summary: Joint evaluation of the international response to the Indian Ocean tsunami (pp. 41). London: Tsunami Evaluation Coalition.

Cosgrave, J., Gonçalves, C., Martyris, D., Polastro, R., & Sikumba-Dils, M. (2007). Inter-agency real-time evaluation of the response to the February 2007 floods and cyclone in Mozambique (pp. 91). Geneva: Inter-agency Standing Committee.

Cosgrave, J. (2008). Responding to earthquakes 2008: Learning from earthquake relief and recovery operations (pp. 40). London and Geneva: ALNAP and Provention.

Page 24:   · Web view2021. 8. 10. · The Real-Time Evaluation of humanitarian action . Abstract. This papers reviews the current use of real time evaluation (RTE) in the evaluation of humanitarian

Cosgrave, J., Ramalingam, B., & Beck, T. (2009). Real-time evaluations of humanitarian action: An ALNAP Guide: Pilot Version (pp. 97). London: ALNAP.

Cosgrave, J., Polastro, R., & Zafar, F. (2010). Inter-Agency Real Time Evaluation (IA RTE) of The Humanitarian Response to Pakistan’s 2009 Displacement Crisis. Commissioned by the Inter-Agency Standing Committee: Final Report (pp. 93). Madrid: DARA.

Cosgrave, J. (2013). Humanitarian standards - too much of a good thing? (pp. 10). London: Joint Standards Initiative.

Cosgrave, J. (2014). Responding to Flood Disasters: Learning from previous relief and recovery operations (pp. 34). London: Alnap.

Cosgrave, J., & Ticao, R. (2014). Final evaluation report: Real-time evaluation of Plan International’s response to Typhoon Haiyan, Philippines (pp. 93). Bolder: International Advisory, Products, and Systems.

Cragg, L. (2012). Mapping exercise on Quality & Accountability initiatives in the humanitarian sector: Prepared for the Joint Standards Initiative (pp. 47). London: Joint Standards Initiative.

Crisp, J., Graf, A., & Tennant, V. (2010). Banking on solutions: A real-time evaluation of UNHCR’s shelter grant programme for returning displaced people in northern Sri Lanka (Evaluation Report PDES/2010/04, pp. 64). Geneva: United Nations High Commissioner for Refugees.

Crisp, J., Garras, G., McAvoy, J., Schenkenberg, E., Spiegel, P., & Voon, F. (2013). From slow boil to breaking point: A real-time evaluation of UNHCR’s response to the Syrian refugee emergency (Evaluation Report PDES/2013/10, pp. 25). Geneva: United Nations High Commisser for Refugees.

CRS. (2010). Real Time Evaluation of CRS’ Flood Response in Pakistan Jacobabad and Kashmore, Sindh (pp. 31). Islamabad: CRS Pakistan.

Darcy, J. (2012). DEC Real Time Evaluation (RTE) – East Africa Crisis Appeal: Synthesis Report (pp. 5). Oxford: Valid International.

Darcy, J., Asmare, E., Banda, T., & Clifton, D. (2012a). Disasters Emergency Committee – East Africa Crisis Appeal: Ethiopia Real-Time Evaluation Report (pp. 41). Oxford: Valid International.

Darcy, J., Bonard, P., & Dini, S. (2012b). IASC Real-Time Evaluation of the Humanitarian Response to the The Horn of Africa Drought Crisis: Somalie 2011-2012 (pp. 91). Oxford: Valid International.

Darcy, J., Chazally, C., Clifton, D., Kamungi, P., & Kamau, M. (2012c). Disasters Emergency Committee – East Africa Crisis Appeal: Kenya Real-Time Evaluation Report (pp. 49). Oxford: Valid International.

Darcy, J. (2013). DEC Syria Crisis Appeal 2013: Response Review: Final Report (pp. 44). London: Disasters Emergency Committee (DEC).

de Klerk, T. (2004). Evaluation Report: The Income Generation Program of American Refugee Committee for Liberian Refugees in the Forest region of Guinea (pp. 46). Monrovia: American Refugee Committee.

Devereux, S., Mvula, P., & Solomon, C. (2006). After the FACT: An Evaluation of Concern Worldwide’s Food and Cash Transfers Project in Three Districts of Malawi, 2006 (pp. 87). Dublin: Concern Worldwide.

Diagne, K., Savage, E., & Kiragu, E. (2007). Real-time evaluation of UNHCR's IDP operation in Eastern Chad (Evaluation Report PDES/2007/02 - RTE 1, pp. 12). Geneva: United Nations High Commissioner for Refugees.

Djordjevic, J., Giesen, P., & Parfenov, E. (2010). Real-time Evaluation of the response to the Typhoons Ketsana and Parma in the Philippines and the West Sumatra Earthquake in

Page 25:   · Web view2021. 8. 10. · The Real-Time Evaluation of humanitarian action . Abstract. This papers reviews the current use of real time evaluation (RTE) in the evaluation of humanitarian

Indonesia in 2009 (pp. 29). Geneva: International Federation of Red Cross and Red Crescent Societies.

Dolphin, H., Abderahamane, B., & Coly, P. S. (2010). Real Time Evaluation: Project ADVANCE Niger (pp. 62). Accra: CRS West Africa Regional Office.

Dolphin, H., Lokoun, G., Carrena, T., Bessan, R., Ahounde, Q., Quenom, H., Madode, F. H., & Wei, E. (2011). Real Time Evaluation: Project HHELP and SAVE2 Benin (pp. 88). Accra: CRS West Africa Regional Ofice.

Dolphin, H. (2012). Maximizing the Value of “Cash for Work”: Lessons from a Niger land recuperation project: CRS EARLI (pp. 16). Baltimore: CRS.

Duncalf, J., Greenhalgh, L., Mohammed, H., Marroni, M., & Maina, B. (2012). IASC Real Time Evaluation (IASC RTE) of the Humanitarian Response to the Horn of Africa Drought Crisis - Kenya (pp. 66). New York: Inter-Agency Standing Committee.

Duncan, J. (2013). A real-time evaluation of ACF International's response to Typhoon Haiyan/Yolanda in the Philippines (pp. 49). London: Action Against Hunger.

Essack-Kauaria, R., Torbjornsen, A., & Daudrumez, A. (2002). Southern Africa Food Security Operation - Real Time Evaluation Report (pp. 31). Geneva: IFRC.

ETC UK. (1999). Real Time Evaluation of the Humanitarian Response to the Crisis in Kosova: March to May 1999: Main report (pp. 42). North Shields: ETC UK.

FAO. (2007). Real Time Evaluation of the FAO Emergency and Rehabilitation Operations in Response to the Indian Ocean Earthquake and Tsunami (pp. 112): FAO.

Fearon, J., Humphreys, M., & Weinstein, J. (2008). Community‐Driven Reconstruction in Lofa County: Impact Assessment (pp. 47). New York: International Rescue Committee.

Featherstone, A., Hart, K., Thurstans, S., Khen, S. I., Durnian, T., Lwin, U. K. M., Porteaud, D., Shahi, D., Goby, B., & Chaimontree, A. (2009). Evaluation of the Save the Children Response to Cyclone Nargis (pp. 74). Yangon: Save the Children.

Feinstein, O., & Beck, T. (2006). Evaluation of Development Interventions and Humanitarian Action. In I. F. Shaw, J. C. Greene & M. M. Mark (Eds.), The Sage Handbook of Evaluation: Policies, Programs, and Practices (pp. 236-258). London: Sage.

Fisher, M., Bhattacharjee, A., Saenz, J., & Schimmelpfennig, S. (2010). The Haiti Earthquake Operation Real Time Evaluation for the International Federation of Red Cross and Red Crescent Societies (pp. 69). Geneva: International Federation of Red Cross and Red Crescent Societies.

Fleischer, D. N., & Christie, C. A. (2009). Evaluation Use: Results From a Survey of U.S. American Evaluation Association Members. American Journal of Evaluation, 30(2), 158-175.

Fortune, V., & Rasal, P. (2010). British Red Cross - Mass Sanitation Module - 2010 Haiti Earthquake Response - Post Deployment Learning Evaluation (pp. 52). London: British Red Cross Society.

General Accounting Office, & Datta, L.-E. (1990). Prospective Evaluation Methods: The Prospective Evaluation Synthesis (Methodology Transfer Paper PEMD-10.1.10, pp. 119). Washington: Program Evaluation and Methodology Division of the United States General Accounting Office.

Good Humanitarian Donorship. (2003). Principles and Good Practice of Humanitarian Donorship (Vol. 2006). Stockholm: Germany, Australia, Belgium, Canada, the European Commission, Denmark, the United States, Finland, France, Ireland, Japan, Luxembourg, Norway, the Netherlands, the United Kingdom, Sweden and Switzerland.

Page 26:   · Web view2021. 8. 10. · The Real-Time Evaluation of humanitarian action . Abstract. This papers reviews the current use of real time evaluation (RTE) in the evaluation of humanitarian

Goyder, H., Girerd-Barclay, E., Jones, A., Lefevre, J., & Larmoyer, A. (2005). Full Report of the ‘Real Time’ Evaluation of WFP’s Response to the Indian Ocean Tsunami: A Report from the Office of Evaluation (pp. 143). Rome: World Food Programme.

Goyder, H. (2010). Real Time Evaluation of Tearfund’s Haiti Earthquake Response (pp. 22). Teddington: Tearfund.

Greenhalgh, L., Bamforth, T., Neudorf, G., & Siddiqui, A. (2014). Real‐time evaluation of the Philippines Haiyan response (pp. 70). Geneva: International Federation of the Red Cross and Red Crescent Societies.

Grünewald, F. (2005). FAO’s Real Time Evaluation in the Tsunami Affected Countries: First Lessons Learned (pp. 4): FAO.

Grünewald, F., Robins, K., Nicholson, N., & Adico, A. (2006a). Somalia: Real Time Evaluation of the 2006 Emergency Response (mistitled Real Time Evolution) (pp. 47). New York: UNICEF.

Grünewald, F., Robins, K., Odicoh, A., & Nicholson, N. (2006b). Kenya RTE Mission 02/10-13/10/2006 (pp. 47). New York: UNICEF.

Grünewald, F., Robins, K., Odicoh, A., Woldemariam, M., Nicholson, N., & Teshome, A. (2006c). Real Time Evaluation of the Drought Response in the Horn of Africa (13/08/2006 – 20/10/2006: Regional Synthesis (pp. 21). New York: UNICEF.

Grünewald, F., Robins, K., Woldemariam, M., Nicholson, N., & Teshome, A. (2006d). Ethiopia: Real Time Evaluation of the 2006 Emergency Response (31/08/2006 to 13/09/2006) (pp. 51). New York: UNICEF.

Grünewald, F., Binder, A., & Georges, Y. (2010). Inter‐agency real‐time evaluation in Haiti: 3 months after the earthquake (pp. 93). La Fontaine des Marins: Groupe URD.

Grünewald, F., & Renaudin, B. (2010). Real-time evaluation of the response to the Haiti earthquake of 12 January 2010: Mission: 9-23 February 2010: Mission Report (pp. 73). La Fontaine des Marins: Groupe URD.

Grünewald, F., Kauffmann, D., Boyer, B., & Patinet, J. (2011). Real-time evaluation of humanitarian action supported by DG ECHO in Haiti: 2009-2011: November 2010 -April 2011 (pp. 72). Brussels: DG ECHO.

Grünewald, F., & Boyer, B. (2013). Lessons Learned on Typhoons in the Philippines (Metro Manila, Cagayan de Oro and Iligan) (pp. 30). La Fontaine des Marins: Groupe URD.

Hagens, C. (2009). Real-Time Evaluation Report for the CRS Pakistan Response in the Swat Valley (pp. 32). Islamabad: CRS Pakistan.

Hagens, C., & Ishida, L. (2010). Real Time Evaluation of CRS’s Flood Response in Pakistan (KPK and Baluchistan) (pp. 33). Baltimore: Catholic Relief Services.

Hallam, A. (1998). Evaluating Humanitarian Assistance Programmes in Complex Emergencies (RRN Good Practice Review 7, pp. 128). London: Overseas Development Institute.

HAP. (2007). Quality Initiative Reference. Genava: Humanitarian Accountability Partnership.Harvey, P., Stoddard, A., Harmer, A., Taylor, G., DiDomenico, V., & Brander, L. (2010).

State of the humanitarian system: assessing performance and progress: a pilot study (pp. 76). London: Active Learning Network for Accountability and Performance in Humanitarian Action.

Hedlund, K., & Clarke, P. K. (2011). Humanitarian action in drought-related emergencies (pp. 31). London: Alnap.

Herson, M. (2005). Asia earthquake and tsunamis: Real Time Evaluation, First Round: Synthesis Report (pp. 57). Geneva: IFRC.

Hidalgo, S., & Théodate, M. P. (2012). Inter-Agency Real-Time Evaluation of the Humanitarian Response to the Earthquake in Haiti (20 months after) (pp. 107). New York: Inter-Agency Standing Committee.

Page 27:   · Web view2021. 8. 10. · The Real-Time Evaluation of humanitarian action . Abstract. This papers reviews the current use of real time evaluation (RTE) in the evaluation of humanitarian

Hoogendoorn, A., & Chisvo, M. (2008). Real-time Evaluation of DG ECHO financed action of CARE International Deutschland in Zimbabwe (pp. 42). Freiburg: Particip.

IA RTE Steering Group. (2011). Inter-Agency Real Time Evaluation (IA RTE) of Emergency Humanitarian Operations: Procedures and Methodologies (pp. 44). New York: Office for the Coordination of Humanitarian Affairs.

IASC. (2006). Real-Time Evaluation: Cluster Approach - Pakistan Earthquake: Application of the IASC Cluster Approach in the South Asia Earthquake: Final Draft (pp. 30). Islamabad: Inter-Agency Standing Committee.

IASC. (2010). Response to the Humanitarian Crisis in Haiti following the 12 January 2010 Earthquake: Achievements, Challenges and Lessons to be Learned (pp. 44). Geneva: IASC.

IASC. (2012a). Inter-Agency Standing Committee Transformative Agenda Reference Document: 5. Responding to Level 3 Emergencies: The Humanitairan Programme Cycle (pp. 6). New York: Inter-Agency Standing Committee.

IASC. (2012b). IASC Transformative Agenda - 2012 (pp. 12). Geneva: Inter-Agency Standing Committee.

IASC. (2014a). Central African Republic: Operational Peer Review – Final Terms of Reference (pp. 2). New York: Inter-Agency Standing Committee.

IASC. (2014b). South Sudan: Operation Peer Review Terms Of Reference (pp. 2). New York: Inter-Agency Standing Committee.

IASC. (2014c). Philippines – Response to Typhoon Haiyan: Operational Peer Review - Terms of Reference (pp. 3). New York: Inter-Agency Standing Committee.

IASC. (2014d). Operational Peer Review: Response to Typhoon Haiyan In The Philippines: Summary Of The January 2014 Findings (pp. 2). New York: Inter-Agency Standing Committee.

IFRC. (2007). Mid Term Review – Red Cross and Red Crescent Movement Tsunami Recovery Program - Housing Projects - Matara District (pp. 64). Geneva: International Federation of Red Cross and Red Crescent Societies.

Jamal, A. (2001). The Sudan/Eritrea emergency: May - July 2000: An evaluation of UNHCR’s response (Evaluation Report EPAU/2001/03, pp. 38). Geneva: United Nations High Commissioner for Refugees.

Jamal, A., & Crisp, J. (2002). Real-time humanitarian evaluations: some frequently asked questions (EPAU /2002/05). Geneva: United Nations High Commissioner for Refugees, Evaluation and Policy Unit.

Jamal, A., & Stigter, E. (2002, 31 May). Real-time evaluation of UNHCR's response to the Afghanistan emergency: Bulletin No. 3. Retrieved 24 February, 2013, from http://reliefweb.int/report/afghanistan/real-time-evaluation-unhcrs-response-afghanistan-emergency-bulletin-no-3

Johnson, A. (2011). CRS Haiti Real Time Evaluation of the 2010 Earthquake Response: Findings, Recommendations, and Suggested Follow Up (pp. 21). Port-au-Prince: Catholic Relief Services.

Keen, D., & Ryle, J. (1996). Editorial: The Fate of Information in the Disaster Zone. Disasters, 20(3), 169-172.

Ketel, H., Bhat, M., Fernando, U., Marut, D., & Louwes, K. (2005). Real-Time Evaluation of ACT International Tsunami Disaster Programs Appeal - Asia Earthquake & Tsunamis - ASRE51 (pp. 67). Geneva: ACT International.

Khan, H. A., Young, R., & Walden, V. (2011). Real Time Evaluation: Somalia Drought Response (pp. 34). Oxford: Oxfam.

Khogali, H., Kane, S., & Bang, T. (2011). Real Time Evaluation of the International Federation of Red Cross and Red Crescent Societies’ Response to the MENA Civil

Page 28:   · Web view2021. 8. 10. · The Real-Time Evaluation of humanitarian action . Abstract. This papers reviews the current use of real time evaluation (RTE) in the evaluation of humanitarian

Unrest (pp. 48). Geneva: International Federation of Red Cross and Red Crescent Societies.

Krueger, R., & Casey, M. (2009). Focus groups: A practical guide for applied research (4th ed.). Thousand Oaks: Sage.

Krueger, S., & Sagmeister, E. (2012). Real-Time Evaluation of Humanitarian Assistance Revisited: Lessons Learned and the Way Forward. Paper presented at the European Evaluation Society 10th Bennial Conference, Helsinki, 3 October.

Krueger, S., & Sagmeister, E. (2014). Real-Time Evaluation of Humanitarian Assistance Revisited: Lessons Learned and the Way Forward. Journal of Multidisciplinary Evaluation, 10(23), 59-72.

Lee, A. C. (2005). Final Report: Real Time Evaluation of Medair's 'Tsunami Emergency Response' Programme in Sri Lanka: Field visit May 29 - June 9, 2005. (pp. 46). Ecublens: MedAir.

Leonardi, E., & Arqués, R. S. (2013). Real Time Evaluation of UNICEF’s response to the Mali crisis: Final report (pp. 104). New York: Unicef.

Letch, J. L. (2013). Theory and practice in the real-time evaluation of humanitarian emergency response. The University of Melbourne, Melbourne.

Lincoln, P., Brander, M., Hardcastle, P., & Tipper, R. (2013). Real-Time Evaluation of Norway’s International Climate and Forest Initiative: Contribution to Measurement, Reporting and Verification (Evaluation Report, pp. 136). Oslo: Norwegian Agency for Development Cooperation.

Ling, T. (2012). Evaluating complex and unfolding interventions in real time. Evaluation, 18(1), 79-91.

Litovsky, A., Knight, A., MacCarthy, S., Mancini, A., & Raynard, P. (2006). A BOND Approach to Quality in Non-Governmental Organisations: Putting Beneficiaries First: A report by Keystone and AccountAbility for the British Overseas NGOs for Development (BOND) (pp. 82). London: BOND.

Lloyd, R., & Casas, L. d. l. (2006). NGO self-regulation: enforcing and balancing accountability (pp. 8). London: One World Trust.

Loquercio, D., Hammersley, M., & Emmens, B. (2006). Understanding and addressing staff turnover in humanitarian agencies (Humanitarian Practice Network: Network Paper 7, pp. 36). London: Overseas Development Institute.

Loquercio, D., & Mubayiwa, R. (2007). Real-time Evaluation Pakistan flood response (June - August 2007) (pp. 26). Oxford: Oxfam.

McNamara, D. (2006). Humanitarian reform and new institutional responses. Forced Migration Review: Special Issue: Putting IDPs on the map: achievements and challenges: in commemoration of the work of Roberta Cohen, 9-11.

Melville, A., & Scarlet, F. (2003). Psychosocial Inteventions: Evaluation of UNICEF supported projects (1999-2001) (pp. 92). New York: UNICEF.

Murphy, O., Thomas, S., Lamb, J., & Zaman, Z. (2011). Real Time Evaluation of the Kenya Drought Response (pp. 43). Oxford: Oxfam.

Murtaza, N., Akhtar, S., Mohammed, N., Harrison, S., & Bhatti, S. (2011). Pakistan Floods 2010: The DEC Real-Time Evaluation Report (pp. 36). Islamabad: ThinkAhead.

O'Donnell, I., Smart, K., & Ramalingam, B. (2009). Responding to urban disasters: Learning from previous relief and recovery operations (pp. 31). London: Active Learning Network for Accountability and Performance in Humanitarian Action and Provention.

OCHA. (2014). Biennial Evaluation Report: 2011-2012 (pp. 16). New York: Office for the Coordination of Humanitarian Affairs.

Page 29:   · Web view2021. 8. 10. · The Real-Time Evaluation of humanitarian action . Abstract. This papers reviews the current use of real time evaluation (RTE) in the evaluation of humanitarian

OECD-DAC. (2014). Peer Review of the Evaluation Function of the UN World Food Programme (2008-2013) (Vol. I and II, pp. 68 + 63). Paris: Development Assistance Committee of the Organisation for Economic Cooperation and Development.

Olsen, G. R., Carstensen, N., & Høyen, K. (2003). Humanitarian Crises: What Determines the Level of Emergency Assistance? Media Coverage, Donor Interests and the Aid Business. Disasters, 27(2), 109-126.

Parker, R., Little, K., & Heuser, S. (2007). Development Actions and the Rising Incidence of Disasters (Evaluation Brief 4). Washington: World Bank.

Patton, M. Q. (2008). Utilization-focused evaluation (4th ed.). Thousand Oaks: Sage.Patton, M. Q. (2011). Developmental Evaluation: Applying Complexity Concepts to Enhance

Innovation and Use. New York: The Guilford Press.Perry, B., Husu-Kallio, J., Magnusson, U., Sims, L., Camus, E., Hargreaves, S., Ellis, T.,

Bruckner, G., Mbugua, H., Grace, D., Kapur, M. S., Isa, K. M., & Sones, K. (2010). Second real time evaluation of FAO’s work on the highly pathogenic avian influenza: Evaluation report (pp. 70). Rome: Food and Agricultural Organisation.

Polastro, R., Roa, B., & Steen, N. (2010). Inter-Agency Real Time Evaluation (IA-RTE) of the Humanitarian Response to Typhoons Ketsana and Parma in the Philippines (pp. 89). Madrid: Dara.

Polastro, R. (2011). Real Time Evaluations: contributing to system-wide learning and accountability. Humanitarian Exchange(52), 10-13.

Polastro, R., Khalif, M. A., Eyben, M. N. v., Posada, S., Salah, A. S. M., Steen, N., & Toft, E. (2011a). IASC Evaluation of the Humanitarian Response in South Central Somalia 2005-2010 (pp. 68). Madrid: DARA.

Polastro, R., Nagrah, A., Steen, N., & Zafar, F. (2011b). Inter‐Agency Real Time Evaluation of the Humanitarian Response to Pakistan’s 2010 Flood Crisis (pp. 140). Madrid: DARA.

Polastro, R. (2014). Evaluating Humanitarian Action in Real Time: Recent Practices, Challenges, and Innovations Canadian Journal of Program Evaluation, 29(1), 118-134.

Poole, L., Walmsley, L., Malerba, D., Sparks, D., Sweeney, H., Smith, K., Stoianova, V., Delgado, A., Stirk, C., Brereton, G., Randel, J., & Coppard, D. (2012). Global Humanitarian Assistance Report 2012 (pp. 108). Wells: Development Initiatives.

Prideaux-Brune, J., & Scott, I. (2006). Real Time Evaluation of Oxfam GB’s response to the Java Earthquake of 27th May 2006 (pp. 17). Oxford: Oxfam.

Ricafort, R. E. (2014). Typhoon Yyolanda: humanitarian response: Real time evaluation+ (pp. 45). Manila: ActionAid Intenational Philippines.

Rogers, P. J. (2008). Using programme theory to evaluate complicated and complex aspects of interventions. Evaluation, 14(1), 29-48.

Sahla, S. (2013). Between fear and Compassion: How refugee concerns shape responses to humanitarian emergencies – The case of Germany and Kosovo (LSE International Development Working paper Series 2013 13-140, pp. 51): Development Studies Institute of the London School of Economimics.

Sandison, P. (2003). Desk review of real-time evaluation experience (Evaluation Working Paper). New York: Unicef.

Sandison, P. (2007). The utilisation of evaluations. In ALNAP Review of Humanitarian Action: Evaluation Utilisation (pp. 89-144). London: Active Learning Network on Accountability and Performance in Humanitarian Action.

Sandison, R., & Khan, V. (2011). Real Time Evaluation Report: Plan: Pakistan Emergency and Early Recovery Flood Response (pp. 75). Woking: Plan International.

Page 30:   · Web view2021. 8. 10. · The Real-Time Evaluation of humanitarian action . Abstract. This papers reviews the current use of real time evaluation (RTE) in the evaluation of humanitarian

Sandström, S., & Tchatchua, L. (2010). Do cash transfers improve food security in emergencies? Evidence from Sri Lanka. In S. W. Omamo, U. Gentilini & S. Sandström (Eds.), Revolution: From Food Aid to Food Assistance: Innovations in Overcoming Hunger (pp. 426). Rome: World Food Programme.

Savage, E., Wright, N., & Kiragu, E. (2007). Real time evaluation of UNHCR's IDP operation in Somalia (Evaluation Report PDES/2007/02 - RTE 4, pp. 28). Geneva: United Nations High Commissioner for Refugees.

Scriven, M. (1991). Evaluation thesaurus (4th ed.). Newbury Park: Sage.Shallon, D., Hall, D., Wilsmore, A., Ayoub, J., Badawy, E. S., Hall, D., Pym, R., Shallon, D.,

Valarcher, J.-F., & Wilks, C. (2007). The First Real Time Evaluation of FAO’s Work on Highly Pathogenic Avian Influenza (pp. 87). Rome: Food and Agricultural Organisation.

Sida, L., Gray, B., & Asmare, E. (2012). IASC Real Time Evaluation (IASC RTE) of the Humanitarian Response to the Horn of Africa Drought Crisis - Ethiopia (pp. 57). New York: Inter-Agency Standing Committee.

Simas, V., Martin, J., & Humphrey, E. (1960). Determination of the internal temperature in satellite 1959 Alpha (Vanguard 2) (NASA Technical Note D-357, pp. 21). Greenbelt: Goddard Space Centre.

Simpson, R., Legesse, N. B., & Mubayiwa, R. (2009). Real Time Evaluation of the Cholera Response in Zimbabwe (09 February – 19 February 2009) (pp. 27). Oxford: Oxfam.

Simpson, R., Legesse, N. B., Phelps, L., & Modino, C. (2011). Real Time Evaluation: Ethiopia Drought Response: 02 September – 19th September 2011 (pp. 30). Oxford: Oxfam.

Simpson, R., Sutton, T., & Ngwenya, D. (2013). Evaluation of the Gaza Flood Response, Mozambique 2013 (20th - 24th May 2013) (pp. 20). Oxford: Oxfam.

Slim, H., & Bonwick, A. (2005). Protection: An ALNAP guide for humanitarian agencies. London: ALNAP.

Slim, H. (2012). IASC Real-Time Evaluation of the Humanitarian Response to the Horn of Africa Drought Crisis in Somalia, Ethiopia and Kenya: Synthesis Report (pp. 18). New York: Inter-Agency Standing Committee.

Slootweg, A., Gould, C., Konaté, S., & Himeur, Y. (2010). Mid Real-time Evaluation of Oxfam International’s response to the food crisis in Niger (pp. 43). Oxford: Oxfam International.

Smith, G. C. S., & Pell, J. P. (2003). Parachute use to prevent death and major trauma related to gravitational challenge: systematic review of randomised controlled trials (Vol. 327).

Sperl, S., Diagne, K., & Snider, D. (2006). Real-time evaluation of UNHCR’s response to the emergency in Lebanon and Syria, July - September 2006 (Evaluation Report PDES/2006/RTE1, pp. 27). Geneva: United Nations High Commissioner for Refugees.

Springett, S., & Creti, P. (2006). Typhoon Durian Humanitarian Response Programme: Real Time Evaluation (15th to 20th January 2006) (pp. 29). Oxford: Oxfam.

Sridhar, D., & Duffield, A. (2006). A review of the impact of cash transfer programmes on child nutritional status and some implications for Save the Children UK programmes (pp. 27). London: Save the Children.

Steets, J., & Dubai, K. (2010). Real-Time Evaluation of UNICEF's Response to the Sa'ada Conflict in Northern Yemen (pp. 87). Berlin and Sana'a: Global Public Policy Institute.

Stigter, E., & Crisp, J. (2001, 7 November). Real-time evaluation of UNHCR's response to the Afghanistan emergency: Bulletin No. 1. Retrieved 24 February, 2013, from

Page 31:   · Web view2021. 8. 10. · The Real-Time Evaluation of humanitarian action . Abstract. This papers reviews the current use of real time evaluation (RTE) in the evaluation of humanitarian

http://reliefweb.int/report/afghanistan/real-time-evaluation-unhcrs-response-afghanistan-emergency-bulletin-no-1

Stockholm International Peace Research Institute. (2013). SIPRI Yearbook 2013: Armaments, Disarmament and International Security: Summary. Stockholm: Stockholm International Peace Research Institute.

Styger, E. (2008). Douentza Circle in Crisis: Improving Household Resiliency to Food Security Shocks in Mali: January 2006 – December 2007: Final Project Evaluation Report (pp. 84). Bamako: Catholic Relief Services.

Swithern, S., Latimer, C., Sardiwal, D., Smith, K., Sparks, D., Stirk, C., Okwaroh, K., Rono, K., Beecher, J., Dalrymple, S., Ifan, G., Knox, D., Langdon, C., Osborne, A., Strawson, T., Walton, D., Claydon, J., Coppard, D., & Poole, L. (2014). Global Humanitarian Assistance Report 2014 (pp. 150). Bath: Deveopment Initiatives.

Taylor, G., Stoddard, A., Harmer, A., Haver, K., Harvey, P., Barber, K., Schreter, L., & Wilhelm, C. (2012). The State of the Humanitarian System: 2012 Edition (pp. 104). London: ALNAP.

Telford, J., Cosgrave, J., & Houghton, R. (2006). Joint Evaluation of the international response to the Indian Ocean tsunami: Synthesis Report (pp. 178). London: Tsunami Evaluation Coalition.

Telford, J. (2009). Review of Joint Evaluations and the Future of Inter Agency Evaluations: on behalf of the Inter Agency Standing Committee (IASC) Real Time Evaluation Interest Group (pp. 38): Office for the Coordination of Humanitarian Affairs.

Ternstrom, B., Yamato, M., Myint, S., & Lwin, U. K. M. (2008). Evaluation of CARE Myanmar’s Cyclone Nargis Response (pp. 58). Yangon: Care.

Thukral, K., Vijayaraghavan, M., Venkata, S. P., Guevara, L. N., & Fortu, M. (2014). Real-Time Evaluation of ADB’s Initiatives to Support Access to Climate Finance (Thematic Evalution Study, pp. 110). Manila: Asian Development Bank.

Tindal, V. (2012). 2011 Horn & East Africa Drought Response - Real Time Evaluation (pp. 33). London: Catholic Fund for Overseas Development (CAFOD).

Tinnemans, K., Rowley, C., Ansari, A., & Blackwell, H. (2009). Real Time Evaluation: East Asia Region: Typhoon Ketsana/Ondoy and West Sumatra Earthquake (pp. 11). Oxford: Oxfam.

Tipper, R., Berry, N., Camargo, M., Davenport, D., Dutschke, M., Helland, J., Lincoln, P., Low, R., Makundi, W., Pedroni, L., & Strønen, I. Å. (2011). Real-Time Evaluation of Norway’s International Climate and Forest Initiative: Contributions to a Global REDD+ Regime 2007-2010 (Evaluation Report 12/2010, pp. 108). Oslo: Norad.

Turner, R., Baker, J., Oo, Z. M., & Aye, N. S. (2008). Inter-Agency Real Time Evaluation of the Response to Cyclone Nargis: 17 December 2008 (pp. 59). Geneva and New York: Inter-Agency Standing Committee.

UNEG. (2005a). Norms for Evaluation in the UN System (pp. 11). New York: United Nations Evaluation Group.

UNEG. (2005b). Standards for Evaluation in the UN System (pp. 25). New York: United Nations Evaluation Group.

UNEG. (2008a). UNEG Code of Conduct for Evaluation in the UN System (pp. 6). New York: United Nations Evaluation Group.

UNEG. (2008b). UNEG Ethical Guidelines for Evaluation (pp. 14). New York: United Nations Evaluation Group.

UNEG. (2010a). UNEG Quality Checklist for Evaluation Reports (pp. 6). New York: United Nations Evaluation Group.

UNEG. (2010b). UNEG Good Practice Guidelines for Follow up to Evaluations (pp. 13). New York: United Nations Evaluation Group.

Page 32:   · Web view2021. 8. 10. · The Real-Time Evaluation of humanitarian action . Abstract. This papers reviews the current use of real time evaluation (RTE) in the evaluation of humanitarian

UNEG. (2010c). UNEG Quality Checklist for Evaluation Terms of Reference and Inception Reports (pp. 5). New York: United Nations Evaluation Group.

UNHCR. (2001, 6 December). Real-time evaluation of UNHCR's response to the Afghanistan emergency: Bulletin No. 2. Retrieved 24 February, 2013, from http://reliefweb.int/report/afghanistan/real-time-evaluation-unhcrs-response-afghanistan-emergency-bulletin-no-2

Vaux, T., Bhatt, M., Disaster Mitigation Institute, Bhattacharjee, A., Lipner, M., McCluskey, J., Naik, A., Stevenson, F., Muse, I. A., Rawal, V., Routley, S., Silva, K. T., & Wiles, P. (2005). Independent evaluation of the DEC tsunami crisis response: Final Report: November 2005 (Suppressed by the DEC but leaked by BBC Newsnight). (pp. 58). London: Disasters Emergency Committee.

Walden, V. M., Emsley, R., & Beesley, J. (2006). Real Time Evaluation Report – Lebanon Crisis Response (pp. 30). Oxford: Oxfam.

Walden, V. M. (2008). Evaluation of Oxfam’s Response to Hurricane Dean in Three Countries of the ESC (pp. 29). Oxford: Oxfam.

Walden, V. M., Briere, J.-F., & Kumar, P. (2008). Real Time Evaluation of Oxfam International’s response to Cyclone Sidr: November 2007- January 2008: Final Report (pp. 30). Oxford: Oxfam.

Wiles, P., Bradbury, M., Buchanan-Smith, M., Collins, S., Cosgrave, J., Hallam, A., Mece, M., Norman, N., Prodanovic, A., Shackman, J., & Watson, F. (2000). Independent Evaluation of Expenditure of DEC Kosovo Appeal Funds : Phases I and II, April 1999-January 2000, Volumes I, II & III (pp. 192, 151, 108). London: Disasters Emergency Committee.

Wright, N., Savage, E., & Tennant, V. (2007). Real-time evaluation of UNHCR's IDP operation in Liberia (Evaluation Report PDES/2007/02 - RTE 2, pp. 20). Geneva: United Nations High Commissioner for Refugees.

Young, N., Khattak, S. G., Bengali, K., & Elmi, L. (2007). IASC inter-agency real time evaluation of the Pakistan floods/Cyclone Yemyin: Final version (pp. 99). New York: Inter-Agency Standing Committee.

Zetter, R., Ager, A., Davey, E., Ferris, E., Fiddian-Qasmiyeh, E., Hammond, L., Haysom, S., Horst, C., Long, C., Martin, S., Pantuliano, S., Poole, L., Stoddard, A., & Willitts-King, B. (2012). World Disasters Report 2012: Focus on forced migration and displacement (pp. 310). Geneva: International Federation of Red Cross and Red Crescent Societies.