Real-Time Evaluation of Humanitarian Assistance Revisited: Lessons Learned and the Way Forward

Main Article Content

Susanna Krueger
Elias Sagmeister

Abstract

Background: The real-time evaluation (RTE) approach has been applied in humanitarian assistance for two decades. Its spread seems to be as much a result of entrepreneurial evaluators who aim to demonstrate expertise to potential clients as it is a response to actual demand for immediate feedback of results to decision makers.


Purpose: As RTE has come under scrutiny recently, this study demystifies the concept of RTE and looks beyond textbook descriptions of its advantages in order to understand its practical application in humanitarian action.


Setting: NA


Intervention: NA


Research Design: NA


Data Collection and Analysis: NA


Findings: It then suggests lessons for how to improve the application of real-time evaluations and related concepts in practice.


 

Downloads

Download data is not yet available.

Article Details

How to Cite
Krueger, S., & Sagmeister, E. (2014). Real-Time Evaluation of Humanitarian Assistance Revisited: Lessons Learned and the Way Forward. Journal of MultiDisciplinary Evaluation, 10(23), 59–72. https://doi.org/10.56645/jmde.v10i23.380
Section
Research on Evaluation Articles

References

Ackoff, R. L. (1999). Ackoff's best: his classic writings on management. New York, NY: Wiley. https://doi.org/10.1093/oso/9780195123876.003.0003 DOI: https://doi.org/10.1093/oso/9780195123876.003.0003

Argyris, C. (1990). Overcoming organizational defenses: facilitating organizational learning: Allyn and Bacon.

Argyris, C., & Schön, D. A. (1978). Organizational learning. Reading, Mass.: Addison-Wesley Pub. Co.

Clark, P., & Ramalingam, B. (2008). Organizational change in the humanitarian sector. In J. Mitchell (Ed.), ALNAP seventh review of humanitarian action. London: ALNAP.

Collinson, S., & Elhawary, S. (2012). Humanitarian space: a review of trends and issues. London: Humanitarian Policy Group (HPG) / Overseas Development Institute (ODI).

Cosgrave, J., Ramalingam, B., & Beck, T. (2009). Real-time evaluations of humanitarian action. An ALNAP Guide: ALNAP / ODI.

Crisp, J. (2000). Thinking outside the box: evaluation and humanitarian action. Forced Migration Review, 8, 4-7.

Crisp, J., Martin, L., & Prattley, W. (1992). Review of UNHCR emergency preparedness and response in the Persian Gulf crisis (GULF/EVAL/12, Rev. 1). Geneva: UNHCR.

DAC (n.d.). DAC criteria for evaluating development assistance. Retrieved from http://www.oecd.org/dac/evaluation/daccriteriaforevaluatingdevelopmentassistance.htm

DG ECHO. (2007). Evaluation of humanitarian aid by and for NGOs: European Commission.

Herson, M., & Mitchell, J. (2005). Real-Time Evaluation: where does its value lie? .Humanitarian Exchange Magazine(32).

Hobday, M. (1988). Evaluating collaborative R&D programmes in information technology: The case of the U.K. Alvey programme. Technovation, 8, 271-298. https://doi.org/10.1016/0166-4972(88)90030-2 DOI: https://doi.org/10.1016/0166-4972(88)90030-2

IASC (2012). Responding to level 3 emergencies: the humanitarianprogramme cycle. Retrieved from http://www.humanitarianinfo.org/iasc/downloadDoc.aspx?docID=6462

IA RTE Support Group. (2010). IA RTE procedures and methodologies. Geneva / New York: United Nations Office for the Coordination of Humanitarian Affairs (OCHA).

Jamal, A., & Crisp, J. (2002). Real-time humanitarian evaluations. Some frequently asked questions. Geneva: United Nations High Commissioner for Refugees (UNHCR).

Jones, H. (2011). Taking responsibility for complexity. London: Overseas Development Institute (ODI).

Kurtz, C. F., & Snowden, D. J. (2003). The new dynamics of strategy. IBM Systems Journal. Retrieved from http://alumni.media.mit.edu/~brooks/storybiz/kurtz.pdf https://doi.org/10.1147/sj.423.0462 DOI: https://doi.org/10.1147/sj.423.0462

Lewin, K. (1946). Action research and minority problems. Journal of Social Issues, 2(4), 34-46. https://doi.org/10.1111/j.1540-4560.1946.tb02295.x DOI: https://doi.org/10.1111/j.1540-4560.1946.tb02295.x

Ling, T. (2012). Evaluating complex and unfolding interventions in real time. Evaluation, 18(1), 79-91. https://doi.org/10.1177/1356389011429629 DOI: https://doi.org/10.1177/1356389011429629

Mark, M. M., & Henry, G. T. (2004). The mechanisms and outcomes of evaluation influence. Evaluation, 10(1), 35-57. https://doi.org/10.1177/1356389004042326 DOI: https://doi.org/10.1177/1356389004042326

McNall, M., & Foster-Fishman, P. G. (2007). Methods of rapid evaluation, assessment, and appraisal. American Journal of Evaluation, 28(2), 151-168. https://doi.org/10.1177/1098214007300895 DOI: https://doi.org/10.1177/1098214007300895

Mitchell, M. (2009). Complexity: A guided tour. Oxford University Press. https://doi.org/10.1093/oso/9780195124415.001.0001 DOI: https://doi.org/10.1093/oso/9780195124415.001.0001

Oxfam. (n.d). Real-¬‐TimeEvaluation. Retrieved from http://www.alnap.org/pool/files/oxfam.pdf

Patton, M. Q. (1997). Utilization-focused evaluation. Thousand Oaks, London, New Delhi: Sage Publications.

Patton, M. Q. (2011). Developmental Evaluation. Applying complexity concepts to enhance innovation and use. New York, London: The Guilford Press.

Pawson, R., & Tilley, N. (1997). Realistic Evaluation: SAGE Publications.

Ramalingam, B., Jones, H., Reba, T., & Young, J. (2008). Exploring the science of complexity: Ideas and implications for development and humanitarian efforts. Working Paper 285: Overseas Development Institute(ODI).

Sandison, P. (2003). Desk review of real-time evaluation experience. New York: UNICEF.

Sandison, P. (2006). The utilisation of evaluations. In J. Mitchell (Ed.), ALNAP Review of Humanitarian Action in 2005: Evaluation utilization (pp. 89-144). London: ALNAP.

Scharmer, O. (2009). Theory U: Leading from the future as it emerges. San Francisco:Berret-Koehler.

Schein, E. H. (1969). Process consultation: its role in organization development: Addison-Wesley Pub. Co.

Scriven, M. (1969). An introduction to meta-evaluation. Educational Product Report, 2, 36-38.

Scriven, M. (2008). Meta-Evaluation revisited. Journal of MultiDisciplinary Evaluation [Online] 6(11), iii-viii. https://doi.org/10.56645/jmde.v6i11.220 DOI: https://doi.org/10.56645/jmde.v6i11.220

Senge, P. M. (2006). The fifth discipline: The art & practice of the learning organization: Crown Publishing Group.

Stufflebeam, D. (1983). The CIPP model for program evaluation. In G. F. Madaus, M. S. Scriven & D. L. Stufflebeam (Eds.), Evaluation models: Viewpoints on educational and human services evaluation(pp. 117-141). Boston: Kluwert-Nijhoff. https://doi.org/10.1007/978-94-009-6675-8_7 DOI: https://doi.org/10.1007/978-94-009-6675-8_7

Stufflebeam, D. (2001). Evaluation models. New Directions for Evaluation, 2001(89), 7-98. https://doi.org/10.1002/ev.3 DOI: https://doi.org/10.1002/ev.3

Stufflebeam, D. (2004). The 21st-century CIPP model. In M. C. Alkin (Ed.), Evaluation roots: Tracing theorists' views and influences (pp. 245-266). Thousand Oaks, California: Sage Publications. https://doi.org/10.4135/9781412984157.n16 DOI: https://doi.org/10.4135/9781412984157.n16

UN Global Pulse Initiative. (n.d.). United Nations Global Pulse. Retrieved from http://www.unglobalpulse.org/roadmap

UN OCHA. (2011). Inter agency real time evaluation (IA RTE) -Phase II of the humanitarian response to the earthquake in Haiti. Call for Expression of Interest. Geneva / New York: United Nations Office for the Coordination of Humanitarian Affairs (OCHA).Retrieved from: http://www.alnap.org/pool/vacancies/eoi-ia-rte-haiti-phase-ii.pdf

Van de Putte, B. (2001). Findings of the ALNAP commissioned study: Improving follow-up to evaluations of humanitarian programmes. London: ALNAP.

Weaver, W. (1948). Science and complexity. American Scientist, 36, 536-544. https://doi.org/10.14219/jada.archive.1948.0155 DOI: https://doi.org/10.14219/jada.archive.1948.0155

Weick, K. E. (1995). Sensemaking in organizations: SAGE Publications.

Weiss, C. (1977). Research for policy's sake: The enlightenment function of social research. Policy Analysis, 3(4), 531-545. http://www.jstor.org/stable/42783234

White, H. (2009). Some reflections on current debates in impact evaluation 3ie Working Paper 1: International Initiative for Impact Evaluation (3ie).

Zelik, D. J., Patterson, E. S., & Woods, D. D. (2010). Measuring attributes of rigor in information analysis. In E. Patterson & J. Miller (Eds.), Macrocognition metrics and scenarios: Design and evaluation for real-world teams. Aldershot, UK: Ashgate.