Debate on the Appropriate Methods for Conducting Impact Evaluation of Programs within the Development Context

Main Article Content

Enyonam B. Norgbey
https://orcid.org/0000-0002-9115-531X

Abstract

Background: Donors and decision-makers use impact evaluation reports to assess the effectiveness of development programs and identify ways to improve the design and implementation of projects, programs, and policies in developing countries.


Purpose: This paper will explore the existing published impact evaluation literature on development programs and provide an overview of the types of approaches and methods that are being used to conduct impact evaluations.


Setting: NA


Intervention: NA


Research Design:  The paper will examine published program evaluation literature in order to shed light on issues related to appropriate methods for impact evaluations of development programs.


Data Collection and Analysis: Literature review.


Findings: The paper will conclude by suggesting a list of approaches and methods that can be used to conduct impact evaluations of programs within the development context.

Downloads

Download data is not yet available.

Article Details

How to Cite
Norgbey, E. B. (2016). Debate on the Appropriate Methods for Conducting Impact Evaluation of Programs within the Development Context. Journal of MultiDisciplinary Evaluation, 12(27), 58–66. https://doi.org/10.56645/jmde.v12i27.454
Section
Research on Evaluation Articles
Author Biography

Enyonam B. Norgbey, University of Ottawa

I am currently a doctoral candidate at the Faculty of Education at the University of Ottawa. My area of concentration is in "Leadership, Evaluation, Curriculum, and Policy". I earned a Master's Degree in Public Administration from Western Michigan University and a second Master's Degree from Michigan State University in Educational Systems Development. Prior to my admission in the doctoral program, I worked many years as a public school teacher in Lansing Michigan before lecturing at the United States International University in Nairobi, Kenya, also acted as the Head of Institutional Research and a consultant for UNDP for a couple of years. My latest assignments were with the International Livestock Research Institute in Nairobi, Kenya, the University of Ottawa, and the Association of African Universities in Accra, Ghana where I managed research grants.

References

Bamanyaki, P.A., & Holvoet, N. (2016). Integrating theory-based evaluation and process tracing in the evaluation of civil society gender budget initiatives. Evaluation, 22(1), 72-90. https://doi.org/10.1177/1356389015623657 DOI: https://doi.org/10.1177/1356389015623657

Bamberger, M. (2012). Introduction to mixed methods in impact evaluation. Impact Evaluation Notes, 3, 1-42. https://www.interaction.org/document/guidance-note-3-introduction-mixed-methods-impact-evaluation.

Bamberger, M. (2015). Innovations in the use of mixed-methods in real-world evaluations. Journal of Development Effectiveness, 7(3), 317-326. https://doi.org/10.1080/19439342.2015.1068832 DOI: https://doi.org/10.1080/19439342.2015.1068832

Bamberger, M., Tarsilla, M., & Hesse-Biber, S. (2016). Why so many "rigorous" evaluations fail to identify unintended consequences of development programs: How mixed methods can contribute. Evaluation and Program Planning, 55, 155-162. https://doi.org/10.1016/j.evalprogplan.2016.01.001 DOI: https://doi.org/10.1016/j.evalprogplan.2016.01.001

Beltramo, T., & Levine, D. I. (2013). The effect of solar ovens on fuel use, emissions and health: Results from a randomized controlled trial. Journal of Development Effectiveness, 5(2). 178-207. https://doi.org/10.1080/19439342.2013.775177 DOI: https://doi.org/10.1080/19439342.2013.775177

Broegaard, E., Freeman, T., & Schwensen, C. (2011). Experience from a phased mixed-methods approach to impact evaluation of Danida support to rural transport infrastructure in Nicaragua. Journal of Development Effectiveness, 3(1), 9-27. https://doi.org/10.1080/19439342.2010.545893 DOI: https://doi.org/10.1080/19439342.2010.545893

Cameron, D.B., Mishra, A., & Brown, A.N. (2016). The growth of impact evaluation for international development: How much have we learned? Journal of Development Effectiveness, 8(1), 1-21. https://doi.org/10.1080/19439342.2015.1034156 DOI: https://doi.org/10.1080/19439342.2015.1034156

Cartwright, N. (2011). The art of medicine: A philosopher's views of the long road from RCTs to effectiveness. The Lancet, 377. https://doi.org/10.1016/S0140-6736(11)60563-1 DOI: https://doi.org/10.1016/S0140-6736(11)60563-1

Carvalho, S., & White, H. (2004). Theory-based evaluation: The case of social funds. American Journal of Evaluation, 25(2), 141-160. https://doi.org/10.1177/109821400402500202 DOI: https://doi.org/10.1177/109821400402500202

Center for Global Development (2006). When will we ever learn? Washington DC: Center for Global Development.

Chambers, R. (1995). Poverty and livelihoods: Whose reality counts? Environment and Urbanization, 7(1), 173-204. https://doi.org/10.1177/095624789500700106 DOI: https://doi.org/10.1177/095624789500700106

Collier, D. (2011). Understanding process tracing. Political Science and Politics, 44(4), 823-830. https://doi.org/10.1017/S1049096511001429 DOI: https://doi.org/10.1017/S1049096511001429

Conlin, S., & Stirrat, R.L. (2008). Current challenges in development evaluation. Evaluation, 14(2), 193-208. https://doi.org/10.1177/1356389007087539 DOI: https://doi.org/10.1177/1356389007087539

Coryn, C. L. S., Noakes, L. A., Westine, C. D., & Schröter, D. C., (2011). A systematic review of theory-driven evaluation practice from 1990 to 2009. American Journal of Evaluation, 32(2), 199-226. https://doi.org/10.1177/1098214010389321 DOI: https://doi.org/10.1177/1098214010389321

Cousins, J.B., & Whitmore, E. (1998). Framing participatory evaluation. New Directions for Evaluation, 80, 5-23. https://doi.org/10.1002/ev.1114 DOI: https://doi.org/10.1002/ev.1114

Cousins J. B., Whitmore, E., & Shulha, L. (2012). Arguments for a common set of principles for collaborative inquiry in evaluation. American Journal of Evaluation, 34(1), 7-22. https://doi.org/10.1177/1098214012464037 DOI: https://doi.org/10.1177/1098214012464037

Fetterman, D. M. (1994). Empowerment evaluation. Evaluation Practice, 15(1), 1-15. https://doi.org/10.1016/0886-1633(94)90055-8 DOI: https://doi.org/10.1016/0886-1633(94)90055-8

Greene, J.C. (2005). The generative potential of mixed methods inquiry. International Journal of Research and Methods in Education. 28(2), 207-211. https://doi.org/10.1080/01406720500256293 DOI: https://doi.org/10.1080/01406720500256293

Hearn, S., & Buffardi, A.L. (2016). Methods lab: What is impact? Better Evaluation. Retrieved from http://betterevaluation.org/resource/discussion-paper/what_is_impact.

Hombrados, J.G., Devisscher, M., & Martinez, M.H. (2015). The impact of land titling on agricultural investments in Tanzania: A theory-based approach. Journal of Development Effectiveness, 7(4), 530-544. DOI: https://doi.org/10.1080/19439342.2015.1105850

Karuiki, J. & Njuki, J. (2013). Using participatory impact diagrams to evaluate a community development project in Kenya. Development in Practice, 23(1), 90-106. https://doi.org/10.1080/09614524.2013.753031 DOI: https://doi.org/10.1080/09614524.2013.753031

Lilja, N., Kristjanson, P., & Watts, J. (2010) Rethinking impact: Understanding the complexity of poverty and change overview. Development in Practice, 20(8), 917-931. https://doi.org/10.1080/09614524.2010.513721 DOI: https://doi.org/10.1080/09614524.2010.513721

Mayne, J. (2012). Contribution analysis: Coming of age? Evaluation, 18(3), 270-280. https://doi.org/10.1177/1356389012451663 DOI: https://doi.org/10.1177/1356389012451663

Mertens, D., & Tarsilla, M. (2014). Mixed methods in evaluation. In S. Hesse-Biber (Ed.) Mixed Methods Handbook, Oxford University Press. https://doi.org/10.1093/oxfordhb/9780199933624.013.27 DOI: https://doi.org/10.1093/oxfordhb/9780199933624.013.27

Mikkelsen, B. (2005). Methods for development work and research. Thousand Oaks, CA: Sage Publications.

Mills, G. E., & Gray, L. R. (2016). Educational research: Competencies for analysis and applications (11th Ed.). Pearson Education Inc.

NONIE (2009). Impact evaluations and development: Guidance on impact evaluation. Retrieved from: www.worlbank.org/icg/nonie.

OECD/DAC. (2002). Glossary of key terms in evaluation and results based management. OECD-DAC, Paris. Retrieved from http://www.oecd.org/dac/2754804.pdf.

Patton, M. Q. (1994). Developmental evaluation. Evaluation Practice, 15(3), 311-319. https://doi.org/10.1177/109821409401500312 DOI: https://doi.org/10.1177/109821409401500312

Patton, M, Q. (2008a). Utilization-focused evaluation. (4th ed.). Thousand Oaks, CA: Sage Publications.

Patton, M. Q. (2008b). State of the art in measuring development assistance. Address of the World Bank Independent Evaluation Group Conference, 10 April, Washington, DC. Retrieved from http://www.worldbank.org/ieg/conference/results/patton.pdf.

Rogers, P. (2009). Matching impact evaluation design to the nature of the intervention and the purpose of the evaluation. Journal of Development Effectiveness, 1(3), 217-226. https://doi.org/10.1080/19439340903114636 DOI: https://doi.org/10.1080/19439340903114636

Stame, N. (2010). What doesn't work? Three failures, many answers. Evaluation, 16(4), 371-387. https://doi.org/10.1177/1356389010381914 DOI: https://doi.org/10.1177/1356389010381914

Vaessen, J. (2010). Challenges in impact evaluation of development interventions: Opportunities and limitations for randomized experiments. Institute of Development Policy and Management. Discussion Paper/2010-01, 1-40.

Wang, P., Connor, A.L., Guo, E., Namboa, M., Chanda-Kapata, P., & Lambo, N. et al. (2016). Measuring the impact of non-monetary incentives on facility delivery in rural Zambia: a clustered randomized controlled trial. Tropical Medicine and Institutional Health, 21(4) 515-524. https://doi.org/10.1111/tmi.12678 DOI: https://doi.org/10.1111/tmi.12678

Weiss, C. (1998). Evaluation. Upper Saddle River, N.J. Prentice Hall.

White, H., & Sabarwal, S. (2002). Quasi-experimental design and methods. Methodological Briefs Impact Evaluation No. 8. Retrieved from http://www.unicef-irc.org/publications/pdf/brief_8_quasi-experimental%20design_eng.pdf.

White, H. (2009). Theory-based impact evaluation: Principles and practice. Journal of Development Effectiveness, 1(3), 271-284. https://doi.org/10.1080/19439340903114628 DOI: https://doi.org/10.1080/19439340903114628

White, H. (2010). A contribution to current debates in impact evaluation. Evaluation, 16(2), 153-164. https://doi.org/10.1177/1356389010361562 DOI: https://doi.org/10.1177/1356389010361562

World Bank, (2011). What is impact evaluation? Retrieved from: http://go.worldbank.org/2DHMCRFFT2.