Learning Lessons for Evaluating Complexity Across the Nexus: A Meta-Evaluation of Environmental Projects

Main Article Content

William Robert Sheate
https://orcid.org/0000-0001-8413-7458
Clare Twigger-Ross
https://orcid.org/0000-0001-5266-0770
Liza Papadopoulou
Rolands Sadauskis
Owen White
Paula Orr
Richard Eales
https://orcid.org/0000-0002-6752-9924

Abstract

Background: A major gap in environmental policy making is learning lessons from past interventions and in integrating the lessons from evaluations that have been undertaken. Institutional memory of such evaluations often resides externally to government, in evaluation practitioner contractors who undertake commissioned evaluations on behalf of government departments.


Purpose: The aims were to learn the lessons from past policy evaluations, understand the barriers and enablers to successful evaluations, to explore the value of different types of approaches and methods used for evaluating complexity, and how evaluations were used in practice. 


Setting: A meta-evaluation of 23 environmental evaluations undertaken by Collingwood Environmental Planning Ltd (CEP), London, UK was undertaken by CEP staff under the auspices of CECAN (the Centre for Evaluation of Complexity Across the Nexus – a UK Research Councils funded centre, coordinated by the University of Surrey, UK). The research covered water, environment and climate change nexus issues, including evaluations of flood risk, biodiversity, landscape, land use, climate change, catchment management, community resilience, bioenergy, and European Union (EU) Directives.


Intervention: Not applicable.


Research design: A multiple embedded case study design was adopted, selecting 23 CEP evaluation cases from across a 10-year period (2006-2016). Four overarching research questions were posed by the meta-evaluation and formed the basis for more specific evaluation questions, answered on the basis of documented project final reports and supplemented by interviews with CEP project managers. Thematic analysis was used to draw out common themes from across the case categories.


Findings: Policy context invariably framed the complex evaluations; as environmental policy has been spread beyond the responsibility of government to encompass multiple stakeholders, so policy around nexus issues was often found to be in a state of constant flux. Furthermore, an explicit theory of change was only often first elaborated as part of the evaluation process, long after the policy intervention had already been initiated. A better understanding of the policy context, its state of flux or stability as well as clarity of policy intervention’s objectives (and theory of change) could help significantly in designing policy evaluations that can deliver real value for policy makers. Evaluations have other valuable uses aside from immediate instrumental use in revising policy and can be tailored to maximise those values where such potential impact is recognised. We suggest a series of questions that practitioners and commissioners could usefully ask themselves when starting out on a new complex policy evaluation.

Downloads

Download data is not yet available.

Article Details

How to Cite
Sheate, W. R., Twigger-Ross, C., Papadopoulou, L., Sadauskis, R., White, O., Orr, P., & Eales, R. (2020). Learning Lessons for Evaluating Complexity Across the Nexus: A Meta-Evaluation of Environmental Projects. Journal of MultiDisciplinary Evaluation, 16(37), 1–19. https://doi.org/10.56645/jmde.v16i37.641
Section
Research on Evaluation Articles

References

Balthasar, A. (2009) 'Institutional Design and Utilization of Evaluation: A Contribution to a Theory of Evaluation Influence Based on Swiss Experience', Evaluation Review, 33(3), pp. 226-256. doi: 10.1177/0193841X08322068. https://doi.org/10.1177/0193841X08322068 DOI: https://doi.org/10.1177/0193841X08322068

BetterEvaluation (no date) Report and support use. Available at: https://www.betterevaluation.org/en/rainbow_framework/report_support_use, accessed 17 June 2020.

CECAN (2018) Policy evaluation for a complex world. April 2018, Version 2.0 online at www.cecan.ac.uk, accessed 17 June 2020.

Downe, J., Martin, S. and Bovaird, T. (2012) 'Learning from complex policy evaluations', Policy and Politics, 40(4), pp. 505-523. doi: 10.1332/030557312X645766. https://doi.org/10.1332/030557312X645766 DOI: https://doi.org/10.1332/030557312X645766

European Environment Agency (2011) Bridging long-term scenario and strategy analysis- organisation and methods, Technical Report No 5/2011. doi: 10.2800/76903.

Glouberman, S., & Zimmerman, B. (2002) 'Complicated and complex systems: what would successful reform of Medicare look like?', Romanow Papers, 2, pp. 21-53.

Goodin, R. E., Moran, M. and Rein, M. (2008) The Oxford Handbook of Public Policy. Oxford: OUP. https://doi.org/10.1093/oxfordhb/9780199548453.001.0001 DOI: https://doi.org/10.1093/oxfordhb/9780199548453.001.0001

Hallsworth, M. (2011) 'System Stewardship' The future of policy making? Working Paper, London: Institute for Government, pp. 1-49. Available at: https://www.instituteforgovernment.org.uk/sites/default/files/publications/System%20Stewardship.pdf, accessed 17 June 2020.

Hargreaves, M. B. and Podems, D. (2012) 'Advancing Systems Thinking in Evaluation: A Review of Four Publications', American Journal of Evaluation, 33(3), pp. 462-470. https://doi.org/10.1177/1098214011435409 DOI: https://doi.org/10.1177/1098214011435409

Henry, G. T. and Mark, M. M. (2016) 'Beyond Use: Understanding Evaluation's Influence on Attitudes and Actions', American Journal of Evaluation, 24(3), pp. 293-314. https://doi.org/10.1177/109821400302400302 DOI: https://doi.org/10.1177/109821400302400302

HM Treasury (2018) The Green Book: Central Government guidance on appraisal and evaluation. London: HM Treasury. Available at https://www.gov.uk/government/publications/the-green-book-appraisal-and-evaluation-in-central-governent, accessed 17 June 2020.

HM Treasury (2020a) The Magenta Book: Central Government guidance on evaluation, March 2020, available at https://www.gov.uk/government/publications/the-magenta-book, accessed 17 June 2020.

HM Treasury (2020b) The Magenta Book: Supplementary Guide: Handling Complexity in Policy Evaluation, March 2020, available at https://www.gov.uk/government/publications/the-magenta-book, accessed 17 June 2020.

Jaffe, A. B., Newell, R. G., & Stavins, R. N. (2005) 'A tale of two market failures: Technology and environmental policy', Ecological Economics, 54(2), pp. 164-174. https://doi.org/10.1016/j.ecolecon.2004.12.027 DOI: https://doi.org/10.1016/j.ecolecon.2004.12.027

Kirkhart, K. (2000) 'Reconceptualizing evaluation use: An integrated theory of influence.', in Caracelli, V. and Preskill, H. (eds) The expanding scope of evaluation use. New Direct. San Francisco, CA: Jossey-Bass., pp. 5-24. https://doi.org/10.1002/ev.1188 DOI: https://doi.org/10.1002/ev.1188

Magro, E. and Wilson, J. R. (2013) 'Complex innovation policy systems: Towards an evaluation mix', Research Policy, 42(9), pp. 1647-1656. https://doi.org/10.1016/j.respol.2013.06.005 DOI: https://doi.org/10.1016/j.respol.2013.06.005

Patton, M. Q. (2002) Qualitative Research and Evaluation Methods. 3rd edn. London: Sage Publications.

Pawson, R. et al. (2005) 'Realist review--a new method of systematic review designed for complex policy interventions.', Journal of Health Services Research & Policy, 10 Suppl 1(July), pp. 21-34. https://doi.org/10.1258/1355819054308530 DOI: https://doi.org/10.1258/1355819054308530

Pawson, R. (2013) The science of evaluation: a realist manifesto. Sage. https://doi.org/10.4135/9781473913820 DOI: https://doi.org/10.4135/9781473913820

Peck, L. R. and Gorzalski, L. M. (2009) 'An Evaluation Use Framework and Empirical Assessment', Journal of MultiDisciplinary Evaluation, 6 (12), pp. 139-156. https://doi.org/10.56645/jmde.v6i12.228 DOI: https://doi.org/10.56645/jmde.v6i12.228

Rogers, P. J. (2008) 'Using Programme Theory to Evaluate Complicated and Complex Aspects of Interventions', Evaluation, 14(1), pp. 29-48. doi: 10.1177/1356389007084674. https://doi.org/10.1177/1356389007084674 DOI: https://doi.org/10.1177/1356389007084674

Sanderson, I. (2000) 'Evaluation in complex policy systems.', Evaluation, 6(4), pp. 433-454. https://doi.org/10.1177/13563890022209415 DOI: https://doi.org/10.1177/13563890022209415

Schoenefeld, J. J. & Jordan, A. J. (2019) Environmental policy evaluation in the EU: between learning, accountability, and political opportunities? Environmental Politics, 28:2, 365-384. https://doi.org/10.1080/09644016.2019.1549782 DOI: https://doi.org/10.1080/09644016.2019.1549782

Snyder, S. (2013), "The Simple, the Complicated, and the Complex: Educational Reform Through the Lens of Complexity Theory", OECD Education Working Papers, No. 96, OECD Publishing.

Warburton, D., Wilson, R. and Rainbow, E. (2010) 'Making a Difference: A guide to Evaluating Public Participation in Central Government', pp. 1-47. Available at: https://www.involve.org.uk/sites/default/files/uploads/Making-a-Difference-.pdf accessed 17 June 2020.

Wong, G. et al. (2014) 'Development of methodological guidance, publication standards and training materials for realist and meta-narrative reviews: the RAMESES (Realist And Meta-narrative Evidence Syntheses - Evolving Standards) project', Health Services and Delivery Research, 2(30), pp. 1-252. doi: 10.3310/hsdr02300. https://doi.org/10.3310/hsdr02300 DOI: https://doi.org/10.3310/hsdr02300