Moving from Studies to Streams: A More Radical Way to Avoid Floods of Evidence by Channeling Evaluative Efforts in Conditions of Complexity
Main Article Content
Abstract
Background: Multiple evaluations of complex problems risk creating ‘floods’ of evidence that are hard for decision makers to make sense of. Using the metaphor of ‘canalizing’ evidence to create more manageable streams of evidence, it is argued that the evaluation community can create more usable knowledge to support decision makers and strengthen accountability. ‘Downstream’ systematic reviews and clearing houses are not enough to address this issue. The problems arise ‘upstream’ from multiple, weakly connected evaluations being conducted in relative isolation, rather than focusing evaluative efforts on the key challenges facing complex policy spaces.
Purpose: It is argued that evaluation policies need to move ‘upstream’ and be consciously designed to both build on existing evaluations and take the needs of decision-makers more fully into account. This cannot be achieved by ‘one off’ improvements to individual evaluations but requires a coherent response that would support effective and meaningful retroductive realist syntheses that reveal the patterns found in evaluations of heterogeneous interventions.
Setting: Evaluations are almost never at the heart of policy debates today and we should at least be curious about why this is the case. It is argued that evaluative efforts are poorly used, unfocused and require ‘canalization’ to replace floods of evidence with coherent sense-making.
Intervention: not relevant
Research design: Review and reflection
Data collection: not relevant
Analysis and findings: Urgent action is required across the community of evaluators, commissioners of studies, and users of evaluative evidence to make better use of the resources dedicated to evaluation. Streams of evidence should be curated to promote learning and improvement, involve multiple stakeholders, in ways that require the evaluation community to take stock of our competencies and organisational forms. Whether this takes us towards a single answer, and identifies ‘best practice’, or whether it takes us towards heterogeneous insights, and identifies more ephemeral ‘good practices’, is for the evidence to decide. How this evidence is channelled will be critical.
Downloads
Article Details

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.
Copyright and Permissions
Authors retain full copyright for articles published in JMDE. JMDE publishes under a Creative Commons Attribution-NonCommercial 4.0 International License (CC BY - NC 4.0). Users are allowed to copy, distribute, and transmit the work in any medium or format for noncommercial purposes, provided that the original authors and source are credited accurately and appropriately. Only the original authors may distribute the article for commercial or compensatory purposes. To view a copy of this license, visit creativecommons.org
References
Frontier economics. 2022. ESRC Investment in What Works Centres: Evaluation report of the ESRC. Retrieved at: https://www.ukri.org/wp-content/uploads/2022/08/ESRC-110822-InvestmentInWhatWorksCentresEvaluation.pdf
Ling, T (2017) ‘Achieving Equality at Scale through system transformation: evaluating system change’ in Speaking Justice to Power K Forss ed Routledge, New York. https://doi.org/10.4324/9781315130132-9 DOI: https://doi.org/10.4324/9781315130132-9
Mayo-Wilson E, Grant S, Supplee LH. Clearinghouse Standards of Evidence on the Transparency, Openness, and Reproducibility of Intervention Evaluations. Prev Sci. 2022 Jul;23(5):774-786. doi: 10.1007/s11121-021-01284-x. Epub 2021 Aug 6. PMID: 34357509; PMCID: PMC9283145. DOI: https://doi.org/10.1007/s11121-021-01284-x
The RAMESES II Project. (2017). Retroduction in realist evaluation. The RAMESES II Project—Resources and training materials for realist evaluation. Retrieved from http://ramesesproject.org/Standards_and_Training_materials.php-RAMESES_II
U.K. H.M Treasury. (2020). Magenta Book: Central Government guidance on evaluation. Retrieved at: https://assets.publishing.service.gov.uk/government/uploads/system/uploads/attachment_data/file/879438/HMT_Magenta_Book.pdf
United Nations Development Program. (2024). Knowledge Management. Retrieved at: https://www.undp.org/tag/knowledge-management