Communication for Social Change Seldom a Stand Alone, and Rarely Verified
Main Article Content
Abstract
Background: Communication for social change is rarely a stand-alone initiative. More often it is combined with several communication purposes such as networking, organizational visibility, information dissemination, or behavioural change.
Purpose: This article reports on an inter-disciplinary, capacity building experiment that combines communication strategy development with Utilization-Focused Evaluation (UFE).
Setting: The analysis stems from close to a dozen case studies where we tested a hybrid approach of UFE and communication strategy development. Our partners were research teams in a variety of areas including open education, open and collaborative science, Internet privacy, cyber-security, and open data. The Networked Economies Program of the International Development Research Centre (IDRC, Ottawa) funded each one of the research teams. The partners were based in different countries and had a global reach.
Intervention: The authors are members of a research project entitled “Designing Evaluation and Communication for Impact” (DECI) that provides mentoring in evaluation and communication to partners. This article focuses mainly on lessons from DECI-2, the second phase of the project that was operational from 2012-2017. DECI is led by a team in Canada and has engaged regional mentors based in Latin America, Asia and East Africa, who have provided much of the capacity building support to partners in their regions. At the end of each mentoring cycle, the DECI team produced a case study summarizing the experience. The collection of these case studies is the basis for this article.
Research Design: This article is a meta-evaluation of the experiences gained from the mentoring. It brings the findings from the grounded work and seeks to find theoretical insights from the evaluation and communication literature. Existing family trees in evaluation and communication are reviewed in search for commonalities that underlie the hybrid decision-making framework.
Data Collection and Analysis: The article leans on the findings of the case studies and the hybrid framework. Our analysis builds on earlier work by the authors in communication for social change. In particular, we analyze a common pattern where communication strategies tend to encompass several purposes in tandem. We refer to the planning steps in utilization-focused evaluation as a structured decision-making process that can help organize communication planning. Finally, we reflect on the benefits of formulating communication objectives that can be tracked or measured.
Findings: The hybrid decision-making framework allows communication planners to add some rigor to their strategies. At the same time, it invites evaluators to introduce evaluation questions about the outcomes of a communication intervention. An external evaluation of the DECI-2 project concluded that the combined decision-making process enabled partners to become better at adaptive management. The process introduced reflection spaces and helped teams adjust their projects as research findings emerged, and as conditions shifted in the policy arenas that they sought to influence.
Downloads
Article Details

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.
Copyright and Permissions
Authors retain full copyright for articles published in JMDE. JMDE publishes under a Creative Commons Attribution-NonCommercial 4.0 International License (CC BY - NC 4.0). Users are allowed to copy, distribute, and transmit the work in any medium or format for noncommercial purposes, provided that the original authors and source are credited accurately and appropriately. Only the original authors may distribute the article for commercial or compensatory purposes. To view a copy of this license, visit creativecommons.org
References
Albrecht, M., Elbe, J., Elbe, S., & Meyer, W. (2014). Analyzing and evaluating regional governance networks: Three challenges for applications. Evaluation. https://doi.org/10.1177/1356389013518457 DOI: https://doi.org/10.1177/1356389013518457
Anyaegbunam, C., Mefalopulos, P., & Moetsabi, T. (2004). Participatory rural communication appraisal: Starting with the people. Food and Agriculture Organization.
Argyris, C., & Schön, D. A. (1997). Organizational learning: A theory of action perspective. Reis. https://doi.org/10.2307/40183951 DOI: https://doi.org/10.2307/40183951
Balit, S. (2005). Listening and learning report measuring the impact of communication for development: An online forum. DFID and World Bank.
Beer, T., & Reed, E. D. (2009). A model for multilevel advocacy evaluation. The Foundation Review, 1(3), 149–161. https://doi.org/10.4087/FOUNDATIONREVIEW-D-09-00028.1 DOI: https://doi.org/10.4087/FOUNDATIONREVIEW-D-09-00028.1
Bessette, G. (2010). Involving the community: A guide to participatory development communication. Southbound & ID. https://doi.org/10.1149/1.3054330 DOI: https://doi.org/10.1149/1.3054330
Brodhead, D., & Ramírez, R. (2014). Readiness and mentoring: Two touchstones for capacity development in evaluation. In Improving the use of M&E processes and findings. Wageningen, Netherlands.
Canadian HIV/AIDS Legal Network & Roper Lyv Consulting. (2016). Advocacy and social justice: Measuring impact: A monitoring, evaluation and learning guide on legal advocacy. https://downloads/CHL0300-MEL-Handbook-FINAL-R2-low-res.pdf
Carden, F. (2004). Issues in assessing the policy influence of research. International Social Science Journal, 179, 135–151. https://doi.org/10.1111/j.0020-8701.2004.00480.x DOI: https://doi.org/10.1111/j.0020-8701.2004.00480.x
Chambers, R. (1997). Whose reality counts? Putting the first last. IT Publications. https://doi.org/10.2307/1161236 DOI: https://doi.org/10.3362/9781780440453.000
Christie, C., & Alkin, M. (2012). An evaluation theory tree. In M. C. Alkin (Ed.), Evaluation roots: A wider perspective of theorists' views and influences (2nd ed., pp. 11–57). Sage. https://doi.org/10.1016/j.stueduc.2008.07.001 DOI: https://doi.org/10.1016/j.stueduc.2008.07.001
Donnelly, C., Letts, L., Klinger, D., & Shulha, L. (2014). Supporting knowledge translation through evaluation: Evaluator as knowledge broker. Canadian Journal of Program Evaluation, 29(1), 36–61. https://doi.org/10.3138/cjpe.29.1.36 DOI: https://doi.org/10.3138/cjpe.29.1.36
Figueroa, M., Kincaid, D., Rani, M., & Lewis, G. (2002). Communication for social change: An integrated model for measuring the process and its outcomes. https://doi.org/10.1038/nature05670 DOI: https://doi.org/10.1038/nature05670
Food and Agriculture Organization. (1989). Guidelines on communication for rural development: A brief for development planners and project formulators.
Glass, J. (2017). "Advocates change the world; Evaluation can help": A literature review and key insights from the practice of advocacy evaluation. Canadian Journal of Program Evaluation, 32(1), 46–64. https://doi.org/10.3138/cjpe.31039 DOI: https://doi.org/10.3138/cjpe.31039
Hanley, T. (2014). Challenges in evaluating development communication: The case of a street theatre programme to address racism. Journal of International Development, 26(8), 1149–1160. https://doi.org/10.1002/jid.3039 DOI: https://doi.org/10.1002/jid.3039
Hearn, S., & Batchelor, S. (2017). Evaluation of DECI-2. Gamos. https://idl-bnc-idrc.dspacedirect.org/bitstream/handle/10625/56523/IDL-56523.pdf?sequence=2&isAllowed=y
Horelli, L. (2009). Network evaluation from the everyday life perspective: A tool for capacity-building and voice. Evaluation, 15(2), 205–223. https://doi.org/10.1177/1356389008101969 DOI: https://doi.org/10.1177/1356389008101969
Inagaki, N. (2007). Communicating the impact of communication for development: Recent trends in empirical research. https://doi.org/10.1596/978-0-8213-7167-1 DOI: https://doi.org/10.1596/978-0-8213-7167-1
Lennie, J., & Tacchi, J. (2013). Evaluating communication for development: A framework for social change. Routledge. https://doi.org/10.4324/9780203078495 DOI: https://doi.org/10.4324/9780203078495
Lennie, J., & Tacchi, J. (2015). Tensions, challenges and issues in evaluating communication for development: Findings from recent research and strategies for sustainable outcomes. Nordicom Review, (Special Issue), 25–39. https://doi.org/10.1515/nor-2015-0027 DOI: https://doi.org/10.1515/nor-2015-0027
Lynn, J. (2014). Assessing and evaluating change in advocacy fields. http://www.pointk.org/resources/files/Spark-Evaluating_Change_In_Advocacy_Fields.pdf
McCall, E. (2009). Overview of UN inter-agency round tables on communication for development. http://www.undp.org/content/dam/aplaws/publication/en/publications/democratic-governance/oslo-governance-center/ogc-fellowship-papers/overview-of-un-inter-agency-round-tables-on-communication-for-development-/Overview_of_UNRound_Tables_on_C4D.pdf
Mefalopulos, P., & Kamlongera, C. (2004). Participatory communication strategy design: A handbook (2nd ed.). Food and Agriculture Organization. http://www.fao.org/docrep/008/y5794e/Y5794E00.htm#Contents
Myers, M. (2004). Evaluation methodologies for information and communication for development (IDC) programmes. Department for International Development.
Overseas Development Institute. (2006). The RAPID framework.
Parks, W., Gray-Felder, D., Hunt, J., & Byrne, A. (2005). Who measures change?: An Introduction to participatory monitoring and evaluation of communication for social change. https://doi.org/10.1177/001100007900800102 DOI: https://doi.org/10.1177/001100007900800102
Quarry, W. (2006). Decision-makers do want communication: What they don't want is participation. In World Congress of Communication for Development. Food and Agriculture Organization.
Quarry, W., & Ramírez, R. (2009). Communication for another development: Listening before telling. Zed Books. https://doi.org/10.5040/9781350219274 DOI: https://doi.org/10.5040/9781350219274
Quarry, W., Ramírez, R., & Brodhead, D. (2016). Mentoring Privacy International in evaluation and research communication (DECI-2 Case Study). http://www.odi.org/sites/odi.org.uk/files/odi-assets/events-documents/2764.pdf
Ramírez, R. (2011). Why "utilization focused communication" is not an oxymoron. http://www.comminit.com/node/329198
Ramírez, R., & Brodhead, D. (2013). Utilization-focused evaluation: A primer for evaluators. Southbound.
Ramírez, R., & Brodhead, D. (2017). Evaluation and communication decision-making: A practitioner's guide. https://evaluationandcommunicationinpractice.net/wp-content/uploads/2018/03/ebookv10.pdf?189db0&189db0
Ramírez, R., Brodhead, D., & Quarry, W. (2018). Readiness in evaluation: Three prompts for evaluators. Canadian Journal of Program Evaluation, 33(2), 258–267. https://jmss.org/index.php/cjpe/article/view/42238/pdf https://doi.org/10.3138/cjpe.42238 DOI: https://doi.org/10.3138/cjpe.42238
Ramírez, R., Quarry, W., & Guerin, F. (2015). Community Note. Can participatory communication be taught? Finding your inner phronēsis. Knowledge Management for Development Journal, 11. http://journal.km4dev.org/
Scriven, M. (1991). Evaluation thesaurus (4th ed.). SAGE.
Taylor, M., Plastrik, P., Coffman, J., & Whatley, A. (2014). Evaluating networks for social change: A casebook (A Guide to Network Evaluation No. 2). http://www.networkimpact.org/wp-content/uploads/2014/10/NetworkEvalGuidePt2_Casebook_Rev.pdf
Teles, S., & Schmitt, M. (2011). The elusive craft of evaluating advocacy. Stanford Social Innovation Review, (Summer), 39–43. https://doi.org/10.1007/s00418-007-0296-4 DOI: https://doi.org/10.1007/s00418-007-0296-4
Waisbord, S. (2001). Family tree of theories, methodologies, and strategies in development communication. http://www.portalcomunicacion.com/obregon/pdf/Waisbord_2001.pdf
Walji, S. (2015). Open, ready and agile: Developing communications strategy for research on open educational resources for development (ROER4D) in the Global South. In Open Education Global Conference (pp. 22–24).
White, S. A. (1999). The art of facilitating participation: Releasing the power of grassroots communication. Sage Publications.