Re-Structuring Evaluation Findings into Useful Knowledge

Main Article Content

Danielle Houston
Bernadette Wright
https://orcid.org/0000-0002-1044-1323
Steven E. Wallis
https://orcid.org/0000-0001-5207-603X

Abstract

Background: A long research stream has shown that when knowledge is more structured it is more likely to be effective in practical application. Building on that research, the authors applied Integrative Proposition Analysis to visualize, integrate, and assess the quality and usefulness of knowledge gained from the NMAC (formerly National Minority AIDS Council) Strong Communities evaluation.


Purpose: Demonstrate an innovative method to rigorously integrate and strengthen knowledge gained from evaluation and to encourage discussion of future directions for developing stronger theories for more effective evaluation and more effective action.


Setting: Birmingham, Alabama


Intervention: A project to identify local strategies for community-based organizations and community health centers that serve African American and Latinx gay and bisexual men and transgender women to collaboratively meet HIV-related community needs.


Research Design:  The researchers applied Integrative Propositional Analysis to integrate and map concepts and causal connections emerging from the evaluation findings. The authors then analyzed the resulting map to identify top-mentioned concepts, better understood concepts, reinforcing loops, and knowledge gaps.


Data Collection and Analysis: Integrative Propositional Analysis applied to a literature review and stakeholder interview transcripts collected for the evaluation.


Findings: Integrating literature and interview results helped to identify several actions where providers of HIV-related services could increase their impact on combating the HIV epidemic among the communities they serve. The authors also identified a reinforcing loop; this shows opportunity to improve two desired outcomes by increasing one. In addition, the authors identified blank spots on the map; these show where additional research could strengthen the quality and usefulness of the mapped knowledge.

Downloads

Download data is not yet available.

Article Details

How to Cite
Houston, D., Wright, B., & Wallis, S. E. (2017). Re-Structuring Evaluation Findings into Useful Knowledge. Journal of MultiDisciplinary Evaluation, 13(29), 31–41. https://doi.org/10.56645/jmde.v13i29.481
Section
Research Articles

References

Belcher, B. M., Rasmussen, K. E., Kemshaw, M. R., & Zornes, D. A. (2016). Defining and assessing research quality in a transdisciplinary context. Research Evaluation, 25(1), 1-17. https://doi.org/10.1093/reseval/rvv025 DOI: https://doi.org/10.1093/reseval/rvv025

Cook, T. D., Scriven, M., Coryn, C. L., & Evergreen, S. D. (2010). Contemporary thinking about causation in evaluation: A dialogue with Tom Cook and Michael Scriven. American Journal of Evaluation, 31(1), 105-117. https://doi.org/10.1177/1098214009354918 DOI: https://doi.org/10.1177/1098214009354918

Cotae, C. E. (2015). Regional performances in the context of a transition towards the circular economy: Structuring the assessment framework. Ecoforum, 4(1), 140-146.

Curseu, P., Schalk, R., & Schruijer, S. (2010). The use of cognitive mapping in eliciting and evaluating group cognitions. Journal of Applied Social Psychology, 40(5), 1258-1291. https://doi.org/10.1111/j.1559-1816.2010.00618.x DOI: https://doi.org/10.1111/j.1559-1816.2010.00618.x

Dent, E. B., & Umpleby, S. A. (1998). Underlying assumptions of several traditions in systems theory and cybernetics. In R. Trappl (Ed.), Cybernetic and Systems '98 (pp. 513-518). Vienna, Austria: Austrian Society for Cybernetic Studies.

Dijkers, M. (2009). When the best is the enemy of the good: The nature of research evidence used in systematic reviews and guidelines. NCDDR Task Force on Systematic Review and Guidelines. Austin, TX: SEDL.

Dijkers, M., Boninger, M., Bushnik, T., Esselman, P., Heinemann, A., Heller, T., et al. (2011). Guidelines for assessing the quality and applicability of systematic reviews. Austin: The National Center for the Dissemination of Rehabilitation Research, SEDL.

Duignan, P. (2008). Easy outcomes workbook (Working Draft). Unpublished manuscript.

EES. (2007). The importance of a methodologically diverse approach to impact evaluation-specifically with respect to development aid and development interventions. Nijkerk, Netherlands: European Evaluation Society (EES).

EPPI-Centre. (2010). EPPI-Centre Methods for Conducting Systematic Reviews, Evidence for Policy and Practice Information and Co-ordinating Centre: University of London, Social Science Research Unit, Institute of Education.

Flyvbjerg, B. (2006). Five misunderstandings about case-study research. Qualitative inquiry, 12(2), 219-245. https://doi.org/10.1177/1077800405284363 DOI: https://doi.org/10.1177/1077800405284363

Goltz, S. M. (2017). Enhancing Simulation Learning With Team Mental Model Mapping. Management Teaching Review, 2379298117706335. https://doi.org/10.1177/2379298117706335 DOI: https://doi.org/10.1177/2379298117706335

Gough, D., Oliver, S., & Thomas, J. (2012). Introducing systematic reviews.

Greenhalgh, T., Robert, G., Macfarlane, F., Bate, P., & Kyriakidou, O. (2004). Diffusion of innovations in service organizations: systematic review and recommendations. The Milbank Quarterly, 82(4), 581-629. https://doi.org/10.1111/j.0887-378X.2004.00325.x DOI: https://doi.org/10.1111/j.0887-378X.2004.00325.x

Greenhalgh, T., Robert, G., Macfarlane, F., Bate, P., Kyriakidou, O., & Peacock, R. (2005). Storylines of research in diffusion of innovation: a meta-narrative approach to systematic review. Social science & medicine, 61(2), 417-430. https://doi.org/10.1016/j.socscimed.2004.12.001 DOI: https://doi.org/10.1016/j.socscimed.2004.12.001

HRSA. (2011). Breaking Barriers, Getting YMSM of Color Into Care: Accomplishments of the SPNS Initiative. What's Going on @ SPINS, December 2011. Retrieved from https://hab.hrsa.gov/sites/default/files/hab/About/Parts/cyberspns_ymsm_of_color_2011.pdf

Keene, M., & Metzner, C. (2011). Fuzzy Logic Models. American Evaluation Association Coffee Break Demonstration webinar Retrieved October 03, 2011

Maxwell, J. A. (2004). Using qualitative methods for causal explanation. Field methods, 16(3), 243-264. https://doi.org/10.1177/1525822X04266831 DOI: https://doi.org/10.1177/1525822X04266831

Meaningful Evidence, L. (2016). Low-Tech and High-Tech Tools for Mapping Your Strategic Plan. 2017, from Available by request from: http://www.meaningfulevidence.com

Moat, K. A., Lavis, J. N., Wilson, M. G., Røttingen, J.-A., & Bärnighausen, T. (2013). Twelve myths about systematic reviews for health system policymaking rebutted. Journal of health services research & policy, 18(1), 44-50. https://doi.org/10.1258/jhsrp.2012.011175 DOI: https://doi.org/10.1258/jhsrp.2012.011175

Ofir, Z., Schwandt, T., Duggan, C., & McLean, R. (2016). Research Quality Plus (RQ+): a holistic approach to evaluating research.

Patton, M. Q. (2013, February 26 - March 1). Day 3 Keynote Speech. Paper presented at the SEA Change Evaluation Conclave 2013, Kathmandu, Nepal.

Pawson, R., & Bellamy, J. L. (2006). Realist synthesis: an explanatory focus for systematic review. Putting effectiveness into context: methodological issues in the synthesis of evidence from diverse study designs. London: Health Development Agency, forthcoming.

Richard, R. (2009). The Logic Model and Systems Thinking: Can They Co-Exist? Paper presented at the American Evaluation Association Conference, Orlando, FL.

Rodgers, M., Sowden, A., Petticrew, M., Arai, L., Roberts, H., Britten, N., et al. (2009). Testing methodological guidance on the conduct of narrative synthesis in systematic reviews: effectiveness of interventions to promote smoke alarm ownership and function. Evaluation, 15(1), 49-73. https://doi.org/10.1177/1356389008097871 DOI: https://doi.org/10.1177/1356389008097871

Rogers, P. J. (2008). Using programme theory to evaluate complicated and complex aspects of interventions. Evaluation, 14(1), 29-48. https://doi.org/10.1177/1356389007084674 DOI: https://doi.org/10.1177/1356389007084674

Rostami, A., & Wright, B. (2016). Developing a Successful Strategy. 2017, Available by request from: http://www.meaningfulevidence.com/

Senge, P. (1990). The Fifth Discipline: The Art and Practice of The Learning Organization. New York: Currency Doubleday.

Shackelford, C. (2014). Propositional Analysis, Policy Creation, and Complex Environments in the United States' 2009 Afghanistan-Pakistan Policy. Unpublished Doctoral Dissertation, Walden, Minneapolis, MN.

Smyth, K. F., & Schorr, L. B. (2009). A Lot to Lose: A Call to Rethink What Constitutes "Evidence" in Finding Social Interventions That Work.

Sprites, P., Glymour, C., & Scheines, R. (1993). Causation, prediction and search (First, online ed.). Cambridge, MA: MIT Press. https://doi.org/10.1007/978-1-4612-2748-9

Stern, E., Stame, N., Mayne, J., Forss, K., Davies, R., & Befani, B. (2012). Broadening the range of designs and methods for impact evaluations. Report of a study commissioned by the Department for International Development, London, UK. https://doi.org/10.22163/fteval.2012.100 DOI: https://doi.org/10.22163/fteval.2012.100

Suedfeld, P., & Rank, A. D. (1976). Revolutionary leaders: Long-term success as a function of changes in conceptual complexity. Journal of Personality and Social Psychology, 34(2), 169-178. https://doi.org/10.1037//0022-3514.34.2.169 DOI: https://doi.org/10.1037//0022-3514.34.2.169

Trochim, W. M. K. (1989). Concept mapping: Soft science of hard art? Evaluation and Program Planning, 12, 87-112. https://doi.org/10.1016/0149-7189(89)90027-X DOI: https://doi.org/10.1016/0149-7189(89)90027-X

University of York, Centre for Reviews & Dissemination (CRD). (2009). Systematic reviews: CRD's guidance for undertaking reviews in health care.

Wallis, S. E. (2010a). The structure of theory and the structure of scientific revolutions: What constitutes an advance in theory? In S. E. Wallis (Ed.), Cybernetics and systems theory in management: Views, tools, and advancements (pp. 151-174). Hershey, PA: IGI Global. https://doi.org/10.4018/978-1-61520-668-1.ch009 DOI: https://doi.org/10.4018/978-1-61520-668-1.ch009

Wallis, S. E. (2010b). Towards the development of more robust policy models. Integral Review, 6(1), 153-160.

Wallis, S. E. (2013). How to choose between policy proposals: A simple tool based on systems thinking and complexity theory. E:CO - Emergence: Complexity & Organization, 15(3), 94-120.

Wallis, S. E. (2014a). Evaluating Explanations through their Conceptual Structures. In M. Lissack & A. Graber (Eds.), Modes of Explanation: Affordances for Action and Prediction. New York: Palgrave MacMillan.

Wallis, S. E. (2014b). Existing and emerging methods for integrating theories within and between disciplines. Organisational Transformation and Social Change, 11(1), 3-24. https://doi.org/10.1179/1477963313Z.00000000023 DOI: https://doi.org/10.1179/1477963313Z.00000000023

Wallis, S. E. (2016a). The science of conceptual systems: A progress report. Foundations of Science, 21(4), 579-602. https://doi.org/10.1007/s10699-015-9425-z DOI: https://doi.org/10.1007/s10699-015-9425-z

Wallis, S. E. (2016b). Structures of logic in policy and theory: Identifying sub-systemic bricks for investigating, building, and understanding conceptual systems. Foundations of Science, 20(3), 213-231. https://doi.org/10.1007/s10699-014-9360-4 DOI: https://doi.org/10.1007/s10699-014-9360-4

Wallis, S. E., & Valentinov, V. (2016). What Is Sustainable Theory? A Luhmannian Perspective on the Science of Conceptual Systems. Foundations of Science, in press. https://doi.org/10.1007/s10699-016-9496-5 DOI: https://doi.org/10.1007/s10699-016-9496-5

Wallis, S. E., & Wright, B. (2014). The Science of Conceptual Systems: Its History and Usefulness for Improved Decision-Making and Organizational Success. from Available by request from: http://www.meaningfulevidence.com

Wallis, S. E., Wright, B., & Nash, F. D. (2016). Using Integrative Propositional Analysis to Evaluate and Integrate Economic Policies of U.S. Presidential Candidates White Paper, 16. Retrieved from http://meaningfulevidence.com/wp-content/uploads/IPA-of-POTUS-Candidates.pdf

Wong, E. M., Ormiston, M. E., & Tetlock, P. E. (2011). The Effects of Top Management Team Integrative Complexity and Decentralized Decision Making on Corporate Social Performance. Academy of Management Journal, 54(6), 1207-1228. https://doi.org/10.5465/amj.2008.0762 DOI: https://doi.org/10.5465/amj.2008.0762

Woolcock, M. (2013). Using case studies to explore the external validity of 'complex' development interventions. Evaluation, 19(3), 229-248. https://doi.org/10.1177/1356389013495210 DOI: https://doi.org/10.1177/1356389013495210

Wright, B. (2013, 5/4/2013). Reuse, recycle: Rethink research. Retrieved December 12, 2013, from http://www.meaningfulevidence.com/uploads/Meaningful_Evidence_LLC_Reuse_Recycle_Rethink_Research.pdf

Wright, B., & Lewis, L. (2016a). Little Is Known? Designing More Useful Reviews of Related Research Evidence. Paper presented at the American Evaluation Association Conference

Wright, B., & Lewis, L. (2016b). Reviewing Related Research (R3) Workbook. 2017, from http://meaningfulevidence.com/news-resources/reviewing-related-research-r3-workbook

Wright, B., & Wallis, S. E. (2015). Using Integrative Propositional Analysis (IPA) for evaluating entrepreneurship theories. SAGE Open, July-September, 1-9. https://doi.org/10.1177/2158244015604190 DOI: https://doi.org/10.1177/2158244015604190

Wright, B., & Wallis, S. E. (2017). How Good Is Your Evidence? Stanford Social Innovation Review (SSIR). Retrieved from https://ssir.org/articles/entry/how_good_is_your_evidence

Yin, R. K. (1994). Discovering the future of the case study method in evaluation research. Evaluation practice, 15(3), 283-290. https://doi.org/10.1016/0886-1633(94)90023-X DOI: https://doi.org/10.1016/0886-1633(94)90023-X