Assessing Implementation Integrity of a National Nutrition Education Program: A Case Study of Share Our Strength's Operation Frontline
Main Article Content
Abstract
Background: Treatment implementation is not just one thing but rather is a multifaceted process that includes treatment delivery, treatment receipt, and treatment adherence. As such, local variations in implementation and service delivery of interventions are an inevitable.
Purpose: To assess implementation fidelity of a multi-site experiential nutrition education program.
Setting: Multiple sites throughout the continental United States.
Intervention: An experiential nutrition education program.
Research Design: A concurrent mixed methods design was used to assess implementation fidelity.
Data Collection and Analysis: Multiple methods of data collection and analysis were used including observations, interviews, survey questionnaires, and extant data.
Findings: Although implementation fidelity varied over program sites, overall implementation fidelity was very good and when it varied, it varied to local site needs and context.
Downloads
Article Details

This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.
Copyright and Permissions
Authors retain full copyright for articles published in JMDE. JMDE publishes under a Creative Commons Attribution-NonCommercial 4.0 International License (CC BY - NC 4.0). Users are allowed to copy, distribute, and transmit the work in any medium or format for noncommercial purposes, provided that the original authors and source are credited accurately and appropriately. Only the original authors may distribute the article for commercial or compensatory purposes. To view a copy of this license, visit creativecommons.org
References
Brandon, P. R. (1998). Stakeholder participation for the purpose of helping ensure evaluation validity: bridging the gap between collaborative and non-collaborative evaluations. American Journal of Evaluation, 19, 325-337. https://doi.org/10.1016/S1098-2140(99)80215-X DOI: https://doi.org/10.1016/S1098-2140(99)80215-X
Burney, J., & Haughton, B. (2002). EFNEP: A nutrition education program that demonstrates cost-benefit. Journal of the American Dietetic Association, 102, 39-45. https://doi.org/10.1016/S0002-8223(02)90014-3 DOI: https://doi.org/10.1016/S0002-8223(02)90014-3
Colosi, L. A. (2007). Eat Smart New York 2007 statewide evaluation report. Ithaca, NY: Cornell University, Department of Policy Analysis and Management.
Condrasky, M., Graham, K., & Kamp, J. (2006). Cooking with a Chef: an innovative program to improve mealtime practices and eating behaviors of caregivers of preschool children. Journal of Nutrition Education and Behavior, 38, 324-325. https://doi.org/10.1016/j.jneb.2006.04.005 DOI: https://doi.org/10.1016/j.jneb.2006.04.005
Cordray, D. S., & Pion, G. M. (2006). Treatment strength and integrity: Models and methods. In R. R. Bootzin & P. E. McKnight (Eds.), Strengthening research methodology: Psychological measurement and evaluation (pp. 103-124). Washington, DC: American Psychological Association. https://doi.org/10.1037/11384-006 DOI: https://doi.org/10.1037/11384-006
Coryn, C. L. S. (2007). Evaluation of researchers and their research: toward making the implicit explicit. Ph.D. dissertation, Western Michigan University, Kalamazoo.
Coryn, C. L. S., Gugiu, P. C., Davidson, E. J., & Schröter, D. C. (2008). Assessing needs in hidden populations using respondent-driven sampling. Evaluation Journal of Australasia, 7, 3-11. https://doi.org/10.1177/1035719X0700700202 DOI: https://doi.org/10.1177/1035719X0700700202
Coryn, C. L. S., Noakes, L. A., Westine, C. D., & Schröter, D. C., (2011). A systematic review of theory-driven evaluation practice from 1990 to 2009. American Journal of Evaluation, 32(2), 199-226. https://doi.org/10.1177/1098214010389321 DOI: https://doi.org/10.1177/1098214010389321
Coryn, C. L. S., Schröter, D. C., & Hanssen, C. E. (2009). Adding a time-series design element to the Success Case Method to improve methodological rigor: an application for nonprofit program evaluation. American Journal of Evaluation, 30, 80-92. https://doi.org/10.1177/1098214008326557 DOI: https://doi.org/10.1177/1098214008326557
Cousins, J. B., & Whitmore, E. (1998). Framing participatory evaluation. New Directions for Evaluation, 80, 5-23. https://doi.org/10.1002/ev.1114 DOI: https://doi.org/10.1002/ev.1114
Cullen, A. E. (2009). The politics and consequences of participation in international development evaluation. Ph.D. dissertation, Western Michigan University, Kalamazoo.
Cullen, A. E., Coryn, C. L. S., & Rugh, J. (2011). The politics and consequences of including stakeholders in international development evaluation. American Journal of Evaluation, 32(3), 345-361. https://doi.org/10.1177/1098214010396076 DOI: https://doi.org/10.1177/1098214010396076
Davey, J. W., Gugiu, P. C., & Coryn, C. L. S. (2010). Quantitative methods for estimating the reliability of qualitative data. Journal of MultiDisciplinary Evaluation, 6(13), 140-162. https://doi.org/10.56645/jmde.v6i13.266 DOI: https://doi.org/10.56645/jmde.v6i13.266
Davidson, E. J. (2005). Evaluation methodology basics: The nuts and bolts of sound evaluation. Thousand Oaks, CA: Sage. https://doi.org/10.4135/9781452230115 DOI: https://doi.org/10.4135/9781452230115
Davis, M., Baranowski, T., Resnicow, K., et al. (2000). Gimme 5 fruit and vegetables for fun and health: process evaluation. Health Education & Behavior, 27, 167-176. https://doi.org/10.1177/109019810002700203 DOI: https://doi.org/10.1177/109019810002700203
Dollahite, J., Kenkel, D., & Scott Thompson, C. (2008). An economic evaluation of the Expanded Food and Nutrition Education Program. Journal of Nutrition Education and Behavior, 40, 134-143. https://doi.org/10.1016/j.jneb.2007.08.011 DOI: https://doi.org/10.1016/j.jneb.2007.08.011
Patton, M. Q. (2002). Qualitative research and evaluation methods (3rd ed). Thousand Oaks, CA: Sage.
Reynolds, K. D., Franklin, F. A., Leviton, L. C., et al. (2000). Methods, results, and lessons learned from process evaluation of the High 5 school-based nutrition intervention. Health Education & Behaviour, 27, 177-186. https://doi.org/10.1177/109019810002700204 DOI: https://doi.org/10.1177/109019810002700204
Rohs, F. R., Langone, C. A., & Coleman, R. K. (2001). Response shift bias: a problem in evaluating nutrition training using self-report measures. Journal of Nutrition Education, 33, 264-292. https://doi.org/10.1016/S1499-4046(06)60187-5 DOI: https://doi.org/10.1016/S1499-4046(06)60187-5
Rossi, P. H., Lipsey, M. W., & Freeman, H. E. (2004). Evaluation: A systematic approach (7th ed). Thousand Oaks, CA: Sage.
Schröter, D. C. (2008). Sustainability evaluation: development and validation of an evaluation checklist. Ph.D. dissertation, Western Michigan University, Kalamazoo.
Scriven, M. (2007). Key evaluation checklist. Kalamazoo, MI: The Evaluation Center.
Shadish, W. R., Cook, T. D., & Campbell, D. T. (2002). Experimental and quasi-experimental designs for generalized causal inference. Boston, MA: Houghton Mifflin.
Share Our Strengths Operation Frontline. (2009). Implementation guidelines. Washington, DC: Share Our Strengths.
Swindle, S., Baker, S. S., & Auld, G. W. (2007). Operation Frontline: assessment of longer-term curriculum effectiveness, evaluation strategies, and follow-up methods. Journal of Nutrition Education and Behavior, 39, 205-213. https://doi.org/10.1016/j.jneb.2007.03.003 DOI: https://doi.org/10.1016/j.jneb.2007.03.003
Taylor, P. J., Russ-Eft, D. F., & Taylor, H. (2009). Gilding the outcome by tarnishing the past: inflationary biases in retrospective pretests. American Journal of Evaluation, 30, 31-43. https://doi.org/10.1177/1098214008328517 DOI: https://doi.org/10.1177/1098214008328517
Taylor, T., Serrano, E., & Anderson, J. (2001). Management issues related to effectively implementing a nutrition education program using peer educators. Journal of Nutrition Education, 33, 264-292. https://doi.org/10.1016/S1499-4046(06)60293-5 DOI: https://doi.org/10.1016/S1499-4046(06)60293-5
Weiss, C. H. (1998). Evaluation: Methods for studying programs and policies (2nd ed.). Upper Saddle River, NJ: Prentice Hall.