Mixed Methods Research in Designing an Instrument for Consumer-Oriented Evaluation

Main Article Content

Dorinda J. Gallant
https://orcid.org/0000-0002-0152-731X
Nicole Luthy

Abstract

Background: The educational product market has been gradually shifting from primarily print to primarily digital content. Educators must make quick decisions when selecting materials that will assist students in their learning.


Purpose: Purposes of this study were to describe the application of a two-stage sequential mixed-method, mixed-model design in designing an instrument for consumer-oriented evaluation and to describe implications of using mixed methods research in developing a rubric to evaluate prekindergarten through Grade 12 digital content.


Setting: The Ohio State University, Columbus, OH.


Intervention: N/A.


Research design: A two-stage sequential mixed-method, mixed-model design.


Data collection & analysis: In Stage 1, a modified electronic Delphi survey technique was implemented with US geographically dispersed subject matter experts. In Stage 2, cross-sectional focus group interviews were conducted with local teachers, administrators, and textbook publishers.


Findings: Inclusion of multiple perspectives and viewpoints from teachers, administrators, textbook publishers, and experts on importance, clarity, and appropriateness of criteria to evaluate digital content resulted in a final version of the rubric that can be used by teachers and administrators to evaluate digital content that supports students’ learning in prekindergarten through Grade 12.

Downloads

Download data is not yet available.

Article Details

How to Cite
Gallant, D. J., & Luthy, N. (2020). Mixed Methods Research in Designing an Instrument for Consumer-Oriented Evaluation. Journal of MultiDisciplinary Evaluation, 16(34), 21–43. https://doi.org/10.56645/jmde.v16i34.583
Section
Research on Evaluation Articles
Author Biographies

Dorinda J. Gallant, The Ohio State University

Department of Educational Studies

Associate Professor

Nicole Luthy, The Ohio State University

College of Education and Human Ecology

References

Adcock, R., & Collier, D. (2001). Measurement validity: A shared standard for qualitative and quantitative research. American Political Science Review, 95, 529-546. https://doi.org/10.1017/S0003055401003100 DOI: https://doi.org/10.1017/S0003055401003100

Akins, R. B., Tolson, H., & Cole, B. R. (2005). Stability of response characteristics of a Delphi panel: Application of bootstrap data expansion. BMC Medical Research Methodology, 5(37). Available online: http://www.biomedcentral.com/1471-2288/5/37 https://doi.org/10.1186/1471-2288-5-37 DOI: https://doi.org/10.1186/1471-2288-5-37

Brookhart, S. M. (2013). How to create and use rubrics for formative assessment and grading. Alexandria, VA: ASCD. https://doi.org/10.4135/9781452218649.n15 DOI: https://doi.org/10.4135/9781452218649.n15

Chang, H.-C., Lai, H.-H., & Chang, Y.-M. (2007). A measurement scale for evaluating the attractiveness of a passenger car form aimed at young consumers. International Journal of Industrial Ergonomics, 37, 21-30. doi: 10.1016/j.ergon.2006.09.014 https://doi.org/10.1016/j.ergon.2006.09.014 DOI: https://doi.org/10.1016/j.ergon.2006.09.014

Collins, K. M. T., Onwuegbuzie, A. J., & Sutton, I. L. (2006). A model incorporating the rationale and purpose for conducting mixed-methods research in special education and beyond. Learning Disabilities: A Contemporary Journal, 4, 67-100.

Creswell, J. W. (2015). A concise introduction to mixed methods research. Thousand Oaks, CA: Sage.

Creswell, J. W., & Creswell, J. D. (2018). Research design: Qualitative, quantitative, and mixed methods approaches (5th ed.). Thousand Oaks, CA: Sage.

Curran, V., Hollett, A., Casimiro, L. M., Mccarthy, P., Banfield, V., Hall, P., …, Wagner, S. (2011). Development and validation of the interpersonal collaborator assessment rubric (ICAR). Journal of Interprofessional Care, 25, 339-344. doi: 10.3109/13561820.2011.589542 https://doi.org/10.3109/13561820.2011.589542 DOI: https://doi.org/10.3109/13561820.2011.589542

Daigneault, P.-M., & Jacob, S. (2014). Unexpected but most welcome: Mixed methods for the validation and revision of the Participatory Evaluation Measurement Instrument. Journal of Mixed Methods Research, 8, 6-24. doi: 10.1177/1558689813486190 https://doi.org/10.1177/1558689813486190 DOI: https://doi.org/10.1177/1558689813486190

Donohoe, H., Stellefson, M., & Tennant, B. (2012). Advantages and limitations of the e-Delphi technique: Implications for health education researchers. American Journal of Health Education, 43, 38-46. doi: 10.1080/19325037.2012.10599216 https://doi.org/10.1080/19325037.2012.10599216 DOI: https://doi.org/10.1080/19325037.2012.10599216

Enosh, G., Tzafrir, S. S., & Stolovy, T. (2015). The development of Client Violence Questionnaire (CVQ). Journal of Mixed Methods Research, 9, 273-290. doi: 10.1177/1558689814525263 https://doi.org/10.1177/1558689814525263 DOI: https://doi.org/10.1177/1558689814525263

Fitzpatrick, J. L., Sanders, J. R., & Worthen, B. R. (2011). Program evaluation: Alternative approaches and practical guidelines (4th ed.). Boston, MA: Pearson.

Hsu, C.-C., & Sandford, B. A. (2007). The Delphi technique: Making sense of consensus. Practical Assessment, Research & Evaluation, 12(10). Available online: http://pareonline.net/getvn.asp?v=12&n=10 https://doi.org/10.7748/paed.19.7.10.s18 DOI: https://doi.org/10.7748/paed.19.7.10.s18

Johnson, R. B., & Onwuegbuzie, A. J. (2004). Mixed methods research: A research paradigm whose time has come. Educational Researcher, 33(7), 14-26. https://doi.org/10.3102/0013189X033007014 DOI: https://doi.org/10.3102/0013189X033007014

Johnson, R. B., Onwuegbuzie, A. J., & Turner, L. A. (2007). Toward a definition of mixed methods research. Journal of Mixed Methods Research, 1, 112-133. doi: 10.1177/1558689806298224 https://doi.org/10.1177/1558689806298224 DOI: https://doi.org/10.1177/1558689806298224

Loiacono, E. T., Watson, R. T., & Goodhue, D. L. (2007). WebQual: An instrument for consumer evaluation of web sites. International Journal of Electronic Commerce, 11(3), 51-87. doi: 10.2753/JEC1086-4415110302 https://doi.org/10.2753/JEC1086-4415110302 DOI: https://doi.org/10.2753/JEC1086-4415110302

Luyt, R. (2012). A framework for mixing methods in quantitative measurement development, validation, and revision: A case study. Journal of Mixed Methods Research, 6, 294-316. doi: 10.1177/1558689811427912 https://doi.org/10.1177/1558689811427912 DOI: https://doi.org/10.1177/1558689811427912

McKenna, H. P. (1994). The Delphi technique: a worthwhile research approach for nursing? Journal of Advanced Nursing, 19, 1221-1225. https://doi.org/10.1111/j.1365-2648.1994.tb01207.x DOI: https://doi.org/10.1111/j.1365-2648.1994.tb01207.x

Mertens, D. M., & Hesse-Biber, S. (2013). Mixed methods and credibility of evidence in evaluation. In D. M. Mertens & S. Hesse-Biber (Eds.), Mixed methods and credibility of evidence in evaluation. New Directions for Evaluation, 138, 5-13. https://doi.org/10.1002/ev.20053 DOI: https://doi.org/10.1002/ev.20053

Nitko, A. J., & Brookhart, S. M. (2011). Educational assessment of students (6th ed.). Boston, MA: Pearson.

Onwuegbuzie, A. J., Bustamante, R. M., & Nelson, J. A. (2010). Mixed research as a tool for developing quantitative instruments. Journal of Mixed Methods Research, 4, 56-78. doi: 10.1177/1558689809355805 https://doi.org/10.1177/1558689809355805 DOI: https://doi.org/10.1177/1558689809355805

Scriven, M. (1974). Prologue: Standards for the evaluation of educational programs and products. In G. D. Borich (Ed.), Evaluating educational programs and products (pp. 5-24). Englewood Cliffs, NJ: Educational Technology.

Strauss, A. & Corbin, J. (1998). Basics of qualitative research (2nd ed.). Thousand Oaks, CA: Sage.

Stufflebeam, D. L., & Coryn, C. L. S. (2014). Evaluation theory, models, and applications (2nd ed.). San Francisco, CA: Jossey-Bass.

Ungar, M., & Liebenberg, L. (2011). Assessing resilience across cultures using mixed methods: Construction of the Child and Youth Resilience Measure. Journal of Mixed Research Methods Research, 5, 126-149. doi: 10.1177/1558689811400607 https://doi.org/10.1177/1558689811400607 DOI: https://doi.org/10.1177/1558689811400607

Xie, K., Kim, M. K., Cheng, S.-L., & Luthy, N. C. (2017). Teacher professional development through digital content evaluation. Educational Technology Research and Development, 65, 1067-1103. https://doi.org/10.1007/s11423-017-9519-0 DOI: https://doi.org/10.1007/s11423-017-9519-0