Examining Factors Impacting Online Survey Response Rates in Educational Research: Perceptions of Graduate Students
Main Article Content
Abstract
Background: In educational research, online survey has become one of the most popular methods of data collection. Academic researchers, including faculty and students, expect and require a good response rate to their research projects for reliable results.
Purpose: In this paper, the authors examine a wide range of factors related to survey response rates in academic research. Examples include email checking habits, survey design, and attitudes toward research.
Setting: An online survey environment
Intervention: Not applicable.
Research Design: A cross-sectional quantitative research method was used to analyze the factors that influence participants’ email survey response rate. Data were collected at a single point in time. The authors did not directly measure changes that come over time in this study.
Data Collection and Analysis: After receiving the Institutional Research Board’s approval, the researchers distributed the survey via the American Educational Research Association (AERA) Graduate Student Discussion List subscribers. A sample of 454 responses was used in the final analysis-- with a 78.9 % response rate. The authors used descriptive statistics (percentage, average mean) and inferential statistics (chi-square and correlations) to report the data analysis and findings.
Findings: Results indicated that research survey response rate was highly influenced by interests of participants, survey structure, communication methods, and assurance of privacy and confidentiality. The findings also suggested that male participants were more likely to respond to surveys if they received a reminder, and older participants were more likely to respond if they were promised a reward.
Downloads
Article Details
![Creative Commons License](http://i.creativecommons.org/l/by-nc/4.0/88x31.png)
This work is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License.
Copyright and Permissions
Authors retain full copyright for articles published in JMDE. JMDE publishes under a Creative Commons Attribution-NonCommercial 4.0 International License (CC BY - NC 4.0). Users are allowed to copy, distribute, and transmit the work in any medium or format for noncommercial purposes, provided that the original authors and source are credited accurately and appropriately. Only the original authors may distribute the article for commercial or compensatory purposes. To view a copy of this license, visit creativecommons.org
References
American Educational Research Association (AERA) (2017). Who we are. Retrieved from http://www.aera.net/About-AERA/Who-We-Are
Aerny-Perreten, N., Domínguez-Berjón, M. F., Esteban-Vasallo, M. D., & García-Riolobos, C. (2015). Participation and factors associated with late or non-response to an online survey in primary care. Journal of Evaluation in Clinical Practice, 21(4), 688-693. https://doi.org/10.1111/jep.12367 DOI: https://doi.org/10.1111/jep.12367
Andrews, D., Nonnecke, B., & Preece, J. (2003). Electronic survey methodology: A case study in reaching hard-to-involve internet users. International Journal of Human-Computer Interaction, 16(2), 185. https://doi.org/10.1207/S15327590IJHC1602_04 DOI: https://doi.org/10.1207/S15327590IJHC1602_04
Asiu, B. W., Antone, C. M., & Fultz, M. I. (1998). Undergraduate perceptions of survey participation: Improving response rates and validity. A paper presented at the Annual Meeting of the Institutional Research, Minnesota.
Bosnjak, M., Neubarth, W., Couper, M.P., Bandille, W., & Kaczmire, L. (2008). Pre-notification in web-based access panel surveys - The influence of mobile text messaging versus email on response rates and sample composition. Social Science Computer Review, 26, 213-223. https://doi.org/10.1177/0894439307305895 DOI: https://doi.org/10.1177/0894439307305895
Bosnjak, M., & Tuten, T. L. (2003). Prepaid and promised incentives in web surveys. Social Science Computer Review, 21, 208-217. https://doi.org/10.1177/0894439303021002006 DOI: https://doi.org/10.1177/0894439303021002006
Converse, P. D., Wolfe, E. W., Huang, X., & Oswald, F. L. (2008). Response rates for mixed-mode surveys using mail and e-mail/web. American Journal of Evaluation, 29(1), 99-107. https://doi.org/10.1177/1098214007313228 DOI: https://doi.org/10.1177/1098214007313228
Couper, M. P., Conrad, F. G., & Tourangeau, R. (2007). Visual context effects in web surveys. Public Opinion Quarterly, 71, 623-634. https://doi.org/10.1093/poq/nfm044 DOI: https://doi.org/10.1093/poq/nfm044
Couper, M. P., Kapteyn, A., Schonlau, M., & Winter, J. (2007). Non-coverage and non-response in an internet survey. Social Science Research, 36, 131-148. https://doi.org/10.1016/j.ssresearch.2005.10.002 DOI: https://doi.org/10.1016/j.ssresearch.2005.10.002
Crawford, S. D., McCabe, S. E., & Pope, D. (2005). Applying web-based survey design standards. Journal of Prevention and Intervention in the Community, 29, 43-66. https://doi.org/10.1300/J005v29n01_04 DOI: https://doi.org/10.1300/J005v29n01_04
Dillman, D. A. (2007). Mail and Internet surveys: The total design method for surveys. New York, NY: Wiley.
Dillman, D. A., & Smyth, J. D. (2007). Design effects in the transition to web-based surveys. American Journal of Preventive Medicine, 32(55), 90-96. https://doi.org/10.1016/j.amepre.2007.03.008 DOI: https://doi.org/10.1016/j.amepre.2007.03.008
Dykema, J., Stevenson, J., Klein, L., Kim, Y., & Day, B. (2012). Effects of e-mailed versus mailed invitations and incentives on response rates, data quality, and costs in web survey of university faculty. Social Science Computer Review, 31(3), 359-370. https://doi.org/10.1177/0894439312465254 DOI: https://doi.org/10.1177/0894439312465254
Fan, W., & Yan, Z. (2010). Factors affecting response rates of the web survey: A systematic review. Computers in Human Behavior, 26, 132-139. https://doi.org/10.1016/j.chb.2009.10.015
Fosnacht, K., Sarraf, S., Howe, E., & Peck, L. K. (2017). How important are high response rates for college surveys? Review of Higher Education, 40(2), 245-265. https://doi.org/10.1353/rhe.2017.0003 DOI: https://doi.org/10.1353/rhe.2017.0003
Fox, G., Schwartz, A., & Hart, K. M. (2006). Work-family balance and academic advancement in medical schools. Academic Psychiatry, 30, 227-234. https://doi.org/10.1176/appi.ap.30.3.227 DOI: https://doi.org/10.1176/appi.ap.30.3.227
Gall, M. D., Gall, J. P., & Borg, W. R. (2007). Educational research: An introduction. New York, NY: Pearson.
Goritz, A. S. (2006). Incentives in web studies: Methodological issues and a review. International Journal of Internet Science, 1, 58-70. https://doi.org/10.4324/9780203014394-10 DOI: https://doi.org/10.4324/9780203014394-10
Greer, T. V., Chuchinprakan, N., & Seshadri, S. (2000). Likelihoods of participating in mail survey research: Business respondents' perspectives. Industrial Marketing Management, 29, 97-109. https://doi.org/10.1016/S0019-8501(98)00038-8 DOI: https://doi.org/10.1016/S0019-8501(98)00038-8
Groves, R. M., Cialdini, R. B., & Couper, M. P. (1992). Understanding the decision to participate in a survey. Public Opinion Quarterly, 56, 475-495. https://doi.org/10.1086/269338 DOI: https://doi.org/10.1086/269338
Handwerk, P., Carson, C., & Blackwell, K. (2000). Online versus paper-and-pencil surveying of students: A case study. A paper presented at the Annual Association of Institutional Research Conference, Cincinnati, Ohio.
Heerwegh, D., Vanhove, T., Matthijs, K., & Loosveldt, G. (2003). The effect of personalization on response rates and data quality in web surveys. International Public Opinion Quarterly, 40, 374-375.
Joinson, A. N., Woodley, A., & Reips, U.D. (2007). Personalization, authentication and self disclosure in self-administered Internet surveys. Computers in Human Behavior, 23, 275-285. https://doi.org/10.1016/j.chb.2004.10.012 DOI: https://doi.org/10.1016/j.chb.2004.10.012
Johnson, R. B., & Christensen, L. (2017). Educational research: Quantitative, qualitative, and mixed approaches (6th ed.). Thousand Oaks, CA: Sage.
Kaplowitz, M. D., Hadlock, T. D., & Levine, R. (2004). A comparison of web and mail survey response rates. Public opinion Quarterly, 68(1), 94-101. https://doi.org/10.1093/poq/nfh006 DOI: https://doi.org/10.1093/poq/nfh006
Koundinya, V., Klink, J., Deming, P., Meyers, A., & Erb, K. (2016). How do mode and timing of follow-up surveys affect evaluation success? Journal of Extension, 54(1). Retrieved from https://www.joe.org/joe/2016february/rb1.php https://doi.org/10.34068/joe.54.01.18 DOI: https://doi.org/10.34068/joe.54.01.18
Krejcie, R. V., & Morgan, D. W. (1970). Determining sample size for research activities. Educational and Psychological Measurement, 30, 607-610. https://doi.org/10.1177/001316447003000308 DOI: https://doi.org/10.1177/001316447003000308
Liu, M., & Wronski, L. (2017). Examining completion rates in web surveys via over 25,000 real-world surveys. Social Science Computer Review. Retrieved from http://journals.sagepub.com/doi/abs/10.1177/0894439317695581 https://doi.org/10.1177/0894439317695581 DOI: https://doi.org/10.1177/0894439317695581
Magro, M. J., Prybutok, V. R., & Ryan, S. D. (2015). How survey administration can affect response in electronic Surveys. Quality & Quantity: International Journal of Methodology, 49(5), 2145-2154. https://doi.org/10.1007/s11135-014-0098-4 DOI: https://doi.org/10.1007/s11135-014-0098-4
McPeake, J., Bateson, M., & O'Neill, A. (2014). Electronic surveys: how to maximise success. Nurse Researcher, 21(3), 24-26. https://doi.org/10.7748/nr2014.01.21.3.24.e1205 DOI: https://doi.org/10.7748/nr2014.01.21.3.24.e1205
Mercer, A., Caporaso, A, Cantor, D., & Townsend, R. (2015). How much gets you how much? Monetary incentives and response rates in household surveys. Public Opinion Quarterly, 79(1), 105-129. https://doi.org/10.1093/poq/nfu059 DOI: https://doi.org/10.1093/poq/nfu059
Muňoz-Leiva, F., Sánchez-Fernández, J., Montoro-Ríos, F., & Ibáňez-Zapata, J. A. (2010). Improving the response rate and quality in web-based surveys through the personalization and frequency of reminder mailings. Qual Quant, 44, 1037-1052. https://doi.org/10.1007/s11135-009-9256-5 DOI: https://doi.org/10.1007/s11135-009-9256-5
Pan, B., Woodside, A. G., & Meng, F. (2013). How contextual cues impact responses and conversion rates of online surveys. Journal of Travel Research, 53(1), 58-68. https://doi.org/10.1177/0047287513484195 DOI: https://doi.org/10.1177/0047287513484195
Perkins, R. A. (2011). Using research-based practices to increase response rates of web-based surveys. Educause Review Online. Available at http://www.educause.edu/ero/article/using-research-based-practices-increase-response-rates-web-based-surveys
Porter, S. R. (2004a). Raising response rates: What works? New Directions for Institutional Research, 121, 5-21. https://doi.org/10.1002/ir.97 DOI: https://doi.org/10.1002/ir.97
Porter, S. R. (2004b). Overcoming survey research problems: New directions for institutional research. London: Jossey-Bass.
Porter, S. R., & Whitcomb, M. E. (2005). Non-response in student surveys: The role of demographic engagement and personality. Research in Higher Education, 46(2), 127-152. https://doi.org/10.1007/s11162-004-1597-2 DOI: https://doi.org/10.1007/s11162-004-1597-2
Porter, S. R., & Whitcomb, M. E. (2003). The impact of lottery incentives on student survey response rates. Research in Higher Education, 44, 389-407. https://doi.org/10.1023/A:1024263031800 DOI: https://doi.org/10.1023/A:1024263031800
Petrovčič, A., Petrič, G., and Lozar Manfreda, K. (2016). The effect of email invitation elements on response rate in web survey within an online community. Computers in Human Behavior, 56(C), 320-329. https://doi.org/10.1016/j.chb.2015.11.025 DOI: https://doi.org/10.1016/j.chb.2015.11.025
Ravert, R. D., Gomez-Scott, J., & Donnellan, M. B. (2015). Equivalency of paper versus tablet computer survey data. Educational Researcher, 44(5), 308-310. https://doi.org/10.3102/0013189X15592845 DOI: https://doi.org/10.3102/0013189X15592845
Roberts, L. D., & Allen, P. J. (2015). Exploring ethical issues associated with using online surveys in educational research. Educational Research and Evaluation, 21(2), 95-108. https://doi.org/10.1080/13803611.2015.1024421 DOI: https://doi.org/10.1080/13803611.2015.1024421
Schonlau, M. Fricker, R.D., & Elliot, M. N. (2002). Conducting research surveys via mail and the web. Santa Monica, CA: Rand.
Shannon, D. M., & Bradshaw, C. C. (2002). A comparison of response rate, response time, and costs of mail and electronic surveys. Journal of Experimental Education, 70(2), 179-92. https://doi.org/10.1080/00220970209599505 DOI: https://doi.org/10.1080/00220970209599505
Sheehan, K. B. (2001). E-mail survey response rates: A review. Journal of Computer Mediated Communication, 6(2). Retrieved from http://jcmc.indiana.edu/vol6/issue2/sheehan.html https://doi.org/10.1111/j.1083-6101.2001.tb00117.x DOI: https://doi.org/10.1111/j.1083-6101.2001.tb00117.x
Sheehan, K. B., & McMillan, S. (1999). Response variation in e-mail surveys: An exploration. Journal of Advertising Research, 39(4), 45-54.
Shih, T. H., & Fan, W. X. (2008). Comparing response rates from web and mail surveys: A meta analysis. Field Methods, 20, 249-271. https://doi.org/10.1177/1525822X08317085 DOI: https://doi.org/10.1177/1525822X08317085
Silva, S.C., & Durante, P. (2014). Suggestions for international research using electronic surveys. The Marketing Review, 14(3), 297-309. https://doi.org/10.1362/146934714X14024779061992 DOI: https://doi.org/10.1362/146934714X14024779061992
Spruyt, B., & Van Droogenbroeck, F. (2014). Forewarned is forearmed? A survey-experiment concerning the impact of pre-notification letters on response in a postal survey. Irish Journal of Sociology, 22(2), 86-95. https://doi.org/10.7227/IJS.22.2.5 DOI: https://doi.org/10.7227/IJS.22.2.5
Trespalacios, J. H., & Perkins, R. A. (2016). Effects of personalization and invitation email length on web-based survey response rates. TechTrends, 60(4), 330-335. https://doi.org/10.1007/s11528-016-0058-z DOI: https://doi.org/10.1007/s11528-016-0058-z
Trouteaud, A. R. (2004). How you ask counts: A test of Internet-related components of response rates to a web-based survey. Social Science Computer Review, 22, 385-392. https://doi.org/10.1177/0894439304265650 DOI: https://doi.org/10.1177/0894439304265650
Uhlig, C.E., Seitz, B., Eter, N., Promesberger, J., & Busse, H. (2014). Efficacies of Internet-based digital and paper-based scientific surveys and the estimated costs and time for different sized cohorts. PIOS ONE, 9(10), 1-11. https://doi.org/10.1371/journal.pone.0108441 DOI: https://doi.org/10.1371/journal.pone.0108441
Vance, M. (2011). 7 questions consider regarding survey response rates. Research and Marketing Strategies Inc. Retrieved from https://rmsresults.com/2011/01/06/7-questions-to-consider-regarding-survey-response-rates-marketing-research-in-syracuse-ny/
Van Mol, C. (2017). Improving web survey efficiency: the impact of an extra reminder and reminder content on web survey response. International Journal of Social Research Methodology, 20(4), 317-327. https://doi.org/10.1080/13645579.2016.1185255 DOI: https://doi.org/10.1080/13645579.2016.1185255
Veen, F. V., Göritz, A. S., & Sattler, S. (2016). Response effects of prenotification, prepaid cash, prepaid vouchers, and postpaid vouchers. Social Science Computer Review, 34(3), 333-346. https://doi.org/10.1177/0894439315585074 DOI: https://doi.org/10.1177/0894439315585074
Yan, Z., & Fan, W. (2010). Factors affecting response rates of the web survey: A systematic review. Computers in Human Behaviors, 26, 132-139. https://doi.org/10.1016/j.chb.2009.10.015 DOI: https://doi.org/10.1016/j.chb.2009.10.015