Articles/abstracts on web versus paper surveys


Title:Administration of Web versus Paper Surveys: Mode Effects and Response Rates.
Author(s):Matz, C. MicheleSource:87 pp.Publication: Masters thesis
Date: 1999
A survey of academic reference librarians in North Carolina provided data for an examination of differences in survey administration on paper and the World Wide Web. Research via the Internet is becoming more attractive for many researchers, but the effects of this medium upon research outcomes have been little explored. This study examined in particular sampling and mode effects,
and response rates of Web surveys. The study found no sampling bias or mode effects in tests of the respondents' demographics and the content of responses. Response rates to Web surveys are not as high as traditional survey methods, and while responses are gathered more quickly, the paper instrument was not far behind. E-mail notices were more efficient for promoting the Web
survey than paper notices. Traditional postal surveys still hold some advantages over Web surveys. Researchers must weigh the advantages in cost and speed to justify use of such instruments. Appendices contain paper survey, Web survey, sample cover letters, selected statistical test results, and survey content summary.

Title:Comparing Responses to Mail and Web-based Surveys.
Author(s):Idleman, Lynda
Publication Date:2003
Source: Paper presented at the Annual Meeting of the American Educational
Research Association (Chicago, IL, April 21-25, 2003)
Abstract:After completing a survey in 2001 for a nonprofit library network, the researcher used the database to study
response rates and response consistency between two survey methods. Information on more than 1,400 potential respondents had been collected from the network's database and four other library databases. Half of the librarians (n=699) had provided contact information that included e-mail addresses. A traditional mailing procedure was used to collect information from those who had given only postal contact information (n=730); others received an e-mail survey, with radio buttons and drop-down boxes. The
response rates from the two methods were similar, and attitudes toward questions about the use of the Internet did not show a difference between the two groups of respondents. Librarians who responded to the postal survey were more likely to be from smaller institutions. The reliability estimates from each method were well within acceptable ranges, but the postal survey obtained
higher values than the Web-based method. However, the amount of missing data was significantly reduced when the Web-based survey was used. An appendix contains the survey instruments.

Title:Survey Response Rates and Survey Administration in Counseling and Clinical Psychology: A Meta-Analysis
Author(s):Van Horn, Pamela S.; Green, Kathy E.; Martinussen, Monica
Source:Educational and Psychological Measurement, v69 n3 p389-403 2009.
Abstract:This article reports results of a meta-analysis of survey response rates in published research in counseling and clinical
psychology over a 20-year span and describes reported survey administration procedures in those fields. Results of 308 survey administrations showed a weighted average response rate of 49.6%. Among possible moderators, response rates differed only by population sampled, journal in which articles were published, sampling source and method, and use of follow-up. Researchers whose studies were included in this meta-analysis used follow-up but rarely used incentives, prenotification, or other response-facilitation methods to maximize response rates. Although the future of survey research in general may rely more heavily on Internet data collection, mail surveys dominate in this field.

Title:Using Web Surveys to Reach Community College Students: An Analysis of Response Rates and Response Bias
Author(s):Sax, Linda J.; Gilmartin, Shannon K.; Lee, Jenny J.; Hagedorn, Linda Serra
Source:Community College Journal of Research and Practice, v32 n9 p712-729 Sep 2008. 18 pp.
Abstract:This study was designed to examine response rates and bias among a sample of community college students who received a district-wide survey by standard mail or e-mail. Findings suggest that predictors of response and types of responses are not appreciably different across paper and online mail-out samples when these samples are "matched" in terms of key
demographics. Rates of response, however, differ by mode of survey administration, gender, and race/ethnicity.

Comparing Response Rates in E-Mail and Paper Surveys: A Meta-Analysis .
By: Shih, Tse-Hua; Fan, Xitao.
Educational Research Review, v4 n1 p26-40 2009. (EJ869695)
This
meta-analysis examined 35 study results within last 10 years that directly compared the response rates of e-mail versus mail surveys. Individual studies reported inconsistent findings concerning the response rate difference between e-mail and mail surveys, but e-mail surveys generally have lower response rate (about 20% lower on the average) than mail surveys. Two study features (population type and follow-up reminders) could account for some variation in the e-mail and mail survey response rate differences across the studies. For the studies involving college populations, the response rate difference between e-mail and mail surveys was much smaller, or even negligible, suggesting that e-mail survey is reasonably comparable with mail survey for college populations. The finding about follow-up reminder as a statistically significant study feature turns out to be somewhat an anomaly. Other study features (i.e., article type, random assignment of survey respondents into e-mail and mail survey modes, and use of incentives) did not prove to be statistically useful in accounting for the variation of response rate differences between mail and e-mail surveys. The findings here suggest that, in this age of internet technology, mail survey is still superior to e-mail survey in terms of obtaining higher response rate.