Académique Documents
Professionnel Documents
Culture Documents
https://www.researchgate.net/publication/26674054
CITATIONS READS
165 830
2 authors, including:
Sharon Brown-Welty
California State University, Fresno
9 PUBLICATIONS 255 CITATIONS
SEE PROFILE
All content following this page was uploaded by Sharon Brown-Welty on 24 March 2014.
Published by:
http://www.sagepublications.com
Additional services and information for Evaluation Review can be found at:
Subscriptions: http://erx.sagepub.com/subscriptions
Reprints: http://www.sagepub.com/journalsReprints.nav
Permissions: http://www.sagepub.com/journalsPermissions.nav
Citations: http://erx.sagepub.com/content/33/5/464.refs.html
What is This?
Authors’ Note: Please address correspondence to Sharon Brown-Welty, California State Uni-
versity, Fresno, 5005 N. Maple Ave., MS ED 117, Fresno, CA 93740; e-mail:
sharonb@csufresno.edu.
464
Carini et al. 2003; Idleman 2003; Mertler 2003; Sax, Gillmartin, and Bryant
2003; Sun and McClanahan 2003; Kaplowitz, Hadlock, and Levine 2004;
McCabe 2004; Kiernan et al. 2005). The results of these studies, however,
are mixed. With the greater acceptance of an online environment and
increased connectivity, continued research into this mode of survey
administration is warranted.
One single administration mode can serve to provide ample data in some
situations but may not be the best approach. Dillman (2000) suggested that
the mixed-mode administration is a more dynamic approach to surveying.
He noted that the different approaches ‘‘provide an opportunity to compen-
sate for the weaknesses of each method’’ (p. 218). The ability to provide
multiple means of survey completion has been shown to have an impact
on response rate in many studies (e.g., McCabe 2004; Kiernan et al.
2005), and as increased response rate is a prime factor in survey validity,
serious consideration should be given to this method.
Although there have been mixed results from studies measuring
response rates between paper-based and Web-based surveys in the past,
recent studies have demonstrated an increase in Web-based response rates
as compared to paper-based response rates (Sax, Gillmartin, and Bryant
2003; McCabe 2004; Kiernan et al. 2005). These studies also suggest that
in similar populations where participants are provided multiple options,
they are more often choosing the Web-based survey method. Even so, pro-
viding response method choice has a direct impact on response rates, which
may then have an impact on the validity of the data analyses and results.
When conducting a survey, response rate is not the only consideration.
Researchers and evaluators must function within the constraints of budgets
and must consider which survey mode will meet the needs of the study
while not depleting too many resources. The costs associated with each sur-
vey mode become an important variable; one that must be measured.
Whether those costs consist of an overall total or are a part of an evaluation
plan, the survey administration cost need to be taken into account.
The costs associated with survey administration are many and varied
and depend on the administration mode chosen. The primary costs for most
surveys include the consumables (i.e., paper and postage) and labor (i.e.,
stuffing envelopes, inputting data, and creating the Web survey). The con-
sumables required to complete a paper-based survey are considerable and
can expend considerable resources. Postage for the prenotification, produc-
tion of the survey, and the reminder and possible follow-up survey can
quickly become costly (Shannon and Bradshaw 2002). Time spent attach-
ing addresses, stuffing and sealing envelopes can also amount to a
Purpose
The purpose of this study was twofold; first, it explored the possibility of
a significant difference in response rates between a Web-based survey
Study Process
Data Collection
The data collected and analyzed were response rates and total monetary
costs for the survey administration. Response rates were calculated using
the method described in the article Standard Definitions by The American
Association of Public Opinion Research. Researchers have noted that there
are multiple ways of calculating a response rate, and as a result, it is diffi-
cult to build on previous research as their computations do not always
Cost Analysis
Costs for each survey administration were calculated assuming the costs
associated with the creation of the survey were the same across all admin-
istration modes. An average cost-per-hour for the labor related to the survey
production was calculated using the average of the median annual income
of a secretary and the median annual income of an office clerk as was listed
in the United States Department of Labor’s Occupational Outlook Hand-
book (2006). Using this method, the average hourly rate was calculated
to be US$14.44. This hourly rate was multiplied by the total number of
hours spent on survey production.
The costs for the paper-based mode were calculated using the cost of
production and materials including the actual printing of the survey, post-
age, and the labor required to process the survey and results. Labor calcula-
tions for the paper-based administration included sorting pages, stuffing
envelopes, affixing postage, sealing envelopes, opening and sorting
responses, and entering data into an Excel spreadsheet.
Costs for the Web-based administration included the cost of the sub-
scription to Zoomerang (the Web-based survey creation tool) and time
spent creating and sending the Web-based surveys. The annual subscription
cost for Zoomerang was US$400, which is the price of a nonprofit
subscription.
The costs for the mixed-mode administration included all of the costs
associated with both the paper-based administration and the Web-based
administration. In addition, the mixed-mode administration required
completing the tasks associated with both the previous survey administra-
tion modes.
Survey
Study Design
Results
The results of the current study demonstrated a significant difference in
response rates and costs between the paper-based administration, the
Web-based administration, and the mixed-mode administration. The
mixed-mode survey administration produced the greatest response rate but
at a considerably greater cost. The Web-based administration produced
greater results than did the paper-based administration overall and was
substantially less costly to administer.
Table 1
Response Rate for AEA Employment Survey Administration
Groups
Mode N # Responses RR2
Table 2
Response Rate by Response Type
Mode n # Responses RR2
a
Paper-based 2561 774 39.1%
Web-based 2561a 1223 61.7%
Total 1997b
Response Rate
The total response rate for the AEA Employment survey was 51.58%,
which was a higher response rate than many response rates reported in the
literature (e.g., Sax, Gillmartin, and Bryant 2003; Converse et al. 2008).
When comparing response rates, the mixed-mode survey administration
received the largest response rate (60.27%) with the Web-based administra-
tion following at a 52.46% response rate (see Table 1). Table 2 includes the
breakdown of response rates (RR2) by response type (paper-based and
Web-based) regardless of administration mode. When examining the data
solely by paper-based or Web-based administration (in this analysis, the
mixed-mode was separated into responses received through either the
paper-based survey or the Web-based survey), the response rate for the
Web-based administration was 61.7% as compared to the paper-based
administration with a response rate of 39.1%.
Table 3
Cost Breakdown by Administration Group
Hours Total Cost Cost Per Response
Cost
A cost-per-response figure was calculated for each administration mode
based on the method used by Kaplowitz, Hadlock, and Levine (2004).
The Web-based survey was by far the least costly administration
(US$0.64 per response) followed by the mixed-mode administration
(US$3.61 per response). The most costly survey administration in this study
was the paper-based administration at US$4.78 per response. The cost
breakdown for each administration mode is outlined in Table 3. A graphical
representation of the response rate and costs is provided in Figure 1.
Discussion
The results of this study show that the use of Web-based survey admin-
istration produced higher response rates when administered to an educated
population with access to computers. The overall cost per response was
notably less expensive than the paper-based administration and the effort
Figure 1
Response Rates and Response Cost
80%
70%
30%
and their computer skills (perceived or reported) were not a factor in group
placement.
The results of the study provided overwhelming support for the cost-
effectiveness of Web-based survey administrations. In addition, tasks
involved with the paper-based administration were time-consuming
(Archer 2003) and included standing in line for postage, meeting with the
printer, and deciphering the costs associated with bulk mail. With the
Web-based administration, once the subscription is paid and the survey is
created, it can be easily sent to additional respondents without cost.
Furthermore, reminders can be sent and tracked easily within the Web site.
An additional factor that becomes very important in comparing the cost
associated with survey administrations is the time required for data entry.
The paper-based results of the current survey were hand entered into an
Excel spreadsheet for use in data analysis. The data input process required
a total of 39 hours to complete. The Web-based data were already in a
digital format and only needed to be downloaded for analysis. The accuracy
of the data is also a factor that must be noted with Web-based survey
administrations. The process of inputting responses onto a spreadsheet can
lend itself to error. These errors might be avoided by allowing the partici-
pant to directly record their responses.
Limitations
The sample used in the current study may limit the ability of the results
to be generalized to other populations. The respondents in this study
appeared to have a particular interest in the results of the survey and might
have felt a professional obligation toward completing the survey. Further-
more, these participants may be more computer literate and have greater
computer accessibility due to the nature of their work. A word of caution
in generalizing these results may be that Web-based surveys should be
considered for working professionals or for responses desired from individ-
uals who are known to have access to and be users of the Internet. Some
examples of populations who might be targeted for Web-based surveys
might include postsecondary education employees, PreK-12 employees,
employees in technology fields, medical field employees, or surveys that
are targeted at issues related to employment.
The bias that may be present in the response rate may be due to the fact
that the respondents in this study regularly use surveys as tools of their
trade. Other professionals (e.g., psychologists, researchers, and marketing
the manipulation of the variable saliency can be difficult but may lead to
promising results. The comparison of response rates from a survey directly
related to the goals and needs of an association and a comparison group
consisting of nonmembers might be a logical next step for this research.
Concluding Discussion
References
Archer, Thomas M. 2003. Web-based surveys. Journal of Extension 41:1-5.
Carini, Robert M., John Hayek, George Kuh, John Kennedy, and Judith Ouimet. 2003. College
student responses to web and paper surveys: Does mode matter? Research in Higher Edu-
cation 44:1-19.
Cobanoglu, Cihan, Bill Warde, and Patrick Moreo. 2001. A comparison of mail, fax, and web-
based survey methods. International Journal of Market Research 43:441-52.
Converse, Patrick D., Edward Wolfe, Xiaoting Huang, and Fredrick Oswald. 2008. Response
rates for mixed-mode surveys using mail and e-mail/web. American Journal of Evaluation
29:99-107.
Cook, Colleen, Fred Heath, and Russel L. Thompson. 2000. A meta-analysis of response rates
in web- or internet-based surveys. Educational and Psychological Measurement 60:821-
36.
Dillman, Donald A. 2000. Mail and internet surveys: The Tailored Design method. 2nd ed.
New York: John Wiley & Sons, Inc.
Groves, Robert M., Eleanor Singer, and Amy Corning. 2000. Leverage-saliency theory of sur-
vey participation: Description and an illustration. Public Opinion Quarterly 64:299-308.
Groves, Robert M., Stanley Presser, and Sarah Dipko. 2004. The role of topic interest in sur-
vey participation decisions. Public Opinion Quarterly 68:2-31.
Heberlein, Thomas, and Robert Baumgartner. 1978. Factors affecting response rates to mailed
questionnaires: A quantitative analysis of the published literature. American Sociological
Review 43:447-62.
Idleman, Lynda. 2003. Comparing responses to mail and web-based surveys. Paper presented
at the annual meeting of the American Educational Research Association, April 21–25 in
Chicago, IL.
Kaplowitz, Michael, Timothy Hadlock, and Ralph Levine. 2004. A comparison of web and
mail survey response rates. Public Opinion Quarterly 68:94-101.
Kiernan, Nancy Ellen, Michaela Kiernan, Mary Ann Oyler, and Carolyn Gilles. 2005. Is a web
survey as effective as a mail survey? A field experiment among computer users. American
Journal of Evaluation 26:245-52.
Krosnick, Jon A. 1999. Survey research. Annual Review of Psychology 50:537-67.
McCabe, Sean Esteban. 2004. Comparison of web and mail surveys in collecting illicit drug
use data: A randomized experiment. Journal of Drug Education 34:61-72.
Mertler, Craig. 2003. What . . . another survey? Patterns of response and nonresponse from
teachers to traditional and web surveys. Paper presented at the annual meeting of the
Mid-Western Educational Research Association, October 15–18 in Columbus, OH.
Occupational Outlook Handbook. 2006. Bureau of Labor Statistics http://www.bls.gov/oco/
home.htm (accessed April 2, 2006).
Paolo, Anthony M., Giulia Bonaminio, Cheryl Gibson, Ty Partridge, and Ken Kallail. 2000.
Response rate comparisons of e-mail and mail-distributed student evaluations. Teaching
and Learning in Medicine 12:81-4.
Sax, Linda J., Shannon Gillmartin, and Alyssa Bryant. 2003. Assessing response rates and
nonresponse bias in web and paper surveys. Research in Higher Education 44:409-32.
Schonlau, Mathias, Ronald Fricker, and Marc Elliott. 2002. Conducting research surveys via
e-mail and the web. Santa Monica, CA: Rand.
Shannon, David, and Carol Bradshaw. 2002. A comparison of response rate, response time,
and costs of mail and electronic surveys. The Journal of Experimental Education
70:179-92.
Shannon, David, Todd Johnson, Shelby Searcy, and Alan Lott. 2002. Using electronic sur-
veys: Advice from survey professionals. Practical Assessment, Records and Evaluation.
http://pareonline.net/htm/v8n/htm (accessed June 19, 2009).
Sheehan, Kim. 2001. E-mail survey response rates: A review. Journal of Computer Mediated
Communication 6:1-20.
Sun, Anji, and Randy McClanahan. 2003. Is newer better: A comparison of web and paper-
pencil survey administration modes. Paper presented at the annual meeting of the Amer-
ican Research Association, April 21–25, in Chicago, IL.
Corey Greenlaw, EdD, is the evaluation and assessment coordinator for the Fresno County
Office of Education. His work centers on helping teachers and administrators improve the
impact of instruction on student learning through the use of assessment data. In addition, he
is a part-time faculty member teaching statistical and research methods in the Kremen School
of Education and Human Development at California State University, Fresno.
Sharon Brown-Welty, EdD, is a professor of education administration and the director of the
doctoral program in Educational Leadership at California State University, Fresno. She has
conducted research in the areas of program evaluation, educational leadership, and student
achievement.