Vous êtes sur la page 1sur 19

See discussions, stats, and author profiles for this publication at:

https://www.researchgate.net/publication/26674054

A Comparison of Web-Based and Paper-Based


Survey Methods Testing Assumptions of Survey
Mode and Response Cost

Article  in  Evaluation Review · August 2009


DOI: 10.1177/0193841X09340214 · Source: PubMed

CITATIONS READS

165 830

2 authors, including:

Sharon Brown-Welty
California State University, Fresno
9 PUBLICATIONS   255 CITATIONS   

SEE PROFILE

All content following this page was uploaded by Sharon Brown-Welty on 24 March 2014.

The user has requested enhancement of the downloaded file.


Evaluation Review
http://erx.sagepub.com/

A Comparison of Web-Based and Paper-Based Survey Methods :


Testing Assumptions of Survey Mode and Response Cost
Corey Greenlaw and Sharon Brown-Welty
Eval Rev 2009 33: 464 originally published online 15 July 2009
DOI: 10.1177/0193841X09340214

The online version of this article can be found at:


http://erx.sagepub.com/content/33/5/464

Published by:

http://www.sagepublications.com

Additional services and information for Evaluation Review can be found at:

Email Alerts: http://erx.sagepub.com/cgi/alerts

Subscriptions: http://erx.sagepub.com/subscriptions

Reprints: http://www.sagepub.com/journalsReprints.nav

Permissions: http://www.sagepub.com/journalsPermissions.nav

Citations: http://erx.sagepub.com/content/33/5/464.refs.html

>> Version of Record - Sep 3, 2009

OnlineFirst Version of Record - Jul 15, 2009

What is This?

Downloaded from erx.sagepub.com by guest on February 20, 2013


Evaluation Review
Volume 33 Number 5
October 2009 464-480
# 2009 SAGE Publications
A Comparison of 10.1177/0193841X09340214
http://erx.sagepub.com
Web-Based and hosted at
http://online.sagepub.com
Paper-Based Survey
Methods
Testing Assumptions of Survey
Mode and Response Cost
Corey Greenlaw
Fresno County Office of Education
Sharon Brown-Welty
California State University, Fresno

Web-based surveys have become more prevalent in areas such as evaluation,


research, and marketing research to name a few. The proliferation of these
online surveys raises the question, how do response rates compare with tra-
ditional surveys and at what cost? This research explored response rates and
costs for Web-based surveys, paper surveys, and mixed-mode surveys. The
participants included evaluators from the American Evaluation Association
(AEA). Results included that mixed-mode, while more expensive, had higher
response rates.

Keywords: Web-based surveys; response rates; online surveys; mixed-


mode surveys; survey costs

S urveys are an integral part of an evaluator’s toolkit. They can be an


effective means of collecting subjects’ opinions, demographics, or
feedback in a straightforward and potentially low-cost manner. Although
the data gathered through surveys can be useful, there are certain considera-
tions (such as the validity of the data analysis due to low response rates and
the cost of producing and administering the survey), which may affect the
decision about the method of data collection chosen.

Authors’ Note: Please address correspondence to Sharon Brown-Welty, California State Uni-
versity, Fresno, 5005 N. Maple Ave., MS ED 117, Fresno, CA 93740; e-mail:
sharonb@csufresno.edu.

464

Downloaded from erx.sagepub.com by guest on February 20, 2013


Greenlaw, Brown-Welty / A Comparison of Web-Based and Paper-Based Survey Methods 465

Another consideration important to evaluators is the validity of the sur-


vey results as a result of response rates (Krosnick 1999; Cook, Heath, and
Thompson 2000; Paolo et al. 2000; Sheehan 2001; Idleman 2003; Mertler
2003; McCabe 2004; Kiernan et al. 2005). If one aspect of survey metho-
dology consistently produces significantly more responses from partici-
pants, there is a greater likelihood questions can be answered and error
can be minimized (Dillman 2000). Response rates, particularly low
response rates, were the subject of a study by Cook, Heath, and Thompson
(2000). Specifically, a mean response rate of 39.6% was found in a meta-
analysis of response rates where the researchers reviewed response rates for
68 surveys included in 49 different studies. When Cook, Heath, and
Thompson only looked at surveys where there were no missing data, the
mean response rate was even lower, 34.6%. In contrast, Converse et al.
(2008) conducted a study that explored response rates using a mixed-
mode of e-mail/mail and mail/e-mail administrations that resulted in a
higher response rate. The subject of the survey was salient to participants
in the study as it was related to the National Board for Professional Teach-
ing Standards assessment for recognizing accomplished educators.
The survey by Converse et al. (2008) was sent to 1,500 PreK-12 teachers
in two states and yielded an overall response rate of 76.3%, a higher
response rate than is normally seen in survey response rates (Cook, Heath,
and Thompson 2000; Sax, Gillmartin, and Bryant 2003). For the initial con-
tact in the study of Converse et al., 61.3% responded and an additional
15.0% responded to the second contact. Disaggregated further, there was
an 80.7% response rate for the first contact of the mail-survey mode and
a 41.8% response rate for the first contact of the e-mail-mode administra-
tion. The authors calculate the costs for the mail/e-mail survey to be higher
than the e-mail/mail administration (US$5.32 for the former administration
as compared to US$4.95 for the latter; p. 105).
Web-based surveys have been purported to be a means of collecting data
from large sample groups quickly and with minimal cost (Schonlau,
Fricker, and Elliott 2002). The design, dissemination, data storage, and data
analysis of Web-based surveys is efficient and is becoming more
user-friendly with the introduction of multiple survey Web sites (e.g.,
Zoomerang.com). Furthermore, using Web-based surveys and a mixed-
mode approach can increase the response rate that may result in a more
valid analysis of the data collected. The effectiveness of this mode of sur-
vey data collection has been researched using many different populations
and in many different settings (Heberlein and Baumgartner 1978;
Cobanoglu, Warde, and Moreo 2001; Shannon et al. 2002; Archer 2003;

Downloaded from erx.sagepub.com by guest on February 20, 2013


466 Evaluation Review

Carini et al. 2003; Idleman 2003; Mertler 2003; Sax, Gillmartin, and Bryant
2003; Sun and McClanahan 2003; Kaplowitz, Hadlock, and Levine 2004;
McCabe 2004; Kiernan et al. 2005). The results of these studies, however,
are mixed. With the greater acceptance of an online environment and
increased connectivity, continued research into this mode of survey
administration is warranted.
One single administration mode can serve to provide ample data in some
situations but may not be the best approach. Dillman (2000) suggested that
the mixed-mode administration is a more dynamic approach to surveying.
He noted that the different approaches ‘‘provide an opportunity to compen-
sate for the weaknesses of each method’’ (p. 218). The ability to provide
multiple means of survey completion has been shown to have an impact
on response rate in many studies (e.g., McCabe 2004; Kiernan et al.
2005), and as increased response rate is a prime factor in survey validity,
serious consideration should be given to this method.
Although there have been mixed results from studies measuring
response rates between paper-based and Web-based surveys in the past,
recent studies have demonstrated an increase in Web-based response rates
as compared to paper-based response rates (Sax, Gillmartin, and Bryant
2003; McCabe 2004; Kiernan et al. 2005). These studies also suggest that
in similar populations where participants are provided multiple options,
they are more often choosing the Web-based survey method. Even so, pro-
viding response method choice has a direct impact on response rates, which
may then have an impact on the validity of the data analyses and results.
When conducting a survey, response rate is not the only consideration.
Researchers and evaluators must function within the constraints of budgets
and must consider which survey mode will meet the needs of the study
while not depleting too many resources. The costs associated with each sur-
vey mode become an important variable; one that must be measured.
Whether those costs consist of an overall total or are a part of an evaluation
plan, the survey administration cost need to be taken into account.
The costs associated with survey administration are many and varied
and depend on the administration mode chosen. The primary costs for most
surveys include the consumables (i.e., paper and postage) and labor (i.e.,
stuffing envelopes, inputting data, and creating the Web survey). The con-
sumables required to complete a paper-based survey are considerable and
can expend considerable resources. Postage for the prenotification, produc-
tion of the survey, and the reminder and possible follow-up survey can
quickly become costly (Shannon and Bradshaw 2002). Time spent attach-
ing addresses, stuffing and sealing envelopes can also amount to a

Downloaded from erx.sagepub.com by guest on February 20, 2013


Greenlaw, Brown-Welty / A Comparison of Web-Based and Paper-Based Survey Methods 467

significant time and cost investment. In addition, once the survey is


returned, the data are usually entered into a database or spreadsheet, which
can be laborious as well as time-consuming (Cobanoglu, Warde, and
Moreo 2001). At times, the costs associated with the paper-based adminis-
tration mode can be minimized by using graduate students or volunteers.
For an evaluator or researcher in private practice or within a company,
however, these costs must be accounted for and included in their budgets.
The costs associated with the Web-based survey administration are dif-
ferent than those associated with the paper-based survey and tend to be less
costly. One cost for Web-based surveys that may be higher than for paper-
based surveys are the costs associated with labor; the survey must be
entered into a Web site designed for survey administration. Authors have
reported that creating the survey online was a significant cost and required
specialized knowledge (Shannon and Bradshaw 2002). However, as tech-
nology has improved, the user interface for these types of Web sites has
made this input process much less time-consuming, and therefore, less
costly. It is now possible to create a survey, upload e-mail addresses, and
distribute the survey in a few hours, at relatively little expense as compared
with the mailing expense of paper-based surveys. In addition, when
participants reply to the survey, the results are immediately recorded on the
Web site for later download. The electronic reply eliminates the data input
process, which further reduces the time element as well as decreases the
potential for transcription errors.
Comparing costs related to different administration modes has proven to
be somewhat difficult because of the way in which survey costs are calcu-
lated. Furthermore, the data reported for survey costs associated with admin-
istration mode were varied within the literature. In studies by Cobanoglu,
Warde, and Moreo (2001) and Shannon and Bradshaw (2002), only the total
cost of each administration type was used in contrast to Archer (2003), who
noted each individual cost related to each survey administration. In a com-
pletely different fashion, Kaplowitz, Hadlock, and Levine (2004) reported
the cost for each survey administration by cost per response. This final
method, cost per response, provided a compelling blend of both response rate
data and the calculation of the costs required to obtain each response.

Purpose

The purpose of this study was twofold; first, it explored the possibility of
a significant difference in response rates between a Web-based survey

Downloaded from erx.sagepub.com by guest on February 20, 2013


468 Evaluation Review

administration, a traditional paper-based survey administration, and a


mixed-mode method that used both Web and paper-based methods. The
second issue examined in the study was which survey mode was the most
cost-effective mode of survey administration.

Study Process

To study the difference between Web-based and paper-based survey


response rates, collaboration was formed with the American Evaluation
Association (AEA) to create a survey that would collect data regarding sal-
ary and work-related information from its membership. The approximately
4,000 members are generally very well educated and technically literate.
The employment survey served as the vehicle for a study to assess the
effectiveness of Web-based surveys as well as the associated cost of admin-
istration. The AEA was chosen primarily for two reasons; employment
information was needed and useful to the organization and the evaluative
focus of the association would increase the saliency of the data, which
could increase response rates (Groves, Singer, and Corning 2000; Groves,
Presser, and Dipko 2004). All members were randomly assigned to one of
three groups, so variables such as computer use, age, and issue saliency
were not factors in the final analysis. Other groups of professionals (e.g.,
lawyers, educators, etc.) would be closely related to the general member-
ship of the AEA and would therefore benefit from the results of this study.
The study drew survey responses from all of the general membership of
the AEA. Only residents of the United States were included in the study to
reduce the complication of monetary exchange rates. Participants also had a
listed address and a listed e-mail address to allow for random placement in
treatment and control groups. There were 1,986 total participants in this
study with 96% of the respondents holding a master or doctoral degree.

Data Collection
The data collected and analyzed were response rates and total monetary
costs for the survey administration. Response rates were calculated using
the method described in the article Standard Definitions by The American
Association of Public Opinion Research. Researchers have noted that there
are multiple ways of calculating a response rate, and as a result, it is diffi-
cult to build on previous research as their computations do not always

Downloaded from erx.sagepub.com by guest on February 20, 2013


Greenlaw, Brown-Welty / A Comparison of Web-Based and Paper-Based Survey Methods 469

match. To alleviate this obstacle, AAOPR suggested using a calculation of


response rate that is the industry standard. Using this calculation allows
future studies to use the collected data in a valid manner as well as attempt
to replicate the study.
The AAOPR (2006) recommendation of using the calculations that com-
pose Response Rate 2 (RR2) best addressed the factors involved in the
study. The calculation of this formula is the number of completed surveys
(completed and partially completed) divided by the number of completed
surveys (completed and partially completed) plus the number of all non-
completed surveys (sent to a potential participant but not completed as a
result of incorrect e-mail address, incorrect mailing address, or other rea-
sons). The use of this formula standardizes response rates so that response
rate comparisons can be undertaken within the literature.

Cost Analysis
Costs for each survey administration were calculated assuming the costs
associated with the creation of the survey were the same across all admin-
istration modes. An average cost-per-hour for the labor related to the survey
production was calculated using the average of the median annual income
of a secretary and the median annual income of an office clerk as was listed
in the United States Department of Labor’s Occupational Outlook Hand-
book (2006). Using this method, the average hourly rate was calculated
to be US$14.44. This hourly rate was multiplied by the total number of
hours spent on survey production.
The costs for the paper-based mode were calculated using the cost of
production and materials including the actual printing of the survey, post-
age, and the labor required to process the survey and results. Labor calcula-
tions for the paper-based administration included sorting pages, stuffing
envelopes, affixing postage, sealing envelopes, opening and sorting
responses, and entering data into an Excel spreadsheet.
Costs for the Web-based administration included the cost of the sub-
scription to Zoomerang (the Web-based survey creation tool) and time
spent creating and sending the Web-based surveys. The annual subscription
cost for Zoomerang was US$400, which is the price of a nonprofit
subscription.
The costs for the mixed-mode administration included all of the costs
associated with both the paper-based administration and the Web-based
administration. In addition, the mixed-mode administration required

Downloaded from erx.sagepub.com by guest on February 20, 2013


470 Evaluation Review

completing the tasks associated with both the previous survey administra-
tion modes.

Survey

The survey created was an employment survey that collected informa-


tion about employment status and compensation as an evaluator. Once the
general framework of the survey was completed, and to establish content
validity, a draft was sent via e-mail to four AEA members who are
professionals in survey research. Each of the professionals was asked to
take the survey. A meeting was held at the AEA national conference in
Toronto, to which all of the four professionals noted above were invited
to discuss the survey instrument and their experience while taking it.
During the meeting, suggestions were made and the survey was refined
to better meet the needs of the AEA. In addition, suggestions were made
to increase readability and to change some of the survey format to
increase response rates. A final draft of the survey was then sent to the
committee members for final feedback.

Study Design

The study consisted of a random assignment design in which partici-


pants were randomly assigned to one of three survey administration groups.
Group 1 (Web-based administration) consisted of members who
received the survey instrument in Web-based form. These members
received the survey invitation and follow-up reminders about the survey
only through e-mail. They were able to access the survey through a link
embedded in the e-mail message. This link appeared in the primary survey
invitation message and the reminder message sent to the participant.
When respondents clicked on the embedded link, a Web page opened
and they had access to the introduction to the survey. Once the participant
began the survey, he or she was able to return to complete the survey if
needed by clicking on the link provided in the e-mail. A unique identifica-
tion was created for that survey by the e-mailed link. When the respondent
completed and submitted the survey, he or she was not able to modify or
change answers.
Group 2 (paper-based administration) received their notification and the
actual survey instrument through the United States Postal Service. All

Downloaded from erx.sagepub.com by guest on February 20, 2013


Greenlaw, Brown-Welty / A Comparison of Web-Based and Paper-Based Survey Methods 471

reminders took place through regular United States Postal Services.


Reminders and invitations were sent to the address listed in the AEA in
their membership roster.
Group 3 (mixed-mode administration) received both the above treat-
ment conditions. They received notification through the Internet and
through the United States Postal Service. They had the opportunity to com-
plete the survey online or complete the paper and pencil instrument that
was sent to them. Communications were conducted through the mail and
through e-mail.
A reminder postcard was sent to all members of the paper-based group
2 weeks following the initial administration of the survey. The postcard
reminded the potential respondent to complete the survey they received and
to return it in the envelope provided. The postcard also contained an e-mail
address where the respondent could e-mail the author for an additional copy
of the survey.
A unique number was assigned to each of the members included in the
mixed-mode administration group and was placed on the survey mailed to
that individual. Web-based responses were matched with the individuals in
the mixed-mode group through the use of a query in a database. The com-
pleted paper-based surveys were compared to the resulting database to
identify and eliminate any duplicates.
During the pilot study, the expected amount of time it would take a
respondent to complete the survey was established to be below 10 minutes.
This respondent burden was included in the cover letter for both the paper-
based and Web-based survey. The AEA requested that the data be collected
because it did not exist in any databases and the study collected two types of
useable data; data to explore survey response rates and data about evalua-
tor’s salaries and workloads, both of which are included under the concept
of managing respondent burden.

Results
The results of the current study demonstrated a significant difference in
response rates and costs between the paper-based administration, the
Web-based administration, and the mixed-mode administration. The
mixed-mode survey administration produced the greatest response rate but
at a considerably greater cost. The Web-based administration produced
greater results than did the paper-based administration overall and was
substantially less costly to administer.

Downloaded from erx.sagepub.com by guest on February 20, 2013


472 Evaluation Review

Table 1
Response Rate for AEA Employment Survey Administration
Groups
Mode N # Responses RR2

Paper-based 1,280 538 42.03%


Web-based 1,281 672 52.46%
Mixed-mode 1,281 772 60.27%
Total 3,842 1,982 51.58%

Note: AEA ¼ American Evaluation Association; RR2 ¼ Response Rate 2.

Table 2
Response Rate by Response Type
Mode n # Responses RR2
a
Paper-based 2561 774 39.1%
Web-based 2561a 1223 61.7%
Total 1997b

Note: RR2 ¼ Response Rate 2.


a
Includes total number of potential respondents for each mode of survey completion. Mixed-
mode group members appear in each mode.
b
Includes 15 duplicate responses that were pulled before the final analysis.

Response Rate
The total response rate for the AEA Employment survey was 51.58%,
which was a higher response rate than many response rates reported in the
literature (e.g., Sax, Gillmartin, and Bryant 2003; Converse et al. 2008).
When comparing response rates, the mixed-mode survey administration
received the largest response rate (60.27%) with the Web-based administra-
tion following at a 52.46% response rate (see Table 1). Table 2 includes the
breakdown of response rates (RR2) by response type (paper-based and
Web-based) regardless of administration mode. When examining the data
solely by paper-based or Web-based administration (in this analysis, the
mixed-mode was separated into responses received through either the
paper-based survey or the Web-based survey), the response rate for the
Web-based administration was 61.7% as compared to the paper-based
administration with a response rate of 39.1%.

Downloaded from erx.sagepub.com by guest on February 20, 2013


Greenlaw, Brown-Welty / A Comparison of Web-Based and Paper-Based Survey Methods 473

Table 3
Cost Breakdown by Administration Group
Hours Total Cost Cost Per Response

Paper-based (RR2 ¼ 42.03%) 57 US$2,573 US$4.78


Web-based (RR2 ¼ 52.46%) 2 US$429 US$0.64
Mixed-mode (RR2 ¼ 60.27%) 44 US$2,785 US$3.61

Note: RR2 ¼ Response Rate 2.

To test whether there was a significant difference between the


paper-based administration, Web-based administration, or mixed-mode
administration, a one-way analysis of variance was performed. The results
indicated there was a significant difference between methods of survey
administration (F(2, 3,840) ¼ 44.799, p < .001). The results of a Tukey’s
honestly significant difference (HSD) analysis revealed that all response
rate differences were significant at the p < .01 level. Specifically, the
mixed-mode administration produced the greatest number of responses.
Finally, related to response rates, there were no significant differences in
response rates for either education level or gender.

Cost
A cost-per-response figure was calculated for each administration mode
based on the method used by Kaplowitz, Hadlock, and Levine (2004).
The Web-based survey was by far the least costly administration
(US$0.64 per response) followed by the mixed-mode administration
(US$3.61 per response). The most costly survey administration in this study
was the paper-based administration at US$4.78 per response. The cost
breakdown for each administration mode is outlined in Table 3. A graphical
representation of the response rate and costs is provided in Figure 1.

Discussion

The results of this study show that the use of Web-based survey admin-
istration produced higher response rates when administered to an educated
population with access to computers. The overall cost per response was
notably less expensive than the paper-based administration and the effort

Downloaded from erx.sagepub.com by guest on February 20, 2013


474 Evaluation Review

Figure 1
Response Rates and Response Cost

80%

70%

60% Mixed−mode (N = 772, RR2 = 60.27%, US$3.61/response)*


Response rates

Web−based (N = 623, RR2 = 52.46%, US$0.64/response)*


50%

Paper−based (N = 538, RR2 = 42.03%, US$4.78/response)*


40%

30%

*Difference significant p < .01

Adapted from Kaplowitz, Hadlock, and Levine 2004.

necessary to produce and distribute the Web-based survey was markedly


reduced.
This finding is in contradiction with the findings of the research under-
taken by Converse et al. (2008). The reason for this difference may be in the
benefit the survey rendered to the respondents. In the case of Converse
et al., respondents were completing an evaluation instrument in which there
was a possibility that they would benefit personally; recognition as an
accomplished educator. In the survey administered in this study, the data

Downloaded from erx.sagepub.com by guest on February 20, 2013


Greenlaw, Brown-Welty / A Comparison of Web-Based and Paper-Based Survey Methods 475

would be valuable indirectly to the individual (e.g., Were they making


competitive salaries as compared to peers in the field?). Although the find-
ings of Converse et al. of the actual cost incurred for each survey-mode
administration was different than the costs calculated for this study, they
did find that the paper-based survey mode was more costly than the
Web-based survey mode, as was also found in this study.
While the mixed-mode group produced the greatest response rates, more
than two thirds of the responses in the mixed-mode group came from the
Web-based survey and more than 60% of the total survey responses came
via the Web-based survey. With the advent and greater use of Internet
surveys to collect multiple types of data, the current study demonstrates
how response rates can be improved with the use of Web-based surveys.
Dillman (2000) noted that ‘‘Mixed-mode surveys provide an opportunity
to compensate for the weaknesses of each method’’ (p. 218). Survey
response rates from this study demonstrate that Dillman’s suggestion was
accurate. Respondents in the mixed-mode group had the option to choose
either the paper-based format or the Web-based format to complete the
survey. The significant difference in response rate for the mixed-mode sug-
gests that people are more likely to respond when they can choose the
response format. Some respondents choose not to respond because they
were unfamiliar with technology and did not feel they have the technologi-
cal capabilities to complete an online survey. Although these numbers will
continue to diminish as computers become even more prevalent in society,
for now it is important to consider how the format of a survey administra-
tion may affect response rates.
This study found that the mixed-mode survey administration produced
the greatest response rate as compared to the other three groups. However,
there was also a significant difference in response rates when the three
groups were collapsed into paper-based response and Web-based response.
These response rates confirmed other studies that found significant differ-
ences in response rates between Web-based and paper-based administra-
tions. Kiernan et al. (2005), McCabe (2004), and Sax, Gillmartin, and
Bryant (2003) had similar results when offering respondents the option to
take the survey in either format. Specifically, people selected the online
survey significantly more often. All of the above studies had very homoge-
nous samples in that two of the studies included only college students who
tend to be more computer literate and the other study only offered the
survey to those individuals who stated that they were computer literate. The
results of the current study demonstrate a greater degree of external validity
in that respondents were selected solely on their membership in the AEA

Downloaded from erx.sagepub.com by guest on February 20, 2013


476 Evaluation Review

and their computer skills (perceived or reported) were not a factor in group
placement.
The results of the study provided overwhelming support for the cost-
effectiveness of Web-based survey administrations. In addition, tasks
involved with the paper-based administration were time-consuming
(Archer 2003) and included standing in line for postage, meeting with the
printer, and deciphering the costs associated with bulk mail. With the
Web-based administration, once the subscription is paid and the survey is
created, it can be easily sent to additional respondents without cost.
Furthermore, reminders can be sent and tracked easily within the Web site.
An additional factor that becomes very important in comparing the cost
associated with survey administrations is the time required for data entry.
The paper-based results of the current survey were hand entered into an
Excel spreadsheet for use in data analysis. The data input process required
a total of 39 hours to complete. The Web-based data were already in a
digital format and only needed to be downloaded for analysis. The accuracy
of the data is also a factor that must be noted with Web-based survey
administrations. The process of inputting responses onto a spreadsheet can
lend itself to error. These errors might be avoided by allowing the partici-
pant to directly record their responses.

Limitations
The sample used in the current study may limit the ability of the results
to be generalized to other populations. The respondents in this study
appeared to have a particular interest in the results of the survey and might
have felt a professional obligation toward completing the survey. Further-
more, these participants may be more computer literate and have greater
computer accessibility due to the nature of their work. A word of caution
in generalizing these results may be that Web-based surveys should be
considered for working professionals or for responses desired from individ-
uals who are known to have access to and be users of the Internet. Some
examples of populations who might be targeted for Web-based surveys
might include postsecondary education employees, PreK-12 employees,
employees in technology fields, medical field employees, or surveys that
are targeted at issues related to employment.
The bias that may be present in the response rate may be due to the fact
that the respondents in this study regularly use surveys as tools of their
trade. Other professionals (e.g., psychologists, researchers, and marketing

Downloaded from erx.sagepub.com by guest on February 20, 2013


Greenlaw, Brown-Welty / A Comparison of Web-Based and Paper-Based Survey Methods 477

professionals) use surveys and data collection instruments on a regular


basis as well and would therefore be more likely to have similar results
as were found in this study. In addition, any group or organization that is
collecting data that are of high interest (high saliency), like salary informa-
tion, might also be similar enough to the present sample to generalize the
results of this study. However, if the sample receiving the survey is not one
of the professional groups noted above, the results of the study may not be
generalizable to that sample or population.
Another possible limitation of the current study is the context in which
the data are being gathered. As noted previously, survey costs vary between
studies (e.g., Archer 2003; Kaplowitz, Hadlock, and Levine 2004). The var-
iation in cost might be due less to actual costs and more to context. An
example of context might be the difference between a small business that
needs to hire clerical staff to input survey data and universities who use
graduate assistants or student assistants. The amount of work required
could be the same, but the cost might be significantly different. Another
example would be the improvement and increase in online survey creation
tools. Previously, these types of tools were very expensive and required
specialized knowledge to use. Now anyone can use Web sites like Survey-
Monkey.com for free. As competition increases and Web sites improve,
those cost differences may become accentuated. As future researchers
attempt to replicate the findings from this study, they may find it difficult
to match the precise context in which the data were collected.

Recommendations for Further Research

The current study used a sample of potential respondents that were


deemed to have an interest in the results and therefore the subject matter
within the survey was salient to their profession. The issue of saliency was
not tested in any experimental fashion in the current study but a presump-
tion can be made that this issue may have had an impact on response.
Response rates were considerably higher in the current study than in many
other studies (e.g., Sax, Gillmartin, and Bryant 2003 24%; Kaplowitz,
Hadlock, and Levine 2004; highest group 31.5%). In the current study, the
overall response was 50%, with the highest group achieving a 60%
response rate.
Groves, Singer, and Corning (2000) and Groves, Presser, and Dipko
(2004) have begun the foray into the issue of whether saliency has an
impact on response rates. The response rates in these studies indicated that

Downloaded from erx.sagepub.com by guest on February 20, 2013


478 Evaluation Review

the manipulation of the variable saliency can be difficult but may lead to
promising results. The comparison of response rates from a survey directly
related to the goals and needs of an association and a comparison group
consisting of nonmembers might be a logical next step for this research.

Concluding Discussion

Overall, the mixed-mode design appeared to be the best source for


response to the AEA Employment Survey. When individuals were given
a choice of either survey administration mode, they were significantly more
likely to respond. Although the difference in response was significant, the
cost associated with the additional responses was considerable. In this
study, the additional 149 responses obtained in the mixed-mode group as
compared to the Web-based group cost a total of US$2,356 in postage,
duplication, labor, and data entry. That total cost broken down by response
resulted in US$15.81 per response.
Evaluators must balance the need for increased response rates with the
real cost associated with data collection. In the larger business community,
the monetary expense necessary to produce greater response rates may be
justified. In smaller business settings and in contracting economies,
increased expenses to increase response rates may be deemed excessive and
not an appropriate use of funds. The application of these results must also
be viewed from the perspective of the sample chosen for future surveys. If
the population from which the sample is drawn does not have access to the
Internet, then the above discussion becomes moot. If responses to surveys
are important and necessary to the outcome of an evaluation, then surveys
must be disseminated in a manner in which they will be easily accessed,
responded to, thereby increasing response rates.

References
Archer, Thomas M. 2003. Web-based surveys. Journal of Extension 41:1-5.
Carini, Robert M., John Hayek, George Kuh, John Kennedy, and Judith Ouimet. 2003. College
student responses to web and paper surveys: Does mode matter? Research in Higher Edu-
cation 44:1-19.
Cobanoglu, Cihan, Bill Warde, and Patrick Moreo. 2001. A comparison of mail, fax, and web-
based survey methods. International Journal of Market Research 43:441-52.
Converse, Patrick D., Edward Wolfe, Xiaoting Huang, and Fredrick Oswald. 2008. Response
rates for mixed-mode surveys using mail and e-mail/web. American Journal of Evaluation
29:99-107.

Downloaded from erx.sagepub.com by guest on February 20, 2013


Greenlaw, Brown-Welty / A Comparison of Web-Based and Paper-Based Survey Methods 479

Cook, Colleen, Fred Heath, and Russel L. Thompson. 2000. A meta-analysis of response rates
in web- or internet-based surveys. Educational and Psychological Measurement 60:821-
36.
Dillman, Donald A. 2000. Mail and internet surveys: The Tailored Design method. 2nd ed.
New York: John Wiley & Sons, Inc.
Groves, Robert M., Eleanor Singer, and Amy Corning. 2000. Leverage-saliency theory of sur-
vey participation: Description and an illustration. Public Opinion Quarterly 64:299-308.
Groves, Robert M., Stanley Presser, and Sarah Dipko. 2004. The role of topic interest in sur-
vey participation decisions. Public Opinion Quarterly 68:2-31.
Heberlein, Thomas, and Robert Baumgartner. 1978. Factors affecting response rates to mailed
questionnaires: A quantitative analysis of the published literature. American Sociological
Review 43:447-62.
Idleman, Lynda. 2003. Comparing responses to mail and web-based surveys. Paper presented
at the annual meeting of the American Educational Research Association, April 21–25 in
Chicago, IL.
Kaplowitz, Michael, Timothy Hadlock, and Ralph Levine. 2004. A comparison of web and
mail survey response rates. Public Opinion Quarterly 68:94-101.
Kiernan, Nancy Ellen, Michaela Kiernan, Mary Ann Oyler, and Carolyn Gilles. 2005. Is a web
survey as effective as a mail survey? A field experiment among computer users. American
Journal of Evaluation 26:245-52.
Krosnick, Jon A. 1999. Survey research. Annual Review of Psychology 50:537-67.
McCabe, Sean Esteban. 2004. Comparison of web and mail surveys in collecting illicit drug
use data: A randomized experiment. Journal of Drug Education 34:61-72.
Mertler, Craig. 2003. What . . . another survey? Patterns of response and nonresponse from
teachers to traditional and web surveys. Paper presented at the annual meeting of the
Mid-Western Educational Research Association, October 15–18 in Columbus, OH.
Occupational Outlook Handbook. 2006. Bureau of Labor Statistics http://www.bls.gov/oco/
home.htm (accessed April 2, 2006).
Paolo, Anthony M., Giulia Bonaminio, Cheryl Gibson, Ty Partridge, and Ken Kallail. 2000.
Response rate comparisons of e-mail and mail-distributed student evaluations. Teaching
and Learning in Medicine 12:81-4.
Sax, Linda J., Shannon Gillmartin, and Alyssa Bryant. 2003. Assessing response rates and
nonresponse bias in web and paper surveys. Research in Higher Education 44:409-32.
Schonlau, Mathias, Ronald Fricker, and Marc Elliott. 2002. Conducting research surveys via
e-mail and the web. Santa Monica, CA: Rand.
Shannon, David, and Carol Bradshaw. 2002. A comparison of response rate, response time,
and costs of mail and electronic surveys. The Journal of Experimental Education
70:179-92.
Shannon, David, Todd Johnson, Shelby Searcy, and Alan Lott. 2002. Using electronic sur-
veys: Advice from survey professionals. Practical Assessment, Records and Evaluation.
http://pareonline.net/htm/v8n/htm (accessed June 19, 2009).
Sheehan, Kim. 2001. E-mail survey response rates: A review. Journal of Computer Mediated
Communication 6:1-20.
Sun, Anji, and Randy McClanahan. 2003. Is newer better: A comparison of web and paper-
pencil survey administration modes. Paper presented at the annual meeting of the Amer-
ican Research Association, April 21–25, in Chicago, IL.

Downloaded from erx.sagepub.com by guest on February 20, 2013


480 Evaluation Review

Corey Greenlaw, EdD, is the evaluation and assessment coordinator for the Fresno County
Office of Education. His work centers on helping teachers and administrators improve the
impact of instruction on student learning through the use of assessment data. In addition, he
is a part-time faculty member teaching statistical and research methods in the Kremen School
of Education and Human Development at California State University, Fresno.

Sharon Brown-Welty, EdD, is a professor of education administration and the director of the
doctoral program in Educational Leadership at California State University, Fresno. She has
conducted research in the areas of program evaluation, educational leadership, and student
achievement.

Downloaded from erx.sagepub.com by guest on February 20, 2013


View publication stats

Vous aimerez peut-être aussi