Vous êtes sur la page 1sur 13

Employee Survey

Content 101
Tips and Guidelines to Create
an Effective Survey

White Paper
CEB Workforce Surveys & Analytics
Contents
Executive Summary 3

Survey Items: Types and Guidelines 4

Single Response Items 4

Multiple Response Items 4

Guidelines for Writing Effective Survey Items 5

Employee Segmentation: Demographic Questions 7

Be Judicious in Your Use of Background or


Demographic Questions 7

Choosing the Right Response Scales/Options 8

Common Rating Scales 8

A Special Note on Frequency Scales 9

Guidelines for Response Options and


Response Scales 9

Branching Items or Skip Patterns 11

Open-Ended Comment Questions 11

Survey Length 12

About Long Surveys 12

How to Determine the Appropriate Length 12

Measuring the Length of a Survey 12

Conclusion 13

About CEB 13

EMPLOYEE SURVEY CONTENT 101

© 2016 CEB. All rights reserved. WSA167004MK 2 cebglobal.com


Detail about CEB Inc. and its subsidiaries can be found at cebglobal.com/offices.
Executive Summary
Employee surveys are a valuable and efficient tool for collecting critical
information for your business. A well-constructed survey uncovers
data that, when combined with good judgment, lead to key insights and
business improvements. But, how do you create a good survey? This white
paper provides some tips and best practices for creating an effective survey
—including: writing effective survey items, choosing appropriate response
scales, using open-ended questions to maximize insights, and controlling
the survey length to minimize survey fatigue.
Creating good survey items starts with defining the key areas of interest
and ensuring that a survey is the best strategy for gathering this type of
information. This white paper focuses on how to translate these concepts
into effective survey items.

EMPLOYEE SURVEY CONTENT 101

© 2016 CEB. All rights reserved. WSA167004MK 3 cebglobal.com


Detail about CEB Inc. and its subsidiaries can be found at cebglobal.com/offices.
Survey Items: Types and Guidelines
Single Response Items
There are many different types of survey items (e.g., multiple choice, check
all that apply, ranking, open-ended), but single response items that ask
the respondent to provide a “rating” or express an opinion tend to be the
most common survey item type. Thus, we’ll focus on this specific item
type, but many of these guidelines will apply across item types. Special
considerations for multiple response, open-ended, and demographic
questions are provided in later sections of this paper.
Examples of this type of item are provided below, with the question stem
followed by the response scale or options:
■■ The people I work with offer suggestions for product improvements.
(Strongly Disagree to Strongly Agree)
■■ How would you rate the service provided by your customer service
representative? (Very Poor to Very Good)
■■ Have you called the HR Help Desk in the last month? (Yes, No)
■■ How often do you have a meeting with your immediate manager?
(At Least Once a Day to Almost Never)

Multiple Response Items


In many cases, survey items will allow for multiple responses. These
question types are typically used when you wish to know which of a series
of things a respondent has done, likes, would choose, etc. These types of
survey items can take a variety of forms, including a “check all that apply”
type item, a ranking item, “hundred pennies” exercise, and so on. An
example of a multiple response item (check all that apply) is as follows:
Which of the following internal communications methods have you
used in the past month? (Please select all that apply.)
Company Intranet
E-Mail
Jabber/Company Instant Messenger
Company Mobile Phone
Company File Transfer Site
Other (specify) ____________________
As you can see, this item includes an “Other” option with a write-in
space to capture options that you have not considered when designing the
question. This is a good practice to follow and provides for a more
complete understanding of an issue; in addition, it may identify additional
response options to include in the future should the survey be repeated.
Often, what is most of interest is the relative weight or importance of
these options to respondents. Having respondents rank-order their
responses in terms of preference or frequency of use can provide this type
of information. When the list of response options is long, having a follow-
up question where the respondent rank-orders the top/bottom three
EMPLOYEE SURVEY CONTENT 101
responses is a recommended strategy.

© 2016 CEB. All rights reserved. WSA167004MK 4 cebglobal.com


Detail about CEB Inc. and its subsidiaries can be found at cebglobal.com/offices.
Guidelines for Writing Effective Survey Items
Writing effective survey items can look deceptively simple on the surface,
but creating a well-constructed survey takes skill and practice. Follow these
guidelines to improve the quality of the survey content that you develop.

Ensure That the Meaning of the Item Stem Is Clear


Review the items to ensure that multiple interpretations are not possible.
Complex items (i.e., those that are very long, those involving double
negatives) can cause confusion and result in varying interpretations by
respondents—and by key stakeholders who will be consuming the data.
Some respondents may even choose to skip an item if its meaning is
unclear to them. If an item is complex or extremely long, it may need to be
rewritten or separated into two or more items.

Confirm That a Response to the Item Has a Clear Meaning


Think through how you would interpret both positive and negative
responses to the item. Will the meaning of the chosen response be
unambiguous? For example, consider responding to the following item on a
Strongly Disagree to Strongly Agree rating scale: “The amount of work that
I am expected to do is acceptable.” What does an unfavorable response to
this item mean? It could mean that they have insufficient work to occupy
their time or that they are overwhelmed with a workload that exceeds their
capacity. Yet, these two different reasons have very different implications.

Avoid Language That May Not Be Universally Understood


Technical terms, slang, or acronyms that are not part of participants’
everyday language generally should be avoided. However, the use
of terminology that is specific to an organization, when universally
understood, can give survey instructions or items more meaning to
respondents and better fit your organizational culture. One safeguard
is to add definitions of such terms to the survey instructions.

Avoid the Use of “Double-Barreled” Items


A double-barreled item addresses two different issues in a single item,
as in this example: “Leadership makes sound strategic decisions and
communicates the reasons behind them.” It is impossible to determine
whether negative responses to this item are due to opinions about
leadership’s decisions or about the communication of those decisions or
both. Yet, the actions you would take to address this item as an opportunity
area will vary significantly depending on what the root issue is.

Include Items That Are Specific and Actionable


To the extent that survey items are specific, it is easier to determine where
to focus improvement efforts. For example, in a customer service survey,
it is easier to take action in response to poor results on an item that asks
about wait time on phone calls to a call center than a general item about
the quality of customer service. It is certainly appropriate to have some
more general items included in a survey, but a survey full of broad items
will make it difficult to pinpoint the underlying issues that need to be
addressed. When such general items are included, they are best placed
EMPLOYEE SURVEY CONTENT 101 at the end of the survey, after the specific issues have been covered.

© 2016 CEB. All rights reserved. WSA167004MK 5 cebglobal.com


Detail about CEB Inc. and its subsidiaries can be found at cebglobal.com/offices.
Avoid Items That Are Stated as Absolutes
We generally recommend that you refrain from using absolutes such
as “always” or “never” in the item stem itself, although there are some
exceptions to this rule. Typically, it is better to use the item text to describe
the behavior or topic of interest and rely on the response scale to measure
degree or frequency. Responses to items that include absolutes can be
difficult to interpret. Consider, for example, the item, “My supervisor
always provides meaningful performance feedback.” An unfavorable
response to this item could mean that the supervisor does not provide
meaningful feedback or that he or she provides meaningful feedback some
or most of the time, just not all of the time. Absolutes also set a very high
(or in some cases, a very low) bar for performance. For Example, can you
reasonably expect anyone to always provide meaningful feedback?
However, these types of questions can be appropriate in certain
circumstances where risks to not consistently/always doing something are
significant, such as the team reporting that they always stop work when a
safety issue occurs or that new regulatory expectations are always
communicated to employees.

Use Consistent Terminology


Avoid alternating between terms such as management and leadership or
customer and client if the terms really are being used interchangeably.
Consistent terminology helps ensure a common understanding among
respondents and makes it easier for them to detect truly meaningful
differences in word choice.

Avoid Leading Items with an Obvious “Right” Answer


Items should not be written in a way that suggests to respondents that you
want them to answer in a particular way. Similarly, in most situations, you
should avoid wording questions in such a way that respondents may feel
that providing a negative response reflects poorly on them. For example, if
you want to assess the effectiveness of new-hire training, it is preferable to
ask respondents to rate the relevance and quality of the training rather
than asking them whether they feel prepared for their job. The desire to
present oneself favorably may make it difficult for respondents to answer
objectively if the question is targeted at them rather than the training.

Be Careful Not to Change the Meaning of Trend or Benchmark Items


Sometimes, even minor changes to the wording of an item can make
comparisons over time or to external benchmark data inappropriate. This
is especially the case when the referent of an item has been changed (e.g.,
changing the focus from “senior management” to “management” or “my
manager”), when the standard conveyed by the item has changed in some
way (e.g., adding or removing an absolute such as “always” or “never” to
item text), or when the response scale has changed. Take care when editing
items for which trend or benchmark data is being used.

EMPLOYEE SURVEY CONTENT 101

© 2016 CEB. All rights reserved. WSA167004MK 6 cebglobal.com


Detail about CEB Inc. and its subsidiaries can be found at cebglobal.com/offices.
Employee Segmentation: Demographic
Questions
Demographics are an important component of surveys to help to
pinpoint employee segments where opportunity areas may be greatest
or, conversely, where best practices can be identified and leveraged.
Demographics provide background information about survey respondents
and usually relate to personal information such as age, gender, or
information on the respondent’s role in the company, such as their team,
job level, or length of time at the company.
Often, to reduce survey length and provide more complete data, this
information is precoded to participants in advance of the survey using
information in an HRIS feed. When this information is not preloaded into
the survey, demographic questions are included in surveys to gather this
information and typically appear grouped together at the beginning or end
of the survey. The primary considerations when including demographic
questions should be suitability to the audience and preserving anonymity.

Be Judicious in Your Use of Background or


Demographic Questions
It is important to balance the need for respondent background information
with instilling trust in the anonymity or confidentiality of the process.
When asked many background questions such as gender, years of service
with the company, function within the organization, or job level, some
respondents become suspicious that the combination of their responses
to questions will be used to determine their identity. Therefore, it is
important to include only those background questions that:
■■ You have reason to expect might be related to differences in opinions
on the topics being measured in the survey, and
■■ Have enough variability in responses to allow for meaningful
comparisons.
Regardless of the number of background questions, it is important that
respondents understand why these questions are included in the survey
and how these responses will be used. To minimize these concerns, it
often makes sense to include an “I would prefer not to answer” option for
these types of questions.

Response Options
With demographic or background questions in particular, it is also
important to choose response options that are at the appropriate level
of detail to allow for meaningful data analysis. The level of detail that
is appropriate will vary across organizations and, sometimes, within
segments of an organization. For example, a common background question
asked in employee surveys is length of service with the company. The
response options provided for a relatively young organization and a more
established one should differ.

EMPLOYEE SURVEY CONTENT 101

© 2016 CEB. All rights reserved. WSA167004MK 7 cebglobal.com


Detail about CEB Inc. and its subsidiaries can be found at cebglobal.com/offices.
In addition, how broadly response options are defined is an important
consideration; overly broad response options may prevent important
comparisons from being made. Using length of service with the company
again as an example, we know that employees with less than a year of
service tend to respond more favorably to employee surveys than do more
tenured employees. An important comparison would be lost if the lowest
response option is defined as “3 years or less” rather than “less than 1 year.”

Choosing the Right Response Scales/


Options
Just as important as the wording of the survey item is the choice of
response options. The use of ambiguous or confusing response options can
jeopardize the quality of the data, and inconsistent use of response scales
within a survey can complicate comparisons of results across items.

Common Rating Scales


There are many different types of rating scales used on employee surveys.
The table below includes some of the typical scales used, with the first two
being the most common.

Type of Scale Option 1 Option 2 Option 3 Option 4 Option 5

Agreement Strongly Disagree Neither Agree Strongly


Agree Nor Agree
Disagree
Disagree

Satisfaction Very Dissatisfied Neither Satisfied Very


Satisfied nor Satisfied
Dissatisfied
Dissatisfied

Extent Not At All Limited Moderate Considerable Very Great


Extent Extent Extent Extent

Frequency Almost Seldom Sometimes Usually Almost


Never Always

Quality Very Poor Poor Fair Good Very Good

Probability Certainly Probably Not Sure Probably Certainly


Not Not

Source: CEB analysis.

EMPLOYEE SURVEY CONTENT 101

© 2016 CEB. All rights reserved. WSA167004MK 8 cebglobal.com


Detail about CEB Inc. and its subsidiaries can be found at cebglobal.com/offices.
Key, though, is that the response scale chosen should be aligned with the
item stem. Some items lend themselves to a frequency rating, others to
a quality rating, and others to a rating of agreement. For example, if the
item is designed to assess the quality of customer service, a rating scale
using evaluative terms such as “poor” and “excellent” is preferable to
an agreement scale (e.g., “strongly disagree” to “strongly agree”). When
multiple response scales are used throughout a survey, it makes it easier
and quicker for respondents if items using the same response scale are
grouped together.

A Special Note on Frequency Scales


Frequency scales can provide for a range of responses from general (e.g.,
almost never to almost always, as shown in the table above) to more specific
options (0–2, 3–5, and so on), A more specific scale allows you to better
understand how frequently a certain action is taking place. Be sure when you
design such a scale that you ensure that:
■■ There are options for all situations that are possible for that
population;
■■ The options do not overlap; and
■■ The options ensure sufficient granularity for effective analysis.
When in doubt over the level of granularity needed, keep in mind that you
can combine response options for reporting if they are too granular, but you
can’t break down a response option into a finer level of detail.

Guidelines for Response Options and Response


Scales
Below are some important issues to consider when choosing or creating
a response scale.

Number of Response Options


Rating scales can use different number of scale points, but five to seven
tend to be most common across surveys of different types. Common
practice within the employee survey space is to use a five-point scale,
consistent with the scales provided in the table above. To avoid confusion
and enhance meaningful comparisons, consistency within and across
surveys needs to be a consideration, as discussed below.

Use of a Midpoint in the Scale


There are differing points of view on this issue. In some cases, a midpoint
is excluded out of a desire to force respondents to “pick a side.” However,
in many cases this may be forcing a significant number of people to
respond in a way that does not accurately represent their opinion; on most
important issues, a significant percentage of people may be truly neutral
or ambivalent in their opinion. Thus, we generally recommend that a
midpoint be included in scales.

EMPLOYEE SURVEY CONTENT 101

© 2016 CEB. All rights reserved. WSA167004MK 9 cebglobal.com


Detail about CEB Inc. and its subsidiaries can be found at cebglobal.com/offices.
Consistency with Scales Used for Trend or Benchmark Items
Using a trend or benchmark item but modifying its response scale could
make comparisons tenuous. While this is a judgment call, we recommend
that you use the response scale associated with the trend or benchmark
item. Even just reversing the order of a scale can impact the way in which a
scale is used.

Influence of Other Established Survey Response Scales


The response scale used for a survey generally should be influenced by
what is common practice in the organization, striving for consistency
across initiatives. For example, if your organization typically uses the low
end of a scale to indicate more favorable results, your response scales
should follow suit to avoid confusion across surveys. This will be important
not only for participants when they respond to the survey items but later
for individuals as they interpret results.

Use of Non-Response Option


For some items, it is beneficial to include a response such as “Not
Applicable” or “Can’t Evaluate” (i.e., a “non-response option). For example,
questions that ask an employee to rate the quality of service provided by
the benefits hotline should allow for a non-response option for employees
who have not called the benefits hotline. Absent a non-response option,
some people may skip the item, but there is a risk that others may select
the neutral option instead.

Make Sure There Is an Option for Everyone


It’s important to ensure that for each question in your survey, there is
a response option that applies to every situation. For example, if you’re
asking respondents what function they work in or which medical plan
they use, be sure to account for all possibilities. For the function question,
what should the CEO and his/her administrative assistant choose? For the
medical plan question, what does someone select if they use the medical
plan offered by their spouse’s company? These situations are avoided by
carefully thinking through all possible scenarios, and in some cases by
adding an “other” option where respondents can write or type in their
responses if none of the options provided applies.

Uniqueness of Response Options


The response options provided for an item should not overlap with each
other. Overlapping response options (e.g., 1–2 years, 2–4 years) result in
ambiguity over which response option a respondent should select.

EMPLOYEE SURVEY CONTENT 101

© 2016 CEB. All rights reserved. WSA167004MK 10 cebglobal.com


Detail about CEB Inc. and its subsidiaries can be found at cebglobal.com/offices.
Branching Items or Skip Patterns
Surveys can be designed so that not all questions are asked of all
respondents. This process is usually called branching or using skip
patterns. These are used so that employees only answer questions that are
relevant to them.
The questions that respondents see vary depending on the answers they
give to a question, so only those people that say “Yes” when asked if they
have called the HR Help Desk are asked to rate the service provided by the
Help Desk, while those that answered “No” move on to the next non–HR
Help Desk question.
The other typical use of survey branching is to provide a set of questions
that is unique to employees from different divisions or teams. In this
situation, there is usually a core set of questions that all employees invited
to participate will see, and then dependent on their division (either asked
as a question in their survey or precoded from the people file used for
the survey) they will then see a set of questions that is targeted at their
division/team.

Open-Ended Comment Questions


Many surveys include “open-ended” or “comment” questions. These
questions have the advantage of allowing respondents to:
■■ Provide greater detail in their feedback on a topic covered by the
survey,
■■ Address an issue that was not included in the survey, and
■■ Offer concrete suggestions to address areas of opportunity to aid action
planning efforts and, ultimately, lead to organizational improvement.
An open-ended question is useful when all of the possible responses to the
question are not known or when very detailed feedback is desired. For tips
and best practices on writing good comments questions and more, see our
white paper.

EMPLOYEE SURVEY CONTENT 101

© 2016 CEB. All rights reserved. WSA167004MK 11 cebglobal.com


Detail about CEB Inc. and its subsidiaries can be found at cebglobal.com/offices.
Survey Length
About Long Surveys
Why should you care about survey length? Because survey respondents
care. Respondents may be less likely to complete a long survey or may rush
through it without carefully considering their responses, both of which
can negatively impact the quality of the data collected. In addition, an
often-overlooked concern is that long surveys can be costly in terms of lost
productivity. But one size doesn’t fit all for surveys. A well-designed survey
that is rolled out with a strong communication plan and with content
that is seen as relevant can get high response rates and high-quality data
regardless of its length.

How to Determine the Appropriate Length


It is important to balance the length of the survey with the value of the
data that will be obtained from it. The appropriate length for a survey
depends on several factors. Keeping the survey relatively short is most
critical when the following is true:
■■ The target audience does not have a vested interest in the results
and how the data are used. Employees may be willing to complete a
relatively long survey if they believe that the results will have a direct
impact on them and their work environment; they may be less willing
to respond to a long survey that has an indirect impact on them,
such as a survey about the effectiveness of the company’s advertising
campaign.
■■ Participants do not have any formal responsibility to the sponsoring
party (e.g., most employees are more likely to complete a survey
sponsored by their division than a survey distributed by another
division in the company).
■■ The organization has numerous ongoing survey efforts, causing people
to be over-surveyed.
■■ Those invited are extremely busy at the time the survey is deployed
(e.g., the holiday season at a retail company). During busy operational
times, even a short survey may be inappropriate.
■■ A very high response rate is critical, such as when the credibility of
results will be called into question if not based on responses from all
(or nearly all) who were invited to participate.

Measuring the Length of a Survey


Avoid the temptation to simply make a ballpark estimate of how long it
takes to complete a survey. A better approach is to ask several colleagues
to take the survey and track how long it takes them to complete it. Once
you have a realistic estimate of how long the survey will take, you should
provide this information to respondents before they begin a survey so that
they can make an informed decision about whether and when they will
complete it.
EMPLOYEE SURVEY CONTENT 101

© 2016 CEB. All rights reserved. WSA167004MK 12 cebglobal.com


Detail about CEB Inc. and its subsidiaries can be found at cebglobal.com/offices.
Conclusion
Surveys are a key component of evaluating opinion, but they are only as
good as the questions that they contain. Surveys should be designed with:
■■ A clear goal in mind of what information is to be sought,
■■ Questions that are relevant to the audience,
■■ Questions that are clear and easy to understand, and
■■ Response scales or options that allow respondents to appropriately
express their opinions and that are an appropriate length.
A well-designed survey provides illuminating, valuable data that helps
decision making while at the same time allowing respondents to feel that
they have contributed to the process.

About CEB
CEB is a best practice insight and technology company. In partnership
with leading organizations around the globe, we develop innovative
solutions to drive corporate performance. CEB equips leaders at more
than 10,000 companies with the intelligence to effectively manage talent,
customers, and operations. CEB is a trusted partner to nearly 88% of the
Fortune 500 and FTSE 100, and more than 70% of the Dow Jones Asian
Titans.

Contact Us to Learn More


Phone: +1-866-754-1854
E-Mail: survey-solutions@cebglobal.com
Web: cebglobal.com/workforcesurveys

© 2016 CEB. All rights reserved. WSA167004MK 13 cebglobal.com


Detail about CEB Inc. and its subsidiaries can be found at cebglobal.com/offices.

Vous aimerez peut-être aussi