Surveys help you discover what customers think about your products and make it easier to understand the needs of various demographics. But survey research is complicated and biased survey questions can easily lead to flawed data, poor decisions, and failed strategies.
In this article, we’ll help you avoid some of the most common survey weaknesses to ensure you ask effective survey questions that lead to better decisions for your business.
Customers can’t accurately respond to questions if they don’t know the answer. Asking questions such as “Will you use our service again in the future?” causes response bias because customers can’t predict the future.
This type of question tends to limit answer choices to yes, no, and maybe, which might lead to an artificially high number of maybes since it’s the safest choice. Instead, use a Likert scale question that asks “How likely are you to use our service again?” with answers ranging from “very likely” to “very unlikely.”
Prevent impossible questions by providing a full range of plausible responses (e.g., include unsure if applicable) and making sure none of the answer choices overlap (e.g., 1 to 5, and 5 to 10) or can be confused with one another (e.g., mobile device, cell phone).
Social desirability bias happens when people answer questions in ways they think will be viewed as socially acceptable by others.
Any topics that touch on socioeconomics, personal habits (e.g., alcohol use), or other controversial issues are prone to social desirability bias and might result in a politically-correct response, rather than an honest one. This is why behaviors perceived as good tend to be overreported relative to behaviors perceived as bad.
Reduce social desirability bias by:
Strongly emphasizing the confidentiality of survey responses
Making it clear that there are no right or wrong answers
Using a third-party platform to conduct your survey
Keeping the survey’s purpose vague
Leading questions are those that suggest a desired response. Rather than gaining unbiased information, these questions typically seek to confirm the survey creator’s assumptions and often alienate the reader.
A survey question like, “Our restaurant was rated number one by Food Eater Monthly. How much did you enjoy your meal?” uses persuasive framing and assumes the customer had a pleasant experience. Instead, simply ask, “How would you describe the quality of your meal?”
Similarly, leading answer options leave the survey taker feeling manipulated. “Would you volunteer to help sick animals? Yes, in a heartbeat! No, I’m too busy to help sick animals.”
Avoid leading questions and answer options by keeping opinions and personal preferences out of your survey. Take care not to influence respondents with unnecessary framing.
Loaded questions put the respondent in an awkward position by creating a false or biased premise. If you ask “What do you like most about our service?” you’re assuming the customer likes your service when it may be that your service is simply the most convenient or the customer hasn't taken initiative to make a change.
To avoid loaded questions, think about whether each question applies to every respondent. You might first ask a question such as “Did you enjoy using our service?” and then branch the survey for those who did and did not enjoy using the service.
While loaded questions are often unintentional, they might come across as manipulative and cause people to quit your survey before finishing. If you have an abnormally high drop-off rate, check carefully for a loaded question (or leading question for that matter) before running it again.
Remember that the order in which questions are asked is often just as important to preventing survey bias as the questions themselves. The carry-over effect biases the respondent by altering their thoughts about the subsequent question.
For example, if you poll a customer about their satisfaction with your returns process, you force them to think about a potentially negative experience. Following this question with a general question about satisfaction might lead the respondent to focus on that negative interaction with your company and provide a lower score than they otherwise would. In this case, simply separating the question about a specific scenario from one about their general impression might be enough to prevent carry-over bias.
Carefully consider the order in which questions are asked to ensure they flow logically and that they don’t unintentionally influence one another.
Multiple-choice questions that use long lists tend to cause order bias—either primacy bias (the tendency to select one of the first choices) or recency bias (the tendency to select the last response option). If a multiple-choice question has eight response options, survey takers are generally less likely to select choices in the middle of the pack.
To address this, most survey platforms allow answer choice randomization. With this strategy, each survey respondent is presented the same choices, but in a different order. However, if a final answer choice is “other”, it should not be randomized with the other choices. In this scenario, be sure to include an open response field so the respondent can clarify the answer in their own words.
Reduce order bias by limiting the number of options for multiple-choice questions. If you find it difficult to keep the number of choices down, consider consolidating similar options or splitting the question into two.
Sample size is the number of participants your survey requires to statistically represent the population. Your population is the demographic of people you want to learn about (e.g., male vegetarians in California from 18 to 35 years old).
Too few respondents for a population magnifies any response bias and results in shallow analysis of your survey results. While you may not always be able to determine the exact number of your population, a rough estimate is usually sufficient.
Generally, a sample size of 30 is the bare minimum from which meaning can be derived from any population. But larger populations require larger sample sizes (up to a certain point). You can find sample size calculators here, here, and here, where you can also account for factors such as margin of error and confidence level.
Check out our new Survey Software Category Leaders Report to find the tool that’s best for your business.