Administering preliminary market research using surveys can save businesses both time and money. However, accurate feedback and insights are contingent on the surveyor asking unbiased questions that get honest answers.
While it may seem impossible to completely avoid biased survey questions, there are ways in which you can order your questions to avoid answer bias. You can also strategically plan to use certain question types to help ensure that you’re not adding your own bias to the survey design. When you're conducting any type of research – whether it's gathering data for a political poll or feedback on a new product idea – honest feedback will provide the most accurate data.
Get pre-written survey templates, created by experts, to help ensure that you’re asking the right questions in the best way.
You may already have an opinion about the topic being researched, but it is possible to collect unbiased responses from people taking your survey. Bias can affect the outcome of a survey. This article will teach you how to reduce bias in a survey. You’ll gain a better understanding of the different types of survey bias and what they look like.
Survey bias is a deviation of feedback based on certain influences by the surveyor and respondent. Sampling bias certainly plays a part in how unbiased feedback and insights can be.
Sampling bias is when certain people are systematically more likely to be chosen in a sample than others. This is otherwise known as purposive sampling. Purposive sampling has its advantages in certain situations, particularly for smaller groups. Yet when it comes to sampling a larger population, it’s critical to reduce the amount of bias in your surveys if you want to gain the most accurate insights.
Tip: Learn about four leading types of bias and how to stop them having an impact on your next survey results.
Response bias is skewed insights from respondents whose answers deviate from how they actually feel. The response can be a result of many factors. Speeding through surveys to finish them quickly may result in answers that are biased. For example, a user may only complete the multiple choice answers and not the text responses. Other biases found in responses may result from respondents not disclosing demographic information in a survey. They might not understand the question or simply feel uncomfortable about answering. It’s possible that through hasty purposive sampling, the survey might not be relevant to respondents. It could also be the survey structure that encourages a particular answer. Overall, there are essentially seven types of response bias:
Tip: Eliminate question order bias to improve your survey data.
Non-response bias, also termed systemic bias, is when respondents included in a survey don’t respond. It represents a gap within your feedback and insights that will result in inaccurate data. Non-response bias also represents respondents who participate in a survey but then drop out for any reason. If you're finding that a high percentage of survey-takers are not responding to your survey altogether, then you may have to redesign it to ensure that they take it.
There are many reasons why a respondent refuses to participate in a survey. It could be personal, or it might have something to do with how the survey is built. Non-response bias can also be the result of timing. Give respondents enough time to complete the survey relative to the feedback they're providing. Don't send a transactional survey one or two weeks after the transaction, because your respondents won't remember the interaction. You can even send reminders, but be mindful of the cadence.
Tip: Response bias and non-response bias are the two main forms of biased feedback. Discover five ways to avoid non-response errors.
Due to limited data and inaccurate analyses, survey biases can negatively affect research results. Such inaccuracies are the result of responsive and unresponsive biases. More specifically, the errors in survey results lead to data issues, poor strategies and investment, low ROI, dissatisfaction and inconclusive results. Understand how a little bias can cause big issues.
Survey bias can also affect different types of interview surveys. Which survey is most likely to be influenced by bias? Group interviews, one-to-one interviews, panel interviews, phone interviews and online surveys can suffer from biased interviewers. It’s impossible not to inherently have a biased position on a subject you’re researching, especially if it benefits your business. However, it's still possible to have an unbiased approach to obtain the most accurate survey results.
Acquiring the most accurate survey results means understanding different types of biased survey questions. The six survey bias examples we’ll examine here are leading questions, loaded questions, double-barrelled questions, absolute questions, ambiguous questions and multiple answer questions. With each biased survey question, you’ll see how it can be written in an unbiased way.
Leading questions involve a surveyor inserting their opinion into the question. This bias influences respondents to answer the question in a way that the enquiry suggests is correct. Consequently, this response results in skewed data that won’t help your overall business objective.
Example:
A good survey question about Company A’s customer service might look like this:
How helpful are the employees at Company A?
A leading question might look like this:
Do you think the customer service at Company A is better than your experience with employees at Company B?
This question suggests that Company A is better than Company B, because the phrasing is too specific. If Company A’s objective is to compare its customer service with a particular company, the question is satisfactory. Having a clear business objective is so important when building a survey. Learn what you need to know about creating good questions for your next survey.
Loaded questions persuade respondents to answer questions in a certain way. This type of query is used when surveyors expect too much from respondents. Even with a buyer persona profile, it’s still best to maintain an objective approach with your survey questions.
Example:
If Company A is a supermarket that also sells pet food, its survey question should be:
Do you currently have a pet where you live, or not?
This question filters out respondents who have pets and those who don’t. In a case like this, you can implement question logic for respondents who answers yes and no.
If Company A leads its survey with a question like “What brand of dry food does your dog like?”, the surveyor assumes two things: the consumer has a pet and it’s a dog. This assumption will lead respondents who don’t have a dog to believe that the survey is for dog owners instead of supermarket shoppers. Consequently, they’ll exit the survey, leaving you with inconclusive results.
Tip: Write smarter survey questions and refrain from asking leading and loaded questions.
Double-barrelled questions are two survey questions asked in one. The question persuades the respondent to offer their opinion on two topics but with only one opportunity to respond. Here is an example of how to avoid double-barrelled questions:
Let’s suppose that a doctor's surgery is interested in monitoring its customer service. In that case, they may want to assess their patients' opinions about how they were treated from the time they checked in to any follow-up visits, if applicable. A good survey question for this scenario might look like this:
Overall, how responsive has our surgery been to your questions or concerns?
The double-barrelled survey question that you need to avoid looks something like this:
How responsive was our team during your visit to us and did someone follow up with you after the appointment?
It’s likely that wording questions in this way will result in a one-answer response that doesn’t quite satisfy the data you’re looking for within that double-barrelled query.
Absolute questions require respondents to provide absolute answers in a survey, with no leeway. Such questions may require a yes or no answer. They will also include words such as “always”, “never”, “every” or “all”. Responses like this clump together assumptions, leading to invalid cataloguing that neglects influential variables.
Example:
Did Product X’s Outdoor Insect Repellent eliminate every mosquito?
The probability of an outdoor repellent getting rid of every single mosquito is unlikely. It’s most likely that the respondent will answer no. However, the product might reduce the number of mosquitoes within the perimeter of its use. In this scenario, we're missing critical information to help assess the product's effectiveness. A better question could be worded like this:
How satisfied are you with the reliability of Product X?
Providing a selection of answers like this allows respondents to rank the product's effectiveness. This type of ranking will help the researcher analyse how well Product X performs.
Ambiguous survey questions leave room for interpretation because the wording isn’t clear. The query may be too broad or lacking in clarity. The use of abbreviations, acronyms and terminology also contributes to ambiguous questions. Questions may be vague or specific to the business or industry. Ambiguous questions allow respondents to interpret queries in a way that makes sense to them, resulting in an obscure response.
Example:
A business objective for a certain dental practice is to get referrals. One of the questions that they might want to ask patients could look like this:
How likely are you to encourage others to visit our office?
An ambiguous question might look like this:
Do you think your friends and colleagues would like us?
Also, avoid asking broad questions that persuade respondents to rephrase the query so it makes sense to them. Their interpretation might be different from the true intent of your question.
Multiple answer survey questions provide a more controlled approach to collecting feedback and insights. However, the challenges include phrasing the options to avoid inconclusive responses. A good practice is to create answers with choices that don’t overlap.
Example:
If you’re trying to get an assessment of a sample population’s annual income, phrase the survey question like this:
How much money did you personally earn last year?
Don’t do this, because it makes the ranges unclear to the respondent:
How much money did you personally earn last year?
Providing overlapping answers leaves respondents stuck between multiple choices. A respondent earning £50,000 a year can be categorised in the second and third categories. In the previous example, that respondent clearly belongs in the third category. Details like this matter when analysing feedback and insights. Get more tips about writing good survey questions and use survey templates with pre-written questions.
When surveys are done correctly, they can yield valuable feedback and insights that will help you make better-informed business decisions. To get the most honest and unbiased feedback from your surveys, refrain from imbuing personal opinions into questions. Avoid framing survey questions such that they influence respondents to answer in a certain way. Be clear and concise. Keep your language simple to avoid misinterpretation. Provide straightforward answers for better assessment results.
SurveyMonkey offers customisable surveys for any industry to help you achieve any business objective. Select a plan that works for you today.
Discover our toolkits, designed to help you leverage feedback in your role or industry.
Ask the right questions on your exit interview survey to reduce employee attrition. Get started today with our employee form builder tools and templates.
Get the permissions you need with a custom consent form. Sign up for free today to create forms with our consent form templates.
Create and customise request forms easily to receive requests from employees, customers and more. Use our expert-built templates to get started in minutes.