Research 101

Common biases in surveys and how to avoid them

The way you word and order questions in a survey can influence a respondent's opinion or behaviour.
Antarika Sen
April 5, 2022
MINS READ

Survey design often feels like detective work. You want to extract the truth from a respondent without tricking them into an answer or getting them to divulge details you wish to hear.  

It is important to structure your survey and wordsmith your questions carefully. Failing to do so runs the risk of biased responses.

Survey bias can occur when a poorly crafted survey influences respondents’ answers whereby they provide inaccurate or inconsistent information.

There’s no magic formula to design the perfect survey - it’s a mix of best practice, experience, and intuition. To help you avoid the trappings of biased questions, we highlight some of the most common biases that can occur while crafting a survey. For each, we then provide some examples of how a poorly designed question can bias responses, and some tips on how you can fix them!

1. Question order bias

The Problem

What you ask a respondent before a particular question often influences how they respond to it.

Let’s take a simple example. Consider asking your respondents the following two questions in this order :

Q1.  What is your favourite TV show?

Q2. What is your favourite hobby

There is a good chance that they might be tempted to answer “watching TV” for Q2 even if it wasn’t an answer on top of their minds. This phenomenon is called priming and it underscores how important it is to take into account the context that directly precedes a question. 

Question order bias may creep into a survey in other non-obvious ways too. 

Consider this famous example from back in the 1950s when Americans were polled on these questions :

Q1. Should American reporters be allowed to enter Russia and report back what they see?

Q2. Should reporters from communist countries be allowed to enter the US and report back what they see?

When these questions were shown in the above order, support for communist reporters in Q2 increased by 37 percentage points as opposed to when the order was reversed. This is because the respondents being Americans were more likely to answer Q1 favourably, and it’s a common human tendency to maintain consistency and fairness in your answers which explains the higher ratings for the subsequent question (read more : assimilation-contrast theory).

How to minimise the bias?

  • Ask generic, broad-based questions before asking questions on specifics. In the first example, the question on “What is your favourite hobby?” should come before you get them to think about TV shows in “What is your favourite TV show?” 
  • Place questions that require top of mind recall (open-ended questions) before a close-ended question on a related topic so that respondents aren’t primed with responses and you can capture their true opinions. For instance, an open-end question on “Which brand comes to mind when you think of  Sustainability?” should come before a close-ended question such as “Which of the following brands do you purchase?”, else they might be conditioned to name a brand from the list that was shown to them.
  • When you can’t control or predict how the order may bias responses (like in the case of US vs communist reporters above), you can randomise the order of the questions across respondents, when possible. 
  • In the case of cross-sectional surveys (e.g., Brand tracking surveys), when you ask the same set of questions across multiple time points to measure change over time, ensure that your question order is consistent across the different time points so that you don’t introduce order effects which can be mistaken for change in trends.   

2. Acquiescence bias

The Problem

Acquiescence bias, also known as "yea-saying", refers to the tendency of agreeing with statements or saying yes to questions, irrespective of the content and what one believes in. Questions that ask respondents to confirm an opinion are especially susceptible to this bias e.g., yes/no, agree/disagree, true/false.

This can be explained by a few reasons. 

Respondents may often believe that saying “yes” to a question will prevent them from being screened-out, and will qualify them for more questions, and hence earn them more points or rewards in exchange. This is particularly true for questions that appear at the start of a survey.

For some, coming across as agreeable is an inherent trait, making them less likely to challenge a statement. 

In some instances, it could be a sign of satisficing i.e. putting in low effort in your answers. This is especially the case in lengthy, tedious surveys when respondent fatigue kicks in. Agreeing with something is cognitively less effortful than disagreeing with or challenging a presented notion. 

How to minimise the bias?

  • Avoid questions that provide binary yes/no options to confirm or support a stance. A better alternative would be to offer respondents opposing statements or a range of options to choose from.
  • Another alternative to yes/no questions would be to field it as a multi-select question instead. For instance consider the example below. This masks your intention of getting their opinion on a particular option (in this instance, Facebook). This way, respondents are more likely to give an honest opinion rather than agree with a question.

  • Agreement scales, even if they are not binary, are particularly susceptible to acquiescence bias. A great way to minimise bias in these instances is to rephrase the scale such that it closely mirrors the nature of the opinion being asked. Consider the example below. Acquiescence bias can be reduced if the question is rephrased such that we ask respondents how serious do they think the issue is.
Reconfigure agreement scales to scales that are customised to measure a particular behaviour / opinion.
  • Similarly, agreement scales on questions related to frequency (how often a respondent does an activity) can be replaced with frequency scales to reduce bias.
Reconfigure agreement scales to frequency scales when you want to measure a behavioural frequency

3. Leading questions

The Problem

As the name suggests, leading questions sway respondents towards answering questions in a certain way due to the manner in which they are framed. This can harm your data because respondents may lose objectivity and be unknowingly biased towards picking an answer. A good question is neutrally phrased and does not sway respondents to select one response over another.

How to minimise the bias?

  • Be neutral in the way you phrase a question. Consider the example below. If you have to ask an agreement scale question, avoid asking how much a respondent "agrees" with a question. Take a neutral stance and indicate that they are free to agree or disagree with a statement.

  • Always avoid presenting an opinion before a question that can sway respondents to answer a certain way. Consider the example below. The context provided in the first question is a big no! The first statement conditions the respondent to think about vaccines favourably. If you have to provide some context before the question is asked, ensure that it is neutral.

  • A way to spot leading questions is to keep a lookout for words/phrases that are emotionally charged - negative or positive. Additionally, never assume you know what the respondent is feeling without having enough information. Consider the example below. Always start with a neutral standpoint and work your way towards the respondent's opinion.

4. Double-barreled questions

The Problem

A double-barreled question is one that clubs two separate issues or topics together but allows for only one answer. This introduces a bias because respondents are allowed to provide only one opinion when in fact they may have a difference stance on both.

How to minimise the bias?

  • Keep things simple and focussed while writing your question. Always ensure you are asking a question pertaining to only one topic. Consider the example below. The first question allows for only one opinion. However, it is possible that the respondent may find the feature interesting but not useful. The solution would be to ask two separate questions - one for each aspect.

5. Double negatives

The Problem

An easy way to confuse and tire your respondents would be to ask double negative questions i.e. use two negative words in the same question. This can not only frustrate respondents, but also cause them to misinterpret the question such that they answer in the opposite way that they intended to.

How to minimise the bias?

The solution is to phrase the question in a simple and neutral manner. Avoid clubbing negative words such as "do not", “no,” and “not,” in one question. Also be wary of negatively worded sentences in the question when paired with words with prefixes like “dis,” “non,” "un" or “mis” in the response scale.

6. Social desirability/conformity bias

The Problem

Social desirability bias refers to the tendency to respond in ways that respondents feel are more appropriate or socially acceptable to others. This can come at the cost of being dishonest.

This can especially be an issue when the content of the question is sensitive or political in nature. They may avoid providing answers that portray them in what they perceive to be a negative light. This leads to answers being inflated to reflect “good behaviour” or under-inflated to hide “poor behaviour”.

How to minimise the bias?

  • If the topic of the survey is sensitive in nature (e.g., personal experiences with sexual harassment), it is best to inform respondents about it at the outset and then provide a way for them to opt out if they are not comfortable with giving honest answers. If only certain questions are sensitive, you may consider providing an option along the lines of, “I prefer not to answer”.
  • In some cases you may want to keep the topic of the survey vague so that you don’t let respondents know what’s the ideal answer you are looking for. For instance, if you are an organisation that works in the area of environment conservation and are interested in finding out people’s attitude and behaviour around it, it’s best not to declare your intentions at the start. Doing so may cause respondents to skew their answers to align with your interests.
  • It is also important to anonymise responses and assure respondents that their data will be kept confidential so that they feel safe in providing honest answers without fearing consequences.

7. Complex jargon

A good question should be interpreted the same way by all respondents. Always use simple, clear and precise wordings that everyone can understand. Steer clear from using jargon, technical terms, and acronyms that only a few will understand. It’s also good to ensure that the words or phrases you use are not ambiguous or open to interpretation.

If you must introduce terms or acronyms that may not be easily understood by an average layman or if they may be open to multiple interpretations, it is best to provide them with a definition before or in the question so that everyone is on the same page.

Feel free to check out more survey design best practices under our Learn Section.

If you have questions related to Milieu's survey platform and/or other product/service offerings feel free to reach out to sales@mili.eu.

Milieu Insight is a reputable survey software and market research agency, enabling businesses to thrive through data analytics. Additionally, gain insight into the workings of modern employee engagement surveys.