• Home
  • /
  • Blog
  • /
  • Cognitive Biases in UX and How it Affects the User Experience

In this article, Cognitive Biases in UX and How it Affects the User Experience, we will discuss in detail what cognitive biases are, the types of bias in UX, and how you can lessen or avoid these biases.

This article will also cover the following:

What are cognitive biases?

Cognitive biases are not entirely new as this was first recognized in the 1970s when two acclaimed scientists, Daniel Kahneman and Amos Tversky were researching people’s innumeracy. What they found out is that most of their test subjects make decisions that are far from rational, especially when it comes to large numbers.

Their experiment showed that instead of people sticking to facts, they utilize mental shortcuts to determine and conclude the outcome.

We call these shortcuts heuristics, which help us solve problems quicker, but may provide us with a lot of errors, which are called cognitive biases.

A cognitive bias is an organized/non-random way in which the context affects judgment and decision-making. In short, it’s all about the framing of information.

Generally, we defined a bias as a tendency, inclination, or prejudice toward or against something or someone. These are often based on stereotypes, rather than facts about individuals or circumstances.

These biases may result in prejudgments that lead to rash decisions or discriminatory practices in UX. In short, this means that you are creating descriptions of someone even before you’ve really gotten to know them.

UX Newsletter

As humans, our focus is on the different aspects of our surroundings. And this is the reason why our rationale is not in full swing. Tversky and Kahneman found out that if we frame the same information differently, it may lead to different outcomes.

Since biases are all about favoring something or someone based on little information, this can have a major impact on user research and can have a negative influence on the design of the final product.

We all have biases, and they’re often unconscious. While we can’t completely get rid of biases, we can be more aware of them and work to overcome them. In UX design, this is critical to product success and to your professional development.


The importance of cognitive biases in UX

Before we discuss cognitive biases further, it is very important that we are aware that designers and product developers both likely have cognitive biases. We should know that no one is immune to heuristics and prejudices. By accepting these facts, we become aware of our biases, and it helps us see things more clearly and make decisions in a more logical way.

As mentioned, we’re all prone to framing. The context and previous experiences can have an impact on our design decisions. These external factors make us focus on specific aspects of the issue or ignore the other ones.

Kathryn Whitenton of the Nielsen Norman Group (NNG) used a brilliant example to describe how this works for UX designers.

For example, we have just conducted a usability test for 20 users. The outcomes can be described in two ways:

  • 20% of users couldn’t find the function on the website

  • 80% of users found the function on the website

See the difference between the two? The researchers from NNG decided to conduct the test versions in an online quiz. Here’s the result:

  • 39% of UX designers who saw the success voted for a redesign

  • 51% of the respondents who saw the failure rate thought that the feature needs to be redesigned

The data showed how framing the research results and statistics in various manners can lead to significantly different design decisions.

However, we should also be aware that cognitive biases can also affect the test users too. This is mostly seen in the perception of the pricing: what is expensive vs what can be considered a sale.

Let us then discuss the most common cognitive biases that can be applied to both the users and UX designers alike.


Common cognitive biases

Anchoring bias

The anchoring bias or known as the anchoring principle is about relying on a single factor and ignoring the other that may be involved. This is considered a judgment, heuristic, or framing since people use it to make decisions.

One of the positive sides of anchoring bias is it becomes essential and helpful in understanding user interfaces. The user can stick to one clue, and therefore, learn to use the application faster.

For UX researchers, since the primary effect becomes the first piece of information received, this becomes the anchor most of the time.

This is a piece of good information to remember for user testing. The respondents may prefer the first version they were introduced to simply because it was the first one, not because it was actually better. One of the ways to avoid this is through A/B testing, where you are provided with different versions for users.

Bandwagon Effect

The bandwagon effect is a cognitive bias with a tendency to do or believe things based on people who do or believe the same. This is also known as herd behavior.

One example of a company that uses this bias effectively is Zeplin, wherein they showed the top brands that are using the software. The interesting part, however, is that Zeplin also showed the number of designs exported to its database, which shows more value.

Clustering illusion

Another cognitive bias is clustering illusion or also known as data clustering, which is a process of organizing large amounts of data into groups or themes based on their relationships.

A lot of UX research data suggest that to make data-informed design decisions, UX professionals should always cluster both qualitative and quantitative data sources. This cognitive bias works well for experienced researchers. However, for beginners in the UX field, this can be a trap since a lot of starting UX researchers can make false clusters when it comes to analyzing data and tends to see patterns even when these do not exist.

In most cases, the issue is always the small sample size, which makes it harder to understand whether the user behavior is typical for larger market segments. This poses a greater risk of incorrect assumptions. Only when you have a sufficient sample size can you cluster data and suggest data-informed changes.

Wording bias

A wording bias example: What is the best quantitative research? Why it is surveys?

The above statement is a classic example of a wording bias, or also known as question-wording cognitive bias. This bias happens when the question itself influences the answer. 

From a user perspective taking a survey, where the question is: "How easy it is for you to use this app?" The question already implies something using this particular app is easy to use, at least to some extent.

For a UX researcher, the way the question is worded greatly affects the validity of surveys. Below are the various kinds of biased questions, which have an effect on the research results:

The leading question

Do you prefer the most updated version of the app or the outdated one? 

The words most updated and outdated suggest the new version is better than the old one.

The double-barreled question

Did you enjoy using the new editor of the messenger app?

This implies that the user already enjoys the editor. But what if they are skeptical of this new feature even though they love the messenger app?

The absolute question

This rarely works as a yes/no answer may miss some important points. So when this type of question is used, you may want to leave enough space for the user to comment and suggest. This way, you gain more in-depth insights.

Implicit bias

Implicit cognitive bias is a collection of attitudes and stereotypes associated with people without our conscious knowledge. Implicit bias is also known as unconscious bias.

The most common form of implicit bias in user research is when we only interview people given a limited set of identity profiles (race, age, gender, status, etc).

These profiles are mostly based on assumptions we have about certain people. Like when an implicit bias may help you feel uncomfortable interviewing people with different life experiences from your own.

On the other hand, we choose to interview people from typically excluded groups but then we ask several offensive questions because we have these internalized stereotypes of them

Both of these are examples of problems that can lead to a lack of representation in the research and design process. The most important thing to note about implicit biases is that everybody has them.

To overcome this kind of bias, we need to ponder and reflect on our own behaviors. We can also ask others to point out our implicit biases.

Decoy Effect

The decoy effect or the asymmetric dominance effect is framing bias where consumers tend to have a specific change in terms of preferences between two given options when they are presented with a third option, which is asymmetrically dominated.

Framing effect

The framing effect is a bias where people react differently to the same information depending on how it’s worded.

The framing effect is especially noticeable when we ask these questions:

Do you enjoy this feature?

an example of a leading question that frames the answer around the word “enjoy.” By asking this question, a person is likely to think about only the positive parts of the product experience. Avoid asking leading questions at all costs.

What do you think about this [product/feature/design/]?

Is a question that works best because it allows the user to think outside one particular part of the experience (positive or negative experience) and focus on their genuine opinion of the product instead.

Social desirability bias

The social desirability bias is the tendency to seem likable and be accepted. Social desirability bias or also known as the friendliness bias encourages the user to answer in a way they think is expected by the researcher.

For the user, the social desirability bias typically appears unconsciously. The respondents do not easily realize that they respond in a kinder, more favorable manner. For example, if asked about what the user like about the new platform, the user is likely to select a higher rating, just to make the researcher feel better about it.

For UX professionals, as long as you know how social desirability bias works, you can prevent it from happening. This bias can be avoided by asking indirect questions instead. So instead of asking about the user's feelings about the new platform, you can ask this way: "What do you think an average user would interact with this feature?”. This question won't make the user feel the urge to be friendly or nice.

Barnum–Forer effect

It is a common psychological phenomenon whereby individuals give high accuracy ratings to descriptions of their personality that supposedly are tailored specifically to them, yet which are in fact vague and general enough to apply to a wide range of people

Spotify creates this playlist tailored for every user at the end of the year to show their favorite songs. It is a more personalized experience but Spotify also managed to sneak up popular and generic songs in the list which the users haven’t listened frequently before.

Confirmation bias

In the confirmation bias, people tend to give more weight to evidence that confirms their assumptions and to discount data and opinions that don’t support those assumptions.

Confirmation bias is probably the most dangerous bias on this list because it severely affects the whole approach to UX design.

This type of bias prevents UX profesisonals from having an open mind. This results in several problems during ideation and brainstorming sessions. Practitioners who suffer from confirmation bias tend to strengthen their beliefs in the face of contrary evidence.

During user research, the pain points may get ignored and users who are suffering from these pain points may not fit with your existing assumptions. For example, a UX practitioner might hear users complain about a poorly designed navigation system in a product. A UX practitioner might discount such feedback because the design looks logical to them.

The peak-end rule

The peak-end rule is a cognitive bias wherein we tend to judge an experience more on how it is felt at its most intense point and its end instead of on the total sum or average of every moment of the experience.

The most intense point can be the most memorable experience that a user can have with a product. For example, the first call on the new phone to one of the user's family members will elicit a memorable experience whether it’s pleasant or unpleasant.

Zeigarnik Effect

The Zeigarnik effect is about uncompleted or interrupted tasks being remembered better than completed ones.

LinkedIn uses this effect to its fullest advantage on its profile page. Did you notice that every time we open the profile page in LinkedIn, it says the profile is incomplete and we are impulsed to complete the profile? Many websites are also using this strategy now.

Sunk cost fallacy

The sulk cost fallacy is about the effect of past decisions on current choices. This bias means that if we already invested a lot of resources in something, we’re going to keep investing more. This is so because we don’t want our efforts to go to waste.

The sunk cost fallacy goes in line with another phenomenon called loss aversion, which states that our brains consider all losses as more severe than gains.

For the user, the sunk cost bias is often a part of the user flow, especially when it comes to signing up for a service, or making a purchase. This mechanism can be triggered by different design tricks, such as

Progress bar

This element encourages users to complete their actions, even if they require a certain commitment.

Sunk rewards

Used in loyalty programs, sunk rewards encourage the user to stick to your product. It’s a bit like collecting stamps in a cafe so you can get the 10th coffee for free. When you’ve got just one or two purchases left to get something extra, you probably won’t resist the temptation.

Top-up suggestions

Companies use this element to make the user utilize the service. For example, in purchasing from the app, the user needs to top up money to buy from the app or avail of the other app's features.

UX designers sometimes tend to stick to bad design because of the sunk cost fallacy. For example, When it comes to spending long hours on a certain feature, UX designers usually stick to this, even when it turned out to be a bad idea.

This situation can be avoided through agile development. Short sprints and iterations make it easier to apply changes along the way and avoid wasting resources.


Steps to prevent bias in UX research and design

It’s important to note that everyone has biases. It’s just a natural part of being human. Being able to recognize your own biases and prevent them from affecting your work is what really matters. As a UX researcher or a UX designer, you’ll need to know how to anticipate, identify, and overcome biases in your research, in particular.

Here are the recommended tips t prevent user cognitive biases in UX research:

Choose your words carefully

When it comes to conducting your UX research, it is essential to utilize the words carefully to avoid leading the user in one direction or another.

Of course, as a UX designer, you’re going to be partial to the designs you’ve created, and you’ll likely assume that users will appreciate them too. However, remember when asking the users about their experience using the product, you don’t want answers that only please you. For example, we covered leading questions that can cause a  framing effect, where users make a decision or choice based on the way information was presented to them.

You need to take this as important in terms of usability studies. For example, imagine a test participant testing your user interface designs. You ask the participant: “Do you like or dislike the improved layout of these web banners?” Because you used the word “improved,” the user will most likely reply positively.

However, this type of feedback is not going to help you because you framed the question in a way that led the participant to respond accordingly. To improve your product, you need honest, unfiltered user feedback.

So instead, a better way to frame the same question is: “Explain how you feel about the layout of the web banners.” The rephrased question allows the user to come to their own conclusions without any outside influence, which will give you better data about their thought process and experience.

Take note of all the existing assumptions before conducting the research

When conducting user research, be aware of any general and specific assumptions that you may have concerning a project. By utilizing an assumptions map to list these all together plus the team's input will avoid such user research bias.

Avoid presumptions when it comes to gathering evidence and proving a hypothesis

We already know that cognitive biases exist when we start looking for evidence to prove our hypothesis. Most of the time, since we believe we already have the answer, we tend to conclude things based on the information we already have, which is based on our beliefs and preconceptions.

Choose participants who are representative of your target audience

It’s not possible to recruit the same number of participants for each usability study. So instead of on the number of participants, you need to determine how many participants are needed based on the number of target personas your team has developed. In this way, the insights you collect can apply to all of your target audience and avoids user research bias.

Conduct user research on a large sample of users

Another way to avoid cognitive bias is to include a large sample of users. Make sure you’re not just looking for a small group of people who fit your preconceived ideas. You want to have a big sample of users with diverse perspectives so you can get a good amount of insights.

When doing user interviews, ask open-ended questions

One of the most effective methods for overcoming confirmation bias during research is to ask open-ended questions when conducting interviews. An open-ended question lets the person being interviewed answer freely, instead of with a yes or no.

You also want to get into the habit of actively listening without adding your own opinions. That means you aren’t leading your interviewees toward the answer that you want them to give.

Banish user research bias by learning how to structure and write a user test script

When it comes to gathering user feedback, intentions, and preferences about your product or service, always ask open-ended questions. Remember to also avoid questions that don’t present answers only based on your own assumptions. Similarly, you can also present a task as a goal or a scenario so that you can learn more about how they interact with your website or app.

You should also never push a user into confirming a specific outcome or problem. If you do this, you might not uncover any underlying or existing problems. When it comes to writing scripts, your wording should also be clear, neutral, and straightforward. By doing this, you can further read through the minds of the users. Try to also give follow-up questions so that you understand what is important to them.

Collect a mixture of quantitative and qualitative data

Why is this important? We know that quantitative metrics provides us to look at the insights objectively. But qualitative data provides us with sentiment analysis, which helps us understand the user better.

Gain additional perspectives from competitor research and other members of the team

It does not hurt at all if you are able to conduct a competitor analysis on top of the unmoderated or moderated interviews. You can unearth what participants like or dislike about your competitors and you utilize this data as another point of reference for improving your overall user experience.

In addition, also consider the opinion of your team members by sharing any videos or insights from studies with them.

Avoid user research bias by listening and watching your body language

When moderating a user interview, avoid any user research bias by letting participants talk more than you do. As much as possible, try not to interrupt but if you want to clear up some things, you could ask them a follow-up question like ‘why do you think so?’

Learn to control any of your emotional reactions so the participants are encouraged more to reveal their true feelings about a product or service.


Conclusion

For UX professionals, it is essential to overcome cognitive biases as part of the UX practice. Never become a UX practitioner who believes that “these biases do not affect me” because this is something that we call 'blind-spot bias" or the failure to notice your own cognitive biases. The danger of this bias is that it may cause one to ignore several issues when it comes to UX design. The first rule of product design is simple: always be open-minded. It is important to be first aware of your own biases so you can avoid them.

UX Newsletter

Mary Ann Dalangin

About the author

A content marketing strategist and a UX writer with years of experience in the digital marketing industry.

Leave a Reply

Your email address will not be published. Required fields are marked

{"email":"Email address invalid","url":"Website address invalid","required":"Required field missing"}

Stop guessing, start knowing. Today.