When running design sprints, you can’t escape from encountering biases. While design sprints allow everyone participating to have a say, the way you conduct the sprint and user testing and how you interpret the results can raise many biases that affect your final product.
Whether you realise it or not, cognitive biases are a part of being human. It is what our brains do. But that doesn’t mean we should embrace it. We need to figure out ways to reduce, or even eliminate it, if we want to build products that are truly valuable and inclusive.
What is a cognitive bias?
A cognitive bias is a systemic thought pattern that is irrational and ungrounded. It is an incorrect assumption, a wrong perception, and a false conclusion that the brain jumps to, affecting how we make decisions and live our lives.
But why does the brain do it? One of the theories is that cognitive biases are a way for the brain to work efficiently, reducing the time to process information and reach a conclusion. It is great for survival, but bad for innovation.
There are hundreds of different cognitive biases, and in this article, I will cover the common ones that pop up in design sprints, user testing and UX research. I’ll also include examples and suggest ways you can reduce it.
Implicit bias
This is, arguably, the most common bias you have and have encountered in your life, existing in its everyday form of societal bias. An implicit bias is a stereotype our brain applies to a specific gender, culture, age, nationality, religion, background, or any other criteria based on our limited experience and information. Unfortunately, this kind of bias exists even among the best of us.
Implicit bias examples
- Not inviting younger recruits to design sprints because you think they are new to the company and have nothing to contribute.
- Discounting or eliminating an idea based on the person giving the idea, not the merit of the idea.
- When conducting user tests, you assume technology incompetence in a certain group based on their profession or age. For example, “I think this group of participants may not know how to use this new feature compared to others because they are older.” However, this could be because your new feature isn’t user friendly to begin with, and it’s a flaw you must fix.
How to reduce implicit bias in your sprint
- Make it clear to everyone early in the sprint that all ideas should be treated equally.
- Invite a diverse set of participants from different departments to the sprint.
- During user testing, ensure you investigate the root cause of the issue that users are experiencing, rather than making immediate assumptions about the user.
- Recruit a group of users that are diverse in background, age, profession, and experience.
Confirmation bias
Confirmation bias is when you already believe in a hypothesis and then cherry pick the results you get according to what you already believe in. You’ll tend to highlight the results that are consistent with your beliefs and disregard the ones that run counter to your hypothesis. This is dangerous if unchecked, because you’ll waste valuable UX research and test results, ending up with a product that doesn’t satisfy your users.
Confirmation bias examples
- Voting on ideas that are similar to yours without considering different ideas that were proposed.
- Creating test questions that validate your hunch but ignoring any other possibilities, excluding them from the questions asked or options given.
How to reduce confirmation bias in your sprint
- Ensure everyone gives their opinion on the UX research prepared at the start of the sprint.
- Double check the user testing questions to ensure you have covered all angles.
- Have a different person to create the user test rather than the people involved in designing the prototype. Or you could make it a collective effort.
- Discuss the results together to produce a less biased conclusion.
Framing bias
Framing bias occurs when you decide or answer based on how the information is presented or framed, rather than cold, hard facts. Framing bias helps when you are trying to persuade someone to your side of the argument but is terrible for user testing. Framing a question or argument a certain way, will give you a certain answer only instead of the real answer that can help identify issues with your prototype.
Framing bias example
You’re revamping the onboarding process for new users in your app. You think that this new user experience design is the best way to do this. One of the questions you ask users is, “How easy is this onboarding process after signup?” This question assumes that the new process is easy for the users, but it may not be the case.
A better way to ask is, “How do you feel about this onboarding process after signup?” From there, you can ask further questions to delve deeper into whether it was effortless, fast, easy to understand or not.
How to reduce framing bias in your sprint
- Use positive or negative words, such as “like” or “dislike” sparingly in all your questions.
- Use a sliding scale for answers, so you get a better picture of the design you’re testing.
Sunk cost fallacy effect
The sunk cost fallacy effect is when you choose an option that isn’t right, only because you’ve invested a large amount of money, time, and resources into it. This bias can be seriously damaging because you will burn more money going further down the wrong path. Rectifying it later will be extremely costly or even sink your business.
Sunk cost fallacy examples
How to reduce sunk cost fallacy effect in your sprint
- Leave cost and time out of the picture when brainstorming and generating ideas.
- Do a quick cost-benefit analysis on the popular ideas, so you get a clearer picture of your options.
- Run design sprints before making big decisions, so you are always testing your ideas before making huge investments.
Friendliness bias
Friendliness bias arises when people agree with whatever is presented, because the person presenting them is someone they like or are familiar with. It’s like asking your friends and families about your business idea. They may not tell you the bitter truth to avoid hurting your feelings. And psychologically, you are more likely to concur with a stranger if they go out of their way to make you like them.
Friendliness bias example
- Inviting a group of close friends at work to your sprint and they make up most of the participants. You may find there are fewer contradicting opinions, which is bad for product innovation.
- For user testing, you end up inviting the people who have participated in user testing in the past. Because they are on friendly terms, they may not answer truthfully but try to please you when asked about your design.
- Treating users differently based on your perception of them during testing. For example, you may be overly friendly with someone you feel is at a disadvantage, whether older, sicker, or disabled.
How to reduce friendliness bias in your sprint
- It’s better to hire an external facilitator because you will at least have a neutral third-party coordinating.
- Recruit new users regularly for testing and interviews.
- When conducting interviews, treat every user with respect and empathy, but don’t go overboard to make them like you.
- Use more online tests, instead of face-to-face sessions.