An investment said to have an 80% chance of success sounds far more attractive than one with a 20% chance of failure. The mind can’t easily recognize that they are the same.
- Daniel Kahnemann, Economics Nobel Laureate
Imagine that as a Product Manager, you’re researching a new potential product. You think that the market for this product is growing, and, as part of your research, you find information that supports this belief. You report back to the team that this is a “Billion Dollar” opportunity.
As a result, you convince your management that the product will do well, and you launch it.
However, the product fails.
The market hasn’t responded, and there are fewer customers than you expected. You can’t sell enough of your products to cover their costs, and you make a loss.
Now if you are guessing, what this whole story about, it’s pertinent to know that your decision was affected by cognitive bias. And that forced you to interpret market information in a way that confirms your notions and preconceptions, and instead of seeing it objectively – you made wrong decisions.
What is Cognitive Bias?
In our fast paced world, we are almost always using rules-of-thumb to make decisions that helps reduce mental effort. This is more so the case when we are confronted with uncertain situations. These short cuts, mostly occurring without our conscious control, are cognitive biases. They often lead to non-rational and suboptimal outcomes.
Noted economists, Daniel Kahneman and Amos Tversky argued in their seminal work that most of the human decision making happens in an intuitive, or non-rational manner that is fundamentally different from rational models proposed by researchers before them.
Their theory in behavioral economics suggests that we make decisions by splitting them into “System 1” and “System 2” thinking. System 1 thinking operates quickly and with very little effort. System 2 is responsible for allocating attention to difficult mental activities. This means that in System 2 thinking, the effort is greater and results are slower than with System 1.
For example: when we learn to drive a car, the process of driving is a difficult mental effort requiring extreme levels of concentration. This is classic System 2 thinking. But later on, once we’ve been driving for a few years, we arrive at the destination with no memory of your journey there. All the effortful concentration required to drive has been off-loaded to System 1. Our muscle memory has taken over and we can drive without any conscious awareness of the decisions we make as we drive.
Biologists provide their perspective saying that the reason for this is that the brain is a metabolically expensive organ to run. It is estimated that the brain consumes approximately 20% of the available energy in the body. System 2 tries to optimize this drain of energy by making use of System 1 thinking as much as possible since it is metabolically less expensive to use.
On a deeper look, the human brain is capable of 1016 processes per second, which makes it far more powerful than any computer currently in existence. But that doesn’t mean our brains don’t have major limitations. The lowly calculator can do math thousands of times better than we can, and our memories are often less than useless — plus, we’re subject to cognitive biases, those annoying glitches in our thinking that cause us to make questionable decisions and reach erroneous conclusions. Here are a dozen of the most common and pernicious cognitive biases that you need to know about.
Some social psychologists believe our cognitive biases help us process information more efficiently, especially in dangerous situations. Still, they lead us to make grave mistakes. We may be prone to such errors in judgment, but at least we can be aware of them. Here are some important ones to keep in mind.
Cognitive Biases Plaguing the PM’s world
Wikipedia lists more than 100 types of cognitive but we pick up the key ones that a Product Manager should be aware of and avoid suboptimal outcomes occurring due to them.
Most people don’t like having their opinions contradicted.
In this bias we tend to interpret data in order to reinforce our existing opinion. So when analyzing data we will tend to place greater store on those data points that support our view, and less on those contradicting.
In other words we need to pay particular attention when we judge any kind of metrics as to how the test was constructed and try to treat outcomes neutrally when making decisions based on the data.
This common bias is the reason why we must be very careful about how questions are worded in surveys, interviews or focus groups. Unfortunately, it is harder to combat this bias in data interpretation. Internal politics, personal goals or simply lack of knowledge can turn research users into cherry pickers. Confirmation bias can affect the way that people view statistics. Its authors report that people have a tendency to infer information from statistics that supports their existing beliefs, even when the data support an opposing view. That makes confirmation bias a potentially serious problem to overcome when you need to make a statistics-based decision.
UX practitioners are taught to be aware of confirmation bias in for example usability testing.
All of this makes it virtually impossible to construct true double-blind experiments in practice such as usability testing and ethnographic research.
The problem is, biases invariably creep into any team’s reasoning—and often dangerously distort its thinking. A team that has fallen in love with its recommendation, for instance, may subconsciously dismiss evidence that contradicts its theories, give far too much weight to one piece of data, or make faulty comparisons to another business case.
That’s why, with important decisions, Managers need to conduct a careful review not only of the content of recommendations but of the recommendation process. To that end, the authors—Kahneman, who won a Nobel Prize in economics for his work on cognitive biases; Lovallo of the University of Sydney; and Sibony of McKinsey—have put together a 12-question checklist intended to unearth and neutralize defects in teams’ thinking. These questions help leaders examine whether a team has expl ored alternatives appropriately, gathered all the right information, and used well-grounded numbers to support its case. They also highlight considerations such as whether the team might be unduly influenced by self-interest, overconfidence, or attachment to past decisions.
By using this practical tool, executives will build decision processes over time that reduce the effects of biases and upgrade the quality of decisions their organizations make. The payoffs can be significant: A recent McKinsey study of more than 1,000 business investments, for instance, showed that when companies worked to reduce the effects of bias, they raised their returns on investment by seven percentage points.
Avoiding Confirmation Bias:
Always look for ways to challenge what you think. Seek out information from a range of sources, and use an approach such as the Six Thinking Hats to consider situations from multiple perspectives.
Discussing our thoughts with a diverse group of people, and not be afraid to listening to dissenting views. Seek out people and information that challenge our opinions and to play the proverbial “Devil’s Advocate”.
“First impression” or anchoring happens when you form an initial picture of a situation too early and you have a tendency to jump to conclusions. You tend to base your judgment on an information that was gained too early on in the decision making process. Anchoring tends to happen when you are in pressure to take quick decisions or is being asked by others to arrive at decisions quickly.
Avoiding anchoring effect:
Take time to make decisions slowly, and be ready to ask for longer if you feel under pressure to make a quick decision This can help you ensure that you’ve made a thorough, well-considered decision.
Overconfidence bias happens when we place too much faith in your own knowledge and opinions. We may also believe that our influence and contribution to a decision is more valuable than it actually is.
A manager might combine the overconfidence bias along with anchoring, hence acting on hunches and have an unrealistic view of his own decision-making ability.
In this study, researchers found that entrepreneurs are more likely to display the overconfidence bias than the general population. They can fail to spot the limits to their knowledge, so they perceive less risk. Some succeed in their ventures, but many do not.
Avoiding Overconfidence Bias
Consider the following questions:
- What sources of information do you tend to rely on when you make decisions? Are these fact-based, or do you rely on hunches?
- Who else is involved in gathering information? Are these sources credible and reliable?
If you suspect that you might be depending on potentially unreliable information, think about what you can do to gather comprehensive, objective data.
With the gambler’s fallacy, we expect past events to influence the future. A classic example is a coin toss. If you toss a coin and get heads seven times consecutively, you might assume that there’s a higher chance that you’ll toss tails the eighth time.
Often, the longer the run, the stronger your belief can be that things will change the next time. However, in this example, the odds are always 50/50.
The gambler’s fallacy can be dangerous in a business environment. For instance, imagine that you’re an investment analyst in a highly volatile market. Your four previous investments did well, and you plan to make a new, much larger one, because you see a pattern of success.
In fact, outcomes are highly uncertain. The number of successes that you’ve had previously has only a small bearing on the future.
How to Avoid Gambler’s Fallacy
A HBR study reported at gambler’s fallacy was less likely to happen when decision makers avoided looking at information chronologically.
So, to avoid gambler’s fallacy, make sure that you look at trends from a number of angles. Drill deep into data using tools. Also try to notice patterns in behavior or product success – for example, if several projects fail unexpectedly – look for trends in your environment, such as changed customer preferences or wider economic circumstances. Tools such as PEST analysis can be really helpful.
Fundamental Attribution Error
Fundamental Attribution is a tendency to blame others when things go wrong, instead of looking objectively at the situation. In particular, you may blame or judge someone based on a stereotype or a perceived personality flaw.
For example, if you’re in a car accident, and the other driver is at fault, you’re more likely to assume that he or she is a bad driver than you are to consider whether bad weather played a role.
Fundamental attribution error is the opposite of actor-observed bias, in that you tend to place blame on external events.
For example, if you have a car accident that’s your fault, you’re more likely to blame the brakes or the wet road than your reaction time.
How to Avoid Fundamental Attribution Error
It’s essential to look at situations, and the people involved in them, non-judgmentally and empathetically, and to understand why people behave in the ways that they do.
Build emotional intelligence so that you can look objectively at your own behavior and direct your actions.
Drawing different conclusions from the same information depending on how the information is presented. For example, respondents will reply with a different set of brands if we ask them which clothing retailer or which online clothing retailer comes to mind. The framing effect is also omnipresent during research reporting. Using different perspective to present the results can lead to different conclusions.
We usually tend to value the things we have worked on more than the equivalent thing that we havenot.
This was verified in a study by behavioural economist, Dan Ariely. In the experiment, couple of college students were paid $5 to assemble an Ikea box. After they had completed it, they were asked how much they would pay to take the box home with them. This was compared with an already assembled box. Students were willing to pay more for the boxes they had assembled themselves.
The implication for Product Managers is typically seen when they work on a product over a period of time that leads to a higher perceived valuation in their minds. While this may be quite a boon when the execution is in the right direction, but then may
In situations of uncertainty, what usually works is looking at what others are doing around us.
People generally discount their own beliefs and imitate others as an escape route to overcome uncertainty and avoid costs of searching for more information or be blamed for making the wrong choice. As a result, using champions or advocates to support adoption can be extremely effective.
Team players would tend to follow the strong personalities in the organization in their bid for ‘social proofing’. However, the halo effect can turn negative too, if the new adoptions do not live up to expectations they may generate disappointment.
Status Quo Bias
Status quo bias describes people’s tendency to maintain their current situation even when the alternative may be better. It has the potential to impact negatively on organizations as they upgrade or replace their ways of thinking, tools and methodology. This may arise through subconscious habits (for example, automatically thinking in a certain way that has worked in the past) and the evaluation of switching costs.
Effort invested and skills gained in the existing setup acts as a deterrent where there are clear indications that change is necessary but we fail to respond to them.
Sunk Cost Fallacy
This is also termed as a dysfunctional commitment, for example a poorly performing stock in portfolio.
The sunk cost fallacy comes into play when we decide whether to spend additional energy on an existing program or to invest in a new one. Take the example of a poorly performing stock. One of the best strategies for a rational investor is to cash in his losses as early as possible and invest only in winning stocks. But most non-professional investors have trouble letting go of poorly performing stocks until it’s too late.
In the context of management, it is tempting to continue investing in a product in the hope that new features will drive customer conversion. The decision to invest in a new feature in an existing product should be evaluated against all the available alternatives. This would include investing in a completely new product and its chances of success compared to investing in the existing product.
- Knee-jerk bias: Make fast and intuitive decisions when slow and deliberate decisions are necessary.
- Occam’s razor bias: Assume the most obvious decision is the best decision.
- Silo effect: Use too narrow an approach in making a decision.
- Myopia bias: See and interpret the world through the narrow lens of your own experiences, baggage, beliefs, and assumptions.
- Shock-and-awe bias: Belief that our intellectual firepower alone is enough to make complex decisions.
- Overconfidence effect: Excessiveconfidence in our beliefs, knowledge, and abilities.
Think about the bad decisions that we might have made as individuals or organizations, over the years, both minor and high impact ones, and we will probably see the fingerprints of some of these cognitive biases all over them.
Biases come in groups
We might assume that the biases only surface one at a time, but this is far from reality. Most organizations and leaders fail to take the right call in a particular situation because of a group of biases that are acting against us.
Take for example the case of Shuttle failure:
The Columbia Shuttle disaster was caused by a piece of foam insulation breaking off the propellant tank and damaging the wing. The problem with the foam sections was known, but management had assumed that it posed no risk.
In their analysis, the group found the following biases to be relevant to the failure of this project:
Conservatism: Management failed to take into account negative data.
Overconfidence: Management was confident there were no safety issues.
Recency: Although foam insulation had broken off on previous flights, it had not caused any problems.
Cognitive biases are a given, we cannot escape falling prey to them. However, the good news is that there we can take care to mitigate them to a large extent:
- Staying in Awareness is a key to reducing the influence of cognitive biases on decision making. Simply knowing that cognitive biases exist and can distort our thinking will help lessen their impact. Learn about them, acknowledge their inevitability and take a balanced view whenever we are in situations that calls for taking decisions.
- Mutual Collaborationmay be an effective tool for mitigating cognitive biases. Simply put, it is easier to see biases in others than in ourselves. When we are in decision-making meetings, have your cognitive-bias radar turned on and look for them in your colleagues.
- Mindset of Inquiry is fundamental to challenging the perceptions, judgments and conclusions that can be impacted by cognitive biases. Using ourunderstanding of cognitive biases, ask the right questions of yourself and others that will shed light on the presence of biases and on the best decisions that avoid their trap.
- Encouraging brainstorming and free-wheeling discussions can be valuable in generating multiple decision options, they can also provide the miasma in which cognitive biases can float freely and contaminate the resulting decisions. When you establish a disciplined and consistent framework and process for making decisions, you increase your chances of catching cognitive biases before they hijack your decision making.