Cognitive bias is a bit of a loaded term. It can be a problem for companies striving to become more data-driven, although many cognitive biases are the flip side of an often useful thought process.

A cognitive bias is what Oliver Sibony, author of You’re About to Make a Terrible Mistake!1 and professor at HEC Paris, calls a heuristic gone bad. 

A heuristic is a kind of mental shortcut that helps us navigate life. For instance, look to the left before you cross the street is, in the U.S., a heuristic. It’s a useful mental habit—the most immediate danger to a pedestrian comes from the left. However, when an American visits the United Kingdom, this same habit becomes a cognitive bias, because there people drive on the left side of the road. A useful habit in one context can become useless or even dangerous in another.

These mental habits illustrate what Nobel prize-winning economist Daniel Kahneman refers to as “thinking, fast and slow,” which is also the name of his book.2 One form of cognitive bias, for example, is called the first instinct fallacy. Individuals often learn to go “with their gut,” or first instinct, but research shows that thinking about a question longer often gives a better chance of getting it right. 

Still, a Verywell Mind article holds that mental heuristics originally developed for evolutionary reasons to enable faster decisions.3 So, yes, the process of thinking fast can be good and bad. Company leaders face the need to figure out when a heuristic tips over into a bias and then institute measures to guard against and repair that bias. Without such a repair, an organization is likely to repeat past decisions merely because they provide a precedent, even when the situation has changed radically over time. In the data context, cognitive biases cause incorrect analyses—and bad business decisions result.

Companies need to experiment with different models and get input from a broad range of contributors. That’s the only antidote.

Cognitive Bias Is Remarkably Persistent 

Cognitive bias comes in many forms, and can act as a drag on an organization’s efficiency. In addition to the first-instinct fallacy, these biases include the sunk cost fallacy, in which you send good money after bad, the planning fallacy, which is an underestimation of the time needed to complete a task, and many more. 

According to Sibony,4 “Just being aware of your biases doesn’t make them go away. The only thing that is going to improve the quality of your decisions is if, at the organizational level, you change the way decisions are made.”

Sibony recommends structuring the decision-making process to include a number of different parties allows for participants to possibly catch a bias that one person alone would not. The inclusion of different parties in decision-making counters some biases. Making certain the group assessing the decisions is diverse in background is likely to eliminate more.

As Bernard Günther, CFO of German electric utility RWE, told The McKinsey Quarterly for an article called “A case study in combating bias,”5 when his organization analyzed its decision-making dynamics, certain things became clear. 

“We had fallen victim to a number of cognitive biases in combination,” said Günther. “We could see that status quo and confirmation biases had led us to assume the world would always be what it used to be.” 

One of the countermeasures RWE instituted was a “devil’s advocate” for every big decision. 

The devil’s advocate is “someone who has no personal stake in the decision and is senior enough in the hierarchy to be as independent as possible, usually a level below the executive board,” according to Günther. The advocate’s task is to create constructive tension, ensuring greater confidence in the decision that is ultimately made. 

“When you’re planning to launch a new product, or you’re planning to build a new factory, or you are organizing the Olympics or something like that, you’re extremely likely to underestimate the time and the cost of the project and to end up with an overrun,” said Sibony. “This keeps happening over and over again, despite the fact that we know about it, that we’ve read about it, that we’ve done studies about it, that we’ve been warned about it, and that we think we’ve taken precautions about it. That characteristic of a bias is that it is a mistake that keeps recurring despite the fact that we’ve been warned against it, sometimes profusely.”

The notion of a business person as an individual is so powerful we have novels about it, from The Rise of Silas Lapham by William Dean Howells  to The Fountainhead by Ayn Rand. So executives may feel like they are going against the grain to ensure decisions are made by a number of people. But simply knowing that people are susceptible to cognitive bias won’t protect from it. 

However, learning to embrace an iterative and collaborative process can take the sting out of having your decision questioned. Jennifer Belissent, Principal Data Strategist for Snowflake, suggests transforming decisions—planning decisions and others—from a one-and-done model to an interactive one. Set the expectation that budgets, as an example, will be created in a series of drafts, with input from multiple stakeholders, not by a lone person working until he or she believes it is perfect.

Cognitive Biases Do Not Appear Solely at the Moment of Decision 

According to Belissent, cognitive bias often shows up as “business as usual.” 

“Where do you go for doing certain things?” asked Belissent. “Who’s your go-to for something? Or, how do you typically address a specific problem? What are your shortcuts for doing things?” 

This tendency to default to old habits and reflexes becomes a problem when the goal is creativity and innovation. 

“If you’re looking for creativity, if you’re looking for new ways of doing things, if you’re looking for new inspiration, you’ve got to fight those habits,” said Belissent. 

In Three keys to faster, better decisions,6 McKinsey Quarterly noted a decision innovation instituted by Amazon CEO Jeff Bezos. “In his April 2017 letter to Amazon shareholders, CEO Jeff Bezos introduced the concept of ‘disagree and commit’ with respect to decision-making. It’s good advice that often goes overlooked. Too frequently, executives charged with making decisions…leave the meeting assuming that once there’s been a show of hands—or nods of agreement—the job is done. Far from it.” 

Transforming from a collection of individuals to a team, from a one-and-done model to an iterative one, is necessary to become both more effective and more creative. There is a gameplay element to this change. It is fun to work with people you like and people you respect to create something you can all get behind. This approach does not rob us of our individuality; it adds to what we can do when we are also members of a team. 

In the final analysis, the easiest way to build bias into an organization is by consolidating decision-making in the hands of one person or one group. The most reliable way of breaking out of bias is to increase the number of voices—and diversity of experience and perspective—participating in the discussion of a given decision. 

Not all cognitive biases are bad; it’s up to business leaders to recognize them, and address biases when they are leading to bad interpretations of data and bad business decisions.

1 bit.ly/3B5ztIh
2 bit.ly/36E0X9X
3 bit.ly/3yWeGVN
4 bit.ly/3B5ztIh
5 mck.co/3AtRIHf
6 mck.co/3hEvoC9


Read more about overcoming cognitive bias:

You’re About to Make a Terrible Mistake! by Olivier Sibony

Thinking, Fast and Slow by Daniel Kahneman

Think Again by Adam Grant

The Hidden Brain by Shankar Vedantam