It has often been said that people tend to “hear what they want to hear.”
In other words, it can be a little bit too easy to accept information or research that we already believe, and to filter-out information or research that we don’t already believe.
This tendency can also increase over time, as with experience comes wisdom… and confirmation bias, which is the tendency to pursue and embrace information that matches our existing beliefs. A simple pre-existing opinion can easily distort the way our minds absorb confirming and contradictory information.
We are making this point today as a follow-up to our previous post, which focused on devoting more resources to value-added work. Four suggestions were made:
Work On Bottlenecks
Increase Understanding of And Alignment With What Customers Truly Value
Get At The Root Causes
Eliminate The Non Value Adding Administrative Work
You might consider that each of these four approaches will require us to conduct research and to compile facts and data. However, each also involves studying areas of our organization with which we are already familiar. Thus the likelihood of “confirmation bias” creeping into our research is high.
So, as the saying goes, forewarned is forearmed! Beware of “happy ears.”
Our previous post explained the concept of “confirmation bias,” which is the tendency to pursue and embrace information that matches our existing beliefs.
Here are some general examples of how confirmation bias can creep into our day-to-day thinking, and three proven ways to avoid the pitfall:
Decision-Driven Data As previously noted, the inclination to look for supportive data can easily lead us to serious mistakes. Social scientists report that analyses of investments we favor inexorably take on a rosier look than investments we are doubtful about.
Many small choices go into collecting and crunching data and analyzing opportunity and risk and presenting results. Absent a conscientious effort to avoid confirmation bias, small choices — all valid on their own — tend to be made to support our initial opinion. We think we are making data-driven decisions, but we are really collecting decision-driven data.
For example, author Daniel Kahneman once described a study of high-performing schools to determine if size played a role in quality of educational outcomes. The data indicated that the top quartile in educational performance contained a disproportionate number of small schools, supporting the hypothesis that small schools provided better quality education. This led to some expensive policy decisions that produced no educational benefit. It turned out that small schools are disproportionately represented in the worst performing quartile as well, due to the statistical tendency of larger populations to “regress to the mean” or basically become more “average” and thus to be under-represented in the top and bottom quartile.
First Impressions Confirmation bias also plays an important role in the inordinate impact of first impressions. A first impression provides a very tiny and possibly serendipitous sample of a candidate’s qualities and qualifications. Yet, people who believe this is a very intelligent candidate before the interview tend to notice more signs of high intelligence.
Here are three things we can do to protect our decision-making process from conformation bias and potential distortion:
Recognize the bias and remind yourself to look for it in your decisions and analyses. Remind yourself that the authors of everything you read (including this article) are making a point that is supported by the data they present, but is not necessarily by data they do not present — and in fact may not even have seen if they did not look hard enough for contrary data. Remind yourself that the talented and well-meaning people providing you with analysis and recommendations are also subject to confirmation bias. Ask for contrary data.
Ask “what else could it be?” Think creatively about alternative explanations and alternative solutions. Explore the whole feasible set, if possible.
Encourage the expression of contrary views and ideas. “If you value the differences in people, the differences will produce value.” Aggressively seek out and try to understand contrarian views. For many people, the first impulse is to refute contrarian views and argue our own. But the best decisions are likely to be made by those who “seek first to understand rather than be understood.”
It has happened to most of us. Has it happened to you?
That is, has there been a time when data supported a decision you knew to be the right one, but for some reason or reasons you did not get the outcome you expected?
Perhaps you find an exciting investment opportunity like the winners you have spotted before, but it yields mediocre or poor results. Or despite your experience and successful track record when judging candidates, a person you just “knew” would be a good fit turns out to be a bad hire.
With experience can come wisdom… but also confirmation bias.
Confirmation bias is the tendency to pursue and embrace information that matches our existing beliefs. We tend to seek out and enjoy people who write or say exactly what we think. We gravitate toward these sources not for information but for confirmation.
Researcher and writer Thomas Gilovich posits the “most likely reason for the excessive influence of confirmatory information is that it is easier to deal with cognitively.” It’s easier to think what we think!
Yet confirmation bias in business can be especially hazardous and costly to highly-experienced and successful individuals. These minds are adept at spotting patterns, learning from experience, scanning the horizon and connecting the dots. If that describes your talents, take a look at this classic puzzle nicely presented by the “The Upshot.”
If you attempted the puzzle, how did you do?
For those who opted out, in this puzzle participants are given a numerical pattern and are asked to determine the underlying rule. The pattern is quite simple, and participants can test their theories as often as they like before specifying the rule. Yet 77% of participants fail to identify the rule because as soon as they find a pattern that supports their theory they conclude it is the correct rule.
In other words, 77% of participants succumb to confirmation bias.
This is a common occurrence in business. When trying to solve problems or make decisions we overwhelmingly look for patterns that support our theories rather than looking for data that would clue us in that we have missed the mark. And with each piece of data that does not refute our theory, we become more confident in our belief.
This exercise shows how people tend to work at proving their theories right, instead of robustly testing the theories to prove them wrong. Once we have seen enough supporting evidence to confirm we are right, it is far more natural for us to fully embrace our premise or idea.
For instance, maybe we are tasked with determining why a certain work process is not being done well. Is the work done less well by inexperienced employees, or when the machine is overdue for maintenance, or when the materials have a certain characteristic?
We could test all three of these ideas with data. But our natural confirmation bias makes us far more likely to look for evidence that the idea we favor is correct than to look for ways it may be mistaken. So, we start testing the idea we think is most likely and as soon as we find enough evidence to support it, we risk diving into the solution and excluding the other possibilities; and we could very well be headed down a path of action that is sub-optimum for our organization.
In our next post we’ll take a closer look at examples of confirmation bias in the workplace and steps that can be taken to avoid it.
As noted in our previous post, confirmation bias is the tendency to pursue and embrace information that matches our existing beliefs; and this inclination to look for supportive data can easily lead us to serious mistakes…
In other words, we become more likely to look for evidence that the idea we favor is correct rather than look for ways it may be wrong.
Here are three ways in which we can protect our decision-making from the related distortion:
Recognize the bias and remind ourselves to look for it in our decisions and analyses. Remember that the authors of everything we read are making a point that is supported by the data they present, but is not necessarily by data they do not present.
Make a habit of asking ourselves, “What else could it be?” We must think creatively about alternative explanations and alternative solutions, and do our best to explore them.
Encourage the expression of contrary views and ideas.
Confirmation bias is the tendency to pursue and embrace information that matches our existing beliefs.
Consider that we tend to seek out and enjoy people who write or say exactly what we think, and that we tend to gravitate toward these sources not for information but for confirmation.
The trouble with confirmation bias begins when it gets in the way of seeking out facts. In other words, we become more likely to look for evidence that the idea we favor is correct rather than look for ways it may be mistaken. In many cases we may start testing an idea we think is most likely right, and as soon as we find enough evidence to support it, we dive into the solution.
We think we are making data-driven decisions, but we are really collecting decision-driven data.
If instead we can develop the habit of seeking contrary information it will enable us to be more informed, make better decisions, and work on the right things.
In our next post we’ll share some specific action-steps for avoiding confirmation bias.
Challenges and best practices associated with continuous improvement