Category Archives: Decision Making

Confirmation Bias Part 2: Examples & Avoidance

Our previous post explained the concept of “confirmation bias,” which is the tendency to pursue and embrace information that matches our existing beliefs.

Here are some general examples of how confirmation bias can creep into our day-to-day thinking, and three proven ways to avoid the pitfall:

Decision-Driven Data
As previously noted, the inclination to look for supportive data can easily lead us to serious mistakes. Social scientists report that analyses of investments we favor inexorably take on a rosier look than investments we are doubtful about.

Many small choices go into collecting and crunching data and analyzing opportunity and risk and presenting results. Absent a conscientious effort to avoid confirmation bias, small choices — all valid on their own — tend
to be made to support our initial opinion. We think we are making data-driven decisions, but we are really collecting decision-driven data.

For example, author Daniel Kahneman once described a study of high-performing schools to determine if size played a role in quality of educational outcomes. The data indicated that the top quartile in educational performance contained a disproportionate number of small schools, supporting the hypothesis that small schools provided better quality education. This led to some expensive policy decisions that produced no educational benefit. It turned out that small schools are disproportionately represented in the worst performing quartile as well, due to the statistical tendency of larger populations to “regress to the mean” or basically become more “average” and thus to be under-represented in the top and bottom quartile.

First Impressions
Confirmation bias also plays an important role in the inordinate impact of first impressions. A first impression provides a very tiny and possibly serendipitous sample of a candidate’s qualities and qualifications. Yet,
people who believe this is a very intelligent candidate before the interview tend to notice more signs of high intelligence.

Here are three things we can do to protect our decision-making process from conformation bias and potential distortion:

  1. Recognize the bias and remind yourself to look for it in your decisions and analyses. Remind yourself that the authors of everything you read (including this article) are making a point that is supported by the data they present, but is not necessarily by data they do not present — and in fact may not even have seen if they did not look hard enough for contrary data. Remind yourself that the talented and well-meaning people providing you with analysis and recommendations are also subject to confirmation bias. Ask for contrary data.
  2. Ask “what else could it be?” Think creatively about alternative explanations and alternative solutions. Explore the whole feasible set, if possible.
  3. Encourage the expression of contrary views and ideas. “If you value the differences in people, the differences will produce value.” Aggressively seek out and try to understand contrarian views. For many people, the first impulse is to refute contrarian views and argue our own. But the best decisions are likely to be made by those who “seek first to understand rather than be understood.”

Confirmation Bias – Has it Happened to You?

CONFIRMATION BIAS AT WORK

It has happened to most of us. Has it happened to you?

That is, has there been a time when data supported a decision you knew to be the right one, but for some reason or reasons you did not get the outcome you expected?

Perhaps you find an exciting investment opportunity like the winners you have spotted before, but it yields mediocre or poor results. Or despite your experience and successful track record when judging candidates, a person you just “knew” would be a good fit turns out to be a bad hire.

With experience can come wisdom… but also confirmation bias.

Confirmation bias is the tendency to pursue and embrace information that matches our existing beliefs. We tend to seek out and enjoy people who write or say exactly what we think. We gravitate toward these sources not for information but for confirmation.

Researcher and writer Thomas Gilovich posits the “most likely reason for the excessive influence of confirmatory information is that it is easier to deal with cognitively.” It’s easier to think what we think!

Yet confirmation bias in business can be especially hazardous and costly to highly-experienced and successful individuals. These minds are adept at spotting patterns, learning from experience, scanning the horizon and connecting the dots. If that describes your talents, take a look at this classic puzzle nicely presented by the “The Upshot.”

If you attempted the puzzle, how did you do?

For those who opted out, in this puzzle participants are given a numerical pattern and are asked to determine the underlying rule. The pattern is quite simple, and participants can test their theories as often as they like before specifying the rule. Yet 77% of participants fail to identify the rule because as soon as they find a pattern that supports their theory they conclude it is the correct rule.

In other words, 77% of participants succumb to confirmation bias.

This is a common occurrence in business. When trying to solve problems or make decisions we overwhelmingly look for patterns that support our theories rather than looking for data that would clue us in that we have missed the mark. And with each piece of data that does not refute our theory, we become more confident in our belief.

This exercise shows how people tend to work at proving their theories right, instead of robustly testing the theories to prove them wrong. Once we have seen enough supporting evidence to confirm we are right, it is far more natural for us to fully embrace our premise or idea.

For instance, maybe we are tasked with determining why a certain work process is not being done well. Is the work done less well by inexperienced employees, or when the machine is overdue for maintenance, or when the materials have a certain characteristic?

We could test all three of these ideas with data. But our natural confirmation bias makes us far more likely to look for evidence that the idea we favor is correct than to look for ways it may be mistaken. So, we start testing the idea we think is most likely and as soon as we find enough evidence to support it, we risk diving into the solution and excluding the other possibilities; and we could very well be headed down a path of action that is sub-optimum for our organization.

In our next post we’ll take a closer look at examples of confirmation bias in the workplace and steps that can be taken to avoid it.

Conventional Wisdom & Utilization

As you are most likely aware, “utilization” is a measure of the actual number of units produced divided by the number possible when machines and people work at full capacity.

Conventional wisdom says that the best way to maximize profits is to encourage every department within an organization to achieve 100% utilization. Like so much of conventional wisdom, this has a ring of truth to it; and it has the added beauty of simplicity. We can evaluate and reward each department independently of one another, and if everyone is given incentives to get as close as possible to 100% utilization, then the company will surely be maximally profitable.

But this premise will fail us in the real world… a world riddled with variation.

For example, let’s say a company has three operations:
• Glass Blowing
• Filament Insertion
• Cap & Wrap

Utilization of the 3 departments is 50% in Glass Blowing, 100% in Filament Insertion, and 80% in Cap & Wrap. So where do you focus your improvement efforts? The natural conclusion is that you would focus on increasing utilization in Glass Blowing: either by increasing production (which would simply increase the inventory of bulbs waiting for insertion) or by decreasing capacity.

But if you look at the throughput of the process as a whole, you see that Filament Insertion is the bottleneck. At 100% utilization, they are unable to produce enough to keep the next operation, Cap & Wrap, fully utilized. Furthermore, Glass Blowing, despite the lousy utilization numbers, is already piling up inventories of bulbs waiting for filaments. The utilization numbers suggest that Filament Insertion is the last area needing improvement, but to improve the process flow, it must be the first area to improve.

If the world were perfectly predictable, we could reduce the capacity in Glass Blowing and Cap & Wrap to exactly match Filament Insertion to achieve 100% utilization. But if we did so in ‘Murphy’s world,’ any variation in glass blowing production — such as machine downtime, absenteeism, yield deterioration, material availability or quality issues — will not only impact Glass Blowing utilization numbers, but the bottleneck — Filament Insertion —will also be idle! Production opportunity lost at the bottleneck is lost forever. Instead of trying to optimize individual operations, identify the bottleneck and make sure there is enough capacity in the feeder operations to ensure that any disruptions do not impact the utilization of the bottleneck capacity. Instead of aiming to maximize utilization at each operation, as conventional wisdom would have us do, we must find and eliminate waste at the ‘bottleneck’ or ‘rate-limiting’ step in order to increase profitability.

Decision-making Pitfalls: Part 3

4 Pitfalls to Avoid

Our previous two posts focused on the decision-making process, as outlined in a Wall Street Journal Article by Robert I. Sutton, a professor in the department of management science and engineering at Stanford University. The premise is that “how” leaders make decisions is just as important as the decisions themselves. 

In his article Sutton identified four bad habits associated with “how” bosses make decisions. As discussed in our previous two posts, the first of these pitfalls are:

  • Telling people they have a voice in decision-making when, in reality, they don’t
  • Treating final decisions as anything but

The final two habits to be avoided are:

  • Moving too fast: While some leaders suffer from indecision and procrastination, some decisions require more careful thought— “especially risky, important and complicated ones that are costly (or even impossible) to reverse,” Sutton says. Despite the fact that employees most often like working with managers who are confident  and don’t waste time, they are also leery of snap decisions, which are likely to turn out wrong. These decisions are also more likely to undermine employees’ faith in their leader and the decision, and can make employees less motivated to implement the decision. It’s the difference between a smart, confident decision and a  rash one, possibly made without proper research or without sufficient facts and data.
  • Using decision-making as a substitute for action: “A decision by itself changes nothing” says Sutton. Simply “deciding” to change a protocol or process doesn’t help unless someone actually does it! The gap between “knowing” and “doing” is real, yet too many leaders act as if, once they make a decision, and perhaps spread the word, their work is done.

Decision-Making Pitfalls: Part 2

Decision-Making Pitfalls

Our previous post shared data from a Wall Street Journal article about decision-making, which indicated that the way in which leaders make decisions (the process) is just as important as what decisions they make.

In that article, author Robert I. Sutton described four specific pitfalls associated with the decision-making process that can compromise a leader’s effectiveness as well as the effectiveness and attitudes of people throughout the organization.

The first of these pitfalls, which was the subject of our previous post, involves telling people they have a voice in decision-making when, in reality, they don’t.

Next on the list is the poor habit some leaders have of “treating final decisions as anything but!”

“Many insecure bosses have a habit that is especially damaging: After a decision has been made and communicated and implementation has begun, their insecurity compels them to revisit the choice too soon and too often. A few complaints, a small early setback, or simply anxiety about the decision can provoke such unnecessary reconsideration.”

Sutton goes on to explain that the insecurity and waffling “infects their teams.”  In addition, many of the people involved lose faith in their leaders’ ability to make good decisions, and also lose interest in implementing new directives that could soon become subject to change.

We will take a look at two additional decision-making pitfalls in our next post.

Decision Making Pitfalls – Part 1

In a recent Wall Street Journal article, Robert I. Sutton, a professor in the department of management science and engineering at Stanford University and co-author of “Scaling Up Excellence,” shared some interesting and important insight into decision-making.

In his article, Sutton makes several points consistent with the fact that all work (i.e., decision making) is part of a process, and every process can be improved.

For example, he first explains that in organizations of all types, how  leaders make decisions (the process) is just as important as what decisions they make.

Sutton then described four specific pitfalls associated with the decision-making process that can compromise a leader’s effectiveness as well as the effectiveness and attitudes of people throughout the organization.

The first of these pitfalls involves telling people they have a voice in decision-making when, in reality, they don’t.

“Good decision-making entails consulting key stakeholders—and using their input to shape final choices,” Sutton said.  “Doing so improves the quality of the decisions, and makes employees more motivated to implement them.”

Unfortunately, in too many cases the consultation of others is only make believe… it starts out looking like the real thing, but in the end leaders are just pretending that others’ input has some influence over the final decision.

While the motivating force behind the make-believe-consultation can vary — some bosses do it to fool people into getting behind the decision’s implementation, and others because they think the mere opportunity to voice opinions somehow makes people feel better — it doesn’t matter. In the end, pretending to consult others for decision-making purposes and then ignoring their input turns out to be demoralizing. Further, the associated deception and disrespect often causes employees or stakeholders to lose faith in their leaders.

In upcoming posts we’ll look at three additional pitfalls related to “how” decisions are made, and how each impacts all of the people involved.

“Classified” Decisions

A past post focused on how we make decisions, and noted that while people often think they could make better decisions if they had more facts and data, in practice the presence of “too much information” often complicates decision-making.

In fact, behavioral economists report that “data driven” decisions tend to increase confidence in the decision far more than the quality of the decision.

While the above-mentioned post shares a five-step process for making critical or complex decisions, a simpler approach might be equally as good for certain decisions. Consequently, we might want to first consider the “type” of decision with which we’re faced. In other words, if we “classify” the decision first we can then proceed more strategically.

For example, in their book “A Leader’s Framework for Decision Making,” authors  David Snowden and Mary Boone explain that decisions can be categorized by context:

  • Known knowns: That is, we know what information we need in order to make a good decision and we can acquire that information. These decisions can be mapped out with simple decision-trees to reliably and quickly produce good outcomes. For example, a supplier selection process can be mapped out and reliably executed to produce good results.
  • Known unknowns: The problem is knowable, but not simple. We require an expert to gather and process the information to arrive at a reliably good decision. Decisions about how to design a website to maximize traffic or where to position a power plant relative to cooling sources are examples of “known unknowns.” Snowden and Boone refer to these as “complicated decisions — the domain of experts.”
  • Unknown unknowns: Complex decisions, in which we may not even know all the right questions, are increasing in frequency. Many strategic decisions organizations face today carry a great deal of uncertainty. Since best practices are by definition past practices, we have little to go on when faced with unknown unknowns. Thus the more detailed 5-step decision-making process outlined in the above-referenced post can help us achieve the best results.