Category Archives: Conventional Wisdom

The Pros and Cons of Conventional Wisdom

challenge fish

Our previous post shared an example of how what “seemed” to be a sure way of increasing the bottom line turned out to do quite the opposite! It was also a classic example of the perils that are often attached to “conventional wisdom.”

Conventional wisdom is typically considered to be an asset, and in many situations it can be. It speeds up consensus and increases our confidence in our decision making, leaving us to focus our attention on challenges for which there is no conventional wisdom to guide us. And conventional wisdom has much truth within it — having been developed over decades of observations.

But in a dynamic world when the underlying assumptions shift, we follow conventional wisdom at our peril as it can easily lead your organization to make some big mistakes.

For example, conventional wisdom holds that specialization is good. A person can get very fast and reliable doing the same thing the same way again and again, as was done in Henry Ford’s assembly line which broke the complex craft of auto assembly into a sequence of very specialized jobs that could be easily taught to the relatively unskilled workforce. Assembly line efficiencies put automobile ownership in reach of a much larger portion of the country and made the benefits of specialization a part of our national business psyche.

But to achieve the benefits of specialization, you need something increasingly uncommon in today’s world: high volume/low variation work.

A service organization learned this lesson the hard way when they implemented a plan they “thought” would speed up throughput and reduce overtime costs for processing new account applications . They organized their processers into different groups to handle different clients. This enabled each processer to complete an account set-up faster because they could easily memorize the steps and forms for their small group of clients. Nonetheless, the efficiency of the operation as a whole declined substantially. Variation in the incoming volume resulted in one group being swamped one day and working overtime, while another group was very slow.

For work that is low volume/high variation, as in the service organization example, specialization tends to reduce throughput. In such an environment, multi-skilled generalists are far more valuable. Specialization may maximize the speed of the individual, but sub-optimize the process as a whole.

Possibly Mark Twain summed it up best when he said, “”It ain’t what you don’t know that gets you into trouble; it’s what you know for sure that just ain’t so.”

Another Pitfall Akin to Confirmation Bias

Thinking outside-of-the box?

Our previous post shared some thoughts on the pitfall of “confirmation bias,” which is the tendency to pursue and embrace information that matches our existing beliefs.

A somewhat related concept that can, surprisingly, be equally as dangerous is “conventional wisdom,” which has been defined as “the body of ideas or explanations generally accepted as true by the public or by experts in a field.” It is frequently referenced as “inside-the-box” thinking, as opposed to taking an approach that challenges convention (i.e., “outside-the-box” thinking).

But contrary to popular belief (or, to ‘conventional wisdom’ – ha ha!), this seemingly safe practice can be both an asset and a liability!

On the plus-side, conventional wisdom speeds up consensus and increases our confidence in our decision making, leaving us to focus our attention on challenges for which there is no conventional wisdom to guide us. And conventional wisdom has much truth within it — having been developed over decades of observations.

For example, conventional wisdom holds that specialization is good. A person can get very fast and reliable doing the same thing the same way again and again, a-la Henry Ford’s production line.

However, while specialization can increase both efficiency and quality when demand is consistent at optimum levels, it can quickly become counterproductive, costly, and even wasteful if the demand for work is uncertain.

For example, a commercial bakery could purchase one large capacity mixer that could produce 100,000 loaves for far less cost per loaf than two smaller mixers. The large mixer produces large batch sizes; that’s how it gets its great efficiencies. But if the market is looking for variety, none of which is ordered in bulk, the large mixer results in the worst of both worlds: you either produce large batch sizes and have a lot of scrap if the demand does not materialize in time, or you waste the purchased capacity by preparing batch sizes more closely tied to current demand for the product variety. Either way, you can never really produce enough variety for the market, because the equipment produces only one variety at a time.

So like many things in life, when we find ourselves needing to research the marketplace, assess root causes, or study work processes, we must beware of both confirmation bias and its kin conventional wisdom, lest we make sub-optimum (or worse!) choices that feel good at the start but come back to bite us in the end.

Researching with Happy Ears

Confirmation Bias

It has often been said that people tend to “hear what they want to hear.”

In other words, it can be a little bit too easy to accept information or research that we already believe, and to filter-out information or research that we don’t already believe.

This tendency can also increase over time, as with experience comes wisdom… and confirmation bias, which is the tendency to pursue and embrace information that matches our existing beliefs. A simple pre-existing opinion can easily distort the way our minds absorb confirming and contradictory information.

We are making this point today as a follow-up to our previous post, which focused on devoting more resources to value-added work. Four suggestions were made:

  1. Work On Bottlenecks
  2. Increase Understanding of And Alignment With What Customers Truly Value
  3. Get At The Root Causes
  4. Eliminate The Non Value Adding Administrative Work

You might consider that each of these four approaches will require us to conduct research and to compile facts and data. However, each also involves studying areas of our organization with which we are already familiar. Thus the likelihood of “confirmation bias” creeping into our research is high.

So, as the saying goes, forewarned is forearmed! Beware of “happy ears.”

Confirmation Bias Part 2: Examples & Avoidance

Our previous post explained the concept of “confirmation bias,” which is the tendency to pursue and embrace information that matches our existing beliefs.

Here are some general examples of how confirmation bias can creep into our day-to-day thinking, and three proven ways to avoid the pitfall:

Decision-Driven Data
As previously noted, the inclination to look for supportive data can easily lead us to serious mistakes. Social scientists report that analyses of investments we favor inexorably take on a rosier look than investments we are doubtful about.

Many small choices go into collecting and crunching data and analyzing opportunity and risk and presenting results. Absent a conscientious effort to avoid confirmation bias, small choices — all valid on their own — tend
to be made to support our initial opinion. We think we are making data-driven decisions, but we are really collecting decision-driven data.

For example, author Daniel Kahneman once described a study of high-performing schools to determine if size played a role in quality of educational outcomes. The data indicated that the top quartile in educational performance contained a disproportionate number of small schools, supporting the hypothesis that small schools provided better quality education. This led to some expensive policy decisions that produced no educational benefit. It turned out that small schools are disproportionately represented in the worst performing quartile as well, due to the statistical tendency of larger populations to “regress to the mean” or basically become more “average” and thus to be under-represented in the top and bottom quartile.

First Impressions
Confirmation bias also plays an important role in the inordinate impact of first impressions. A first impression provides a very tiny and possibly serendipitous sample of a candidate’s qualities and qualifications. Yet,
people who believe this is a very intelligent candidate before the interview tend to notice more signs of high intelligence.

Here are three things we can do to protect our decision-making process from conformation bias and potential distortion:

  1. Recognize the bias and remind yourself to look for it in your decisions and analyses. Remind yourself that the authors of everything you read (including this article) are making a point that is supported by the data they present, but is not necessarily by data they do not present — and in fact may not even have seen if they did not look hard enough for contrary data. Remind yourself that the talented and well-meaning people providing you with analysis and recommendations are also subject to confirmation bias. Ask for contrary data.
  2. Ask “what else could it be?” Think creatively about alternative explanations and alternative solutions. Explore the whole feasible set, if possible.
  3. Encourage the expression of contrary views and ideas. “If you value the differences in people, the differences will produce value.” Aggressively seek out and try to understand contrarian views. For many people, the first impulse is to refute contrarian views and argue our own. But the best decisions are likely to be made by those who “seek first to understand rather than be understood.”

Confirmation Bias – Has it Happened to You?


It has happened to most of us. Has it happened to you?

That is, has there been a time when data supported a decision you knew to be the right one, but for some reason or reasons you did not get the outcome you expected?

Perhaps you find an exciting investment opportunity like the winners you have spotted before, but it yields mediocre or poor results. Or despite your experience and successful track record when judging candidates, a person you just “knew” would be a good fit turns out to be a bad hire.

With experience can come wisdom… but also confirmation bias.

Confirmation bias is the tendency to pursue and embrace information that matches our existing beliefs. We tend to seek out and enjoy people who write or say exactly what we think. We gravitate toward these sources not for information but for confirmation.

Researcher and writer Thomas Gilovich posits the “most likely reason for the excessive influence of confirmatory information is that it is easier to deal with cognitively.” It’s easier to think what we think!

Yet confirmation bias in business can be especially hazardous and costly to highly-experienced and successful individuals. These minds are adept at spotting patterns, learning from experience, scanning the horizon and connecting the dots. If that describes your talents, take a look at this classic puzzle nicely presented by the “The Upshot.”

If you attempted the puzzle, how did you do?

For those who opted out, in this puzzle participants are given a numerical pattern and are asked to determine the underlying rule. The pattern is quite simple, and participants can test their theories as often as they like before specifying the rule. Yet 77% of participants fail to identify the rule because as soon as they find a pattern that supports their theory they conclude it is the correct rule.

In other words, 77% of participants succumb to confirmation bias.

This is a common occurrence in business. When trying to solve problems or make decisions we overwhelmingly look for patterns that support our theories rather than looking for data that would clue us in that we have missed the mark. And with each piece of data that does not refute our theory, we become more confident in our belief.

This exercise shows how people tend to work at proving their theories right, instead of robustly testing the theories to prove them wrong. Once we have seen enough supporting evidence to confirm we are right, it is far more natural for us to fully embrace our premise or idea.

For instance, maybe we are tasked with determining why a certain work process is not being done well. Is the work done less well by inexperienced employees, or when the machine is overdue for maintenance, or when the materials have a certain characteristic?

We could test all three of these ideas with data. But our natural confirmation bias makes us far more likely to look for evidence that the idea we favor is correct than to look for ways it may be mistaken. So, we start testing the idea we think is most likely and as soon as we find enough evidence to support it, we risk diving into the solution and excluding the other possibilities; and we could very well be headed down a path of action that is sub-optimum for our organization.

In our next post we’ll take a closer look at examples of confirmation bias in the workplace and steps that can be taken to avoid it.

Conventional Wisdom & Utilization

As you are most likely aware, “utilization” is a measure of the actual number of units produced divided by the number possible when machines and people work at full capacity.

Conventional wisdom says that the best way to maximize profits is to encourage every department within an organization to achieve 100% utilization. Like so much of conventional wisdom, this has a ring of truth to it; and it has the added beauty of simplicity. We can evaluate and reward each department independently of one another, and if everyone is given incentives to get as close as possible to 100% utilization, then the company will surely be maximally profitable.

But this premise will fail us in the real world… a world riddled with variation.

For example, let’s say a company has three operations:
• Glass Blowing
• Filament Insertion
• Cap & Wrap

Utilization of the 3 departments is 50% in Glass Blowing, 100% in Filament Insertion, and 80% in Cap & Wrap. So where do you focus your improvement efforts? The natural conclusion is that you would focus on increasing utilization in Glass Blowing: either by increasing production (which would simply increase the inventory of bulbs waiting for insertion) or by decreasing capacity.

But if you look at the throughput of the process as a whole, you see that Filament Insertion is the bottleneck. At 100% utilization, they are unable to produce enough to keep the next operation, Cap & Wrap, fully utilized. Furthermore, Glass Blowing, despite the lousy utilization numbers, is already piling up inventories of bulbs waiting for filaments. The utilization numbers suggest that Filament Insertion is the last area needing improvement, but to improve the process flow, it must be the first area to improve.

If the world were perfectly predictable, we could reduce the capacity in Glass Blowing and Cap & Wrap to exactly match Filament Insertion to achieve 100% utilization. But if we did so in ‘Murphy’s world,’ any variation in glass blowing production — such as machine downtime, absenteeism, yield deterioration, material availability or quality issues — will not only impact Glass Blowing utilization numbers, but the bottleneck — Filament Insertion —will also be idle! Production opportunity lost at the bottleneck is lost forever. Instead of trying to optimize individual operations, identify the bottleneck and make sure there is enough capacity in the feeder operations to ensure that any disruptions do not impact the utilization of the bottleneck capacity. Instead of aiming to maximize utilization at each operation, as conventional wisdom would have us do, we must find and eliminate waste at the ‘bottleneck’ or ‘rate-limiting’ step in order to increase profitability.