Tag Archives: theory blindness

WYSIATI?

Continuing the theme of keeping improvement projects on track, CI leaders should be very careful to avoid falling prey to “theory blindness.”

Theory-blindness is an “expensive” pitfall that extracts a huge economic toll in organizations of all types and sizes. In some cases it leads companies to invest in expensive solutions that completely miss the real cause. In other instances, organizations will live with costly problems for years because of a shared but erroneous theory about the cause of the problem.

Psychologist Daniel Kahneman, (the only non-economist to win the Nobel Prize in Economics) describes the phenomenon in his book, Thinking, Fast and Slow.

The human brain, he illustrates by describing decades of research, is wired to apply a number of biases, theory-blindness being one of them. Understanding the biases gives us the tools to overcome them.

The most powerful mental bias underlying a great deal of the flawed decision making is what he calls: WYSIATI (which is a acronym for “what-you-see-is-all-there-is”). It occurs because we are inordinately influenced by what we see, and greatly undervalue information we do not have. As a result, paradoxically, the less we know, the more sure we are of our conclusions.

Based on research and many years of experience, we’ve determined the best way to avoid theory blindness is to rigorously adhere to an improvement process; one that includes a comprehensive method of identifying and quantifying root causes and the real waste.

The less we know… the more we think we’re right!

questions

Just as continuous improvement teams face hidden perils associated with confirmation bias (see related post), there is another frequently unrecognized pitfall that plagues many-a-project, and that merits our constant attention.

“Theory blindness” is a remarkably common condition in which our theory about the way the world works blinds us to the way the world really works.

When afflicted, we readily accept evidence (however meager or flawed) that supports our assumption or theory, and we explain away or simply fail to notice or correctly interpret evidence that contradicts it.

Daniel Kahneman is an Israeli-American psychologist and economist notable for his work on the psychology of judgment and decision-making, as well as behavioral economics, for which he was awarded the 2002 Nobel Memorial Prize in Economic Sciences.

In his book, Thinking, Fast and Slow, he suggests the human brain is wired to apply a number of biases, theory-blindness being one of them. The impact of theory blindness is that we are inordinately influenced by what we see, and greatly undervalue information we do not have. As a result, paradoxically, the less we know, the more sure we are of our conclusions.
It’s just how we are wired.

The Less We Know…
When engaged in improvement projects it is important to maintain an open mind and a heightened awareness of the impact of theory blindness, lest we fall prey to the pitfall and “assume” things that just aren’t so.

But beware!

Confidence, it turns out, depends much more on coherence (whether all the information at hand points to the same conclusion) than completeness. Thus, while the less we know the less likely we are to be right, the more likely we are to think we are right!