The less we know… the more we think we’re right!


Just as continuous improvement teams face hidden perils associated with confirmation bias (see related post), there is another frequently unrecognized pitfall that plagues many-a-project, and that merits our constant attention.

“Theory blindness” is a remarkably common condition in which our theory about the way the world works blinds us to the way the world really works.

When afflicted, we readily accept evidence (however meager or flawed) that supports our assumption or theory, and we explain away or simply fail to notice or correctly interpret evidence that contradicts it.

Daniel Kahneman is an Israeli-American psychologist and economist notable for his work on the psychology of judgment and decision-making, as well as behavioral economics, for which he was awarded the 2002 Nobel Memorial Prize in Economic Sciences.

In his book, Thinking, Fast and Slow, he suggests the human brain is wired to apply a number of biases, theory-blindness being one of them. The impact of theory blindness is that we are inordinately influenced by what we see, and greatly undervalue information we do not have. As a result, paradoxically, the less we know, the more sure we are of our conclusions.
It’s just how we are wired.

The Less We Know…
When engaged in improvement projects it is important to maintain an open mind and a heightened awareness of the impact of theory blindness, lest we fall prey to the pitfall and “assume” things that just aren’t so.

But beware!

Confidence, it turns out, depends much more on coherence (whether all the information at hand points to the same conclusion) than completeness. Thus, while the less we know the less likely we are to be right, the more likely we are to think we are right!