I recently spent time with the class members of The Fedcap Group’s Leadership Academy. We spent a fair amount of time on the topic of critical thinking. I shared with them my love of the book by the Nobel Memorial Prize in Economic Sciences laureate Daniel Kahneman, Thinking Fast and Slow.
In his book, Kahneman discusses two systems of thinking. System 1 operates automatically and quickly, with little or no effort, and no sense of voluntary control. We just know. We may not even know why we know, but we do. We don’t have to think as this knowing comes fast. That driver is driving erratically, 1 + 1 = 2, the snow is cold. We know it because it is predictable—we have seen it before and we understand it. He also discusses System 2, allocating attention to the effortful mental activities that demand it, including complex computations, 23 x 65 = ?, figuring out the fastest way to a restaurant or parking in a narrow parking space. System 1 can be completely involuntary. You can’t stop your brain from completing 2 + 2 = 4. You can’t un-see optical illusions, even if you rationally know what’s going on.
The theory Dr. Kahneman posits is that System 1 automatically generates suggestions, feelings, and intuitions for System 2. If endorsed by System 2, intuitions turn into beliefs, and impulses turn into voluntary actions. What gets interesting is when he discusses the trouble that can arise when a “lazy” System 2 accepts what the faulty System 1 gives it, without questioning, and how that leads to cognitive biases.
Without spending much more time on the book itself, the concept for me is foundational. There are things that for whatever reason, one believes to be true. They may be, they may not be, but for that person, they are. And when one accepts this “truth” as being fact, they often stop being curious. They stop imagining that something could be different than what they believe and, as a result, cognitive biases develop.
While I would not have structured the discussion with the brilliance of Dr. Kahneman, every day I see the impact of people who have come to believe something to be “truth”. I see what happens due to the lack of curiosity, the lack of asking questions. And I see the mistakes that result from thinking we know. This is why when people come to me, they rarely get a simple yes or no. If they want me to give them an answer about something, they can expect that I will ask many, many questions. As frustrating as this can be, I want to understand and, in order to understand, I have to ask questions.
I believe that it is incumbent upon leaders to role model this kind of critical thinking—every day. We need to create a climate of curiosity, one that by its very nature invites questions and potentially disrupts cognitive bias…even our own.
As always I welcome your thoughts.