It’s hard for me to change my mind. I’m impatient, so when given a choice, like which taco to order, which car to buy, which candidate to hire, I want to decide quickly and then find reasons not to reconsider. It’s painful to step back and start over. And yet, I know this about myself. I try to compensate. But the possibility that I’ve committed to the wrong path sometimes keeps me awake.
My hardest decisions are in software designs. I might work for weeks to choose among several options, and get my colleagues to agree, and then wonder: am I confident that I chose correctly? Or am I fooling myself, because I don’t want to keep working on it? If I chose wrong, the consequences could bite us years later.
This is just one situation where I’m aware of my motivated reasoning. There are many others, and even more where I’m not aware. So I was excited by the promise of Julia Galef’s “The Scout Mindset”: to recognize and compensate for motivated reasoning, and be more skilled at finding the truth. (I was excited also because I have long enjoyed her rationalism podcast.) Galef’s central metaphors are the “scout mindset” and the “soldier mindset”. A soldier defends her opinions against all comers, but a scout has few preconceptions, and seeks only the truth. An army sends a scout with a bad map to reconnoitre, and whenever the scout finds a mistake on her map, she gladly updates it. “Aha! My opinion has become more accurate.”
The best description of motivated reasoning I’ve ever seen comes from psychologist Tom Gilovich. When we want something to be true, he said, we ask ourselves, “Can I believe this?,” searching for an excuse to accept it. When we don’t want something to be true, we instead ask ourselves, “Must I believe this?,” searching for an excuse to reject it. … In contrast to directionally motivated reasoning, which evaluates ideas through the lenses of “Can I believe it?” and “Must I believe it?,” accuracy motivated reasoning evaluates ideas through the lens of “Is it true?”
“The Scout Mindset” is an efficient little book if you have the same desire for self-correction that I do. It brought my attention to how much motivated reasoning I engage in, and it encouraged me to be more rigorous. It’s full of little tricks for better thinking. For example, I learned about Chesterton’s Fence, which is a good principle for changing old code:
I try to abide by the rule that when you advocate changing something, you should make sure you understand why it is the way it is in the first place. This rule is known as Chesterton’s fence, after G. K. Chesterton, the British writer who proposed it in an essay in 1929. Imagine you discover a road that has a fence built across it for no particular reason you can see. You say to yourself, “Why would someone build a fence here? This seems unnecessary and stupid, let’s tear it down.” But if you don’t understand why the fence is there, Chesterton argued, you can’t be confident that it’s okay to tear it down.
The book has some tests for detecting motivated reasoning in myself. I haven’t applied them yet, but they sound promising. For example, the Outsider Test: imagine someone else is in your shoes, what might they do? The Conformity Test: imagine everyone disagreed with you, would you still believe what you do? Status Quo Bias Test: would you choose your current situation, if it weren’t the status quo? If not, maybe you should change your situation. The tests are all in Chapter 5, maybe I’ll go refer to them the next time I’m making a hard decision, especially in a software design.
I recommend this book to anyone who must make decisions in the face of uncertainty that will affect their lives and the lives of others: that is, I recommend it to everyone.
Further reading: Julia Galef interview with Dylan Matthews.