A Different Way to Think About Confirmation Bias

Ethan Milne
5 min readJul 26, 2020

--

In his book, The Righteous Mind, Jonathan Haidt describes an odd dynamic that occurs when humans think about new evidence:

“… When we want to believe something, we ask ourselves, “Can I believe it?” Then (as Kuhn and Perkins found), we search for supporting evidence, and if we find even a single piece of pseudo-evidence, we can stop thinking. We now have permission to believe. We have a justification, in case anyone asks. In contrast, when we don’t want to believe something, we ask ourselves, “Must I believe it?” Then we search for contrary evidence, and if we find a single reason to doubt the claim, we can dismiss it.”

This happens all the time. You’ve done it, I’ve done it, everyone does it. It’s just how our minds work. But the world has changed faster than our minds can evolve, so sometimes this tendency leads us astray.

Can I Believe It?

This is what most people think of when they say “confirmation bias”. We look for any evidence — no matter how weak — that confirms our prior intuitions. Tim Urban from the Wait But Why blog describes this as thinking like a sports fan:

Source

Where a person thinking like a scientist wants to find the truth even if it contradicts their preferred belief, a sports fan values truth a little bit less and confirmation a little bit more. Note that these are stereotypes: lots of scientists are motivated reasoners, and lots of sports fans, I assume, are rational people.

This sports fan mode of thinking isn’t limited to sports. Imagine Janice, a recent MBA grad trying to start her own business. She wants to revolutionize the world of consumer waste disposal — but first she needs to establish that there are problems consumers have with their existing waste disposal services.

Janice might start looking at survey data of consumers rating waste disposal. She sees lots of studies asking people to rate waste disposal on a scale from 1–7, with 1 being Awful and 7 being Amazing. A graph of all these studies might look something like this:

So most people are ok with their waste disposal services. But what study do you think Janice looks at? Probably the one study that says everything is awful.

Why I wasn’t hired by a consulting firm, Example #1092

…. Her slide deck looks slick, though.

Must I Believe It?

This is another component of confirmation bias, albeit a less direct part. While we may leap to support evidence confirming our beliefs, we also discount evidence we don’t like.

An example Haidt uses frequently is a study that asks participants to read a study on coffee drinking’s link to bad outcomes. Participants who drank coffee regularly were, unsurprisingly, extremely good at identifying flaws in experimental design relative to their decaffeinated counterparts.

“Must I Believe It” is a mindset that’s very hard to identify as confirmation bias at the beginning. This mindset often comes in the form of what Scott Alexander calls Isolated Demands for Rigour; It’s all well and good to criticize a study for bad methodology, or point out the flaws of using single studies as proof, but if this rigour is only applied to positions you disagree with, it’s probably a case of “Must I Believe It” applied in a biased manner.

For a sample of Alexander’s post on the subject, here’s an imagined wild west showdown between philosophers unscrupulously using isolated demands for rigour:

The old man stamped his boot in the red dirt, kicking up a tiny cloud of dust. “There’s a new sheriff in town,” he told them.

“No, I’m pretty sure that’s impossible,” says Parmenides. “There’s no such thing as change, only the appearance thereof.”

“Well then,” says the old man, “I reckon you won’t mind the false illusion of your surroundings appearing to change into a jail cell.” And he took out his six-shooter and held it steady.

“Hold on,” said Thales. “We don’t want any trouble here. All is water, so all we did was steal a little bit of water from people. We can give you some water back, and everything will be even, right?” He gestured to a watering trough for horses on the side of the street, which was full of the stuff.

“Just so long as you don’t mind being sprayed with some very hard water from my squirt gun,” the old man answered, and the six-shooter was pointed at the Milesian now.

“Ha!” said Zeno of Elea. “You don’t scare us. In order to hit Thales, your bullet would have to get halfway to him, then half of the remaining distance, and so on. But that would require an infinite number of steps, therefore it is impossible.”

“Sorry,” said the old man, “I couldn’t hear you because it’s logically impossible for the sound waves encoding your speech to reach my ears.” — Scott Alexander, Beware Isolated Demands for Rigour

In other words, if you’re going to apply a “must I believe it” mindset, be consistent. Don’t just use it when convenient — you’re only doing yourself a disservice through self deception.

Why You Should Care

For entrepreneurs, academics, doctors — anyone who needs an accurate picture of reality — a “can I believe it” mindset can be disastrous. That’s where we have a higher risk of obviously bad startup ideas, biased research, misdiagnoses, and more.

The “must I believe it” mindset is similarly dangerous if it forces us to dismiss ideas in an asymmetric way. Scientists need to be operating consistently thinking in this mindset, and entrepreneurs need to be wary of only applying this level of rigour to only their critics.

Of course, it’s not reasonable to expect anyone to consistently use either mindset. There’s a sweet spot, as Tim Urban points out:

source

So long as you know where you stand, you’ll be ok.

--

--

Ethan Milne
Ethan Milne

Written by Ethan Milne

Current PhD student at the Ivey School of Business, researching consumer behaviour. I enjoy writing long-form explanations of niche academic books.

No responses yet