A Skeptical Look at Confirmation Bias

A Skeptical Look at Confirmation Bias

How powerful is confirmation bias? As skeptic Michael Shermer says: “It’s the mother of all biases.” In truth, we all believe in things we can’t prove. If you’re intelligent, well-educated, and good at pulling facts, you’re going to be even better at confirming your own beliefs than someone who’s less intelligent or well-educated. [Photo by Colonel Robert Wilson]

Our brains are wired so that we make better lawyers than we do scientists, in that a lawyer will look for and use evidence to support their client, while ignoring and definitely not using evidence that does the opposite.

I know — depressing.

We only want to watch television shows or read articles or listen to podcasts that confirm what we already believe — what we already think we know.

Right now, you might be thinking, “I don’t do that!” Well, the reality is that you probably do. Because everybody does.

It’s called confirmation bias, and it’s one of the most powerful biases we have. Confirmation bias means we find what we want to be true to be true.

This is why Michael Shermer is a skeptic.

Now skeptics aren’t deniers. It’s simply a position you might or might not take on a claim based on the evidence.

For example, it’s not that Bigfoot doesn’t exist, or can’t exist. It might do. But show me the evidence. Same with Santa Claus. Same with God.

Skepticism is a scientific way of thinking, and seeing as the entire scientific process — for centuries — has been all about working around the different kinds of biases we have, it might not be the worst way of thinking to adopt — especially if you consider yourself to be quite intelligent. Because here’s the thing: if you’re intelligent, well-educated, and good at pulling facts, you’re going to be even better at confirming your own beliefs than someone who’s less intelligent or well-educated.

I can hear some of you now: “What? That can’t be true!” Ah, the irony.

It’s good to remember that intelligent and well-educated people are better at confirming their own beliefs because we tend to think it’s those other people, those idiots over there who believe in nonsense and absurdities. But the reality is that we all believe in things we can’t prove.

It’s also good to actually take some action on this — for example, on social media, we primarily follow people we agree with. There’s that confirmation bias again. But when we do this we only get one side of the story — only half of all the information that’s out there. So try following people and companies and news outlets that you know you don’t agree with — so you can stop being so shocked about what those other people actually believe, and start understanding that yes, those other people actually do believe those things. You don’t have to agree — just understand. Because let’s be real: it’s unlikely that all those other people are evil or stupid or both. Maybe they’re simply misguided. Just like they think you are.

Here’s an example of the confirmation bias: if you tell someone the person they’re about to meet is extroverted, they’re more likely to describe them as extroverted. If you tell someone the person they’re about to meet is introverted, they’re more likely to describe them as introverted. Because they’re only looking for those qualities. Not consciously, but unconsciously.

Another example: a health care reform bill is given to self-reporting Democrats. If these Democrats are told the bill was written by a Democrat, they praise it, like it, and find fewer errors. If they’re told it was written by a Republican, they rip it to shreds. And vice versa, of course.

They’re reading the exact same thing. That’s how powerful the confirmation bias is. As Michael Shermer says: “It’s the mother of all biases.”

If an animal hears a rustle in the grass, it might be a predator. Of course, it might just as easily be the wind, or a smaller animal, or it might be nothing at all. But if it is a predator, that would be dangerous. The end of the animal’s life, probably. They would be removed from the gene pool forever. So it’s much better — in evolutionary terms — to assume that all these rustles in the grass are predators, because the cost of not assuming it’s a predator is too high.

You might be thinking, “But why can’t this animal collect more data about these rustles in the grass? Why couldn’t it have evolved the capacity to do a more rational analysis?” Well, because that would take more time, and a predator doesn’t wait around for its prey to make a decision. Assuming is what keeps an animal alive, and therefore gives it more of a chance to reproduce — and so it’s the animals that developed the ability to assume and make decisions rapidly that were more effective at reproducing.

This is why research in rapid cognition says that we — that is, humans — make our decisions intuitively most of the time.

It’s like buying a house. You make a list of all the things you want, and then usually it comes to down a feeling. “I don’t know why, but I just like this one.”

We’re lucky our emotions evolved like this, because we don’t have the time to process information like a computer does, and weigh every single characteristic of every house we look at, or to consider every variable of every toothpaste we’re looking at buying. Imagine if we had to do that to make decisions? We’d never buy or do anything, would we? And actually, this is confirmed in studying people who have brain damage in terms of their emotions, in that they become like Star Trek’s Mr. Spock. Everything is utterly rational. And because of that, they can barely get out of bed in the morning. Or they’ll go to buy toothpaste, and there’ll be 25 options, and they just won’t be able to know which one to buy, because there’s too much information, and have no capability to pick the one that looks or feels right.

That’s why it’s good and important to be able to make decisions intuitively. To be able to buy the house that feels right.

Reading all of this, you might be thinking, “What can we do about this, then? How do we counteract this?”

Well, one good thing is that when you teach people about confirmation bias, they become adept in seeing it in other people. So you’ll now find yourself spotting it in other people. Congratulations.

Unfortunately — but maybe not surprisingly — it’s still almost impossible to see in ourselves. It’s like some cruel game, isn’t it? And to make it even worse, when we’re told we’re wrong, that our belief just isn’t true, we tend to double down on that belief. We find even more reasons (and excuses) to believe in it.

I know. Seems kind of hopeless, right?

For now, maybe. But we used to believe the world was flat, after all. Science always progresses. At least that’s confirmed, right?

And thank goodness for Michael Shermer, who’s making it a little less hopeless for us all.

Listen to the full podcast episode at The Art of Charm here: Michael Shermer | Why We Believe Weird Things (Episode 531)

Tidbits from the podcast not discussed in this article:

  • How the spoon-bender Uri Geller started the modern skeptic movement.
  • John Edwards and the paranormal.
  • How to “cold read” someone (I can confirm that it’s much easier than you probably think).