I’ve been thinking a lot recently about what it means to have an open mind.
It’s hard.
Background Reading
Before I go any further, I’m going to ask you to look at two things other people have created.
First, this brilliant comic by The Oatmeal, which describes the “Backfire Effect.” I’m not going to summarize; you really need to read it, if you haven’t already. But I’m going to assume you are familiar with the backfire effect in a moment.
And then there’s one from the early dark ages of the Internet (2002) that in my opinion deserves a place in the Internet Hall of Fame. This essay by a man named Glen Morton, about something he calls “Morton’s Demon is focused on religious belief, but reading it was the first time I truly understood what an insidious force confirmation bias is. You may think you understand confirmation bias – but you probably don’t, not really. Give it a read, if you haven’t.
Going forward, I’m going to assume that you have a working knowledge of the backfire effect, confirmation bias, and cognitive dissonance.
Open Mindedness is a Verb
So I suspect everybody thinks they have an open mind. It just means having a childlike curiosity, right? It means being open to the wonder of the world, being ready for lifelong learning, it’s just a great thing and we all should sit back and enjoy having it.
Nope. It’s none of that.
Open mindedness is hard work.
The problem is that after you’ve learned something, really learned it, it embeds in your worldview. Everything else that comes along is interpreted against that conception.
It’s uncomfortable to you encounter something that doesn’t fit; it triggers the backfire effect. Encountering something that really doesn’t fit, or worse, that makes you realize that you’re harming others, can cause anguish. Strong backfire.
So that’s where the hard work of open mindedness starts. If you want an open mind, you need to push through that discomfort, you need to push through that anguish, and decide if it’s true. You need to compare your cherished beliefs against what the world is telling you.
If you aren’t feeling discomfort, you aren’t doing the work. If you don’t recognize the backfire effect and discover that it’s bracing – it’s actually exciting – it’s the first indication that you’re about to achieve real insight…
Then you don’t have an open mind.
Open Mindedness is never comfortable
The other problem with trying to keep your mind open is that we typically don’t notice when things come along that should make our jaws drop. We see them, we process them, we can even apply them – like Glenn Morton did with the fossil record – but because it conflicts with our world view, Morton’s Demon snaps the door shut and we don’t notice that it undermines our current understanding of the world.
We need other people to tell us when we’re overlooking important things. And when those smart people do so, it will trigger the backfire effect. Always. It’s wired into our amygdala.
So if you encounter a smart person who disagrees with you, you can choose to ignore or mock or dismiss them. That’s a nice, comfortable place to live.
Or you can engage with them, mutually explore the painful or even scary places, and then try to synthesize a world view that reconciles the differences.
But that’s hard, uncomfortable work. If you aren’t doing it, you don’t have an open mind.
An open mind isn’t a blank check
Starting in the 1950’s, there has been a blizzard of research that tells us that it’s hard to think independently. The classics in the field are known as the “Ascher Conformity Studies.” The basic study examined a group that was told they were participating in a vision study, and asked to choose – aloud – which of three options matched an example (“which of these three lines is the same length as this one?”). The truth was that only one member of the group was the subject; that sucker always went last, and the rest all sometimes intentionally chose wrong. Unsurprisingly, the test subject often went along with the group: “I honestly thought I must be mistaken,” or “I knew they were wrong, but I didn’t want everybody to think I was weird.”
It’s easy to convince ourselves that we wouldn’t fall for that. It’s easy to believe that we’re among those who are self-confident, independent, clear thinkers, and we’d totally stand up for what we see or believe, no matter what. But, really… no. Probably not.
Are you really sure you’d be the one person in the audience who didn’t stand up? For myself, I hope so… but I can’t be sure. Don’t want to hurt anybody’s feelings, right? Don’t want to look weird.
But we need to try. We all want to be that person.
Which means that while we need to seek to disconfirm our biases, we cannot blithely accept every theory that comes our way; down that road lies a feckless society that believes nothing at all. We need to be particularly suspicious of ideas that come with a group of people who will mock or worse exclude us for challenging their ideas, or the status quo.
We must simultaneously be open to new ideas, but also be skeptical of them.
“Enthusiastic skepticism is the perfect partner to an open mind.”
— Astro Teller
How do we find that balance?
It turns out, we have the tools we need. It’s actually not hard: we use science.
Well, not science per se, but the scientific method – or if not that, the science community.
See, scientists have been fighting this battle for many, many generations. And they’ve developed some pretty good tools for doing so. There are two key elements:
Present testable hypotheses, and then test them.
That seems straightforward, but it’s amazing how many people fall back on logic, argument, philosphy, rhetoric, and overt sophistry. Try to ignore all that. Always seek the data; more than that, seek the test. Seek the data.
Don’t bother arguing with sophists. Just ask for the data.
Look for the repeatable experiment.
By the way, it’s not enough to just show data. The scientific community has been rocked, many times, by experiments that seem to imply amazing things (anybody old enough to remember when cold fusion was just around the corner?) but then nobody can repeat the results. This is as often due to error as malice, but there are plenty of examples of malice.
Which means that data is not enough; it is the repeatability of the experiment that proves the hypothesis. And if you can’t repeat the data yourself, you have to lean on people who have done so. You have to trust the scientists.
Great! Solved!
Uh… no.
It turns out that scientists are people, and they have an amygdala just like everybody else. Scientists have egos, and biases, and sometimes even bills. There are any number of examples of scientific bias either due to a rush for credit, or because the source of funding had a particular goal for the research. The entire anti-vaccination movement sprang up because of a single, utterly fraudulent paper (that has since been retracted).
In a more egregious example, consider the case of Philipp Lenard and Albert Einstein. The two were peers, both were brilliant German physicists. Lenard won the Nobel Prize in physics in 1905; Einstein referenced Lenard’s work in his work on relativity. But Lenard’s history turns dark, later in life. He became a Nazi and was the “Chief of Aryan Physics.” He rejected work by non-Germans and or non-Aryans (you know, like Einstein, who was Jewish). “Ironically, [his] disdain for ‘Jewish physics’ was one of the main reasons [the Nazis were unable to] develop nuclear weapons.” He rejected an entire body of important scientific insight because he was blinded by his own prejudices.
So there is no innoculation against bias. However, there is no better approach than science. Historically, over the long haul it has been proven to be pretty resilient.
Don’t Feed the Trolls
So that may all seem pretty obvious, but I hope you can see that it’s really hard.
The real point here is that on today’s internet, I almost never see people original cite research. It’s rare enough to see citations at all, but more often than not those citations lead to a twitter thread that summarizes an article that turns out to have been posted on a self-proclaimed “parody” website. Or extracts half of a graph out of a well-researched, peer-reviewed study to support a position that completely misrepresents the intent of the research.
And if you don’t have time to do that, find people who do. Don’t listen to random people on the web – no matter their credentials – unless and until they have established that they do their homework.
Otherwise, all you’re doing is amplifying their bullshit.