Podcast Summary: EconTalk – David McRaney on How Minds Change

The duration of the original EconTalk podcast with David McRaney is 1 hour 28 minutes.
The estimated reading time for this podcast summary (excluding “My Thoughts”) is less than 10 minutes.

Book Cover for David McRaney - How Minds Change

Key Takeaways

  • Traditionally, people thought that you could change people’s minds by giving them more facts and information. This was known as the information deficit hypothesis.
    • The information deficit hypothesis is wrong. Everyone engages in motivated reasoning. Our beliefs are not formed by a careful analysis of facts and logic.
    • Telling someone they’re wrong generates reactance and threatens their sense of autonomy. It’s more likely to backfire, causing them to do the opposite.
  • Changing someone’s mind requires cognitive empathy.
    • You need to get out of the “debate” mindset where you are looking to beat the other person, and into a “cooperative” mindset where you are working together to find the truth.
    • This could mean you find your own beliefs to be wrong – you have to be open to this.
  • There are a variety of effective conversational techniques, such as street epistemology, deep canvassing.
    • Though they were developed independently, all techniques involve basically the same steps in the same order (the steps are outlined below).
    • The goal is not to convince the other person to your view, but to encourage them to form a more complex, nuanced view that’s grounded in good epistemics. This preserves the other person’s agency.
    • These techniques are not magic bullets that will cause complete 180s. More often, they work incrementally, gradually nudging a person towards better-founded beliefs.
    • However, studies have shown deep canvassing at least to be very successful in generating lasting change.

Detailed Summary

Our beliefs are based on cherry-picked “facts”

Sharing facts with people is not very effective in getting others to change their mind.

The information deficit hypothesis is the idea that people will come to the same beliefs if they have access to the same information. It’s a really old idea:

  • The Founding Fathers of the US thought that building public libraries would make democracy reach this utopian ideal.
  • The 19th century rationalist philosophers similarly thought public education was a panacea.
  • People later thought the Internet would be the answer.
    Unfortunately, the hypothesis seems to be wrong. There’s a lot of research in psychology, neuroscience and other social sciences to support this, and the evidence keeps building. Smarter, more educated people just become better at rationalising and justifying their existing beliefs, rather than updating those beliefs.

“Reasoning” (in the psychological sense, not logical sense) is simply coming up with reasons for what you believe. A desire to be well-regarded by our peers motivates those reasons more than truth. Facts aren’t what drive most of our beliefs because we cherry-pick our “facts”. They feel like reasons, but they are merely justifications and rationalisations picked out of all potential evidence on an issue.

How people become radicalised

Imagine you’re in a tent in the woods and you hear a sound. You feel anxious because it might be a bear. You decide to look for information to check if your anxiety is justified. That’s all good.

But apply this same process to other things in the world that make us feel anxiety. You look for information to check your anxiety online. And online, you’ll inevitably find something to justify your anxiety. There may be other people talking about it, too. You might end up talking with them and, over time, you slowly associate with those people more and more and weaken your ties with people who don’t share the views in that community. Once you’re in that community, the fear of social death drives your beliefs.

“The fear of social death is greater than the fear of physical death.”

Brooke Harrington, sociologist

Covid, for example, showed that when wearing a mask or getting vaccinated becomes politicised, your decision to do so (or not) becomes a “badge” to show which side you belong to. People were willing to risk death over something that was previously as neutral as volcanoes.

Why it’s hard to change another person’s mind

A lot of the frustration we feel when we try to change someone’s minds is because we have the wrong tools (applying the information-deficit models). When they fail, we blame the other side and say they’re dumb, ignorant or unpersuadable.

People come to their views through a “reasoning” (again, in the psychological not logical sense) process. What you see is the end of that process. You can’t copy and paste your reason into another person – you have to give them space to engage in their reasoning, their cognitive journey.

Russ uses the analogy of being in Rome and saying to someone, “Rome is great! Come join me. Watch this video, read this book, look at this postcard.” But he’s in Rome and has already taken the journey to get there. Whereas the person he’s trying to convince won’t be persuaded by a mere postcard of Rome.

There are several things which make it hard:

  1. Reactance. People react badly when they feel like their sense of agency is under threat, even when the advice you’re giving is good. This seems to be universal to human beings across all cultures. In the past, when therapists gave advice, patience would go to therapy wanting to curb a behaviour and, because of reactance, leave being more likely to engage in the behaviour.
  2. Making people feel ashamed for their current beliefs. If you make someone feel they should be ashamed for their current beliefs (even if they really should), you’ll trigger their fear of ostracism and push them away.
  3. Different moral frameworks. You can’t make an argument from your moral framework to a person who is in a different moral framework. You have to couch your argument in terms of the other person’s moral framework and values.

Get out of the debate frame and get into a cooperative frame

Changing someone’s mind requires cognitive empathy and respecting the other person’s autonomy. Get out of the “debate” frame where you try to win. One suggestion is to say something like:

“You seem to know a lot about this issue and care about it. I care about this, too. I wonder why we disagree. Maybe we could look at it together and work out exactly what we disagree on?”

By doing this, you go from being in opposition to each to working together on the same problem. That avoids reactance on their part. But you have to be genuine – it’s not about manipulation or brainwashing. If you do this thinking it’ll make you more persuasive, it won’t be very effective.

You should also be prepared for the possibility that the other person has something to teach you. You may end up persuaded by their view, or you may both end up with a view that neither of you started with.

Effective conversational techniques are pretty much all the same

Lots of organisations promote different conversation techniques (deep canvassing, street epistemology, smart politics) and therapeutic models (motivational interviewing, cognitive behavioural therapy, etc). Most of them had never heard of each other, and the non-therapeutic ones weren’t aware of the science. Yet, independently, through A/B testing and research, they all came up with pretty much the same technique, and the same order! McRaney says this was the most surprising thing he found when writing the book.

Conversational techniques that successfully shift attitudes and get past resistance all pretty much work the same way because brains resist for universal reasons. McRaney’s book describes different techniques, some of which work well on politics, others which work well on attitudes and values. One in particular works best with fact-based claims like whether the earth is flat.

Basic steps to effective conversational techniques

The basic steps to techniques like street epistemology are:

  1. Establish rapport and assurance. Assure the person you’re not out to change their mind or shame them – you just want to explore their reasoning. For example, “I would love to have a conversation with you in which we explore your reasoning on a topic and see what your views are and understand it better.’ (but use your own language).
  2. Establish a very specific claim. Ask the other person for a claim, such as “The earth is flat”. Give them space to introspect and listen non-judgmentally. Then repeat it back to them in their own words. Using their words and their definitions is very important – e.g. they may have a different definition of “government” from you, but you have to use theirs.
  3. Ask how confident they are about that claim on a scale of 0 to 100. The scale is a great way to get out of the debate frame and show the other person this is not going to be a binary, right/wrong thing.
  4. Ask why they assessed it at the number they did. Often the first reason someone gives, particularly for a belief that is more a cultural than factual (e.g. is gay marriage ok?). will not be a genuine reason – they may just say it to look good. They will also engage in motivated reasoning and come up with various justifications and rationalisations. But this encourages the other person to engage in reasoning and it frequently turns out they’ve never really thought about it this way before.
  5. Ask why they didn’t assess their confidence higher. For example, if they’re 80% confident, you ask why they’re not 90%, or 100%. This forces them to generate counterarguments against their own position. But because you aren’t generating those arguments, you avoid triggering reactance. [Unfortunately he doesn’t talk about what to do if the person’s already at 100%.]
  6. Ask what methods they used to judge the quality of the reasons they presented. You then summarise and repeat back what the other person says, staying in this conversation space as long as they’re willing to.

[I watched one of these on YouTube and found it very interesting.]

How successful is this?

It won’t be a 100% success rate, and changes will usually be incremental rather than complete 180s.

The goal is to make the other person examine how they vet their beliefs and become better at critical thinking, rather than changing their mind per se. Sometimes the person may lie to you in order to save face. Let them. Don’t point out their inconsistencies, which will make them defensive. Remember the goal isn’t to “win”, but to get them to introspect. They may have a private epiphany later.

There have been many studies on deep canvassing, which have found a very high success rate. Moreover, the results seem to be “sticky” in that even when people return to their social networks, those networks don’t have as much influence as they did before over the particular topic. Although changes are just incremental, moving a point or two in some direction, that can be enough to win elections and create cascades that lead to social change.

Can these techniques promote intolerance?

Russ points out that, to the extent political views are involved, most examples in McRaney’s book seem to move people from the political right to the left. For example, encouraging tolerance about sexual preferences or religious differences. He wonders whether the same technique could be used to promote intolerance.

McRaney’s response is yes, in that it’s completely neutral. Usually it is successful in getting people to have more robust, complex and nuanced view of the issue, and of the epistemology required for them to have a concrete belief about the issue. Sometimes that may end up strengthening views we disagree with, or weakening views we consider to be “correct”. But we should never be afraid of someone having a more complex and nuanced view of anything.

With attitude-based beliefs, deep canvassing involves asking about someone’s real lived experiences with the issue. For example, what experiences have you had with transgender people/? Usually the person realises they have little to no real-life experience and that most of their beliefs have come from someone else. Occasionally they do have a real-life experience with the relevant people, though that experience is rarely negative. However, McRaney admits that there is a risk someone has an existing, negative prejudice against a group of people, and letting them talk about why they don’t like those people could just inflame those prejudices.

Other interesting points

  • McRaney doesn’t believe anyone is unpersuadable. He didn’t used to – before he wrote the book, he had told a student whose father had fallen for a conspiracy theory that there was nothing they could about it.
  • A movie or TV show can be much more persuasive than giving advice because of something called narrative transport, where the person gets so immersed in a story they forget to mount counterarguments. The movie itself may generate arguments for or against an issue, but those issues are left to stand alone without being weighed against counterarguments. [Assuming, of course, that the movie is sufficiently immersive.]
  • Some people can’t experience cognitive dissonance and effectively “shut down”:
    • When a normal person experiences cognitive dissonance, they go through some discomfort and then resolve the discomfort. For example, you may experience dissonance when someone you think well of does something bad. You can resolve that dissonance by interpreting their action as actually good, or as never having happened at all.
    • David Eagleman had a patient who suffered from anosognosia. Anosognosia is when someone cannot recognise that they have a disorder. Eagleman’s patient could only close one of her eyes, but believed she could close both. So Eagleman told her to close her eyes, held up some fingers and asked her how many she saw. She answered correctly, because she had one eye open. When asked to explain how she answered correctly if both her eyes were closed, she just stopped.
    • Eagleman later found damage to her interior singular cortex, which is the part of the brain that experiences cognitive dissonance. The patient didn’t feel the urge to resolve any dissonance. Instead, she went through what McRaney calls a “cognitive filibuster” and ended up “rebooting” to before the experience.

My thoughts

This was an unusually useful podcast in that I feel like I learned things I might actually be able to apply in my everyday life. (I do learn from EconTalk podcasts, but the learnings tend to be more theoretical than practical.)

I’m in mixed views about whether I found McRaney’s overall message to be encouraging or not. On the one hand, it’s heartening to hear that people’s minds can be changed. I also agree with Russ’s comments that this is just good advice for having better conversations generally, even if you’re not interested in changing anyone’s mind.

But if you do want to change people’s minds – even if just in the benign sense of improving their epistemology, rather than convincing them of your particular views – then this podcast is a little disheartening. Changing a person’s mind seems like such a slow and time-consuming process that you have to do with one person at a time over many different conversations. It’s all the more disheartening because it seems much easier to be a con-artist and “trick” people into certain views, as suggested in The 48 Laws of Power, for example. (I’m not advocating that, by the way – I’m just pointing out the imbalance in that it’s a lot harder to be “good” than “bad”) Counteracting that I suppose is that the changes generated by the conversational techniques discussed are relatively “sticky”.

I also don’t think all people are that intractable about all beliefs. It’s probably true that this slow, incremental process is the only effective way to change beliefs that are based on emotions or are important to someone’s identity. I’ve definitely changed my mind in the face of new information in the past and I think I’ve convinced some people to change their minds by pointing out facts or arguments they hadn’t thought of. But I agree that – even in those cases – it’s usually more effective if you:

  1. Make very clear the specific claim being asserted. I’ve found that merely the process of clarifying a claim forces one to be more thoughtful and nuanced about what they assert. My personal experience also shows that many, many disagreements actually come down to semantic differences.
  2. Approach the other person in a co-operative, non-threatening way. Particularly at work, I find asking questions to be very effective at getting others to reconsider their opinions and assumptions. It’s especially effective when I disagree with those who are more senior/experienced than me. And, as this podcast points out, sometimes it’ll turn out I’m wrong, in which case I learn something new in a way that allows me to save face.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.