Book Summary: The Scout Mindset by Julia Galef

Book Cover for The Scout Mindset by Julia Galef

In The Scout Mindset: Why Some People See Things Clearly and Others Don’t, Julia Galef argues that we don’t need to deceive ourselves in order to feel good, motivated, or persuade others. The book also offers practical tips to see the world more accurately.

Estimated reading time: 32 mins

Buy The Scout Mindset at: Amazon (affiliate link)

Key Takeaways from The Scout Mindset

  • We’re in soldier mindset when we try to defend the views we already hold or wish are true. We’re in scout mindset when we try to suss out the actual lay of the land.
  • It’s not binary—no one is entirely soldier or entirely scout. But some people are more scout-like than others.
  • A common myth is that we need self-deception to cope with reality, motivate ourselves to do hard things, or persuade others to trust us. However:
    • There are other coping strategies like reframing or perspective-taking that don’t require deluding ourselves.
    • You can be motivated to take moon shots by thinking in expected value terms without overestimating your odds of success.
    • You can be socially confident and epistemically humble —and social confidence is what inspires trust.
  • Overall, we probably use scout mindset less than would be optimal. We have far more choice today than in the environments humans evolved in, so accurate judgments are more valuable than ever.
  • How to become more of a scout:
    • Assign probabilities to your beliefs and update incrementally. Changing your mind doesn’t have to be embarrassing and you shouldn’t have to apologise.
    • Learning from disagreements is hard; we shouldn’t expect it to be easy. But we can give ourselves a better chance of success by finding the right representatives for the other side.
    • Hold your identity lightly so it doesn’t get in the way of accurate beliefs. Better yet, identify as a scout and take pride in being able to spot errors in your own thinking.

Detailed Summary of The Scout Mindset

What is the scout mindset?

Soldier mindset

Soldier mindset is Galef’s term for motivated reasoning. When we something to be true, we ask “Can I believe this?”. But when we don’t want something to be true, we ask ourselves, “Do I have to believe this?”.

In soldier mindset, we defend our beliefs against threatening evidence. Think of all the militaristic terms we use to describe beliefs and arguments:

Beliefs can be deep-rooted, well-grounded, built on fact, and backed up by arguments.

If we’re not careful, someone might poke holes in our logic or shoot down our ideas. … Our positions might get challenged, destroyed, undermined, or weakened. So we look for evidence to support, bolster, or buttress our position.

And if we do change our minds? That’s surrender. … If we realize our position is indefensible, we might abandon it, give it up, or concede a point, as if we’re ceding ground in a battle.
— Julia Galef in The Scout Mindset

Scout mindset

In contrast, scout mindset is when we’re trying to form an accurate view of ourselves and the world. It prompts us to question our assumptions and consider other interpretations, even when it’s painful. Scouts aren’t completely indifferent—they might hope to learn that the path is safe or that they other side is weak. But they want their “map” of the landscape to be as accurate as possible. New information doesn’t “threaten” a scout’s beliefs—it just improves their map.

No one is a perfect scout or pure soldier. We all switch between the two depending on the context. But some people are better at being scouts than others.

Why we likely undervalue scout mindset

Galef’s book isn’t a moralising tome claiming that the world would be better if everyone was a scout all the time. She just points out we’re probably using scout mindset less than would be optimal.

Some biases that suggest we tend to undervalue scout mindset are:

  • Present bias. We overvalue immediate rewards and undervalue longer-term ones. The soldier mindset tends to offer instant emotional and social benefits, but with the longer-term cost of impaired judgment. The scout mindset has the benefit of learning from our mistakes, but that’s a longer-term benefit.
  • Tangibility bias. The scout mindset may have a tangible cost of losing the immediate argument but it has the broader, intangible benefit of improved judgment in general. One of the less tangible, longer-term costs of the soldier mindset is the ripple effects that inaccurate beliefs can have, as beliefs are usually interconnected.
  • Overvalue social costs. The costs of scout mindset tend to be social, and research suggests we overvalue social costs. In reality, other people don’t think about us nearly as much as we think they do. A possible reason for this bias is that we evolved in a world of small social groups where social exclusion could literally mean death.
Conceding an argument earns me credit. It makes me more credible in other cases, because I’ve demonstrated that I don’t stick to my guns just for the sake of it. It’s like I’m investing in my future ability to be convincing.
— Julia Galef in The Scout Mindset

Today, social groups are more malleable and we have a far greater ability to shape our lives. [See The WEIRDest People in the World for a theory on how this happened.] Having an accurate map doesn’t help you very much if you can only travel one path, but it’s far more valuable when life involves many judgment calls. The more we can avoid distorting our perceptions of reality, the better those calls will be.

Why do we use soldier mindset?

The “fixes” typically proposed for motivated reasoning are things like teaching people about cognitive biases, reason and logic. Unfortunately, this doesn’t seem to work in the long run or outside the classroom. It’s like how knowing you should exercise won’t necessarily improve your health.

One reason it’s so hard to shed the soldier mindset is because it provides short-term benefits. It provides both:

  • Emotional benefits like comfort, self-esteem and motivation and morale; and
  • Social benefits like persuasion, image and belonging.

Comfort

Examples of comforting beliefs include:

  • Sour grapes. Aesop’s Fables tell the tale of a fox who tries, unsuccessfully, to grab some grapes on a tree and concludes the grapes were likely sour, anyway. Similarly, we may convince ourselves it’s for the best if we get rejected for a job or by someone we like.
  • Sweet lemons. If we have a problem we can’t fix, we convince ourselves the problem is actually a blessing. [Daniel Gilbert calls this our ‘psychological immune system’.]
  • Denial. If we pretend that a problem doesn’t exist, we won’t have to deal with it.
  • Fatalism. If there’s nothing we can do, we don’t need to worry about it. If we tell ourselves the future is inherently unpredictable, we won’t blame ourselves for inaccurate forecasts.
  • Blaming a scapegoat. Blaming others can make us feel better about our decisions. For example, Spock from Star Trek frequently shrugs off his errors in predicting others’ behaviour with, “Well, they were illogical.”

We have a fundamental human need to feel like things are basically okay and that we’ll be able to cope. That’s why we commonly use motivated reasoning during emergencies (which, ironically, is when we need to be the most clear-eyed).

Coping strategies don’t have to involve self-deception

Some writers like Carol Tavris and Elliot Aronson in Mistakes Were Made (But Not By Me) argue that motivated reasoning helps our mental health because it means we don’t have to torture ourselves with regret.

But not all coping strategies involve self-deception. For example:

  • Make a plan. When you’re tempted to deny a problem, making a simple plan for how you’d deal with it anyway can quickly dissolve denial.
  • Notice any silver linings (without convincing yourself the whole cloud is silver) — e.g. “At least this will make a good story”, “I’ve just demonstrated I can change my mind”, or “I can learn from this mistake”.
  • Remind yourself of a helpful true fact — e.g. “I’ve handled worse problems before” or “I’m doing my best.”
  • Put things in perspective — e.g. “I won’t care about this 5 years from now”.

Different strategies work for different people. There are also strategies that don’t involve beliefs (e.g. take a deep breath and count to 10) and so don’t require delusion. Just don’t settle for a delusional coping strategy—your ability to see clearly is precious.

Self-esteem

We may resort to the soldier mindset to protect our own egos—e.g. “I’m not wealthy because I have integrity.” “I don’t have friends because others are intimidated by me.” “My desk is messy because it’s a sign of creativity.”

A 2001 study followed 500 students through 4 years in college and found that those who consistently underperformed their own expectations began to believe grades weren’t really that important.

Protecting your self-esteem doesn’t necessarily mean thinking good things about yourself. Sometimes people use masochistic beliefs to protect their own ego—assuming the worst about themselves to avoid further blows.

Aren’t Happy People Slightly Deluded?

Some prominent articles claim that depressed people see the world more realistically and that happy people might be slightly delusional. But most of these studies (see two examples) are majorly flawed in conflating self-esteem with self-enhancement.

Positive beliefs are assumed to be positive illusions, even when there’s no reason to doubt the accuracy of those positive beliefs. People can rate themselves as better than average without being deluded. All that the studies really find are that people who think they have many positive traits and positive beliefs about the future have higher self-esteem and are happier—a thoroughly unsurprising finding.

Morale and motivation

The self-belief model of success holds that you need “irrational optimism” to motivate yourself into taking moon shots and persevering against the odds.

A survey of almost 3000 entrepreneurs found that over 80% estimated their own chance of success was at least 7 out of 10—way higher than the average start-up success rate of around 10%. Optimism did not seem to correlate with factors that have previously been found to be associated with success, such as personal background or nature of their business.

Accurate assessments help you make choices

The main problem with the self-belief model is that it overlooks how an accurate assessment of your chances helps you make choices. The self-belief model implicitly assumes there’s only one path, and no other options out there worth pursuing. In reality, there are usually multiple paths to success or happiness, so it’s useful to have an accurate picture of each path’s odds.

Some argue that having a clear assessment is helpful at the “decision-making” phase but not the “execution” phase when you should just go all in. But the two phases are rarely so distinct. The execution phase can still involve decisions, including whether to continue or quit. Besides, even after you’ve decided to go down one path, you may still be able to choose how much to stake on it.

Example: Elon Musk’s estimated chances of success

When Elon Musk decided to start SpaceX, he estimated the chance of every sending a spacecraft into orbit at 10%. When he decided to join Tesla, he similarly gave it around a 10% chance of success.

Although he thought these businesses had a 90% chance of failure, Musk still thought it was worth trying because the potential upside was so huge. Instead of convincing himself of a very high chances of success, Musk applied expected value thinking to his decision. By contrast, most people assume things are only worth doing if they’re likely to succeed.

With Tesla, Musk also comforted himself with the belief that, even if it failed, they could at least change the perception that electric cars had to be ugly and slow.

Accepting the possibility of failure can be liberating

People also use self-deception to motivate themselves by refusing to consider alternative plans or downsides. The self-belief model assumes that acknowledging the possibility of failure (especially when it’s high) will discourage you from taking risks. Your pessimism then becomes a self-fulfilling prophecy.

But Galef argues that accepting the possibility of failure can be liberating. All decisions involve luck. Even a positive expected value bet may not work out. You should judge your decisions based on the long-run trend rather than any individual outcome. If you know you can feel happy with a bet even if it doesn’t work out, you’ll be more likely to take the risk. In this way, the scout mindset can provide a more robust kind of morale that doesn’t require self-delusion.

Persuasion

Common wisdom states that confidence is magnetic and helps you persuade and influence others. We may use soldier mindset to believe something in order to sell it to other people. But overconfidence can backfire.

One study found that law students were more likely to believe their side of the case was right, even though the sides were randomly assigned. However, the students who felt more confident in the merits of their case turned out to be worse advocates, possibly because they underprepared for the rebuttals.

Galef points out you can get people psyched up without lying to them or being overconfident. There’s a difference between:

  • Epistemic confidence — how certain you are about what is true.
  • Social confidence — how you carry yourself and your level of ease in social situations. It’s more like self-assurance.

These two types of confidence don’t always go together. One study found stronger correlations between assessments of others’ competence and their levels of social confidence. Epistemic confidence, by contrast, hardly affected the competence ratings.

Other studies examining patients’ reactions to their doctors’ uncertainty find mixed results. A closer inspection reveals this is because different studies test different things. Patients don’t react well when doctors say something like, “I haven’t come across this before”. They may well assume it’s because of the doctor’s own incompetence or inexperience. But patients don’t penalise uncertainty when doctors also showed their expertise in explaining why they were uncertain.

3 rules to communicate uncertainty

  1. Have a plan. People crave certainty because it tells them what to do. You can help by following up your uncertainty with a plan or recommendation for next steps.
  2. Show that uncertainty is justified. Sometimes people just don’t know how messy the issue you’re discussing is. If you can show that certainty is unrealistic, you can come off more credible than someone who is overly certain.
  3. Give informed estimates. You can still show you’re well-informed by giving an estimate and explaining how you got there.

Image and belonging

We choose beliefs that make us look good. Robin Hanson has compared beliefs to clothing—they tell others what kind of person we are and can fall in and out of fashion. As one belief starts to gain traction in your social circle, you may become motivated to adopt it in order to fit in (unless your image involves being a contrarian).

All social groups are built on shared beliefs and values. In religious communities, this can be extreme—the “wrong” belief can mean losing your entire social support system. But even groups that don’t kick you out can still alienate people who hold different beliefs. Below, Galef suggests ways to prevent our identities from clouding our judgment and instead leverage them into seeing the world more accurately.

Are you a scout?

It’s far easier to spot motivated reasoning in others than in ourselves. When we scrutinise our own reasoning, it feels dispassionate and sound. But we can apply different standards to an issue depending on what we’re motivated to believe. This is more likely to happen on issues we haven’t previously considered.

Example: Should the loser pay the winner’s costs?

After a legal case is decided in favour of one side, the question of costs remains. One study found that people answer very differently depending on how the question is worded:

  • 85% agreed that if you get sued and you win the case, the person who sued you should pay your legal costs.
  • But only 44% agreed that if you sue someone and lose the case, you should have to pay their costs.

Being smart or knowledgeable doesn’t make you a scout

A high IQ can help you find the right answer in ideologically neutral domains like solving math problems, but it doesn’t seem to protect you from bias on ideologically charged questions. In fact, intelligence and knowledge can give us a false sense of security.

Example: Science intelligence and polarisation

Dan Kahan conducted a study comparing Americans’ levels of “science intelligence” with their views on politically charged questions like global warming. Contrary to what you might expect (or hope for), they found that opinions diverged as scientific intelligence increased:

  • At the highest levels of science intelligence, around 100% of liberals believed in human-caused global warming while only 20% of conservatives did.
  • At the lowest levels of science intelligence, there was no polarisation—roughly the 33% of both liberals and conservatives believed in human-caused global warming.

Another study by Drummond Fischhoff found the same pattern for other politically charged scientific issues like stem cell research and human evolution but not for other issues like genetically modified foods that are controversial for non-political reasons.

Ultimately, intelligence and knowledge are just tools. You can use them to help see the world more clearly, or you can use them to defend a particular viewpoint. There’s nothing inherent in the tools that makes you more of a scout.

Signs of a true scout

Galef sets out 6 signs of a scout mindset (you may be stronger on some of these than others):

  1. Telling other people when you realised they were right.
  2. Reacting well to personal criticism. Sometimes people claim to “welcome criticism” but then get defensive when criticised or unconsciously deter others from criticising them.
  3. Proving yourself wrong. What steps have you taken to try to do this?
  4. Taking precautions to avoid fooling yourself. For example, when describing a disagreement to a friend, you might hide which side you were on until your friend’s given their view. Or, when starting a new project, decide in advance what counts as “success” so you can’t move the goalposts later.
  5. Having true critics. Can you name some people who disagree with your beliefs, profession, or life choices who you consider to be thoughtful? Or at least list some good reasons why someone might disagree with you?
  6. Recognising your soldier mindset. Can you point to occasions where you were in soldier mindset? If not, it doesn’t mean you haven’t done it. Since motivated reasoning is our default state, it’s more likely you’re just unaware of it.

How to become more of a scout

Going from soldier to scout doesn’t happen overnight. Galef recommends starting with no more than 2 or 3 scout habits. Humans didn’t evolve to do unbiased evaluations of scientific evidence, so we should be proud of how far we’ve come—and strive to be a bit better still.

Forming your beliefs

Look for motivated reasoning in yourself

“Knowing” something in the sense that you’ve read about it is different from having internalised it in a way that actually changes how you approach the world. But it’s hard to internalise something until you’ve derived it for yourself and experienced the realisation that you were wrong.

The first step is just to notice instances when you’ve fallen into soldier mindset. Some thought experiments that may help are:

  1. Double Standard Test. Are applying inconsistent standards to different people’s behaviour?
  2. Outsider Test. Imagine someone stepped into your shoes without any emotional attachment to your past decisions. What would you expect them to do?
  3. Conformity Test. When you find yourself agreeing with another person, imagine them telling you they don’t actually hold that view. Would your view change?
  4. Selective Sceptic Test. Imagine the study you’re relying on reached the opposite finding. Would you still find it credible?
  5. Status Quo Bias Test. Imagine the current situation was not the status quo. Would you still choose it? Of course, change comes with transaction costs and risks, so it’s not a perfectly clean thought experiment, but can still help you better understand your own preferences.
Think in bets

Evolutionary psychologist Robert Kurzban uses the analogy of a company’s press secretary vs its board of directors. The press secretary just wants to make the company look good and doesn’t care about what is true. In contrast, the board of directors is incentivised to find out the truth. The press secretary makes claims; the board makes bets.

A bet is any decision in which you stand to gain or lose something of value (e.g. money, time, reputation). When you have something on the line, you become incentivised to find out the truth. This trick can help you think more objectively even if the bet is imaginary.

You may have to reframe your situation so that it can be proved true or false. For example, say you had a fight with your partner and you want to know who was being unreasonable. You could reframe it by betting on whether an impartial third party, given all the relevant details of the fight, will agree you were more reasonable.

Assign probabilities

Most people, including experts, state their claims with too much certainty and are overconfident. Ideally, our claims would be well-calibrated so that if you’re 50% sure of something, you’re correct 50% of the time. If you’re 80% sure, you should be correct 80% of the time. (The book includes a calibration exercise to test your calibration.) The good news is that calibration has a quick learning curve—just a few hours of practice can greatly improve calibration.

Changing your mind

Update incrementally and often

A common misconception is that changing your mind is a big deal. But you can change your mind incrementally. It’s like correcting your course as you steer a ship—you’ll rarely make a drastic 180-degree turn, going from one belief to the exact opposite. More likely, you’ll just revise a probability up from 60% to 70%. This is much lower-stakes and should be easier.

An update is routine. Low-key. It’s the opposite of an overwrought confession of sin.
— Julia Galef in The Scout Mindset

Philip Tetlock’s study of superforecasters finds that the people who were best at making predictions and judgments were those who revised their judgments a lot. The highest scorer usually changed his mind at least a dozen times on a single forecast.

Example: Admitting you were wrong

Galef describes how one friend, Mark, accused another, Andrew, of never admitting he was wrong. In response, Andrew pointed out two recent occasions on which he’d admitted being wrong in front of Mark.

This came as a surprise to Mark. Andrew was so matter-of-fact about his errors that they barely registered. He’d just say, “Yup, you’re correct. Scratch what I said earlier” in a straightforward and cheerful way. Mark had implicitly (and incorrectly) assumed that admitting a mistake had to be humbling. But you don’t have to feel embarrassed about updating your belief, unless you’d initially been negligent somehow.

Lean in to confusion

When something conflicts with your expectations, don’t just explain it away. It’s very tempting to judge other people’s behaviour as stupid, irrational, or crazy. But top negotiators know that you should never write off the other side as “crazy”—that “craziness” can well be a valuable clue. [Galef refers to Chris Voss’s advice on uncovering “Black Swans” in Never Split the Difference.]

Example: The London Homeopathic Hospital

In the 1850s, London suffered regular cholera outbreaks. A council of scientists was set up to survey the city’s hospitals and find the most effective treatments.

The London Homeopathic Hospital’s cholera mortality rate was around 18%—significantly lower than the average of 46%. But the council excluded it from their survey, because homeopathy is bunk (its central theory is that diluting medicine with water somehow makes it more potent).

It turned out that the London Homeopathic Hospital’s success at treating cholera was real. They’d stumbled upon two effective treatments by sterilising blankets used by sick people and giving whey to their patients (which replenished their fluids and electrolytes). These treatments were just lucky hunches—neither had anything to do with homeopathy. If only the council had gotten curious about the lower mortality rates instead of dismissing them, they might have saved millions of lives.

Take note whenever you see an anomaly that conflicts with your model of the world. It may never lead anywhere—sometimes the world is just messy and random. But, over time, you may end up finding a model that better fits your observations. and change your mind in a big, paradigm-shifting way.

Learning from disagreements

Our expectations for learning from disagreement are generally unrealistic. We assume that if two reasonable and logical people acting in good faith will find it easy to resolve their differences. When that doesn’t happen, we get frustrated and conclude that the other side is biased or irrational.

We need to lower our expectations—a lot. Even under ideal conditions, learning from disagreements is hard. It’s very easy to misunderstand each other’s views and beliefs are often connected. Changing one often requires changing other, possibly unspoken, beliefs about how the world works and which sources are trustworthy. Bad arguments can also “inoculate” us against good ones. We can often mistake a new good argument with a bad argument we’re already familiar with. [This is super common in my experience. It’s part of why it’s much easier to learn from disagreements when you’ve already built up some credibility with each other.]

Select the right representatives from “the other side”

Many people think that being fair-minded means getting out of their echo chamber and listening to “the other side”. Research shows this typically backfires and deepens existing positions and biases.

This doesn’t mean we can’t learn from disagreement. We just need to be careful in selecting our sources. By default, the voices we’re most likely to encounter from the “other side” are naturally disagreeable people (since they’re most likely to disagree with you) or popular representatives, who are more likely to cheer for their own side or caricature your position. These voices are not representative of the other side.

To give yourself the best chance of learning from disagreement, try to listen to people you like or respect but nevertheless disagree with. Dissent is far more useful it comes from someone you respect and with whom you share some common ground.

Example: Climate change sceptic turned activist

Jerry Taylor was a climate sceptic turned climate activist. He first started to doubt his sceptic position when he discovered that a scientist on “his side” had misrepresented facts, and that the sources he’d been citing were shakier than he’d realised. While he still thought his scepticism was basically correct, he became less sure of his position.

A few years later, he met with a climate activist called Bob Litterman. Taylor respected Litterman, who was a prestigious figure in finance. Litterman spoke in Taylor’s language. Instead of suggesting that Taylor care about climate change because of, say, humanity’s moral responsibility to Mother Earth, Litterman pointed out that climate change was a non-diversifiable financial risk. Normally, investors are willing to pay lots of money to avoid non-diversifiable risks. By that same logic, society should be willing to invest lots of money to avoid catastrophic climate change.

Shortly afterwards, Taylor left the Cato Institute (a libertarian think tank) and became a climate activist.

Truly understand others’ views

Just like it’s hard to learn from people you dislike, it’s similarly hard to change another person’s mind if you feel morally and intellectually superior to them.

Example: Understanding vaccine scepticism

Adam Mongrain admitted he’d once felt nothing but “contempt” for vaccine sceptics. But then he got to know one. He respected her as an intelligent and caring person before the topic of vaccines ever came up, so he couldn’t just dismiss her as an idiot.

He came to realise that vaccine sceptics weren’t so different from the rest of us. After all, experts had previously assured people that things like lead paint, tobacco and bloodletting were safe and turned out to be wrong. Once you’ve learned to become suspicious of mainstream medicine, it’s very easy to find evidence confirming those suspicions.

When Mongrain could demonstrate that he understood her concerns, they began to have reasonable, good faith discussions about vaccines. After several such conversations, she agreed to sign her daughter up for vaccines.

Reading sources that confirm your beliefs, trusting people whom you’re close to—everyone does that. It’s just an unfortunate fact that this universal tendency sometimes yields harmful results.
— Julia Galef in The Scout Mindset

Bryan Caplan has suggested an ideological Turing test to determine if you truly understand an ideology: can you explain their position so convincingly that other people can’t tell the difference between you and a genuine believer? This is a test of both your knowledge (how well do you understand the other side’s beliefs?) and emotions (can you avoid caricaturing the other side?).

Acknowledging the weaknesses in your “side” can go a long way toward showing that you’re not just a zealot parroting dogma.

Using your identity

Hold your identity lightly

Beliefs that form a core part of your identity are much harder to change. Disagreements over those beliefs grow highly charged. We all know this is true for things like politics and religion, but it can also apply to less obvious topics like breastfeeding vs bottlefeeding or even your choice of programming language.

People like Paul Graham have therefore recommended keeping your identity small. For a while, Galef followed this advice and tried to avoid identifying with any ideology, movement or group. However, over time, she realised there were also advantages to belonging to and identifying with some groups.

She now recommends holding your identity lightly instead. This means thinking of identity in a matter-of-fact way, rather than as a central source of pride and meaning in your life. You can still belong to groups, while maintaining your own independent values and beliefs. Your identification with that group should be contingent—only applying as long as that label seems accurately describes your views.

Adopt a scout identity

The scout mindset can become part of your identity. People who are good at facing hard truths and changing their minds take pride in being someone who can change their minds in response to evidence. This identity motivates them to check their own thinking and helps them overcome the sting that comes from being wrong.

One of the biggest things you can do to reinforce scout-like habits is to surround yourself with people who value such habits. Galef, for example, is part of the effective altruist (EA) movement, where prominent organisations and individuals publish their mistakes and things they’ve changed their minds about. Humans are social creatures, and our peers and social circles influence us even when we don’t notice. This is true of people we read, follow or talk to both online and offline.

Examples of good scouts

Throughout the book, Galef gives many examples of people who showed some scout-like behaviours. I’ve included a few of my favourite ones here.

  • This Reddit thread with former incels who explain what changed their ideology
  • \/Fe-MRADebates is a small subreddit where feminists and men’s rights activists (MRAs) discuss questions that divide them. Surprisingly, it’s not the hellish dumpster-fire you might expect. The moderators set rules that facilitated constructive disagreement (e.g. don’t insult other members; disagree with specific people or views rather than with what a certain group allegedly believes). (Galef points to specific examples of users changing their minds over time.)

Example: I Kissed Dating Goodbye

When Joshua Harris was 21, he wrote I Kissed Dating Goodbye, which encouraged Christians to avoid dating before marriage. It sold over a million copies and catapulted Harris to fame.

By the 2010s, Harris started to hear from readers who felt his book had screwed up their lives. At first, he dismissed such critics. One thing blocking his ability to change his mind was the (unconscious) belief that you can’t harm others if you have good intentions. If you’d explicitly asked him that, he may not have agreed, but the belief was nevertheless lurking in the background.

Another blocker was that much of Harris’ identity had been tied up in the book. It was very hard to accept that the biggest thing he’d done in his life may have been a mistake.

But as the numbers of critics grew, he started to seriously consider if they were right. In 2015, at age 40, he stepped down as a pastor and enrolled in a graduate school of theology. This was the first time he’d attended a traditional school full-time. By 2018, he decided to discontinue publication of the book and announced that he no longer agreed with the central idea that dating should be avoided.

Example: “Dr” vs “Ms”

One morning, Bethany Brookshire opened her email and found replies from two scientists she’d emailed. The female scientist had written, “Dear Dr. Brookshire” while the male one had began, “Dear Ms. Brookshire”.

She quickly tweeted about how, in her experience, men were far more likely to address her by her first name or as “Ms Brookshire”, while women were more likely to call her “Dr. Brookshire”. Her tweet was liked over 2,000 times.

But then she thought it would be prudent to actually check if her impression was true. She went through her emails and found that men were slightly more likely to call her “Dr.” (around 8%) than women (6%). So she followed up with a new tweet saying: “I took the data on this. It turns out… I was wrong.”

Other Interesting Points

  • The technical term for what we normally call “motivated reasoning” is actually called “directionally motivated reasoning”. In contrast, “accuracy motivated reasoning” is aimed at working out whether an idea is true.
  • In a very relatable letter, Charles Darwin once wrote: “… I am very poorly today & very stupid & hate everybody & everything.”
  • Whether people suppress their doubts or take note of them turns out to be a key determinant of whether they manage to escape a multilevel marketing (MLM) scheme.
  • There’s long been a theory in psychology that conservativism attracts people who are more “rigid”, closed-minded and dogmatic. Turns out the research behind this theory is sketchy. The questions researchers used to measure rigidity simply measured conservative beliefs—i.e. favouring the death penalty or opposing abortion were classified as “rigid”.

My Review of The Scout Mindset

I’d heard of The Scout Mindset a long time ago but was never that interested in it. The word “scout” conjured up images of boring, goody two-shoes, boy/girl scouts—and I’d already read many books about avoiding sloppy thinking like Thinking Fast and Slow, Thinking in Bets or Calling Bullshit.

I’m glad to report that I enjoyed the book more than expected. The book was logically structured and Galef carefully backs her claims up. When I looked up some of the studies she’d relied on (like the one about social vs epistemic confidence) I could tell she had taken the time to read them closely. But she doesn’t get lost in details—the book remains accessible and easy to read. She also doesn’t oversell, acknowledging that she’s not claiming we should be scouts 100% of the time.

Things I particularly liked were:

  • Galef’s point about beliefs being interconnected, such that “changing your mind” tends to happen gradually or incrementally.
  • The stories of Joshua Harris (“I Kissed Dating Goodbye”) and Jerry Taylor (the climate sceptic turned activist). I find some people like Jonathan Haidt are far too defeatist about people’s abilities to change their minds based on a few lab experiments. These stories provided a nice counterpoint.
  • Admitting you’re wrong can be done in a low-key and matter-of-fact way. I personally find it helpful to venture claims I’m not sure of by framing them as questions (e.g. “Isn’t it the case that… ?”, or “Doesn’t that conflict with… ?”), which feels less oppositional.
  • The three tips for communicating uncertainty. The tip to “have a plan” was especially useful.

A small nitpick was that Galef occasionally made some minor claims without much, if any, evidence. One I found peculiar was the idea that prefacing a statement with “I believe” suggests it’s important to your identity, because the words should go without saying. I found this very surprising. I preface claims with “I believe” to indicate that it’s something I believe, but I don’t have good evidence for this and my belief may turn out to be wrong. Whereas omitting those words would sound like I’m asserting a fact. Of course, this may vary across cultures and depend on the tone used.

Lastly, epistemic uncertainty can go too far. Disinformation doesn’t always try to convince us of a specific falsehood; some just seek to sow doubt and erode confidence in our ability to make sense of the world. Now, it’s not like The Scout Mindset recommends radical uncertainty—a scout’s goal is to form an accurate map, not to give up in the face of motivated reasoning. I just wish Galef had addressed this point, as it’s something I personally struggle with.

Let me know what you think of this summary of The Scout Mindset in the comments below!

Buy The Scout Mindset at: Amazon <– This is an affiliate link, which means I may earn a small commission if you make a purchase through it. Thanks for supporting the site! 🙂

If you enjoyed this summary of The Scout Mindset, you may also like:

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.