Book Summary: Inadequate Equilibria by Eliezer Yudkowsky

Book Cover for Inadequate Equilibria by Eliezer Yudkowsky

This summary of Inadequate Equilibria: Where and How Civilizations Get Stuck by Eliezer Yudkowsky sets out two competing views on the question: “When should you think that you may be able to do something unusually well?”

Estimated reading time: 24 mins

Buy Inadequate Equilibria at: Amazon (affiliate link) or pay what you want for the ebook

Key Takeaways from Inadequate Equilibria

  • Two competing views on when you should think you may be able to do something unusually well:
    • Inadequacy. We should understand how different systems work and the incentives within them to determine whether we’ll be able to outperform them.
    • Modesty. We should always defer to “experts” or “the crowd” because the average person will not be able to outperform them.
  • Inadequate systems are surprisingly common because of misaligned incentives and coordination problems. Even if someone spots an inadequacy, it’s usually inexploitable in terms of money or prestige, so the inadequacy does not get fixed.
  • Modest epistemology seems primarily motivated by social considerations:
    • It’s not necessarily arrogant to think you can outperform a system of experts on some dimensions. Systems can easily end up ‘dumber’ than the people within them. Rather than worrying about whether you are smarter or dumber in general, treat adequacy as a technical question where the answer shifts depending on the situation.
    • While correcting for overconfidence is good, underconfidence is just as big an epistemological error as overconfidence—just less of a social one.
    • People act to protect status hierarchies when they slap down those who try to “steal” status by attempting to do much better than normal.
  • But sometimes modesty seems to be based on misunderstandings of the following:
    • Applying base rates or the “outside view”. While Yudkowsky agrees this is a good idea, you can get reference class tennis when two sides disagree on which reference class to use.
    • Distrust of theories and models. The concern seems to be that people get too wedded to their theories and apply motivated reasoning to explain away all evidence that conflicts with it. But the answer there is to avoid motivated reasoning, not avoid theories and models.
  • It’s much easier to identify the correct contrarian experts in an existing field than to become one yourself.
  • Don’t overcorrect by falling into sloppy cynicism. The amount of failure to be explained is bounded.

Detailed Summary of Inadequate Equilibria

Two competing views: Inadequacy and Modesty

There are two competing views on when you should think you may be able to do something unusually well:

  • Inadequacy analysis. We should understand how different systems work and the incentives within them to find out how adequate they are at producing particular outcomes. This will help us determine if we should expect to be able to outperform the system.
  • Modest epistemology. We should always defer to “experts” or “the crowd” because the average person will not outperform them and it would be overconfident to do so. Modest epistemology seems driven by social considerations more than anything else.

As the title Inadequate Equilibria suggests, Yudkowsky prefers the former view. He argues that inadequate systems are all around us, and it is a mistake to assume adequacy when staring at a clearly broken system.

Yudkowsky offers up some theories behind what is driving the tendency towards modesty. While he agrees with proponents of modesty in some areas (i.e. overconfidence can indeed be a problem; base rates are important), he argues modesty is a bad heuristic because people are prone to overcorrect.

Efficiency, Adequacy and Exploitability

Inadequacy analysis requires understanding 3 concepts: efficiency; adequacy; and exploitability.

Efficiency

The term efficiency comes from economics but is widely misunderstood. A market or price that is “efficient” doesn’t mean that it produces outcomes that are fair or good for society. It just means that the prices in that market can’t be predicted by you.

A better term might be “relative efficiency”, as a market’s efficiency is relative to your own intelligence and level of information. A market might not be efficient relative to a superintelligence or corporate insiders with secret knowledge, but could still be efficient relative to the average investor.

If I had to name the single epistemic feat at which modern human civilization is most adequate, the peak of all human power of estimation, I would unhesitatingly reply, “Short-term relative pricing of liquid financial assets, like the price of S&P 500 stocks relative to other S&P 500 stocks over the next three months.” This is something into which human civilization puts an actual effort.
—Eliezer Yudkowsky in Inadequate Equilibria

Adequacy

Adequacy is like efficiency, but applied beyond the realm of markets.

Example: Cures for Seasonal Affective Disorder

Yudkowsky’s wife suffered from severe Seasonal Affective Disorder (SAD). She’d tried using a small lightbox, but that didn’t work, and the disorder was serious enough that they considered more extreme options like spending time in South America during winter.

Yudkowsky wondered if simply putting up more lights at home would work. He did some Googling but didn’t find any research on this seemingly obvious solution. But he didn’t take this as particularly strong evidence that this solution wouldn’t work. He didn’t see any reason to think that society would be adequate at treating (and documenting obvious treatments for) mood disorders that the absence of any findings on this “more lights” intervention would be easy to locate.

So Yudkowsky proceeded to put up 130 LED light bulbs in the house at a cost of around $600. This seemed to do the trick.

To find out whether there’s some good reason why people aren’t already doing what appears to be an obvious solution, consider if there are good reasons to think that all low-hanging fruit has already been plucked. This leads us to the question of exploitability (or lack thereof).

Exploitability

Large, liquid stockmarkets are efficient because they are exploitable—i.e. if someone recognises that a stock is under- or over-priced, they can buy or short it and shift the market towards a more adequate state.

But efficiency doesn’t hold even in markets for other financial assets like, say, housing or shares in a startup because you can’t short-sell a house or startup equity. (You may be able to bet against aggregate housing markets but you can’t short a particular overpriced house.) So an inefficient housing market may have many overpriced houses, but few underpriced ones.

True efficiency is very rare. The reason we even have a concept of “efficient markets” but not “efficient medicine” or “efficient physics” is because, in the latter systems, agents can’t easily exploit inefficiencies in a way that moves the system towards greater efficiency.

However, one thing efficient markets do have in common with inadequate systems is that there is no free energy in both systems. Any powerfully smart organism will readily consume any freely available energy, so the equilibrium becomes one with no free energy. The important point to note is that a system can be in a horrible state (in terms of the outcomes it produces) but still be in a competitive equilibrium.

For example, even though Yudkowsky’s simple SAD treatment seemed to work, he’s not running out to try and get a researcher to look into it because there’s probably a deeper reason why this research hasn’t been published than “nobody thought of it.”

Systems can be adequate or exploitable at different levels or dimensions

A system may be adequate at different levels. For example, you could say that society is adequate at saving 10,000 lives for $100,000 if obvious interventions at that level have already been picked up. But if this does not hold, you could say that society is inadequate at that level.

There are also different dimensions in which a system may or may not be exploitable. Systems tend to be inexploitable with respect to things like money or prestige—desirable resources for which many agents compete. A system might therefore be exploitable in terms of personalised SAD treatments, but not in dollars or esteem.

Even where the current way of doing things seems bad, and even when you really do know a better way, 99 times out of 100 you will not be able to make money by knowing better.
—Eliezer Yudkowsky in Inadequate Equilibria

Why are systems inadequate?

Yudkowsky suggests 3 broad categories of reasons why systems might be inadequate (though he acknowledges these are somewhat arbitrary):

  • Decision-makers who are not beneficiaries [aka “principal-agent” problems or misaligned incentives];
  • Asymmetric information and irrationality;
  • Nash equilibria that are inferior to other possible equilibria.

The book focuses primarily on the last point, so I will do so too.

What’s a Nash equilibrium?

A Nash equilibrium is what happens when everyone makes their best move, given that all the other players are making their best moves from that Nash equilibrium. For example, in the classic two-person prisoner’s dilemma, the Nash equilibrium is “Defect/Defect” because each prisoner’s incentive is to defect regardless of what the other person does.

If someone wanted to do the best thing altruistically, they would need to ignore the incentives in the system pointing elsewhere. But there’s no free energy in the system to feed such altruists. Moreover, if a system is broken in multiple parts at the same time, no individual can unilaterally defy the system (unless perhaps they were a billionaire or something).

What’s frustrating is when a situation has multiple equilibria, but we’re stuck in a Nash equilibrium that’s worse than another equilibrium. In the prisoner’s dilemma examlpe, the “Cooperate/Cooperate” strategy would make both prisoners better off than “Defect/Defect”, but the two players would have to cooperate. However, coordination is hard.

Coordination is hard

Coordination requires trust. To get everyone jumping at the same time, people have to trust the person shouting “jump!” and they have to trust that other people also trust that person so that they won’t be the lone jumper.

Some examples:

  • Hospital statistics. When no hospital offers statistics, you have no baseline to compare to if one hospital starts doing so. There’s no incentive for any hospital to be the first to publish statistics as the numbers of patient deaths will just look scary in isolation. (Incidentally, anaesthesiologists did start tracking patient outcomes after their professional society released standards in the 1980s. Fatality rates fell by a factor of 100 —see Hyman and Silver, 2001.)
  • Use of p-values. Once it became conventional to measure experiments using p-values, researchers could not get the most prestigious journals to publish their papers if they used using alternative statistical methods (even if those alternative methods are better).

Recursion — a double-edged sword

Recursion can make an equilibrium extra sticky.

Example: Recursion and red-haired entrepreneurs

Startups have to go through multiple rounds of funding before they become profitable and failing any round will kill the startup. So if you’re an angel investor in an early round, you have to take into account the probability that the startup secures funding in later rounds.

Let’s say it’s widely believed that successful entrepreneurs have red hair. Even if you don’t believe that, there’s no way to make an excess profit by betting on promising non-red-haired entrepreneurs—because such people would likely struggle to raise funding in later rounds. So long as people think everyone else believes successful entrepreneurs have red hair, the equilibrium will discriminate against non-red-haired founders.

However, recursion also means you can have sharp tipover without most people actually changing their minds on a particular issue. They just have to change what they believe about what others believe.

For example, journalists thought for a long time that gay marriage was outside the Overton window and that supporting gay marriage would be a political blunder. And if enough journalists think it’s a political blunder, it is. But then some politicians actually tested support for gay marriage and got away with it. Journalists began to realise it wasn’t a political blunder after all, and this created a snowball effect.

Modest epistemology

Inadequate Equilibria is largely a response to an approach that Yudkowsky calls modest epistemology, which advocates being humble in general and deferring to the crowd or to experts. The argument goes—since most people overconfidently rate themselves above average, people would typically be better off if they deferred to the majority view on most issues.

Yudkowsky argues against modest epistemology because it’s too blunt. He thinks it’s motivated more by social considerations (like anxiety and status) rather than epistemological ones. He’s also concerned that a misunderstanding of important concepts like empiricism and the outside view may be partly driving the modesty.

Modest epistemology is too blunt

Under the modest epistemology worldview, you either think you’re better than experts and everyone backing them or you admit you’re not as good and should defer to them.

Yudkowsky explains he isn’t arguing for immodest epistemology, where you decide you’re smarter than the experts and can just ignore their judgments. Inadequacy analysis is far more contextual. You don’t decide whether you’re better or worse than experts in general. It’s about identifying ways in which systems fail to deliver certain outcomes.

Better, I think, to not worry quite so much about how lowly or impressive you are. Better to meditate on the details of what you can do, what there is to be done, and how one might do it.
—Eliezer Yudkowsky in Inadequate Equilibria

Being able to outperform a system full of experts doesn’t mean you’re better than the experts. Systems can end up dumber than the people within them because there are layers of diverging incentives and coordination is hard. This is entirely normal.

Treat adequacy as a technical question where different pieces of evidence can shift your conclusions to different degrees. In some ways you may be more competent than the system but in other ways you’ll be less.

The social considerations behind modest epistemology

Yudkowsky thinks that the main forces behind modest epistemology are social ones, particularly anxious underconfidence and status regulation (and these two can overlap).

Anxious underconfidence

It’s true that commenters on the Internet are often overconfident and it’s also true that avoiding overconfidence is important. But there is research to show that some people overcompensate for a cognitive bias after being warned about it. For people he meets in person, Yudkowsky estimates he advises people to be more confident or ambitious around 90% of the time, but to beware of overconfidence only around 10% of the time.

Moreover, avoiding overconfidence doesn’t mean being underconfident, either. Logically, underconfidence is just as big of an epistemological error—just less of a social one. So some people end up convincing themselves that they live in an adequate world to avoid coming off arrogant.

Yudkowsky believes many people have anxious underconfidence in that they experience excess fear of trying and failing at something. (“Excess” in that the fear is disproportionate to the real, non-emotional consequences.) Trying only things that are within your “reference class” is still safe in a social sense—if you fail, you can justify your attempt by noting that others have succeeded on similar tasks. In contrast, if you fail at something ambitious, people might think you’re arrogant or stupid to try. But if you never fail, you’re not being ambitious enough.

Social status regulation

Status is a limited resource, which was valuable in the ancestral environment. Some people are highly attuned to each person’s position in a status hierarchy and will protect the hierarchy by slapping down attempts to “steal” status. This is status regulation. [I’ve written up some thoughts on this in We all play different status games.]

An important aspect of status regulation is that a person needs to already possess a certain amount of status before it’s acceptable for them to reach up for a higher level. So attempts to do much better than normal may come with heated accusations of overconfidence, while you’re unlikely to face any heated accusations of underconfidence for aiming too low.

But, on any given issue, the people or institutions with higher status may not be right. A market’s efficiency doesn’t derive from its social status.

Example: Yudkowsky’s disagreement with Anna

On one occasion, Yudkowsky disagreed with a colleague, Anna Salamon, on how to teach others about sunk costs. Yudkowsky got his way but his approach turned out to be a miserable failure. Anna was right.

At the point they had the disagreement, most outsiders would have thought Yudkowsky had the better track record on teaching rationality. So any advice to follow track records or trust externally observable status would probably have favoured Yudkowsky. But he was wrong anyway, because sometimes that happens even when one person has more externally observable status than another.

After this occasion, Yudkowsky began to hesitate to disagree with Anna, and hesitate even more if she had heard his reasons and still disagreed with him. But he doesn’t do this for all other people because the average person is not like Anna.

Misunderstood concepts

Blind empiricism

Yudkowsky suspects that part of what’s behind modesty is a general fear of theories and models.

Isaiah Berlin drew a famous distinction between hedgehogs and foxes. ‘Hedgehogs’ are the kind of people who rely on theories, models, and global beliefs, while ‘foxes’ rely more on data, observations and local beliefs. The metaphor was recently popularised by Philip Tetlock in Superforecasting, who found that experts who relied heavily on a single overarching model made less accurate predictions.

Unfortunately, some people have gone too far in their skepticism of model-building. They seem to think that having a theory at all makes you a bad ‘hedgehog’ instead of a good ‘fox’ with lots of small observations.

Yudkowsky argues you should have a theory and test it. Indeed, it would be dangerous to rely on a theory instead of running an experiment or looking at data. But you need to have a theory in the first place to run an experiment that potentially falsifies your theory.

The idea isn’t, “Be a hedgehog, not a fox.” The idea is rather: developing accurate beliefs requires both observation of the data and the development of models and theories that can be tested by the data.
—Eliezer Yudkowsky in Inadequate Equilibria

Perhaps the fear is that forming a theory increases the risk of motivated reasoning where, once you’ve formed a big theory, you’ll find some way to keep that theory even you come across some conflicting data. Yet the answer isn’t to stop forming theories—it’s to get better at testing your theories, admitting when you’re wrong and updating accordingly.

Outside view

A well-known debiasing measure to deal with things like the planning fallacy is to start with the outside view—the average rate for people in your reference class. People who do this tend to be much more accurate at predicting how long something will take or cost than those who start with the inside view—estimating how much is required to do the steps in their particular project.

This is a perfectly good debiasing measure in some situations, like planning your Christmas shopping. In situations where you have a large sample of causally similar situations and good reasons to expect overoptimistic forecasts, the outside view performs well.

But in novel cases where causal mechanisms differ, the outside view fails. The problem is that selecting the right reference class is tricky. You can get reference class tennis where two sides disagree and insist that their chosen reference class is the correct one. For example, if 80% of people in the world believe in God, does that mean you should assign an 80% probability to God existing? The example may seem silly, but poses a serious challenge to modest epistemology.

How to assess your own competence

Make bets and update hard

When estimating a system’s adequacy (especially relative to your own competence), update hard on your experiences of failure and success. One data point may be “just an anecdote”, but it’s far better than zero data points, and you can keep collecting more and more data. And bet on everything, because it helps a lot with learning.

Yudkowsky admits he assumes he produces better judgments than average, but he didn’t start out with this assumption. He makes this assumption only because he learned about the processes for producing good judgments and then invested a lot of time and effort (including lots of updating) to try and develop better judgments.

It’s easier to identify experts than to become one

Correct contrarians understand that it takes far less work to identify the correct expert in a pre-existing dispute between experts than to make an original contrarian contribution in almost every field. It’s not easy in absolute terms—you’re still trying to outperform the mainstream in deciding who to trust. But it’s still something that an amateur with unusually good epistemology could do with a reasonable amount of effort.

Example: Japan’s monetary policy

Yudkowsky once wrote a paper with a footnote that suggested Japan’s economic growth had probably been hampered by its monetary policy. One of his friends questioned how he knew this, since this wasn’t Yudkowsky’s area of expertise and there could be many other reasons for the slow growth (e.g. aging population, low female labour participation rates, high levels of regulation, etc).

But Yudkowsky hadn’t come to this conclusion about Japan’s monetary policy on his own. The argument came from Scott Sumner, who was not quite mainstream (at that time) though many economists shared his view. Yudkowsky found Sumner’s arugment credible based on his own attempt to understand the arguments and by looking at track records on near-term predictions.

Inadequacy analysis also gave Yudkowsky good reason to think Japan’s monetary policy was suboptimal, even though trillions of dollars were at stake and the policy was set by experienced economists. It’s because the system was inexploitable. People who saw the inefficiency could not profit off it because it’s already been priced into Japanese stock prices. And even if one of the Bank of Japan’s governors saw the inefficiency, they don’t have much incentive to change it—they don’t get financial bonuses for the Japanese economy performing better.

Indeed, once the Bank of Japan changed its monetary policy and started printing a lot more money, the country experienced real GDP growth of 2.3%, whereas the trend had previously been falling GDP. [Note: Yudkowsky may be wrong on this after all.]

Over your lifetime, you’ll probably get between 0 to 2 instances of being able to substantially improve on civilization’s current knowledge by putting years into the attempt. But you’ll likely experience many cases of picking the right side in a dispute between experts, if you can follow the object-level arguments reasonably well and identify the meta-level cues that indicate which expert’s reasoning process is more sound.

Don’t overcorrect

Inadequacy analysis is not the same as sloppy cynicism. There’s only so much failure to be explained, and every possible dysfunction competes against every other possible dysfunction as an explanation. If you aim to do things beat the stockmarket or build a successful startup without doing a good inadequacy analysis, you’ll still probably fail.

The benefit of inadequacy analysis for many people will be to break a kind of blind trust, where they assume a system is adequate even when its brokenness is staring them in the face. But have some common sense and take care not to shoot your own foot off.

Somehow, someone is going to horribly misuse all the advice that is contained within this book.
—Eliezer Yudkowsky in Inadequate Equilibria

Yudkowsky hopes his book does more good than harm. He’s not sure it will actually harm the overconfident since he doesn’t know of any cases where previously overconfident people were rescued by modest epistemology and became more effective as a result.

But Yudkowsky expects Inadequate Equilibria will be much more useful to the underconfident than the overconfident. If you’re trying to do something unusually well—whether in the domain of science, business or altruism—you will often need to seek out the most neglected problems and use private information that is not widely known or accepted. Modesty is particularly detrimental in that kind of work, because it discourages breaking new ground and making uncertain bets.

Other Interesting Points

  • Education as signalling (i.e. people waste a lot of time and money going to college to signal desirable traits like intelligence and conscientiousness, rather than to learn things) is one of the examples Yudkowsky gives of a bad but stable equilibrium. The equilibrium is stable because the selfish incentive for a smart student is to go to college and the selfish incentive for an employer is to hire college graduates.

    I think Yudkowsky’s overall point is right but, as noted in my criticisms of The Case Against Education, Bryan Caplan’s signalling estimates are overstated. In fact, I think “education as signalling” is a great example of the problem of recursion. If people actually believe signalling is as strong as Caplan claims, it becomes harder to break out of the equilibrium.

  • If you’re going to argue against an idea, it’s bad form to start off by arguing that the idea was generated by a flawed thought process before you’ve explained why you think the idea itself is wrong.

  • Beware of the fallacy fallacy. Don’t assume a conclusion is false just because a fallacious argument for that conclusion exists. There are just so many fallacious arguments that could exist that if you rejected every argument that is superficially similar to some other fallacy, you’d be left with nothing. [This is similar to what Julia Galef describes as “bad arguments inoculating people from good ones”.]

My Review of Inadequate Equilibria

Yudkowsky is a talented writer who can explain complex ideas in entertaining ways. For the most part, I really enjoyed Inadequate Equilibria and found its ideas useful and thought-provoking.

But I also found it frustrating at times. Yudkowsky favours (unnecessarily, imo) difficult words and the book’s structure was on the messy side, which led to needless repetition. The book skims over many concepts such as the market for lemons, the Gell-Mann Amnesia Effect, Aumann’s agreement theorem, and Bayes’s Rule. I expect these ideas are well-known and understood within the rationalist community, but this may have made the book more challenging for people not in that community. Perhaps this was intentional on Yudkowsky’s part to limit his audience to those who most needed to hear (and least likely to misapply) his message.

One concern I had while reading this book was that it might encourage fatalism. I know Yudkowsky’s goal is to discourage a certain kind of fatalism (“the world is self-correcting and adequate so there’s nothing we can do” kind), yet it risks encouraging another kind (“there is so much broken in our society and these equilibria are impossible to shift unless you’re a billionaire” kind). Yudkowsky even writes:

In the same way that inefficient markets tend systematically to be inexploitable, grossly inadequate systems tend systematically to be unfixable by individual non-billionaires.
—Eliezer Yudkowsky in Inadequate Equilibria

I don’t doubt that coordination problems are widespread and usually impossible for an individual to fix. But individuals can work with other individuals, and some individuals can hold power without being a billionaire. After all, the Bank of Japan eventually changed its monetary policy, even though the system’s incentives remained the same. What prompted the shift? It may well have been an individual governor who set things in motion.

Similarly, Yudkowsky spends considerable time on a “dead babies” example, which involves the parenteral nutrition formula (the food given intravenously) to babies born with short bowel syndrome. For years, the FDA-approved formula was faulty and caused innocent babies to suffer needless liver failures and deaths until, eventually, the FDA let US hospitals import parenteral nutrition bags from Europe. Again, the system’s bad incentives continued and we don’t know what precipitated the change. Maybe an individual at the FDA got the ball rolling.

It’s quite possible that fatalists will just find a way to be fatalistic regardless of what anyone writes. However, I wish Yudkowsky had taken a bit more care on this point.

A final note: the main point I see in favour of modest epistemology—i.e. deferring to the crowd and experts—is that it’s fast. Yudkowsky acknowledges that when trying to assess the quantity of effort required to outperform a system in some way, the question is not “Can it be done?” but “How much work?”. Even finding the right contrarian—particularly in a field as arcane as monetary policy—takes a decent amount of time and effort, which most people are unwilling to spend on most issues. So “be modest and defer to experts” has its place as a rule of thumb, rather than as an absolute rule.

Let me know what you think of my summary of Inadequate Equilibria in the comments below!

Buy Inadequate Equilibria at: Amazon <– This is an affiliate link, which means I may earn a small commission if you make a purchase through it. Thanks for supporting the site! 🙂 Alternatively, pay what you want for the ebook.

If you enjoyed this summary of Inadequate Equilibria, you may also like:

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.