Calibration Is Hard

  • People want to be safe more than right. The benefits of being right are not cut-and-dry, so it has a marketing problem.
    • Confirmation bias rules everything around me (C.R.E.A.M.)
      • People seek confirmation, not truth. We are ego protectors
      • Many calibration errors are not errors; they are motivated reasoning. Everyone's talking their own book. Especially on matters that definitely don’t have right answers that reasonable people disagree on.
      • SlatestarCodex: Of the fifty-odd biases discovered by Kahneman, Tversky, and their successors, forty-nine are cute quirks, and one is destroying civilization. This last one is confirmation bias - our tendency to interpret evidence as confirming our pre-existing beliefs instead of changing our minds. This is the bias that explains why your political opponents continue to be your political opponents, instead of converting to your obviously superior beliefs. And so on to religion, pseudoscience, and all the other scourges of the intellectual world.
      • Rohit Krishnan: “human ability is normally distributed but the outcomes are power law distributed”. What this means is that just because someone builds a company that produces extraordinary outcomes, 10000x the average, doesn’t mean that they were 10000x as capable. Achievements are created from multiplicative outcomes of many different variables. So if you’re investing in a “10x founder” it doesn’t mean that they themselves are 10x the capability of everyone else, but what it means is that their advantage, combined with everyone else’s advantage, can get you to a 10000x outcome. Which means the adulation we pour on top of some folks creates its own gravitational field, and makes others susceptible to falling in love.
      • Risk-conscious hard work and discipline can lead someone to achieve a comfortable life with a very high probability. Beyond that, it is all randomness: either by taking enormous (and unconscious) risks, or by being extraordinarily lucky. Mild success can be explainable by skills and labor. Wild success is attributable to variance. Nassim Taleb
  • A good filter reduces the chain of coin flips you must win before you reach your goal
  • No matter how much we say ‘correlation is not causation’ the admonition remains underrated. An example I see everywhere: Someone says “x makes us good at y” but usually what makes us good at x also makes us good at y.
  • Post-hoc fallacy: we don’t know the difference between “because of” and “in spite of”
  • Berksons Paradox and tail divergence are horribly underappreciated. Just listen to how anybody speaks about unconventional outliers.
  • Survivorship bias —> overfitting —> cargo-culting
  • The mark of smart is asking good questions. This is a good test for understanding.
  • Improving decisions requires:
    • better benchmarking and accounting for counterfactuals
    • Recruit others into your dilemmas
      • Daniel Kahneman from Thinking, Fast and Slow: “It is easier to recognize other people’s mistakes than our own.”
      • The best antidote to cognitive bias is getting perspective from others. We can’t see our own blindspots.
      • Julia Galef’s also believes “peer pressure” can combat bias if people prefer to flatter themselves as “scouts” rather than cling to something they want to believe but think is wrong. This makes sense to me because a culture of being well-calibrated will nudge you towards that goal (SIG, LessWrong community)
        • There’s also an argument I find unsatisfying in Slatestar’s review of The Scout Mindest
          It reminds me of C.S. Lewis - especially The Great Divorce, whose conceit was that the damned could leave Hell for Heaven at any time, but mostly didn’t, because it would require them to admit that they had been wrong. I think Julia thinks of rationality and goodness as two related skills: both involve using healthy long-term coping strategies instead of narcissistic short-term ones.
          I know some rationalists who aren’t very nice people (I also know others who are great). There are lots of other facets of nice-person-ness beyond just an ability to acknowledge your mistakes (for example, you have to start out thinking that being mean to other people is a mistake!) But all these skills about “what tests can you put your thoughts through to see things from the other person’s point of view?” or “how do you stay humble and open to correction?” are non-trivial parts of the decent-human-being package, and sometimes they carry over.
          In one sense, this is good: buy one “rationality training”, and we’ll throw in a “personal growth” absolutely free! In another sense, it’s discouraging. Personal growth is known to be hard. If it’s a precondition to successful rationality training, sounds like rationality training will also be hard. Scout Mindset kind of endorses this conclusion. Dan Ariely or whoever promised you that if you read a few papers on cognitive bias, you’d become a better thinker. Scout Mindset also wants you to read those papers, but you might also have to become a good personHere Scout Mindset reaches an impasse. It’s trying to train you in rationality. But it acknowledges that this is closely allied with making you a good person. And that can’t be trained
          (in case this is starting to sound too touchy-feely, Julia interrupts this section for a while to mercilessly debunk various studies claiming to show that “self-deluded people are happier) [Kris: that’s another argument for being well-calibrated!]