Michael Mentele

learning-review

Book Review: Thinking in Bets

Book Author: Annie Duke

The premise of this book is to think probablistically. The author Annie Duke introduces this idea with a simple assertion; life is like poker–not chess.

Poker has unknown unknowns and known unknowns. Chess is deterministic and bounded. The best move can always be computed. But with poker information is hidden from you. You don’t get to see your opponent’s hand.

In this world, how do you know you made good choices or bad choices? You could win through dumb luck and lose through stupid choices. You have to deal with the fact that good decisions can still result in bad outcomes.

This is a key idea. Good decisions can result in bad outcomes and vice versa. This book advocates for a frameshift in thinking and this is one of the most important.

A central premise of thinking in bets is there is a lot of uncertainty, this book is how to deal with that uncertainty.

The first chapter makes the introduction about how we want to think the right decision gives the right result (which is wrong).

The rest of the book describes how you can learn from noisy feedback, make good decisions (bets) and not judge them based on results (if they were probablistic as most decisions are), and finally Duke hangs a lantern on our biases and presents strategies for overcoming them.

If you’ve read Daniel Kahneman’s ‘Thinking Fast and Slow’ or similar you will already be familiar with many of these cognitive errors, but Duke provides a compelling review and framework (perspective) to reconsider them.

I’m going to open the same way, but compress the rest of the book into what it is–biases and how to deal with them.

Specifically: 1) how to make good decisions in a non-determinisitic world 2) why we don’t behave rationally 3) mental traps (biases)

How to Make Good Decisions

The premise of this book is that decisions are bets and that if we treat them as explicit bets (like poker players do) then we can make better decisions. Results are probablitic and so bad outcomes can result from good decisions and vice versa.

If you make a good decision ie. one with a high probablity of good outcomes you will still have situations where you got unlucky and it resulted in a poor outcome.

An example Duke gives is a play call by Pete Carrol: with twenty-six seconds remaining and trailing by four points, it was second down on the one-yard line. Instead of running the ball Pete Carrol called for a pass. It was intercepted. Game over.

Was this a bad play call? Watch out! I’ve already biased you because I told you the outcome. Since you know the result there is a strong bias to infer the decision was wrong. Poker players call this resulting.

But in is not out–not when uncertainty is involved.

Think about this, if I had presented what Pete Carrol had called but not told you they lost and asked you if it was a good call, what would you have said?

Before you answer, consider that it was second down, the probablity of an interception was extremely low and a pass was something the defense wasn’t expecting and it had the potential to stop the clock. It was a good opening move. There was one problem though; it didn’t work and that completely colors how we interpret the decision.

We want to think Pete Carrol made a stupid choice because we know what the outcome was. But if you think about it probablistically the outcome that happened had a low chance of occurring, the chance of scoring was larger and getting an incomplete, the most likely outcome, was desirable.

Why we Don’t Behave Rationally

We didn’t evolve to be rational, we evolved to make causal connections from events to outcomes, even if they are wrong. Consider this, is it better to associate the rustle of foliage with a lion or not? Even if its only a lion 1/100 times it’s better to think it’s a lion every time… Why? Because it only has to be a lion once.

A type I error (a false positive) results in death i.e. the lion caused the rustle and you ignored it whereas a type II error (a false negative) has little cost; the wind caused the rustle and you ran away or became more alert.

In modern life this isn’t really the case, the stakes aren’t usually as high, so we can afford to lose if making good decisions over the long term will be profitable. For example, in poker you wouldn’t want to assume that the opponent has an amazing hand everytime they raise or make a bet–you’d soon go broke.

A key takeway is that you can’t really know if a decision is good or bad without enough data. For poker, you need to play a lot of hands to have signal a strategy is good or bad.

The rest of this article I’ll go through the biases and relay strategies for dealing with them.

Mental Traps

  • confirmation bias is the grand-daddy. It is where we tend to use evidence to confirm our already held beliefs. That is, we are already sold or anchored on an idea and we only take in evidence that bolsters our idea. One interesting claim is that experts are more biased, or that is the more knowledge you have in a given field the more you tend to use data to confirm existing opinions. The more ‘intelligent’ or fluent you are in a technology the more likely you are to be able to rationalize data to conform to your expectations.
  • cognitive homeostasis and motivated reasoning biases are where we want to make the data support our current beliefs in its interpretation, very similar to confirmation bias.
  • blind spot bias is the fact it is easier to recognize bias in others than in ourselves, gets worse the more intelligent or more expert you are
  • self-serving bias is we take credit for the wins but not for our part in bad outcomes
  • black and white thinking where it is easier to have hard lines than reason through the specific context of an idea or situation
  • hindsight bias we tend to assume we could have known the outcome by looking back with the knowledge of the actual outcome. We tend to cherry pick the data and make connections between what happened in the past and what happened but knowing the outcome biases your view of the data and judgement of decision, you might make a cause-effect connection that is not truly there.
  • outcome bias and resulting assuming that our decision making is good or bad based on a small set of outcomes, is good for bounded arenas like chess but bad in games like poker and life. Resulting is the idea that something happens and we make hindsight qualifications or rather we judge the quality of decisions based on the outcomes that occurred even though good decisions don’t always produce good outcomes. This is the case for controversial football calls where if it had succeeded it would have been ‘brilliant’ but if it fails it is ‘stupid’ or a disaster. One habit is to not give the outcome when evaluating a decision.
  • credulity bias we believe stories and anecdotes because sharing experience with others is an evolutionary advantage, but we can challenge beliefs in these stories so that people actually evaluate them via stakes i.e. wanna bet on it?
  • authority bias and credibility bias we use authority as a heursitic for correctness rather than the merit of an idea. If it is important, do your own thinking, look at the data yourself
  • temporal discounting is where the future is discounted to the present. Unfortunatley, success comes to long range thinking and delayed gratification. To overcome this bias we can use the 10/10/10 rule to time travel and try to imagine future outcomes in the here and now. Or put yourself in the shoes of future self in 10 minutes, 10 months, and 10 years and see how you would feel.
  • Rashomon effect: narratives diverge based on perspective

Other general strategies to help you think in bets:

  • form a truth seeking group with an explicit agreement, you can improve if you are accountable to a group who’s aim is accuracy, you need them to be able to question each other but have a good enough relationship that the questioning is not toxic. Ingredients: focus on accuracy, explicit accountability, openness
  • follow CUDOS
    • Communism: use data belonging to the group (not political groups)
    • Univeralism: apply the same standards to info regardless of source, disinterestedness
    • Disinterestedness: fight conflict bias, don’t be attached to outucomes
    • Organized Skepticism: encourage dissent
  • you will commonly find people falling prey to these biases so you need to communicate well and question other people’s assertions by:
    • express uncertainty
    • lead with assent (what you agree with)
    • ask for agreement to be open minded
  • thinking long term will help you act more rationally
  • you can use Ulysses contracts to raise barriers against being irrational
  • pre-decide where possible to avoid bias in the moment
  • engage in scenario planning to expore the potential futures
    • weight rewards by the probability of success when choosing between options i.e. something worth $100 with a 10% chance of succeeding is worth $10
    • architect backards aka backcasting (page 219)
    • perform pre-mortems to imagine the ways you are likely to fail (more effective than imagining success)
    • working backwards helps you find reasons (causes) for future outcomes by creating a causal train