DRMacIver's Notebook
Life is neither poker nor tennis
Life is neither poker nor tennis
From Thinking in Bets by Annie Duke, page 10:
Hindsight bias is the tendency, after an outcome is known, to see the outcome as having been inevitable. When we say “I should have known that would happen,” or, “I should have seen it coming,” we are succumbing to hindsight bias.
Those beliefs develop from an overly tight connection between outcomes and decisions. That is typical of how we evaluate our past decisions.”
This is from a chapter called “Life is poker, not chess.”
Before I start, I’d like to say that I think “Thinking in Bets” is a very good book and I would recommend that most people read it. It is probably the most accessible practical manual of rationality that I know of (there may well be better ones, but I don’t know of them), and most people’s thinking would be improved by a good primer in that.
Which is why I now want to explain why it’s also wrong about everything.
In discussions on Twitter about yesterday’s diatribe against The Inner Game of Tennis, I made the following point:
The problem being that because he got his start in tennis where verbal intelligence mostly does need to get out of the way, and because most people don’t have tennis-related trauma, his model works pretty there but has fatal flaws when you try to generalise it off the court.
Gallwey is making a classic error: He is a nerd, and he thinks his special interest is universally applicable. Most of the advice in the inner game of tennis is very good if you only ever apply it on the Tennis court.
It’s absolutely a good thing to take ideas from a specific nerddom and try to generalise them. I love doing that, and claiming that it was bad would be the rankest hypocrisy on my part. The problem is not generalising his Tennis related learning off the tennis court, the problem is that having learned all these skills in a context where they are useful close to 100% of the time, you do not learn to understand the preconditions under which those skills are useful.
Annie Duke has made the same mistake: She has learned a skill set within the context of a game of poker, and is now viewing the relevant skill set as applicable to life as a whole. Unfortunately, Poker has some very distinctive features over life:
- It is intrinsically adversarial.
- It is highly structured.
- The probabilities of different outcomes are well defined and easily calculable.
- It has a single well-defined utility function (money made) describing success in the game.
As you may or may not be aware I love to hate on Subjective Expected Utility Theory, but poker is an example where utility theory is great because you do have a utility function and you do have probabilities. Regardless of whether you believe in SEU, life outside poker simply isn’t like that because the probabilities and the utility function are at best much more complex and less well defined.
(Note: Thanks to The Kelly Criterion the utility function of poker is not just straight up money made, but its logarithm, but the point still stands).
Poker is an excellent game with which to hone your skills for a particular type of technical rationality, but technical rationality will only take you so far.
C. Thi Nguyen suggests that games are ways of experimenting with different structures of agency. In playing a game, you are entering a world in which your choices are constrained to follow the rules of the game, and so you learn to play as if those were the only choices possible. As a result you will learn skills that you are neglecting in other contexts where those constraints on your agency do not apply. Learning the inner game of tennis will teach you about embodied cognition, because you can’t think your way out of the problem. Learning poker will teach you about technical rationality, because you can’t sweet talk your way out of the problem. Both embodied cognition and technical rationality are very good skills that you can apply in life more broadly, but because life lacks that restricted structure of agency that the game had. The reason the game taught you that skill so well was precisely because the skill is not universally applicable.
David Chapman has recently published an updated introduction to his Eggplant Book, which is about what he calls metarationality, that I rather like. In particular:
[People with metarational skills] produce these insights by investigating the relationship between a system of technical rationality and its context. The context includes a specific situation in which rationality is applied, the purposes for which it is used, the social dynamics of its use, and other rational systems that might also be brought to bear. This work operates not within a system of technical rationality, but around, above, and then on the system.
This applies more broadly than technical rationality: The foundational skill of life is figuring out what other skills you need to bring to bear on the situation. You cannot learn that from a game, because games work precisely by removing the need for that skill. Games are an excellent way to hone certain skills, but they intrinsically lack the ability to teach you the fundamental life skill of knowing which game you are playing.
All of which is a very long winded way of getting around to a very basic point prompted by the initial quote: Hindsight bias is good, actually.
Hindsight bias is generally bad in poker where you can have very well defined strategies and intrinsic quantifiable uncertainty in the problem, but in life more broadly hindsight bias serves several fairly important functions. For example:
- It forces you to look at problems much more carefully than you otherwise would have, giving you incentives to improve your skills.
- It lets you hold people accountable for actions even when they are individually harmless.
An example of the first is that when writing it is generally worth assuming that any misunderstands are your fault. They may often not be - many people are very bad readers - but by treating them as your fault anyway your writing skills will generally improve.
An example of the second is something I often use as a moral test case and requires a bit more of an extended anecdote.
Near my parents’ place there is a more-or-less blind bend in the main road with a small road that joins on to the main road right after that bend. It is very poor design.
Almost nobody comes down that small road. I’ve seen someone joining on to the main road from there maybe two or three times during many years of driving that road.
It’s entirely viable to take the bend at 60mph if you ignore the side road. It’s not that sharp a bend. However, the signage wants you to take it at 25mph in order to account for that basically-never-used side road.
Now, my question is this: How fast should you take that bend?
From a subjective expected utility point of view, it’s actually probably fine to take that bend at 60mph. Slowing down at that point in the drive is quite annoying, and the chances of causing an accident are extremely low given the low usage of that side road (and it’s not like an accident is inevitable at 60mph and more than it’s entirely avoidable at 25mph, you’re just increasing the likelihood and severity). Yes you’re trading convenience for an increased probability of your own and/or someone else’s death but of course you are, you’re driving.
But, if everyone else makes the same calculation, driving in from that side road is now basically a death sentence.
Subjective Expected Utility calculations often look very different if you universalise them by assuming that everyone else will make similar calculations, but SEU calculations are hard enough as it is without reasoning this way, and there’s an easy kludge to achieve much the same effect: Hindsight bias.
If we treat low-probability consequences as if we should have seen them coming, and punish them accordingly, then we change the incentives to shape individual behaviour away from creating outcomes that were inevitable in aggregate even if they were low probability in any given instance.
Is this unfair on the indidual who happens to trigger the low probablity consequence? Probably. But so is dying because everyone else was making individually rational calculations that decided a low probability of your death was an acceptable consequence.
Is hindsight bias always good? No, of course not. You have to figure out from context whether it is or not. But that skill of figuring it out from context is not, I’m afraid, something you can learn from a game.