Most people, including “experts”, are poor at forecasting, but we can improve.
Did the outcome of the Brexit vote surprise you? If you were surprised because the “experts” were expecting a “remain” outcome, then you should note that experts getting it wrong is not an unusual event.
Issue: we’re poor at forecasting.
Professor Phil Tetlock ran a twenty-year study in which 284 experts (two thirds of which were PhDs) in many fields, including government officials, professors, journalists, and others, and with many opinions, from Marxists to free-marketeers, were asked to make 80,000 predictions about the future.
The findings: They were only slightly more accurate than chance, and worse than basic computer algorithms. Yet, the experts were systematically overconfident – they thought they knew more than they knew.
Further evidence: projects of all kinds invariably cost more, take longer and deliver less than anticipated.
Insight: we’re led astray by our biases (overconfidence, self-serving, planning fallacy, etc)
Imperatives: Phil Tetlock offers ten commandments for forecasting.
1. Triage. Focus on questions where your hard work is likely to pay off.
2. Break seemingly intractable problems into tractable sub-problems.
3. Strike the right balance between inside and outside views. Start with the outside-view question: How often do things of this sort happen in situations of this sort?
4. Strike the right balance between under- and overreacting to evidence. The world doesn’t work the way we want it to, but it does signal to us when things change. If we pay attention and adapt, we let the world do most of the work for us.
5. Look for the clashing causal forces at work in each problem. For every good policy argument, there is typically a counterargument that is at least worth acknowledging. You need to understand both (all) sides.
6. Strive to distinguish as many degrees of doubt as the problem permits but no more. The more degrees of uncertainty you can distinguish the better.
7. Strike the right balance between under- and overconfidence, between prudence and decisiveness.
8. Look for the errors behind your mistakes but beware of rearview-mirror hindsight biases. It’s easy to justify or rationalize your failure. Don’t. You want to learn where you went wrong and determine ways to get better.
9. Bring out the best in others and let others bring out the best in you. Master (a) perspective taking (understanding the arguments of the other side so well that you can reproduce them to the other’s satisfaction), (b) precision questioning (helping others to clarify their arguments so they are not misunderstood), and (c) constructive confrontation (learning to disagree without being disagreeable).
10. Master the error-balancing bicycle. Just as you can’t learn to ride a bicycle by reading a physics textbook, you can’t become good at forecasting by reading training manuals. Learning requires doing, with good feedback that leaves no ambiguity about whether you are succeeding are failing
References:
1. "Expert Political Judgment: How Good Is It? How Can We Know?", Phil Tetlock (2005)
2. "Superforecasting: The Art and Science of Prediction", Phil Tetlock (2015)