If statistician Nate Silver’s “The Signal and the Noise: Why So Many Predictions Fail” – who successfully called the outcomes of all 50 states in the 2012 United States presidential election (he was 49 for 50 in the 2008 election) – first piqued my interest in the general application of probability and statistics, then this book is a helpful extension, and could perhaps even galvanise deeper involvement in this field. In the midst of my public policy education, where quantitative evaluation of government policies has gained traction, learning about how to make effective forecasts should pay dividends in the future.
Premised upon “The Good Judgement Project” – a project which gathered voluntary and often amateur participants to make predictions on events, and which has also emerged as a winner in a forecasting tournaments organised by the Intelligence Advanced Research Projects Activity in America – Philip E. Tetlock and Dan Gardner’s “Superforecasting: the Art and Science of Prediction” detailed “superforecasters” who outperformed not even the average, but also intelligence professionals. Advocating a “try, fail, analyse, adjust, try again” cycle to forecasting, through interviews with and performance data of these top “superforecasters”, Tetlock and Gardner also mooted ten commandments for aspirants:
2. Break seemingly intractable problems into tractable sub-problems.
3. Strike the right balance between inside and outside views.
4. Strike the right balance between under- and over-reacting to evidence.
5. Look for the clashing causal factors at work in each problem.
6. Strive to distinguish as many degrees of doubt as the problem permits but no more.
7. Strike the right balance between under- and over-confidence, between prudence and decisiveness.
8. Look for the errors behind your mistakes but beware of rear-view-mirror hindsight bias.
9. Bring out the best in others and let others bring out the best in you.
10. Master the error-balancing bicycle.
They do add on a cautious note – the same cautious and healthily-sceptical note Tetlock and Gardner espouse throughout the book – that “commandments [should not be treated] as commandments”, yet overall “Superforecasting” is a good mix of findings from “The Good Judgement Project”, anecdotes from the “superforecasters”, and important characteristics of probabilistic thinking such as having a growth mindset. What might have been even more enriching, I thought, was: i. an example of how a “superforecaster” worked through a question, from start to finish; ii. an example of how a more regular individual may process the same question; and iii. how these two procedures may differ. Walking through the steps of a “superforecaster” will yield further insights.
Like Nate Silver (who previously cited the work of Tetlock), the writers also allude to warrior-poet Archilochus’s quote that “the fox knows many things but the hedgehog knows one big thing”. Forecasters have to be more “foxy”, and in this vein pay more attention to a wide variety of issues and to take a more pluralistic approach in response to questions.