Over the new year I read the perfect book for the start of the new year. In Superforecasting, Philip Tetlock and Dan Gardner provide a roadmap for becoming a better forecaster, with small and progressive steps to improving any prediction you make on almost any topic. This is not just book for political pundits and economists, but is recommended to anyone in marketing sciences, including researchers, who make a living from interpreting and synthesising information to make inferences about business decision-making.
Their overall thesis is very simple. Being an “expert” does not make you better at making predictions especially about your specialism. In fact, in many cases experts make worse predictions than those with a wide general knowledge (back to the hedgehog and fox). However, Philip Tetlock’s has shown that a small number of people are consistently better than others at making predictions of the future across a wide variety of topics. This book discusses the secrets of their success.
These secrets are really the best practice of interpreting any kind of evidence: gathering evidence from multiple sources, thinking probabilistically (i.e. like a Bayesian), working in teams, being open to learning new things and most especially learning from your mistakes and making corrections as you go along. As George Box famously said, “All models are wrong but some are useful” and this is the guiding principle of the book.
They summarise the skills and strengths of a superforecaster in a pen portrait, making the following points (among others). Good forecasters:
- are cautious because nothing is certain
- are humble because real life is complex (and so are most forecasting problems)
- are open-minded and test their beliefs rather than protect them
- are intellectually curious and enjoy mental challenges
- are introspective and self-critical
- are comfortable with numbers (with a mix of quantitative and qualitative mindsets in the parlance of market research)
- are pragmatic and wiling to change their mind
- are analytical and capable of looking at other points of view
- are “dragonfly-eyed” with a 360 view of problems and an ability to synthesise
- think probabilistically (like a Bayesian)
- are able to update their opinions when the facts change
- have an intuitive understanding of human psychology and cognitive biases
- positively seek to continuously improve
- have strong determination in the face of adversity
What a job description for anyone who analyses any kind of research data! On top of these attributes, the authors also discuss the value of working in groups, as long as the groups are used to bring a diversity of opinion and an openness for discussion and disagreement. Their evidence clearly shows that greater diversity increases the chance that some team members might have important information that others don’t, improving the amount of information and acting as an important balance to a more single-minded approach to making predictions.
I can recommend Superforecasting to anyone interested in improving the predictions that they make. And there’s no better time to start applying these lessons than in the first week of 2016.
Superforecasting: The art and science of prediction by Philip Tetlock & Dan Gardner