Back
Tetlock research
webgoodjudgment.com·goodjudgment.com/
Data Status
Full text fetchedFetched Dec 28, 2025
Summary
Philip Tetlock's research on Superforecasting reveals a group of experts who consistently outperform traditional forecasting methods by applying rigorous analytical techniques and probabilistic thinking.
Key Points
- •Superforecasters consistently outperform traditional experts by 30% in predictive accuracy
- •Successful forecasting relies on probabilistic thinking and methodical analysis
- •Predictive skills can be systematically identified, trained, and improved
Review
Tetlock's groundbreaking research on Superforecasting emerged from a US intelligence community-funded project that challenged conventional wisdom about predictive accuracy. The Good Judgment Project, led by Tetlock and Barbara Mellers, demonstrated that a select group of forecasters could consistently outperform professional intelligence analysts, even those with access to classified information, by approximately 30%.
The research has profound implications for decision-making across multiple domains, including government, finance, energy, and nonprofit sectors. By identifying and training individuals with specific cognitive traits and methodological approaches, Superforecasting offers a systematic approach to reducing uncertainty and improving strategic planning. The work highlights the importance of probabilistic thinking, continuous learning, and carefully calibrated predictions over dogmatic or overconfident forecasting methods.
Cited by 3 pages
| Page | Type | Quality |
|---|---|---|
| Metaculus | Organization | 50.0 |
| AI-Augmented Forecasting | Approach | 54.0 |
| Prediction Markets (AI Forecasting) | Approach | 56.0 |
Resource ID:
664518d11aec3317 | Stable ID: MzY3Y2U2ZG