Skip to content
Longterm Wiki
Back

A Gentle Introduction to Risk Frameworks Beyond Forecasting

web

Author

pendingsurvival

Credibility Rating

3/5
Good(3)

Good quality. Reputable source with community review or editorial standards, but less rigorous than peer-reviewed venues.

Rating inherited from publication venue: LessWrong

A LessWrong post offering an accessible survey of risk analysis frameworks for AI safety audiences who may be overly reliant on probabilistic forecasting; useful background for those working on governance or decision-making under deep uncertainty.

Metadata

Importance: 52/100educational

Summary

This post introduces risk management frameworks that go beyond probabilistic forecasting, arguing that uncertainty about AI risks cannot always be captured by probability estimates alone. It surveys approaches like scenario planning, robust decision-making, and precautionary principles that remain useful when probabilities are unknown or contested. The piece aims to broaden the toolkit for thinking about AI safety and existential risk.

Key Points

  • Probabilistic forecasting has limits when dealing with deep uncertainty, novel risks, or situations where we lack reliable base rates.
  • Alternative frameworks such as scenario planning, robust decision-making, and the precautionary principle can complement or replace forecasting in high-stakes contexts.
  • Risk frameworks from finance, engineering, and public health offer transferable lessons for AI safety practitioners.
  • Choosing the right risk framework depends on the type of uncertainty involved and the decision context.
  • A pluralistic approach to risk analysis may be more appropriate for existential and catastrophic AI risks than relying solely on probability estimates.

Cached Content Preview

HTTP 200Fetched Apr 10, 202661 KB
# A Gentle Introduction to Risk Frameworks Beyond Forecasting
By pendingsurvival
Published: 2024-04-11
*This was originally posted on Nathaniel's and Nuno's substacks (*[*Pending Survival*](https://nathanielcooke.substack.com/p/a-gentle-introduction-to-risk-frameworks) *and* [*Forecasting Newsletter*](https://forecasting.substack.com/p/a-gentle-introduction-to-risk-frameworks)*, respectively). Subscribe* [*here*](https://nathanielcooke.substack.com/) *and* [*here*](https://substack.com/@forecasting)*!*  
  
*Discussion is also occurring on the EA Forum* [*here*](https://forum.effectivealtruism.org/posts/yqquAt4JLzqKyRMQB/a-gentle-introduction-to-risk-frameworks-beyond-forecasting) *(couldn't link the posts properly for technical reasons).*

Introduction
------------

When the Effective Altruism, Bay Area rationality, judgemental forecasting, and prediction markets communities think about risk, they typically do so along rather idiosyncratic and limited lines. These overlook relevant insights and practices from related expert communities, including the fields of disaster risk reduction, safety science, risk analysis, science and technology studies—like the sociology of risk—and futures studies.

To remedy this state of affairs, this document—written by Nathaniel Cooke and edited by Nuño Sempere—(1) explains how disaster risks are conceptualised by risk scholars, (2) outlines Normal Accident Theory and introduces the concept of high-reliability organisations, (3) summarises the differences between “sexy” and “unsexy” global catastrophic risk (GCR) scenarios, and (4) provides a quick overview of the methods professionals use to study the future. This is not a comprehensive overview, but rather a gentle introduction.

**Risk** has many different definitions, but this document works with the IPCC definition of the “potential for adverse consequences”, where risk is a function of the **magnitude** of the consequences and the **uncertainty** around those consequences, recognising a diversity of values and objectives.1 Scholars vary on whether it is always possible to measure uncertainty, but there is a general trend to assume that some uncertainty is so extreme as to be practically unquantifiable.2 Uncertainty here can reflect both objective likelihood and subjective epistemic uncertainty.

1\. Disaster Risk Models
------------------------

A common saying in disaster risk circles is that “there is no such thing as a natural disaster”. As they see it, hazards may arise from nature, but an asteroid striking Earth is only able to threaten humanity because our societies currently rely on vulnerable systems that an asteroid could disrupt or destroy.3

This section will focus on how disaster risk reduction scholars break risks down into their components, model the relationship between disasters and their root causes, structure the process of risk reduction, and conceptualise the biggest and most complex of the risks they study.

The field of global catastrop

... (truncated, 61 KB total)
Resource ID: f5b742f8d2794e61 | Stable ID: sid_QG813SozNV