Skip to content
Longterm Wiki
Back

Credibility Rating

3/5
Good(3)

Good quality. Reputable source with community review or editorial standards, but less rigorous than peer-reviewed venues.

Rating inherited from publication venue: 80,000 Hours

This interview accompanies Toby Ord's book 'The Precipice' (2020) and serves as an accessible entry point to existential risk thinking, particularly relevant for understanding how AI risk fits within the broader landscape of civilizational-scale risks.

Metadata

Importance: 72/100podcast episodeeducational

Summary

A comprehensive podcast interview with philosopher Toby Ord discussing his book 'The Precipice', covering quantitative estimates of existential risks from natural and anthropogenic sources including AI, bioweapons, nuclear war, and climate change. Ord argues humanity is at a uniquely dangerous 'hinge of history' and outlines both the moral case for prioritizing existential risk reduction and practical policy recommendations.

Key Points

  • Ord provides specific probability estimates for various existential risks this century, with engineered pandemics and unaligned AI ranked as the highest-probability anthropogenic threats.
  • Natural risks (asteroids, supervolcanoes, stellar threats) are estimated to be much lower than anthropogenic risks, suggesting human-caused dangers dominate the overall risk landscape.
  • The 'hinge of history' concept argues that decisions made now about technology and governance may be uniquely consequential for humanity's long-term trajectory.
  • Climate change is discussed as a risk factor that could amplify other existential risks rather than being an existential risk in isolation for most scenarios.
  • The interview covers career and policy recommendations for those wanting to reduce existential risk, connecting abstract philosophy to actionable priorities.

Cited by 1 page

PageTypeQuality
Toby OrdPerson41.0

Cached Content Preview

HTTP 200Fetched Apr 7, 202698 KB
Toby Ord on the precipice and humanity's potential futures | 80,000 Hours Search for: Our new book, a ridiculously in-depth guide to a fulfilling career, is out May 2026. Preorder now 

 On this page:

 Introduction 
 1 Highlights 
 2 Articles, books, and other media discussed in the show 
 3 Transcript 3.1 Rob's intro [00:00:00] 
 3.2 The interview begins [00:02:15] 
 3.3 What Toby learned while writing the book [00:05:04] 
 3.4 Estimates for specific x-risks [00:08:10] 
 3.5 Asteroids and comets [00:16:52] 
 3.6 Supervolcanoes [00:24:27] 
 3.7 Threats from space [00:33:06] 
 3.8 Estimating total natural risk [00:36:34] 
 3.9 Distinction between natural and anthropogenic risks [00:45:42] 
 3.10 Climate change [00:51:08] 
 3.11 Risk factors [01:10:53] 
 3.12 Biological threats [01:26:59] 
 3.13 Nuclear war [01:36:34] 
 3.14 Artificial intelligence [01:48:55] 
 3.15 Dealing with big uncertainties [01:59:17] 
 3.16 The hinge of history [02:15:20] 
 3.17 Reasons for optimism [02:34:34] 
 3.18 A vision for how the future could go well [02:44:01] 
 3.19 Policy recommendations [02:59:49] 
 3.20 Careers [03:04:54] 
 3.21 Rob's outro [03:13:29] 
 
 4 Learn more 
 5 Related episodes 
 Read transcript See all episodes 
 
 
 
 

 

 “The Precipice” is a time where we’ve reached the ability to pose existential risk to ourselves, which is substantially bigger than the natural risks, the background that we were facing before. And this is something where I now think that the risk is high enough, that this century, it’s about one in six.

 — Dr Toby Ord

 This week Oxford academic and advisor to 80,000 Hours Toby Ord released his new book The Precipice: Existential Risk and the Future of Humanity . It’s about how our long-term future could be better than almost anyone believes, but also how humanity’s recklessness is putting that future at grave risk, in Toby’s reckoning a 1 in 6 chance of being extinguished this century.

 I loved the book and learned a great deal from it.

 While preparing for this interview I copied out 87 facts that were surprising to me or seemed important. Here’s a sample of 16:

 The probability of a supervolcano causing a civilisation-threatening catastrophe in the next century is estimated to be 100x that of asteroids and comets combined.
 The Biological Weapons Convention — a global agreement to protect humanity — has just four employees, and a smaller budget than an average McDonald’s.
 In 2008 a ‘gamma ray burst’ reached Earth from another galaxy, 10 billion light years away. It was still bright enough to be visible to the naked eye. We aren’t sure what generates gamma ray bursts but one cause may be two neutron stars colliding.
 Before detonating the first nuclear weapon, scientists in the Manhattan Project feared that the high temperatures in the core, unprecedented for Earth, might be able to ignite the hydrogen in water. This would set off a self-sustai

... (truncated, 98 KB total)
Resource ID: 35cc64aad5b46421 | Stable ID: sid_KnFvTSOIUn