Skip to content

CSER (Centre for the Study of Existential Risk)

📋Page Status
Page Type:ContentStyle Guide →Standard knowledge base article
Importance:50 (Useful)
Last edited:2026-02-01 (today)
Words:2.3k
Structure:
📊 1📈 0🔗 2📚 4616%Score: 10/15
AttributeValue
Founded2012
LocationUniversity of Cambridge, UK
Key FocusExistential risks from AI, biotechnology, climate, nuclear threats
Notable Funding$1M from Future of Life Institute (2023); over $200K from Survival and Flourishing Fund
Key Publications”The Malicious Use of Artificial Intelligence” (2018); “Climate Endgame” in Nature (2022)
LeadershipProfessor Matthew Connelly (Director, from 2023); Dr. Seán Ó hÉigeartaigh (Executive Director)

The Centre for the Study of Existential Risk (CSER) is an interdisciplinary research centre within the University of Cambridge that studies threats capable of causing human extinction or civilizational collapse. Founded in 2012 by philosopher Huw Price, cosmologist Lord Martin Rees, and Skype co-founder Jaan Tallinn, CSER represents one of the first major academic institutions dedicated to existential risk research.12

CSER’s research spans four primary domains: risks from artificial intelligence, extreme technological risks, global catastrophic biological risks, and extreme environmental risks including climate change.3 The centre operates through a three-pillar strategy focused on advancing understanding of existential risks through rigorous research, developing collaborative mitigation strategies, and building a global field of researchers, technologists, and policymakers committed to addressing these challenges.4

Hosted within Cambridge’s Centre for Research in the Arts, Social Sciences and Humanities (CRASSH), CSER has produced influential research including publications in Nature and other top-tier journals, organized major conferences on catastrophic risk, and advised governments and international organizations including the UN, WHO, and OECD on pandemic preparedness, nuclear risks, and AI governance.56

CSER was established in 2012 through an unusual collaboration between a philosopher (Huw Price, Bertrand Russell Professor of Philosophy at Cambridge), a scientist (Lord Martin Rees, Astronomer Royal and former President of the Royal Society), and a software entrepreneur (Jaan Tallinn, co-founder of Skype and early investor in Anthropic).78 The founders were motivated by concerns that advancing technologies—particularly artificial intelligence, biotechnology, nanotechnology, and anthropogenic climate change—posed extinction-level risks that were comparatively neglected in academia.9

Jaan Tallinn, who had begun engaging with the existential risk community in 2009, provided seed funding for the centre’s establishment.10 In its early years (circa 2013), CSER submitted ambitious grant applications including a proposal to the European Research Council for a “New Science of Existential Risk” five-year program, which was highly ranked but ultimately not funded.11

By 2015, CSER had secured initial time-limited grants primarily focused on philosophy and social science research, funding operations through mid-2018.12 The centre began building expertise to support future science, technology, and AI safety grant applications. During this period, CSER developed the TERRA bibliography tool for existential risk publications and began organizing academic conferences linking decision theory and AI safety starting in 2017.13

In 2018, CSER achieved significant recognition with two major publications: The Malicious Use of Artificial Intelligence: Forecasting, Preventing and Mitigation (co-authored with tech companies and security think-tanks) and An AI Race: Rhetoric and Risks, which won the Best Paper award at the AAAI/ACM AI Ethics and Society conference.14 The centre also established the UK’s first All-Party Parliamentary Group for Future Generations during this period.15

In April 2020, Dr. Catherine Rhodes took on the role of Executive Director, with Dr. Seán Ó hÉigeartaigh serving as Co-Director.16 The centre’s research output accelerated significantly: in 2022 alone, CSER produced 24 publications including papers in Nature, Nature Sustainability, and Proceedings of the National Academies of Science. Notable works included “Climate Endgame: Exploring catastrophic climate change scenarios” and “Huge volcanic eruptions: time to prepare,” both of which received extensive media coverage.17

In July 2023, the Future of Life Institute granted $1 million to the University of Cambridge specifically for CSER, enabling funding for a full five-year position for Professor Matthew Connelly as the centre’s new Director.18 This was complemented by a multi-million-dollar endowment from Carl Feinberg (via Cambridge in America) establishing the Rees Feinberg Professor of Global Risk, aimed at supporting CSER’s long-term expansion and permanence.19

Recent activities include the Cambridge Conference on Catastrophic Risk (September 2024), which brought together researchers, diplomats, UN representatives, and government officials to discuss emerging risks including biological and technological threats, space warfare, and systemic resilience.20

CSER has been actively engaged in AI safety research since its founding. The centre organized a series of academic conferences on decision theory and AI safety beginning in 2017, exploring the theoretical foundations necessary for developing safe artificial intelligence systems.21

The centre’s 2018 report The Malicious Use of Artificial Intelligence: Forecasting, Preventing and Mitigation, produced in collaboration with technology companies and security think-tanks, examined how AI could be weaponized for physical attacks, digital security threats, and political disruption.22 This work helped establish frameworks for understanding dual-use AI risks that continue to inform policy discussions.

CSER researchers have advised multiple governments and international organizations on AI and AGI governance, including consultations with the UN Secretary-General, UK government, EU, and US agencies.23 The centre’s work emphasizes rigorous, multidisciplinary approaches to AI safety that bridge technical research, policy analysis, and ethical considerations.

Managing extreme technological risks, particularly in biotechnology, represents a core research area for CSER. The centre has conducted horizon-scanning work for the Biological Weapons Convention (BWC), identifying emerging biotechnologies that could pose catastrophic risks.24 CSER researchers emphasize that governance bodies like the BWC struggle to keep pace with rapid advances in synthetic biology and gene editing technologies.25

During and after the COVID-19 pandemic, CSER significantly expanded its work on pandemic preparedness and biological catastrophic risks. In 2022, the centre advised the World Health Organization (WHO), governments, and international bodies on building resilience against future pandemics.26 This work addresses both naturally occurring pandemic threats and the potential for engineered biological weapons.

CSER’s 2022 paper “Climate Endgame: Exploring catastrophic climate change scenarios,” published in Proceedings of the National Academies of Science, examines worst-case climate scenarios that could lead to civilizational collapse or human extinction.27 This research challenged the field to take seriously the tail risks of climate change beyond conventional projections.

The centre also published research on huge volcanic eruptions and the need to prepare for low-probability, high-impact geological events.28 CSER’s environmental work is supported by funding from the Grantham Foundation for the Protection of the Environment.29

Martin Rees, CSER’s co-founder, co-organized workshops with the Vatican that influenced the 2015 Papal Encyclical on Climate Change and contributed to momentum for the Paris Agreement.30

CSER researchers have provided analysis and policy advice on nuclear risks, particularly following Russia’s invasion of Ukraine. The centre has advised governments on nuclear security, deterrence stability, and the intersection of emerging technologies with nuclear risks.31

CSER has developed a diverse funding base combining philanthropic support, institutional grants, and university integration. Major funding sources include:3233

  • Future of Life Institute: $1,000,000 (2023) supporting the Director position and operations
  • Survival and Flourishing Fund: Over $200,000 (as of June 2022)
  • Templeton World Charity Foundation: Major research projects including the Managing Extreme Technological Risks program
  • Grantham Foundation for the Protection of the Environment: Environmental risk research
  • Hauser-Raspe Foundation, Blavatnik Foundation (public lectures), Libra Foundation, Musk Foundation, Milner Foundation: Additional project support
  • Carl Feinberg (via Cambridge in America): Multi-million-dollar endowment for the Rees Feinberg Professor of Global Risk

Jaan Tallinn provided critical seed funding in 2012 that enabled CSER’s establishment.34

CSER has established multiple international academic partnerships to expand existential risk research capacity. In 2018, the centre signed a Memorandum of Understanding with the Graduate School of Advanced Integrated Studies in Human Survivability (GSAIS) at Kyoto University, extended in 2023. This partnership supports joint research funding applications, regular workshops, faculty and student exchanges, and collaborative publications on topics including cascading natural risks and space technology in risk mitigation.35

CSER has advised on the establishment of global risk research programs at Australian National University, University of California Los Angeles (UCLA), and the University of Warwick.36 The centre has organized two international Cambridge Conferences on Catastrophic Risk and over 30 specialized workshops on topics including cybersecurity, nuclear security, climate change, and gene drives.37

CSER maintains an active public engagement program including the CSER Public Lectures series (supported by the Blavatnik Foundation), which has been viewed over 500,000 times online.38 The centre’s media engagement has been extensive, with research findings regularly covered by major international outlets.

Within the effective altruism and existential risk research communities, CSER is viewed as a key academic institution lending rigor and legitimacy to the field. The centre has a dedicated topic page on the EA Forum, where community members discuss CSER’s work and its alignment with effective altruism priorities.39

In March 2025, CSER hosted an “Exploring Careers in Existential Risk” event featuring speakers from 80,000 Hours and ERA Fellowship, connecting students and early-career researchers with opportunities in the field.40 The centre has also developed self-guided educational trails and contributed researchers as Lead Authors to the IPCC’s Seventh Assessment Report.41

  • Huw Price: Bertrand Russell Professor of Philosophy at Cambridge; Academic Director
  • Lord Martin Rees: Astronomer Royal, former President of the Royal Society, Emeritus Professor of Cosmology and Astrophysics
  • Jaan Tallinn: Co-founder of Skype, early investor in Anthropic; provided seed funding
  • Professor Matthew Connelly: Director (from July 2023), supported by five-year position funded by Future of Life Institute
  • Dr. Seán Ó hÉigeartaigh: Executive Director/Co-Director, manages CSER within CRASSH
  • Dr. Catherine Rhodes: Former Executive Director (as of April 2020), Academic Project Manager
  • SJ Beard: Senior Research Associate; works on ethics of extinction, extreme event methodologies, decision-maker constraints, and existential hope
  • Dr. Charlotte Hammer: Assistant Professor in Global Risk and Resilience
  • Dr. Luke Kemp: Research Associate; focuses on global catastrophic risks and civilizational collapse
  • Partha Dasgupta: Senior Advisor; co-organized Vatican workshops on climate and extinction

Available sources contain no direct criticisms or controversies targeting CSER specifically. However, some broader concerns affecting existential risk research are relevant to understanding CSER’s context:

Field Immaturity and Methodological Challenges

Section titled “Field Immaturity and Methodological Challenges”

Existential risk studies remains a young, interdisciplinary subfield still developing consensus methodologies.42 CSER researchers acknowledge that measuring extreme tail risks and validating models for unprecedented catastrophic events presents fundamental epistemic challenges. The centre’s TERRA bibliography project revealed data limitations in tracking existential risk research productivity, including undercounting of publications and incomplete capture of non-academic work.43

Research analyzing the broader existential risk field (including CSER-associated publications) has identified underrepresentation of women researchers, aligning with patterns observed in effective altruism-adjacent communities.44 While not specific to CSER, this demographic pattern affects the diversity of perspectives in the field.

CSER researchers themselves identify significant barriers to effective existential risk mitigation, including short-termism in government decision-making, institutional lag (exemplified by UN Security Council ineffectiveness), and lack of representation for future generations in policy processes.45 These structural challenges constrain the real-world impact of academic research, regardless of its quality.

CSER emphasizes that managing extreme technological risks remains “urgent” but “comparatively neglected” in academia, with governance institutions struggling to keep pace with technological advances in areas like synthetic biology.46

  1. Impact measurement: While CSER has produced numerous publications and policy engagements, measuring the actual reduction in existential risk resulting from this work remains extremely difficult. How effectively does academic research translate into policy changes that meaningfully reduce catastrophic risks?

  2. Research prioritization: With limited resources and multiple potential existential threats, how should CSER balance attention across AI risks, biological threats, climate scenarios, and other catastrophic risks? Are current priorities optimally calibrated to actual risk levels?

  3. Institutional growth trajectory: With new endowed positions and expanded funding, can CSER scale its research capacity while maintaining quality and interdisciplinary rigor? What is the optimal size for an existential risk research centre?

  4. Policy influence mechanisms: What are the most effective pathways for translating existential risk research into policy action? Should CSER prioritize direct government advising, public engagement, training future policymakers, or other approaches?

  5. Collaboration vs. competition: As the existential risk research field grows with multiple institutions now active, how should CSER balance collaborative field-building with maintaining its distinctive research identity and institutional competitiveness for funding?

  1. EA Forum - Centre for the Study of Existential Risk

  2. CSER Overview - Cambridge

  3. CSER Research Areas

  4. CSER Impact Strategy

  5. Future of Life Institute Grant Announcement

  6. CSER 2022 Activities Report

  7. CSER Founding History

  8. Jaan Tallinn Profile

  9. CSER Mission Statement

  10. Jaan Tallinn and Existential Risk

  11. CSER Early Grant Applications

  12. CSER Funding History 2015-2018

  13. CSER AI Safety Conferences

  14. The Malicious Use of AI Report

  15. UK All-Party Parliamentary Group for Future Generations

  16. CSER Leadership Changes 2020

  17. Climate Endgame Paper - PNAS

  18. FLI Grant to CSER July 2023

  19. Rees Feinberg Professorship Endowment

  20. Cambridge Conference on Catastrophic Risk 2024

  21. CSER Decision Theory Conferences

  22. Malicious AI Report

  23. CSER Policy Advisory Work

  24. CSER BWC Horizon Scanning

  25. Managing Extreme Technological Risks

  26. CSER Pandemic Advisory 2022

  27. Climate Endgame Paper - PNAS

  28. Volcanic Eruptions Paper - Nature

  29. CSER Funding Sources

  30. Vatican Workshop on Climate

  31. CSER Nuclear Risk Advisory

  32. FLI Grant 2023

  33. CSER Major Supporters

  34. Jaan Tallinn Seed Funding

  35. CSER-GSAIS Partnership MoU

  36. CSER Global Risk Project Advisory

  37. CSER Conferences and Workshops

  38. CSER Public Lectures

  39. EA Forum CSER Topic Page

  40. Exploring Careers in Existential Risk Event 2025

  41. CSER Researchers Join IPCC

  42. Existential Risk Studies Field Review

  43. TERRA Database Limitations

  44. Gender Representation in X-Risk Research

  45. Long Problems Lecture 2025

  46. Managing Extreme Tech Risks