Skip to content
Longterm Wiki
Back

Statement on Superintelligence

web
superintelligence-statement.org·superintelligence-statement.org/

A public petition hosted by the Future of Life Institute calling for a prohibition on superintelligence development until there is broad scientific consensus on safety and strong public buy-in, representing a significant coordinated public stance on existential AI risk.

Metadata

Importance: 55/100otherprimary source

Summary

This petition, hosted by the Future of Life Institute, calls for a prohibition on the development of superintelligence until there is broad scientific consensus that it can be done safely and controllably, and strong public buy-in. It highlights concerns including human economic obsolescence, loss of civil liberties, national security risks, and potential human extinction. The statement has gathered nearly 70,000 signatures.

Key Points

  • Calls for a prohibition on superintelligence development, not lifted until broad scientific consensus on safe and controllable development exists.
  • Requires strong public buy-in before any prohibition on superintelligence development is lifted.
  • Highlights risks including human economic obsolescence, loss of freedoms, civil liberties, national security threats, and potential human extinction.
  • Distinguishes between beneficial AI tools and the specific goal of building superintelligence that outperforms all humans on all cognitive tasks.
  • Hosted by the Future of Life Institute and has gathered nearly 70,000 signatures including from experts and public figures.

1 FactBase fact citing this source

Cached Content Preview

HTTP 200Fetched Apr 7, 20261 KB
Statement on Superintelligence Please enable JavaScript to view this website.

 Learn how to enable JavaScript 

 69,817 signatures

 Sign statement Including 5,000 from the same petition by Ekō 

 

 Statement on Superintelligence 

 en Context : Innovative AI tools may bring unprecedented health and prosperity. However, alongside tools, many leading AI companies have the stated goal of building superintelligence in the coming decade that can significantly outperform all humans on essentially all cognitive tasks. This has raised concerns, ranging from human economic obsolescence and disempowerment, losses of freedom, civil liberties, dignity, and control, to national security risks and even potential human extinction . The succinct statement below aims to create common knowledge of the growing number of experts and public figures who oppose a rush to superintelligence.

 Sign anonymously For corrections, technical support, or press enquiries, please contact letters@futureoflife.org Statement

 We call for a prohibition on the development of superintelligence, not lifted before there is

 broad scientific consensus that it will be done safely and controllably, and
 strong public buy-in.
 69,817 signatures Sign statement Hosted by   Future of Life Institute Powered by Privacy Policy Terms of Use
Resource ID: kb-489b3a716bd920cc