Comprehensive profile of FLI documenting $25M+ in grants distributed (2015: $7M to 37 projects, 2021: $25M program), major public campaigns (Asilomar Principles with 5,700+ signatories, 2023 Pause Letter with 33,000+ signatories), and $665.8M Buterin donation (2021). Organization operates primarily as an advocacy and grantmaking institute. Sister organization Future of Life Foundation (FLF) handles incubation of new beneficial AI organizations. Both share leadership under Anthony Aguirre.
Key Metrics
Revenue (ARR)
Headcount
Facts
12Other Data
| Title | PublicationType | Authors | Url | PublishedDate | IsFlagship | Venue | |
|---|---|---|---|---|---|---|---|
| Pro-Human AI Declaration | policy-brief | Future of Life Institute | humanstatement.org | 2026-01 | ✓ | — | |
| AI Safety Index Winter 2025 | report | Future of Life Institute | futureoflife.org | 2025-12 | ✓ | FLI | |
| AI Safety Index: Winter 2025 | report | Future of Life Institute | futureoflife.org | 2025-12 | — | — | |
| Statement on Superintelligence | policy-brief | Future of Life Institute | superintelligence-statement.org | 2025-10 | ✓ | — | |
| AI Safety Index: Summer 2025 | report | Future of Life Institute | futureoflife.org | 2025-06 | ✓ | — | |
| Pause Giant AI Experiments: An Open Letter | policy-brief | Future of Life Institute | futureoflife.org | 2023-03 | ✓ | — | |
| Lethal Autonomous Weapons Pledge | policy-brief | Future of Life Institute | futureoflife.org | 2018-06 | ✓ | — | |
| Asilomar AI Principles | policy-brief | Future of Life Institute | futureoflife.org | 2017-01 | ✓ | — | |
| Autonomous Weapons: AI and Robotics Researchers Open Letter | policy-brief | Future of Life Institute | futureoflife.org | 2016-02 | ✓ | — | |
| Research Priorities for Robust and Beneficial AI: An Open Letter | policy-brief | Future of Life Institute | futureoflife.org | 2015-10 | ✓ | — |
Divisions
7Biannual. 33 indicators, 6 domains, 7 companies evaluated. Expert panel of 6 AI scientists.
Slaughterbots films (100M+ views), Lethal Autonomous Weapons Pledge (5,218 signatories), Autonomous Weapons Watch database.
Vitalik Buterin PhD and Postdoctoral Fellowships in AI Existential Safety. Run with BAIF. 14+ PhD fellows and 4+ postdocs at top universities. Falls under Operations & Grants team.
2015: $7M (Musk-funded), 2021: $25M (Buterin), 2022-2024: ~$16.5M total. AI safety, nuclear risk, autonomous weapons.
FLI's grantmaking arm. $25M+ distributed since 2015 across AI safety, nuclear risk, governance, and existential risk reduction. Andrea Berman is Grants Manager.
Storytelling, worldbuilding, scenario planning for beneficial tech futures.
Campaigns include Asilomar Principles (5,700+ signatories), 2023 Pause Letter (33,000+ signatories), AI Act advocacy. EU and UN engagement. Led by Mark Brakel (Global Director of Policy).
Prediction Markets
4 activeRelated Wiki Pages
Top Related Pages
Max Tegmark
Swedish-American physicist at MIT, co-founder of the Future of Life Institute, and prominent AI safety advocate known for his work on the Mathemati...
Pause Advocacy
Advocacy for slowing or halting frontier AI development until adequate safety measures are in place. Analysis suggests 15-40% probability of meanin...
AI for Human Reasoning Fellowship
A 12-week fellowship program by the Future of Life Foundation (FLF) that brought together 30 fellows to develop AI tools for coordination, epistemi...
Jaan Tallinn
Jaan Tallinn (born 1972) is an Estonian programmer, entrepreneur, and philanthropist who co-founded Skype and Kazaa, then became one of the largest...
Pause / Moratorium
Proposals to pause or slow frontier AI development until safety is better understood, offering potentially high safety benefits if implemented but ...