Longterm Wiki
Back

The Foundation Layer: A Philanthropic Guide to AI Safety

web
foundation-layer.ai·foundation-layer.ai/

Data Status

Not fetched

Summary

A comprehensive philanthropic guide by Tyler John (Effective Institutions Project) aimed at persuading major donors to fund AI safety. Covers AGI timelines, existential risks (loss of control, malicious use, power concentration), and proposes a five-pillar philanthropic strategy: alignment science, nonproliferation, defensive technology, power distribution, and talent mobilization. Includes a getting-started guide for donors with specific funds and advisors.

Key Points

  • Synthesizes five years of AI safety philanthropic advisory into a donor guide
  • The Foundation Layer Fund has facilitated 100+ grants exceeding $70M
  • Proposes five intervention pillars for AI safety philanthropy
  • Covers alignment science, compute governance, biodefense, and political giving

Cited by 2 pages

PageTypeQuality
Longtermist Funders (Overview)--3.0
The Foundation LayerOrganization3.0
Resource ID: 9c6a24147b148206 | Stable ID: YzdhN2ExNT