Skip to content
Longterm Wiki
Back

Survival and Flourishing Fund Recommendations

web
survivalandflourishing.fund·survivalandflourishing.fund/recommendations

SFF is a notable philanthropic funder in the AI safety ecosystem; its recommendations page is a useful reference for tracking which organizations and projects receive support and understanding funder priorities in the field.

Metadata

Importance: 45/100organizational reportreference

Summary

The Survival and Flourishing Fund (SFF) publishes its grant recommendations for organizations working on existential risk reduction, AI safety, and related cause areas. The page documents funding decisions made through SFF's regrantor model, where individual advisors recommend grants to reduce risks to humanity's long-term future. It serves as a transparency record of which organizations and projects receive philanthropic support in the AI safety ecosystem.

Key Points

  • SFF uses a regrantor model where independent advisors make grant recommendations rather than a centralized program officer structure
  • Funding focuses on existential risk reduction, AI safety research, and organizations working to improve humanity's long-term prospects
  • The recommendations page provides transparency into which AI safety and EA-aligned organizations receive philanthropic support
  • SFF is a significant funder in the AI safety space, supporting both technical research and governance organizations
  • Grant amounts and reasoning are often published, offering insight into how funders evaluate AI safety work

Cited by 2 pages

PageTypeQuality
Centre for Long-Term ResilienceOrganization63.0
Survival and Flourishing Fund (SFF)Organization59.0

6 FactBase facts citing this source

Cached Content Preview

HTTP 200Fetched Apr 6, 202663 KB
Grants | Survival and Flourishing Fund 
 
 
 
 
 
 
 
 
 
 

 
 
 
 

 
 

 
 
 



 
 
 
 
 
 Programs 
 FAQ 
 
 
 
 Contact 
 
 Join newsletter 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

 

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 Below is a list of grants we have recommended using the S-Process and the Initiative Committee . 
Recommendations after the final date might not be listed yet.


 
 
 
 
 
 Round 
 Source 
 Organization 
 Amount 
 Receiving Charity 
 Purpose 
 
 
 
 SFF-2025 Jaan Tallinn Agent Foundations Field Network (AFFINE) [Algorithm Design] $79,000 Ashgro, Inc. General support of Algorithm Design SFF-2025 Jaan Tallinn Agent Foundations Field Network (AFFINE) [Technical Research] $165,000 Ashgro, Inc. General support of Technical Research SFF-2025 Jaan Tallinn AI & Democracy Foundation $195,000 Thoughtful Tech Project, Inc. General support SFF-2025 Jaan Tallinn AI Futures Project $1,535,000 +$500,000‡ AI Futures Project General support SFF-2025 Jaan Tallinn AI Lab Watch $371,000 Lightcone Infrastructure, Inc. General support of AI Lab Watch SFF-2025 Jaan Tallinn AI Policy Institute (AIPI) $1,635,000 The Hack Foundation General support of AI Policy Institute SFF-2025 Jaan Tallinn AI Safety Camp $90,000 +$110,000‡ Ashgro, Inc. General support of AI Safety Camp SFF-2025 Jaan Tallinn AI Standards Lab and Holtman Systems Research $228,000 +$100,000‡ Players Philanthropy Fund, Inc. General support of AI Standards Lab and Holtman Systems Research SFF-2025 Jaan Tallinn Alignment Ecosystem Development $91,000 Ashgro, Inc. General support of Alignment Ecosystem Development SFF-2025 Jaan Tallinn Alignment in Complex Systems Research Group (ACS Research) $400,000 +$306,000‡ Epistea, z.s. General support of Alignment in Complex Systems Research Group SFF-2025 Jaan Tallinn Alliance to Feed the Earth in Disasters (ALLFED) $30,000 ALLFED Institute General support SFF-2025 Jaan Tallinn Amplifying AI Safety $30,000 Epistea, z.s. General support of Amplifying AI Safety SFF-2025 Jaan Tallinn Association for Long Term Existence and Resilience (ALTER) $60,000 +$22,000‡ Association for Long Term Existence and Resilience General support SFF-2025 Jaan Tallinn Augur $160,000 Augur LLC General support SFF-2025 Jaan Tallinn BERI-CLTC Collaboration $141,000 Berkeley Existential Risk Initiative General support of BERI-CLTC Collaboration SFF-2025 Jaan Tallinn Catalyze Impact $74,000 Ashgro, Inc. General support of Catalyze Impact SFF-2025 Jaan Tallinn Center for AI Safety (CAIS) $289,000 Center for AI Safety, Inc. General support SFF-2025 Jaan Tallinn Center for AI Safety Action Fund (CAIS AF) $772,000 Center for AI Safety Action Fund, Inc. General support SFF-2025 Jaan Tallinn Center for Humane Technology (CHT) $468,000 Center for Humane Technology General support SFF-2025 Jaan Tallinn Center on Long-Term Risk (CLR) $200,000 Center on Long-Term Risk General support SFF-2025 Jaan Tallinn Centre 

... (truncated, 63 KB total)
Resource ID: aebe92781f2a19fb | Stable ID: sid_9E6b3qZYXd