Skip to content
Longterm Wiki

Coefficient Giving

Funder
Founded Jun 2017 (8 years old)HQ: San Franciscocoefficientgiving.org

Also known as: Open Philanthropy, OP

Entity
Wiki
About
Business
Data

Coefficient Giving (formerly Open Philanthropy) has directed $4B+ in grants since 2014, including $336M to AI safety (~60% of external funding). The organization spent ~$50M on AI safety in 2024, with 68% going to evaluations/benchmarking, and launched a $40M Technical AI Safety RFP in 2025 covering 8 research areas.

Headcount
150
as of 2025
Total Funding Raised
$5B
as of Jun 2025
Grants Made
$4B
2626 grants

Facts

13
Financial
Total Funding Raised$5B
Headcount150
Funding Received$100M
Market Share60%
People
Founder (text)Cari Tuna
Other
Key PersonElie Hassenfeld
Organization
Founded DateJun 2017
Legal StructureLimited liability company (LLC)
CountryUnited States
HeadquartersSan Francisco
Biographical
Wikipediahttps://en.wikipedia.org/wiki/Coefficient_Giving
General
Websitehttps://www.coefficientgiving.org

Other Data

Entity Assessments
6 entries
DimensionRatingEvidenceAssessor
ai-safety-focusLeading funder$336M+ to AI safety since 2014; ~60% of external AI safety fundingeditorial
application-modelRolling RFPs + regranting300-word EOI, 2-week response; supports platforms like Manifundeditorial
key-fundersGood Ventures (primary)Dustin Moskovitz & Cari Tuna; expanding to multi-donor modeleditorial
scaleDominant$4B+ total grants; ~$46M AI safety in 2023editorial
structure13 cause-specific fundsMulti-donor pooled funds since Nov 2025 rebrandeditorial
transparencyHighPublic grants database, annual progress reportseditorial
Entity Events
10 entries
TitleDateEventTypeDescriptionSignificance
Rebrand from Open Philanthropy to Coefficient Giving2025-11-18pivotMulti-donor expansion (over $100M directed from non-Good-Ventures donors in 2024); brand clarity to disambiguate from OpenAI and Open Society Foundations; structural reorganization into 13 distinct funds.major
$40M Technical AI Safety RFP2025fundingmajor
~$50M AI safety committed; 68% to evaluations/benchmarking2024milestonemoderate
~$46M AI safety spending; largest funder in the field2023milestoneAI safety becomes Open Philanthropy's largest longtermist cause area.major
$150M Regranting Challenge launched (not AI-specific)2022launchmoderate
AI safety spending exceeds $20M annually2019milestonemoderate
Spun off from GiveWell as independent LLC2017pivotHolden Karnofsky publishes detailed AI concerns; the spinoff enables Open Philanthropy to pursue its own strategic priorities while GiveWell continues focusing on evidence-backed global health interventions.major
First AI safety grants2015milestoneBegan supporting AI safety work in 2015, when the field had ~10 full-time researchers and institutional support was minimal. Early grants helped establish MIRI, CHAI, and the Future of Humanity Institute.major
Open Philanthropy formalized as project within GiveWell2014milestoneThe advising relationship formalizes into "Open Philanthropy" as a distinct project, focused on identifying high-impact giving opportunities across a broader range of cause areas than GiveWell's traditional global health focus.major
GiveWell begins advising Good Ventures2011foundingGiveWell, founded by Holden Karnofsky and Elie Hassenfeld, begins advising Good Ventures (established by Dustin Moskovitz and Cari Tuna) on how to deploy philanthropic capital effectively.major

Divisions

16
Fund·Matt Clancy

$120M committed over 3 years. Led by Matt Clancy. Economic growth, scientific progress, US-focused.

Fund·Santosh Harish

40+ grants totaling ~$20M. Led by Santosh Harish. Focus on South Asia and high-pollution areas.

Fund·Andrew Snyder Beattie

140+ grants totaling ~$260M. Led by Andrew Snyder-Beattie. Work began ~2015, five years before COVID-19.

Fund·Melanie Basnak, Sam Donald

Led by Melanie Basnak and Sam Donald. Support for CEA, 80,000 Hours, EA Funds, and community infrastructure.

Fund·Lewis Bollard

Led by Lewis Bollard. Corporate cage-free campaigns, alt-protein research, advocacy.

Fund·Benjamin Tereick

30+ grants totaling ~$50M. Led by Benjamin Tereick. Forecasting infrastructure and research.

Fund·Norma Altshuler

50+ grants totaling ~$30M. Led by Norma Altshuler. Encouraging generous and cost-effective international aid.

Program

Covers AI safety, biosecurity, and other GCR-related grantmaking. 250+ grants across cause areas.

Fund·Eli Rose

250+ grants across GCR cause areas and EA community capacity building

Fund·Justin Sandefur

$40M+ committed over 3 years. Led by Justin Sandefur. Policy research for economic growth in low/middle-income countries.

Fund·James Snowden

360+ grants; largest fund. Primarily GiveWell-recommended charities; $175M committed for 2026 via GiveWell

Program

Covers global health, farm animal welfare, and scientific research. 360+ grants; largest program area.

Fund

$100-125M raised; 20+ grants. Multi-donor pooled fund with Gates Foundation, UNICEF, others.

Fund·Claire Zabel, Luke Muehlhauser, Peter Favaloro, Rossa O'Keeffe O'Donovan

480+ grants totaling ~$500M. Sub-areas: Technical Safety (Favaloro, O'Keeffe-O'Donovan), AI Governance (Muehlhauser), Short Timelines (Zabel). ~$63.6M in 2024 (~60% of all external AI safety funding).

Fund·Jacob Trefethen

330+ grants + 30+ social investments ($90M+). Led by Jacob Trefethen. Treatments, vaccines, diagnostics.

1 inactive division
Fund

Focused on reducing incarceration; wound down in 2022. ~$200M total grants.

Related Wiki Pages

Top Related Pages

Approaches

AI Safety Training Programs

Analysis

Is EA Biosecurity Work Limited to Restricting LLM Biological Use?XPT (Existential Risk Persuasion Tournament)Donations List Website

Organizations

Centre for Long-Term ResilienceFTX Future FundJohns Hopkins Center for Health SecurityLong-Term Future Fund (LTFF)

Other

Nick BecksteadAI EvaluationsScalable OversightRecoding America

Key Debates

Technical AI Safety ResearchAI Accident Risk CruxesThe Case For AI Existential Risk

Concepts

AI TimelinesEa Epistemic Failures In The Ftx Era

Risks

Bioweapons Risk

Historical

Deep Learning Revolution EraThe MIRI Era