Skip to content
Longterm Wiki
Index
Grant·K4RSQa8epL·Record·Profile

Grant: AI Safety Support — SERI MATS 4.0 (Coefficient Giving → AI Safety Support)

Verdictconfirmed95%
1 check · 4/9/2026

Deterministic match: grantee, amount, date matched in source snapshot (2714 rows)

Our claim

entire record
Name
AI Safety Support — SERI MATS 4.0
Amount
$1,240,840
Currency
USD
Date
June 2023
Notes
[Navigating Transformative AI] Open Philanthropy recommended three grants totaling $1,240,840 to AI Safety Support to support their collaboration with Stanford Existential Risks Initiative (SERI) on SERI’s Machine Learning Alignment Theory Scholars (MATS) program. MATS is an educexpand[Navigating Transformative AI] Open Philanthropy recommended three grants totaling $1,240,840 to AI Safety Support to support their collaboration with Stanford Existential Risks Initiative (SERI) on SERI’s Machine Learning Alignment Theory Scholars (MATS) program. MATS is an educational seminar and independent research program that aims to provide scholars with talks, workshops, and research mentorship in the field of AI alignment, and connect them with in-person alignment research communities. These grants will support the MATS program's fourth cohort. They follow our November 2022 support for the previous iteration of MATS, and fall within our focus area of growing and empowering the community of people focused on global catastrophic risk reduction. We also made a separate grant to the Berkeley Existential Risk Initiative for this cohort.

Source evidence

1 src · 1 check
confirmed95%deterministic-row-match · 4/9/2026
Name
AI Safety Support — SERI MATS 4.0
Grantee
AI Safety Support
Focus Area
Navigating Transformative AI
Amount
$1,240,840.00
Date
June 2023
Description
Open Philanthropy recomm

NoteDeterministic match: grantee, amount, date matched in source snapshot (2714 rows)

Case № K4RSQa8epLFiled 4/9/2026Confidence 95%