Skip to content
Longterm Wiki
grant

AI Safety Support — SERI MATS 4.0

Child of Coefficient Giving

Metadata

Source Tablegrants
Source IDK4RSQa8epL
Descriptionto AI Safety Support, USD 1240840, 2023-06
Source URLcoefficientgiving.org/funds/
ParentCoefficient Giving
Children
CreatedMar 12, 2026, 5:54 AM
UpdatedMar 23, 2026, 3:17 PM
SyncedMar 19, 2026, 8:57 PM

Record Data

idK4RSQa8epL
organizationIdCoefficient Giving(organization)
granteeIdAI Safety Support(organization)
orgEntityIdCoefficient Giving(organization)
orgDisplayName
granteeEntityIdAI Safety Support(organization)
granteeDisplayNameai-safety-support
nameAI Safety Support — SERI MATS 4.0
amount1240840
currencyUSD
period
date2023-06
status
sourcecoefficientgiving.org/funds/
notes[Navigating Transformative AI] Open Philanthropy recommended three grants totaling $1,240,840 to AI Safety Support to support their collaboration with Stanford Existential Risks Initiative (SERI) on SERI’s Machine Learning Alignment Theory Scholars (MATS) program. MATS is an educational seminar and
programIdEXpTP-ujq6
dataSourceId

Source Check Verdicts

confirmed95% confidence

Last checked: 4/9/2026

[deterministic-row-match] Deterministic match: grantee, amount, date matched in source snapshot (2714 rows)

Debug info

Thing ID: K4RSQa8epL

Source Table: grants

Source ID: K4RSQa8epL

Parent Thing ID: sid_ULjDXpSLCI