Back
Alignment Research Center - OpenBook
webopenbook.fyi·openbook.fyi/org/Alignment%20Research%20Center
This OpenBook page tracks philanthropic donations to the Alignment Research Center, useful for understanding ARC's funding sources and financial scale within the broader AI safety funding landscape.
Metadata
Importance: 30/100homepagereference
Summary
OpenBook profile page for the Alignment Research Center (ARC), displaying its funding history totaling $2.52M. Key donors include Jaan Tallinn ($2.18M), Open Philanthropy ($265K), and the EA Funds Long-Term Future Fund ($72K), all directed toward AI safety work.
Key Points
- •ARC received $2.52M in tracked incoming grants, primarily in 2022.
- •Jaan Tallinn is the largest donor at $2.18M, representing ~87% of total recorded funding.
- •Open Philanthropy contributed $265K and EA Funds Long-Term Future Fund contributed $72K.
- •All recorded grants are categorized under the AI safety cause area.
- •OpenBook aggregates philanthropic funding data for transparency in the EA and AI safety ecosystem.
Cited by 1 page
| Page | Type | Quality |
|---|---|---|
| Model Organisms of Misalignment | Analysis | 65.0 |
Cached Content Preview
HTTP 200Fetched Apr 7, 20260 KB
OpenBook Alignment Research Center Incoming Grants | Total Received: $2.52M DATE AMOUNT DONOR CAUSE AREAS 2022/12/01 Jaan Tallinn AI safety $2.18M Jaan Tallinn AI safety 2022/10/01 EA Funds: Long-Term Future Fund AI safety $72K EA Funds: Long-Term Future Fund AI safety 2022/03/01 Open Philanthropy AI safety $265K Open Philanthropy AI safety
Resource ID:
2cae1e372ec79aeb | Stable ID: sid_XVyPLXULXG