AI Safety Camp - EA Forum Topic Page
webCredibility Rating
Good quality. Reputable source with community review or editorial standards, but less rigorous than peer-reviewed venues.
Rating inherited from publication venue: EA Forum
This EA Forum topic page aggregates information about AI Safety Camp (AISC), a non-profit running programs for students and early-career researchers interested in reducing existential risk from AI, including funding history and related posts.
Metadata
Summary
This EA Forum wiki page covers AI Safety Camp (AISC), a non-profit initiative supporting students and early-career researchers in AI safety. It documents AISC's funding history (totaling ~$600,000 from Future Fund, EA Funds, and Survival and Flourishing Fund as of 2022) and aggregates related forum posts including impact assessments and funding cases.
Key Points
- •AISC is a non-profit running programs for students and early-career researchers focused on reducing existential risk from AI.
- •As of July 2022, AISC received ~$600K total: $290K from Future Fund, $180K from EA Funds, and $130K from Survival and Flourishing Fund.
- •The page aggregates top EA Forum posts related to AISC, including annual AI alignment literature reviews and grant recommendations.
- •AISC has faced funding uncertainty, with multiple posts titled 'This might be the last AI Safety Camp' indicating financial precarity.
- •An independent impact assessment by Arb Research was conducted, suggesting external evaluation of AISC's effectiveness.
Cached Content Preview
AI Safety Camp - EA Forum This website requires javascript to properly function. Consider activating javascript to get access to all site functionality. Hide table of contents AI Safety Camp Edit History Discussion 0 Subscribe Edit History Discussion 0 AI Safety Camp Funding External links Related entries Random Topic Contributors 3 Remmelt 2 Pablo 2 Lizka 1 Leo AI Safety Camp ( AISC ) is a non-profit initiative that runs programs serving students and early-career researchers who want to work on reducing existential risk from AI. Funding As of July 2022, AISC has received $290,000 in funding from the Future Fund , [1] $180,000 from Effective Altruism Funds , [2] [3] [4] [5] and $130,000 from the Survival and Flourishing Fund. [6] External links AI Safety Camp . Official website. ... (Read more) Posts tagged AI Safety Camp Top Relevance 176 2021 AI Alignment Literature Review and Charity Comparison Larks Larks · 4y ago · 87 m read 18 2 18 2 155 2020 AI Alignment Literature Review and Charity Comparison Larks Larks · 5y ago · 82 m read 16 3 16 3 131 AI safety starter pack mariushobbhahn mariushobbhahn · 4y ago · 7 m read 13 1 13 1 110 Long-Term Future Fund: May 2021 grant recommendations abergal abergal · 5y ago · 69 m read 17 3 17 3 87 This might be the last AI Safety Camp Remmelt Remmelt , Linda Linsefors + 0 more · 2y ago · 1 m read 32 7 32 7 87 Impact Assessment of AI Safety Camp (Arb Research) Sam Holton Sam Holton · 2y ago · 13 m read 23 8 23 8 79 Long-Term Future Fund: August 2019 grant recommendations Habryka [Deactivated] Habryka [Deactivated] · 7y ago · 77 m read 70 3 70 3 51 Talking to Congress: Can constituents contacting their legislator influence policy? T_W T_W , davekasten , jacob.turn , Felix De Simone , gergo + 0 more · 2y ago · 22 m read 3 2 3 2 46 Long-Term Future Fund: November 2019 short grant writeups Habryka [Deactivated] Habryka [Deactivated] · 6y ago · 11 m read 11 3 11 3 45 Funding case: AI Safety Camp 10 Remmelt Remmelt , Linda Linsefors + 0 more · 2y ago · 7 m read 13 5 13 5 42 Funding Case: AI Safety Camp 11 Remmelt Remmelt , Linda Linsefors , Robert Kralisch + 0 more · 1y ago · 7 m read 2 2 2 2 42 We don't want to post again "This might be the last AI Safety Camp" Remmelt Remmelt · 1y ago 2 2 2 2 36 AISC9 has ended and there will be an AISC10 Linda Linsefors Linda Linsefors · 2y ago 0 2 0 2 30 Agentic Mess (A Failure Story) Karl von Wendt Karl von Wendt · 3y ago 3 1 3 1 30 Invitation to lead a project at AI Safety Camp (Virtual Edition, 2025) Linda Linsefors Linda Linsefors · 2y ago 2 1 2 1
d8e974683182342f