Longterm Wiki
Updated 2026-03-12HistoryData
Page StatusContent
Edited 1 day ago3.8k words15 backlinksUpdated every 6 weeksDue in 6 weeks
50QualityAdequate •31ImportanceReference42ResearchLow
Summary

Manifund is a \$2M+ annual charitable regranting platform (founded 2022) that provides fast grants (<1 week) to AI safety projects through expert regrantors (\$50K-400K budgets), fiscal sponsorship, and experimental impact certificates. The platform distributed \$2.06M in 2023 (~40% to AI safety research) and raised \$2.25M for 10 regrantors in 2025, filling a niche between individual donors and major funders like Coefficient Giving.

Content7/13
LLM summaryScheduleEntityEdit historyOverview
Tables29/ ~15Diagrams2/ ~2Int. links11/ ~31Ext. links27/ ~19Footnotes0/ ~11References3/ ~11Quotes0Accuracy0RatingsN:2.5 R:4 A:5 C:6.5Backlinks15
Issues2
QualityRated 50 but structure suggests 100 (underrated by 50 points)
Links3 links could use <R> components

Manifund

Funder

Manifund

Manifund is a \$2M+ annual charitable regranting platform (founded 2022) that provides fast grants (<1 week) to AI safety projects through expert regrantors (\$50K-400K budgets), fiscal sponsorship, and experimental impact certificates. The platform distributed \$2.06M in 2023 (~40% to AI safety research) and raised \$2.25M for 10 regrantors in 2025, filling a niche between individual donors and major funders like Coefficient Giving.

TypeFunder
3.8k words · 15 backlinks

Quick Assessment

DimensionAssessmentEvidence
ScaleGrowing$2M+ in 2023; $2.25M raised for 2025 regrants
SpeedVery FastGrant recommendation to disbursement in under 1 week
Mechanism InnovationHighPioneered impact certificates, regranting at scale
TransparencyVery HighAll grants public with explanations
Team SizeSmall≈3 core staff (Austin Chen, Rachel Weinberg, Saul Munn)
Fiscal SponsorshipYes501(c)(3) status; 5% fee for large donors
Focus AreasAI Safety Primary≈80% of grants to x-risk/AI safety
Related PlatformManifold MarketsShared founders, infrastructure, community

Organization Details

AttributeDetails
Full NameManifund (Manifold for Charity)
TypeCharitable regranting platform and fiscal sponsor
Founded2022
FoundersAustin Chen (CEO), Rachel Weinberg (Engineer)
Parent/RelatedManifold Markets (prediction market platform)
Legal Status501(c)(3) nonprofit
Annual Volume$2M+ (2023); $2.25M budget (2025)
Websitemanifund.org
Substackmanifund.substack.com
Primary FocusAI safety, effective altruism, rationalist projects

Overview

Manifund is a charitable funding platform that emerged from Manifold Markets in 2022 to address a critical gap in the effective altruism funding ecosystem: the need for fast, flexible grantmaking that can move money to promising projects within days rather than months. Founded by Austin Chen and Rachel Weinberg, Manifund operates as both a regranting platform and a fiscal sponsor, enabling donors to support projects that lack formal nonprofit status while receiving tax benefits.

The platform's origin traces to a concrete pain point: Manifold Markets received a $500,000 grant from the FTX Future Fund in 2022 for charity prediction markets, but needed a separate 501(c)(3) entity to process the funds. What began as "Manifold for Charity" evolved into a full funding platform after Scott Alexander approached the team in 2022 wanting to run ACX Grants through an impact certificate mechanism. Austin Chen recruited his then-girlfriend (now wife) Rachel Weinberg to build the platform, and she became the primary engineer for manifund.org, launching it within two weeks.

Manifund distinguishes itself from traditional foundations through three mechanisms. First, it empowers individual regrantors with independent budgets of $50,000-$400,000 to make grant decisions without committee review, enabling faster and more speculative funding. Second, it experimented extensively with impact certificates, a mechanism where funders can retroactively purchase credit for completed work. Third, it provides infrastructure for programs like ACX Grants, where external funders can run their own grantmaking through the platform.

The platform serves a critical "middle layer" function in the EA funding ecosystem. Large funders like Coefficient Giving operate at scale but require months of due diligence. EA Funds and Long-Term Future Fund operate at medium scale with multi-week timelines. Manifund fills the niche for $5,000-$50,000 grants that can be disbursed within a week, often seeding projects that later receive larger grants from major funders. As one regrantor noted, "quick regrants induce further funding from OpenPhil and others."

Founding Team

Austin Chen

Austin Chen co-founded Manifold Markets in December 2021 alongside brothers Stephen and James Grugett. He served as Chief Product Officer before stepping back from day-to-day Manifold operations to focus on Manifund. Chen's entrepreneurial approach emphasizes rapid experimentation and community-driven development.

AttributeDetails
RoleCEO and Co-founder, Manifund
PreviousCPO and Co-founder, Manifold Markets
EducationMIT (Computer Science)
NotablePioneered play-money prediction markets at scale

Chen's philosophy centers on building infrastructure that empowers individual decision-makers rather than committees. When discussing Manifund's regranting model, he emphasized that "regrantors make grants solo and are directly responsible for grant quality, which encourages more speculative grants and avoids problems in review-by-committee."

Rachel Weinberg

Rachel Weinberg serves as the primary engineer and co-founder of Manifund. She joined after Austin Chen approached her about building the impact certificate platform for Scott Alexander's ACX Grants program.

AttributeDetails
RoleCo-founder and Lead Engineer, Manifund
PreviousPresident, EA @ Tufts University
EducationTufts University (Mathematics, partial degree)
CommunityOrganized small-to-medium EA events

Weinberg's background in effective altruism community building informed Manifund's design philosophy. She founded and ran the EA group at Tufts before transitioning to engineering. Her rapid development of manifund.org, launching within two weeks of starting, enabled the platform to process its first grants quickly.

Funding Mechanisms

Loading diagram...

Regranting Programs

Regranting is Manifund's primary funding mechanism, pioneered by the FTX Future Fund in 2022. The model empowers individual experts ("regrantors") with independent budgets to make grant decisions without committee approval. Manifund launched its regranting program in May 2023 after being introduced to an anonymous donor "D" who provided $1.5 million specifically for this purpose.

Program YearTotal PoolNumber of RegrantorsBudget per RegrantorFocus
2023$1.4M5$50K-400KAI safety, EA causes
2024≈$2M≈12$50K-400KAI safety primary
2025$2.25M10 (announced)$100K+AI safety primary

Notable 2024 Regrantors:

  • Neel Nanda (Mechanistic interpretability researcher, Anthropic)
  • Leopold Aschenbrenner (Former OpenAI researcher, Situational Awareness author)
  • Dan Hendrycks (Center for AI Safety director)
  • Adam Gleave (FAR AI founder)
  • Ryan Kidd (SERI MATS co-director)
  • Evan Hubinger (Anthropic alignment researcher)

2025 Regrantors include several former Manifund grantees who have become grantmakers:

  • Tamay Besiroglu (Epoch AI)
  • Lisa Thiergart (MIRI)
  • Marius Hobbhahn (Apollo Research)

The regranting model offers several structural advantages over traditional grantmaking. Speed is paramount: Manifund can move money from recommendation to grantee bank account in under one week. The model also encourages "hits-based giving" where regrantors can make speculative bets on early-stage projects that committee processes might reject. Each regrantor writes public explanations for their grants, creating accountability and enabling learning across the ecosystem.

ACX Grants

Scott Alexander's Astral Codex Ten grant program represents one of Manifund's largest partnerships. Alexander committed $250,000 of personal funds in 2024, with other donors contributing an additional ≈$1 million through Manifund's infrastructure.

YearScott's CommitmentTotal DistributedApplicationsGrants Made
2024$250,000≈$1.25M600+≈40
2025$250,000TBD65442

Notable ACX Grants (2024):

RecipientAmountProject
Elaine Perlman$50,000Lobbying for kidney donation law reform
John Lohier & Hugo Smith$13,000Lead-acid battery recycling research in Nigeria
Mark Webb$5,000Direct land reform experimentation
Various$5K-50KBiotech, AI alignment, education, climate projects

The 2024 ACX Grants introduced an impact market component where applications not receiving direct grants could participate in a secondary market. Retroactive prize funders including ACX Grants 2025, Survival and Flourishing Fund, Long-Term Future Fund, Animal Welfare Fund, and EA Infrastructure Fund (collectively disbursing $5-33M annually) committed to purchasing successful impact certificates.

Impact Certificates

Impact certificates represent Manifund's most experimental funding mechanism. The concept functions like "Kickstarter meets the stock market, for charity": founders create proposals with minimum funding goals, accredited investors bid in auctions, and successful projects issue tradeable certificates representing credit for completed work. Retroactive funders can later purchase certificates for projects that demonstrated impact.

AspectImplementationResults
MechanismAuction-based initial funding; tradeable certificatesFacilitated 20+ project fundings
Investor InterestLower than expectedStruggled to attract speculative investors
Trading VolumeLimitedAMMs implemented but underutilized
Learning ValueHighInformed future mechanism design

Key Learnings from Impact Certificate Experiments:

Manifund ran three major impact certificate programs through Q1 2024: ACX Grants, Manifold Community Fund, and ChinaTalk essay competition. The team concluded they were "less excited by impact certificates than before" due to several challenges:

  1. Investor acquisition difficulty: The use case of speculating on charitable projects did not attract sufficient investor interest beyond the EA community.

  2. Evaluation burden: Assessing project impact proved time-consuming and "wasn't that fun," with participation declining each evaluation round.

  3. Trading mechanism complexity: Implementing automated market makers facilitated more trading but didn't justify the engineering cost. All AMMs were overpriced because the system supported buying and selling but not shorting.

  4. Brand/prize insufficiency: When partnering with Coefficient Giving's essay contest, most essayists declined to create impact certificates, including all ultimate winners. This demonstrated that "large dollar prizes plus well-known brands are not sufficient to get a robust certificate ecosystem started."

Despite these challenges, Manifund views the experiments positively for the learning they generated. The platform continues to support impact certificates while focusing more resources on regranting, which has demonstrated clearer product-market fit.

Donor Lottery

Manifund hosts donor lotteries following the model established by Carl Shulman and Paul Christiano in 2017. The mechanism allows donors to pool contributions, with one randomly-selected winner receiving the entire pool to distribute as grants.

FeatureDetails
Entry Range$1,000 - $100,000
Typical Pool Size$100,000 - $500,000
Winner SelectionRandom, proportional to contribution
Allocation Period≈6 months to distribute

The donor lottery rationale exploits increasing marginal returns to donation research. A $1,000 donor gains a 1% chance of allocating $100,000, making extensive research worthwhile. Winners often spend dozens of hours investigating giving opportunities, producing higher-quality grant decisions than thousands of small donors making quick choices.

Fiscal Sponsorship Model

Manifund operates as a 501(c)(3) fiscal sponsor, providing critical infrastructure for projects that lack formal nonprofit status. This enables tax-deductible donations to individuals, unregistered projects, and even for-profit companies (pending due diligence).

Loading diagram...
FeatureDetails
Tax Status501(c)(3) nonprofit
Fee Structure5% for large donors (covers operations)
Eligible GranteesRegistered charities, individuals, for-profits (with review)
RestrictionsNo political campaigns or lobbying
Processing TimeDays to one week

The fiscal sponsorship model solves a key friction point in EA funding: many promising projects are led by individuals or small teams without formal nonprofit status. Traditional foundations cannot easily fund such projects. Manifund bridges this gap by accepting donations, conducting basic due diligence, and disbursing funds while maintaining tax compliance.

Grant Categories and Distribution

2023 Distribution

CategoryAmountPercentageNotes
AI Safety Research$800K+≈40%Interpretability, alignment, governance
Community Building$400K≈20%Local groups, events, career transitions
Software & Tools$300K≈15%Forecasting, epistemics, infrastructure
Events & Conferences$200K≈10%Manifest, workshops, retreats
Other EA Causes$300K≈15%Biosecurity, animal welfare, global health
Impact Certificates$45K≈2%Experimental mechanism
Total$2.06M100%$2.012M grants + $45K certificates

Typical Grant Sizes

Size RangeFrequencyTypical Use Cases
$1K - $5K25%Microgrants, travel, small events
$5K - $25K40%Early-stage projects, tools, part-time work
$25K - $75K25%Research projects, organization support
$75K - $200K8%Major projects, multi-month work
$200K+2%Apollo Research (largest grant)

Notable Funded Projects

AI Safety

ProjectRegrantor(s)AmountImpact
Apollo ResearchTristan Hume, Evan Hubinger, Marcus AbramovitchLargest grantFounded by Marius Hobbhahn; published research contributing to Anthropic's dictionary learning work
Developmental Interpretability (Timaeus)MultipleSeed fundingFirst funding for DevInterp research agenda; accelerated research by months
Mechanistic Interpretability CommunityVarious≈$50KGrew to 500+ members; 40+ reading sessions; multiple publications
ChinaTalk AI CoverageVarious≈$20KReported on DeepSeek developments ahead of mainstream coverage

Community Building

ProjectFocusImpact
Equiano InstituteAI Alignment Research Lab for AfricaAssisted UN on Global Digital Compact; ran governance and alignment fellowships
AI Safety CommunitiesOnline coordinationMaintains AI Safety World, EA Domains, AI Safety Training resources
Mox SF CoworkingPhysical spaceCoworking and events space for AI safety researchers

Research and Tools

ProjectTypeDescription
Shallow Review 2024ResearchQuick review of AI safety papers; induced further OpenPhil funding
Foresight AI Safety GrantsRegrantingSupports neurotechnology, cryptography, multi-agent game theory
Inside View PodcastMedia43+ AI safety explainers featuring Evan Hubinger, Neel Nanda, Victoria Krakovna

Connection to Manifold Markets

Manifund maintains close organizational and technical ties to Manifold Markets, the play-money prediction market platform.

ConnectionDetails
Shared FoundersAustin Chen co-founded both; Stephen and James Grugett lead Manifold
Team Overlap~3 people on Manifund, ≈6 on Manifold (as of 2024)
InfrastructureShared codebase elements; connected user accounts
CommunityOverlapping user base; cross-promotion
EventsManifest conference serves both communities

Manifold Markets Overview

AttributeDetails
FoundedDecember 2021
FoundersAustin Chen, Stephen Grugett, James Grugett
HeadquartersAustin, Texas
Funding$1.5M (FTX Future Fund), $340K+ (SFF), ACX Grant (seed)
Currency"Mana" (play money); Sweepcash sunset March 2025

Manifold Markets received its original seed funding from Scott Alexander's ACX Grants program, which "kicked off Manifold as a business." The platform later raised $1.5 million from the FTX Future Fund and over $340,000 from the Survival and Flourishing Fund.

Manifest Conference

Manifold hosts Manifest, an annual forecasting and prediction markets festival that brings together the broader rationalist and EA communities.

YearDatesLocationAttendanceNotable Speakers
2023Sep 22-24Berkeley, Lighthaven250Nate Silver, Robin Hanson, Eliezer Yudkowsky
2024Jun 7-9Berkeley, Lighthaven600Nate Silver, Scott Alexander, Dwarkesh Patel, Emmett Shear (Twitch), Ben Mann (Anthropic)

The 2024 Manifest featured startup pitch competitions, prediction market workshops, and fireside chats. The event ran alongside LessOnline, hosted by Lightcone Infrastructure, creating a 10-day gathering of the rationalist and forecasting communities.

Application Process

For Projects Seeking Funding

StepDetailsTimeline
1. Create ProfileSign up on ManifundMinutes
2. Submit ProjectDescribe work, budget, timeline, deliverables30-60 minutes
3. Identify FundersFind relevant regrantors or apply to open callsSame day
4. Review ProcessRegrantor evaluates fit with their focus areaDays to 2 weeks
5. Due DiligenceManifund reviews for legitimacy, legality, mission alignment1-3 days
6. Receive FundsTransfer to Manifund account; request withdrawalDays

What Makes Strong Applications

FactorStrong SignalWeak Signal
ScopeClear deliverables, measurable outcomesVague goals, "exploring" without specifics
TeamRelevant track record, domain expertiseNo demonstrated capability
BudgetJustified costs, efficient allocationRound numbers without breakdown
TimelineSpecific milestones, realistic estimatesOpen-ended or unrealistic
Theory of ChangeClear path from work to impactDisconnected from outcomes
TransparencyWilling to share updates publiclyResistant to public accountability

For Donors and Regrantors

FeatureDescription
Regrantor ProgramApply for budget to run independent grantmaking ($50K-400K)
Direct GivingFund specific projects directly through platform
Donor LotteryPool funds for chance at larger allocation
Account FundingAdd money to account for flexible allocation
Tax DocumentationAutomatic receipts for 501(c)(3) donations

Platform Features

Transparency

Manifund distinguishes itself through radical transparency. All grants are publicly visible with full details:

InformationVisibility
Grant AmountPublic
Project DescriptionPublic
Regrantor IdentityPublic
Grant RationalePublic (regrantor writes explanation)
Project UpdatesPublic
Comments/DiscussionPublic
Withdrawal AmountsPublic

This transparency creates accountability for regrantors (whose track records are visible) and enables ecosystem learning (other funders can see what's being funded and why).

Technical Infrastructure

FeatureImplementation
Open SourceCore platform code publicly available
API AccessProgrammatic access to grant data
Embeddable WidgetsProjects can embed funding progress
Notification SystemUpdates on projects, comments, funding
Mobile SupportResponsive web design

Comparison with Other EA Funders

DimensionManifundLong-Term Future FundCoefficient GivingSFF
SpeedDays to 1 week4-8 weeks3-12 monthsQuarterly
Typical Grant Size$5K-75K$20K-200K$100K-10M+$50K-500K
Due DiligenceLight (regrantor discretion)MediumHeavyMedium
Decision ProcessIndividual regrantorsCommitteeStaff + external reviewCommittee
TransparencyAll grants publicGrant reports publicGrant reports publicRecommendations public
Fiscal SponsorshipYes (5% fee)Through CEANoNo
FocusAI safety, EA broadAI safety, longtermismStrategic causesX-risk, EA infra

When to Apply to Manifund vs. Other Funders

Apply to Manifund when:

  • You need funding quickly (weeks, not months)
  • Grant size is $5K-75K
  • Project is early-stage or speculative
  • You lack nonprofit status
  • You want to test an idea before seeking larger funding

Apply to LTFF/Coefficient Giving when:

  • Project requires $100K+
  • You have time for longer review process
  • Project has clear track record to evaluate
  • You need multi-year funding commitment

Strengths and Limitations

Organizational Strengths

StrengthEvidence
SpeedGrant recommendation to disbursement in under 1 week
FlexibilityMultiple funding mechanisms (regrants, impact certs, lotteries)
TransparencyAll grants public with rationales
Low OverheadSmall team, 5% fee covers operations
InnovationPioneered impact certificates at scale in EA
AccessibilityEasy application, fiscal sponsorship available
Expert NetworksTop AI safety researchers as regrantors

Limitations

LimitationDetailsMitigation
Scale$2M annually vs. $100M+ at OpenPhilFocus on early-stage, speculative grants
Due DiligenceLess thorough than major foundationsLeverage regrantor expertise; accept higher variance
Regrantor AvailabilityGrant quality depends on regrantor capacityMaintain 10+ active regrantors
SustainabilityRelies on continued donor participationDiversify funding sources; build track record
FocusPrimarily AI safety/EA communityIntentional; serves specific ecosystem need

Impact Certificate Challenges

The impact certificate mechanism faced specific challenges that limited its effectiveness:

ChallengeObserved Result
Investor AcquisitionFailed to attract investors outside EA community
Evaluation BurdenTeam participation declined over time
Trading LiquidityAMMs implemented but underutilized
Mainstream AdoptionCoefficient Giving essay contest participants declined certificates

Manifund continues supporting impact certificates while recognizing that regranting has demonstrated stronger product-market fit.

Strategic Position in EA Funding Ecosystem

Manifund fills a specific niche in the effective altruism funding landscape: fast, flexible, small-to-medium grants that seed early-stage projects. This complements rather than competes with larger funders.

Ecosystem RoleDescription
Seed FunderProvides initial funding that enables projects to later secure larger grants
Speed LayerMoves money in days when other funders take months
Risk ToleranceRegrantors can make speculative bets committees might reject
Talent IdentificationRegrantors with field expertise spot promising individuals early
Infrastructure ProviderFiscal sponsorship enables funding to non-charities

Several Manifund-funded projects have subsequently received larger grants from Coefficient Giving and other major funders, validating the "quick regrants induce further funding" thesis.

Timeline

DateEvent
Dec 2021Manifold Markets founded by Austin Chen, Stephen Grugett, James Grugett
2022Manifold receives $500K from FTX Future Fund for charity prediction markets
2022Manifund incorporated as separate 501(c)(3)
Late 2022Scott Alexander approaches team about running ACX Grants via impact certificates
2022Rachel Weinberg joins; builds manifund.org in two weeks
May 2023Anonymous donor "D" provides $1.5M for regranting program
2023Manifund launches with ≈12 regrantors ($50K-400K budgets each)
Sep 2023First Manifest conference (250 attendees, Berkeley)
2023$2.06M distributed ($2.012M grants + $45K impact certificates)
Q1 2024Impact certificate experiments conclude with mixed results
Jun 2024Manifest 2024 (600 attendees)
2024ACX Grants 2024 includes impact market for 50+ proposals
2025$2.25M raised for 10 regrantors
Jan 2025ACX Grants 2025 funds 42 of 654 applications

Sources and Citations

Primary Sources

EA Forum Posts

ACX/Scott Alexander Sources

Media and Interviews

Conference Sources

Donor Lottery Background

References

1manifund.org
3Manifold Marketsmanifold.markets

No substantive information available to summarize.

Structured Data

1 fact·377 recordsView full profile →
Total Funding Raised
$2.3 million
as of 2025

All Facts

Financial
PropertyValueAs OfSource
Total Funding Raised$2.3 million2025

Grants

377
NameAmountDate
Train great open-source sparse autoencoders$4,025May 2024
Develop an accessible, low-cost system for single-cell imaging in multiple regions of freely moving organisms$25,000Feb 2024
Interpretable Forecasting with Transformers$602.5Feb 2023
Forethought$365,270Dec 2025
Luthien$170,470Mar 2025
Investigating the Effects of IF in the reversal of Type 2 Diabetes Mellitus.$8,500Mar 2024
Connect For Animals: a platform for ending factory farming$21,054Aug 2024
Split Personality Training$3,000Apr 2025
Holly Elmore organizing people for a frontier AI moratorium$5,310Jul 2023
Lead-acid battery recycling in Philippines$50,000Oct 2025
AI-Plans.com $5,370Jan 2024
Adjacent News$1,000Aug 2024
African School of Economics in Zanzibar charter city$100,000Feb 2024
Forecasting global disasters$50,705Oct 2025
Hive Highlights Newsletter - farmed animal advocacy upd with 7k/month views$1,452Aug 2024
Tooling + Model Orgs for CoT Faithfulness Research$3,000Jul 2025
Recreate the cavity-preventing GMO bacteria BCS3-L1 from precursor $40,640Jul 2023
'Making God': a Documentary on AI Risks for the Public$205,094Mar 2025
Transformative AI scenarios for EU policymakers$30,000Oct 2025
LEAH Coworking Space$3,847Aug 2024
Good Ancestors (Australia)$65,020Oct 2025
Coursetexts$2,540May 2025
Make large-scale analysis of Python code several orders of magnitude quicker$1,000Mar 2023
Crystal Ballin' Podcast$2,790.08Mar 2023
Elizabeth and Timothy Podcast on values in Effective Altruism (Funded)$7,810Oct 2024
Stipend to enable a US PhD student to move to the UK for a temporary role with the UK DSIT$13,000Feb 2024
News through prediction markets$499Feb 2024
Convert a hybrid car to chip wood and generate electricity$25,100Feb 2024
Telemedicine platform for Congo$12,000Oct 2025
Mechanistic Interpretability research for unfaithful chain-of-thought (1 month)$11,000Nov 2024
Social Media Strategy for EA Orgs$1,246Aug 2024
80,000 Hours$4,915.11Aug 2024
CaML - AGI alignment to nonhumans$30,000Nov 2025
Evaluating the Effectiveness of Unlearning Techniques $30,000Jun 2024
Mox, a coworking & events space in SF$261,820Mar 2025
Synthesizing Standalone World-Models$51,500Sep 2025
Animal Advocacy Africa $5,585Aug 2024
Forecast Dissemination Mini-Market 2 of 3: Hurricane Hazards$450Feb 2023
Benchmarking and comparing different evaluation awareness metrics$3,000Aug 2025
Year one of AI Safety Tokyo$600Feb 2024
Hive Slack - an active community space for engaged farmed animal advocates$3,749Aug 2024
Kickstarting EA Latvia community$507Aug 2024
Native psilocybin use in Southern Africa$13,000Oct 2025
[AI Safety Workshop @ EA Hotel] Autostructures$8,555Oct 2024
Joseph Bloom - Independent AI Safety Research$51,400Jul 2023
Videos on UK Free Speech$7,700Aug 2025
Conduct field research on informal lead-acid battery recycling supply chains in Nigeria$13,000Feb 2024
Medical Expenses for CHAI PhD Student$23,043Jun 2023
VaccinateCA$15,075Jul 2023
AI that standardizes and curates bio datasets$70,000Oct 2025
Tardigrade stress tolerance genes that modified for improved performance in human cells.$8,000Feb 2024
Formal Certification Technologies for AI Safety$128,000Nov 2025
Apart incubates and facilitates hundreds of AI safety researchers around the globe.$59,000Feb 2024
BAIS (ex-AIS Hub Serbia) Office Space for (Frugal) AI Safety Researchers$10,473Dec 2023
Pilot for new benchmark by Epoch AI$200,000Dec 2024
Produce therapeutic food for Ethiopia$50,000Oct 2025
FloraForge: Aesthetic Gene Editing in Petunias$5,000Nov 2025
Virtue-Ethical Rationality and Training Dynamics $20,000May 2025
AI Safety & Society$250,000Dec 2024
Understanding SAE features using Sparse Feature Circuits$11,000Jun 2024
Publish a book on Egan education for parents$5,602.89Feb 2024
Covid Work By Elizabeth VN/Aceso Under Glass$1,484Aug 2024
Invest in the Conflux Manifold Media Empire(??)$310.58Nov 2023
"Contemporary Issues in Animal Rights Law" online course$1,380Aug 2024
Manifold x College Admissions$949.32Dec 2023
Impact Assessment of Social Programs$3,500Feb 2023
Improving fish welfare through producer and retail outreach in Türkiye$15,000Feb 2024
Help a Bootstrapped AI Risk Literacy Founder Get To IASEAI 2026 in Paris$3,036Jan 2026
Making God's Post-Production: Aiming for Netflix$550Aug 2025
Good Ancestors Policy expenses$15,100Sep 2023
Influenza Pandemic Protection on Day 1 through universal priming vaccines$30,000Feb 2024
Trading bot guide$20Dec 2023
SDCPNs for AI Safety$39,009Oct 2025
Safe AI Germany (SAIGE)$9,201Jan 2026
Teamwork - professional EA co-working space in Berlin$1,868Aug 2024
Forecasting Meetup Network - Washington, DC pilot (4 meetups)$677Aug 2024
Charge selective large pore membranes for artificial implantable kidney.$80,000Feb 2024
Compute for 4 MATS scholars to rapidly scale promising new method pre-ICLR$16,047Aug 2024
Distribute HPMOR copies in Bangalore, India$1,000Feb 2024
Visa fee support for Australian researcher to join a fellowship with Anthropic$4,000Nov 2025
Mitigating Reward Hacking Through RL Training Interventions$7,900Feb 2026
Funding researcher on frontier AI governance inclusive of global majority $87,543Jul 2025
Gene editing corn for nutrition$10,000Oct 2025
Is electrical stunning a humane slaughter method for decapods?$5,020Oct 2025
The Grass is Greener: Skirting Deflation via Strong Overseas Growth. Again.$40Jan 2024
Manifold merch store$302Nov 2023
YIMBY movement for African cities$85,000Oct 2025
BLAM Liberia Seed Project$10,000Oct 2025
Doom Debates - Podcast & debate show to help AI x-risk discourse go mainstream$41,195Aug 2024
Build anti-mosquito drones$28,630Feb 2024
Field building in universities for AI policy careers in the US$25,000Dec 2023
Demonstration of LLMs deceiving and getting out of a sandbox$3,110Jun 2025
Improve eligibility for organ donation$105,000Oct 2025
Independent research to improve SAEs (4-6 months)$55,025Apr 2024
Finishing a Video on the Google DeepMind Hunger Strike$1,800Oct 2025
Accelerating the AI Safety talent pipeline in South Africa| Matched donations$656Aug 2024
AI Policy work @ IAPS$10,050Dec 2023
Write a good primer on political change.$1,500Feb 2024
Evitable: a new public-facing AI risk non-profit $8,982Dec 2025
AI-Plans.com Critique-a-Thon $500 Prize Fund Proposal$500Jul 2023
Metaculus x Givewell Forecasting Tournament$4,500Nov 2024
Athena - New Program for Women in AI Alignment Research$20,000Dec 2023
Scaling AI safety awareness via content creators$21,309Apr 2025
Eradicate the hidden factory farming of feeder rodents$50,200Oct 2025
Lost Ones in-person meetups$10Aug 2024
Run a public online Turing Test with a variety of models and prompts$3,149Feb 2024
Calibration City$8,864Aug 2024
Progress Studies YouTube Channel$20,000Dec 2025
Promote Georgism$100,000Feb 2024
Fund a Fellow for the Cooperative AI Research Fellowship! $2,530Sep 2025
Developing a Course on AI x-risk$10,000Aug 2024
Scenario analysis and planning platform for more effective developing world agriculture projects$20,000Feb 2024
Superforecaster predictions of long-term impacts$4,548.33Feb 2023
Forecast Dissemination Mini-Market 3 of 3: Earthquake Risk$345Feb 2023
Scoping Developmental Interpretability$144,650Jul 2023
Charting the Policy Landscape of Advanced Education in America$8,350Oct 2025
Blog about Forecasting Global Catastrophic Risks$1,449.55Feb 2023
Assess online training platform for health workers in Nigeria$25,000Feb 2024
Isolating CBRN Knowledge in LLMs for Safety - Phase 2 (Research)$150,000.16Dec 2025
A Lumenator Company, or: A More Ambitious Life Trajectory$6,000Sep 2023
General Career Support$38,000Nov 2025
AI scam research targeting seniors $10,000Aug 2025
Replicate and extend brain entrainment results$32,500Feb 2024
Attention-Guided-RL for Human-Like LMs$3,100Mar 2025
Estimating annual burden of airborne disease (last mile to MVP)$8,200Oct 2023
Sentience Politics$535Aug 2024
Study land reform$5,000Feb 2024
Train LLMs in honest introspection$15,000Oct 2025
Introductory resources for Singular Learning Theory$10,650Jul 2023
Seed Funding For Geodesic Research$201,000Jan 2026
The AI Governance Archive (TAIGA)$1,485Aug 2024
Support funding for Hardeep Gambhir's gap semester$3,999Aug 2023
AI to screen DNA orders$26,000Oct 2025
Project on optimal government hedging against labour automation$9,000Aug 2025
Platform for migrants to start legal & profitable microbusinesses.$4,050Feb 2024
MATS Program$290,193Oct 2023
Dads Against AGI Inc.$20,181Mar 2025
Support for Deep Coverage of China and AI$37,567Jul 2023
Build an app to track ones impact on animal cruelty$1,000Feb 2024
Rabbitholeathon Event Food$500Dec 2024
PIBBSS - General Programs funding or specific funding$51,200Jul 2024
Establishing the Utilitarian School of Thought in Thai Society$1,012Feb 2025
Animal Advocacy Innovation Hub in the Bay Area$4,100Feb 2025
Coordinal Research: Accelerating the research of safely deploying AI systems.$125,062Apr 2025
Using M&E to increase impact in the animal cause area, by The Mission Motor.$3,372Aug 2024
Implicit planning in LLMs Paper$1,000May 2025
Clean Indoor Air for Schools$150,000Oct 2025
Bayesian modelling of LLM capabilities from evals$32,000Dec 2024
Disentangling Political Bias from Epistemic Integrity in AI Systems$50,000Oct 2025
An online platform to solve coordination problems and generate leverage through collective action campaigns$17,090Feb 2024
BioBind for the Amazon$1,050Aug 2024
Offline AI voice restoration device for people with voice disorders$25,000Oct 2025
3 Months Career Transition into AI Safety Research & Software Engineering$32,000Sep 2025
Develop technical framework for human control mechanisms for agentic AI systems$10,030Feb 2025
Telegram bot for Manifold Markets$1,251.41Feb 2023
Out of This Box: AI Safety Musical$19,205Mar 2025
Bridge Funding for the Sydney AI Safety Hub (SASH)$5,167Feb 2025
One semester living expenses for MIT-based researcher$500Jul 2023
Build a new nonprofit thinktank that advances technological solutions to animal welfare challenges$100,000Feb 2024
Scale up biologics manufacturing$50,000Oct 2025
Animal Advocacy Strategy Forum$2,942Aug 2024
Artificial General Intelligence (AGI) timelines ignore the social factor at their peril$650Mar 2023
Vaccine manufacturing platform$50,000Oct 2025
AI-assisted low-cost ultrasound scanner$23,000Oct 2025
Athena 2.0$8,000Aug 2024
Giving What We Can$3,173Aug 2024
AI Animals and Digital Minds 2025$17,380Aug 2024
Kickstarting For Good - High-Impact Nonprofit Incubation Program$2,631Aug 2024
Operating Capital for AI Safety Evaluation Infrastructure$400,000Oct 2025
Emergency travel funding to attend EA Global: New York 2025$500Oct 2025
Manifold Tournaments$262.5Nov 2023
General support for SaferAI$100,025Feb 2025
Act I: Exploring emergent behavior from multi-AI, multi-human interaction$67,832.77Aug 2024
Standardized Tools for Impact Market Reporting$1,263.53Feb 2023
Long-Term Future Fund$141,457Sep 2023
Forecast Dissemination Mini-Market 1 of 3: College Majors$550Feb 2023
human intelligence amplification @ Berkeley Genomics Project$107,502Mar 2025
Support early career journalists covering AI by providing training, funding, and placements at major news outlets.$32,000Feb 2024
Preventing Worst Case Pandemics Symposium @ Cambridge$4,100May 2024
Orexin Pilot Experiment for Reducing Sleep Need$5,170May 2025
Experiments to test EA / longtermist framings and branding$26,760Nov 2023
Advertise Christians for Impact (EA for Christians)$40,000Oct 2025
Mapping neuroscience and mechanistic interpretability $5,950Dec 2023
PauseAI US 2025 through Q2$100,000Nov 2024
Lightcone Infrastructure$237,365May 2024
EEG using a generalizable ML model + 32 channel PCB$2,500Jan 2024
Combat global antibiotic resistance$60,000Feb 2024
Shallow review of AI safety 2024$20,860Oct 2024
Development of a Cautionary Tale Feature Film about Gradual Disempowerment$100,000Jul 2025
Agent Island: An Environment for Interagent Cooperation and Conflict$2,000Feb 2026
OpenReg: AI tools for FDA application and reg compliance$50,000Oct 2025
PauseAI local communities - volunteer stipends$93,574Aug 2024
AI Governance YouTube Channel$732Aug 2024
AI-Driven Market Alternatives for a post-AGI world$15,745Jul 2024
Making 52 AI Alignment Video Explainers and Podcasts$15,269Feb 2024
Finishing The SB-1047 Documentary$118,987Oct 2024
AI Safety Research Organization Incubator - Pilot Program$15,977Nov 2023
3 months full-time contributing software to Inspect$20,000Aug 2025
Fund thebes' model tinkering$26,402Nov 2025
Shanzhai Intelligence and the role of AI in China’s future$40Jan 2024
Democratising charity evaluations$501Sep 2024
AI, Animals, and Digital Minds 2024 Conference and Retreat$5,109Jun 2024
The Deal of the Century: Targeted Persuasion Campaign for a US-China AI Treaty$11,050Oct 2025
Manifold feature to improve non-resolving popularity markets$800Mar 2023
Research Staff for AI Safety Research Projects$26,710May 2024
Optimizing clinical Metagenomics and Far-UVC implementation.$41,800Aug 2023
China’s Incoming AI Ethics Leadership Push$40Jan 2024
Keep Apart Research Going: Global AI Safety Research & Talent Pipeline$130,926May 2025
Systems that "give a damn"$12,000Aug 2025
Software to detect data fabrication$50,000Oct 2025
Finish Video Game about AI Girlfriends and Social Peril$20,000Oct 2025
New Roots Institute: Empowering the Next Generation to End Factory Farming$15,531Aug 2024
Youtube tutorial videos about computational biology and bioinformatics$1,000Feb 2024
12th Edition of AI Safety Camp$40,190Jan 2026
aCFAR 2025/6 Fundraiser$3,250Dec 2025
Human Inductive Bias Project$10,000Oct 2025
Create a platform for assurance contracts that looks nice.$17,000Feb 2024
1200€ to move in SF for an international high-level event and meetings$1,300Nov 2024
TransformerLens - Bridge Funding$30,000Jul 2025
OPTIC$12,926.2Mar 2023
Pro-Animal Future$4,020Aug 2024
AI Safety Los Angeles (AISLA)$2,500Jul 2025
Support EA NZ's operations$1,531Aug 2024
Retroactive: Presenting a poster at the ICML technical AI governance workshop$1,300Nov 2025
Help launch the Technical Alignment Research Accelerator (TARA)!$14,901Nov 2024
Giving free AI safety books for potentially high-impact individuals$761Jun 2025
Advocate for a specialized pandemic response team at the FDA$100,000Feb 2024
make and sell far-UVC lamps for air disinfection$50,000Oct 2025
FarmKind - A donation platform creating new donors for effective animal advocacy$2,581Aug 2024
N.C. Young's Umbrella Project$393.03Nov 2023
Asterisk AI Blogging Fellowship$70,000May 2025
Alignment Is Hard$6,070Jul 2023
Manifold Markets Add-on for Google Sheets$879.69Feb 2023
AI scam research targeting seniors$4,500Jun 2025
Create open source predictors for various genetically influenced traits such as intelligence and disease risk$20,000Feb 2024
Manufacture Manyfold Manifolders in the Maritime Metropolis$800Nov 2023
Inspect Evals$50,000Sep 2025
Comparing forecasting platform accuracy$2,173.59Mar 2023
Help Apart Expand Global AI Safety Research$17,732Dec 2023
Why I think strong general AI is coming soon$60Mar 2023
Talos Network$1,000Dec 2023
CEEALAR (EA Hotel) Needs a New Roof$3,476Sep 2023
Manifold: Live!$1,927Nov 2023
Wasabipesto's Umbrella Project$3,500Nov 2023
EA Christian / CFI Community$4,902Aug 2024
Educate the public about high impact causes$2,050Feb 2024
"Sweeten" training corpuses with stories of nice AI$5,000Oct 2025
AI safety fieldbuilding in Warsaw, Poland (funding for 1 semester)$10,024Nov 2024
Wise AI: Fine-Tune an Open Source Model$5,550Jan 2024
11th edition of AI Safety Camp$45,115Dec 2024
Acausal research and interventions$80,050Jun 2025
Activation vector steering with BCI$30,260Jul 2023
Understanding Trust$100,000May 2025
Forecasting - AI Governance Policies$9,000Aug 2023
Scaling Training Process Transparency$5,150Nov 2023
Database of the Most Impactful Research Questions by Discipline and Cause Area$500Feb 2024
Runway till January: Amplify's funding ask to market EA & AI Safety $520Nov 2025
The Base Rate Times$4,275Jul 2023
Satoshi's Antithesis: China to use CBDC to Boost CNY Trade$40Jan 2024
Make ALERT happen$18,466Dec 2023
Biosecurity bootcamp by EffiSciences$1,300Jan 2025
Avoiding Incentives for Performative Prediction in AI$33,200Aug 2023
Pandemic Interventions Course - Introductory Biosecurity Syllabus$1,117Aug 2024
WhiteBox Research’s AI Interpretability Fellowship$2,005Jul 2024
Graduate School Application Fee for Students from Third World Country$5,000Apr 2024
Thrive Philanthropy$12,646Aug 2024
Personal development and better infrastructure for learning, "Anki V2"$10,000Nov 2023
Help me create a free programming school in my city$800Nov 2024
Enabling a Student to Present his Mechanistic Interpretability Work at NeurIPS $1,100Oct 2025
Commissions for a Cause - a Profit for Good project $801Aug 2024
Support a thriving and talented community of Filipino EAs$43,700Mar 2024
Isaac's Blog Writing$1,034.46Nov 2023
Towards Shared AI Compute Infrastructure$23,000Jan 2026
Update Big List of Cause Candidates$1,000Dec 2023
Travel grant to present AI safety paper at ACM FAccT$1,650Jun 2024
Metaculus' First Animal-Focused Forecasting Tournament$3,400Oct 2025
Building Effective Giving ecosystem in India$30,000Oct 2025
Removing Hazardous Knowledge from AIs$190,000Dec 2023
AI Policy Breakthroughs — Empowering Insiders$20,000Jul 2024
CHAT Diplomacy$50May 2023
Use smartphone pupillometry to triage brain injury in rural Missouri.$60,000Feb 2024
The Looming Super-Bug Crisis$40Jan 2024
Support Riesgos Catastroficos Globales$32,500Sep 2023
Seed fund for boundaries-based empirical AI safety projects after workshop.$40,000Feb 2024
Budget for events aiming to "mainstream" EA in Sweden$1,047Aug 2024
Explainer and analysis of CNCERT/CC (国家互联网应急中心)$3,000Dec 2023
Live Governance$3,000Nov 2024
Mirrorbot$1,593.12Nov 2023
EVN General Support Application$50,000Nov 2024
Visa fee support for US researcher to take on a temporary role with the UK AISI$3,000Feb 2024
Future-Proofing Forecasting: Easy Open-Source Solution$2,071Aug 2024
Travel funding to the International Conference on Algorithmic Decision Theory.$1,170Sep 2024
Guaranteed Safe AI Seminars 2026$30,000Oct 2025
"Litigating and Legislating for Animal Rights" Online Seminar Series$781Aug 2024
Next Steps in Developmental Interpretability$80,680Jul 2024
Effective Altruism Meetup, Abuja, Nigeria$1,448Aug 2024
London Manifold.love dating shows$2,538.91Dec 2023
Play money prediction markets$2,400Aug 2024
Scaling Legal Impact for Chickens to make factory-farm cruelty a liability$5,674Feb 2024
Compute and other expenses for LLM alignment research$400,100Aug 2023
Run five international hackathons on AI safety research$10,950Jul 2023
AI Safety Textbook$41,070Apr 2024
mRNA for airway disease$80,000Oct 2025
Reflective altruism$2,000Jul 2023
Neuronpedia - Open Interpretability Platform$2,500Jul 2023
Apollo Research: Scale up interpretability & behavioral model evals research$339,409Jul 2023
VoiceDeck$538Aug 2024
Unprompted Unfaithful Chain of Thought Dataset Project$2,000Jan 2025
SafePlanBench: evaluating a Guaranteed Safe AI Approach for LLM-based Agents$1,975Jan 2025
Normie-friendly prediction market interfaces$7,000Oct 2025
Fatebook and Quantified Intuitions$1,025Aug 2024
Design budget for rebuilding the dating site we all want back. $5,000Jun 2024
Funding for Solar4Africa app development$500Jul 2023
AI Governance Exchange (focus on China, AI safety), Seed Funding$12,365Mar 2025
Hallucination Detector$6,000May 2025
Impact Accelerator Program: Biggest career program for experienced professionals$978Aug 2024
Exploring feature interactions in transformer LLMs through sparse autoencoders$8,500Dec 2023
Measuring the Epistemic Cost-Effectiveness of AI Safety Content$6,060Oct 2025
HIV/TB clinic in Africa$45,000Oct 2025
Search for more solar system objects$6,000Oct 2025
Journal of Animal Rights Law$911Aug 2024
Fish welfare training in Nigeria$10,000Oct 2025
A tool for making well sized (~Kelly optimal) bets on manifold$10,890.93Feb 2023
AI Crisis Convening at India AI Impact Summit 2026$100,000Jan 2026
AI Security Startup Accelerator Batch #2$355,000Dec 2025
Cadenza Labs: AI Safety research group working on own interpretability agenda$7,810Nov 2023
Philanthropic advising$50,000Mar 2025
The Metascience Observatory $35,000Oct 2025
Testing and spreading messages to reduce AI x-risk$12,600Aug 2024
Fund Sentinel for Q1-2025$42,384Sep 2024
Collaborative Resolution of Political Controversies$1,714Mar 2023
Overcoming inertial barriers to collective action through anonymous coordination$822Aug 2024
Funding for AI safety comms strategy & career transition support$38,952Aug 2024
Start an online editorial journal focusing on paradigm development in psychiatry and psychology$10,020Feb 2024
Empirical research into AI consciousness and moral patienthood$12,020Sep 2023
SirCryptomind Moderation$400Feb 2024
Validate a solution for ozone generation from far-UVC germicidal light$25,000Feb 2024
Provide Rationality / EA Education at University of Maryland$7,800Mar 2023
10th edition of AI Safety Camp$68,463Dec 2023
Automatic circuit discovery on sparse autoencoded features$25,000Dec 2023
Effective Giving in New Zealand$25,000May 2024
Helping to change the stray animal crisis in Mexico$600Aug 2024
Develop an app or website to help parents choose between IVF clinics$20,000Feb 2024
Isaac's power-user Manifold search and dashboard$2,478.55Nov 2023
Discovering latent goals (mechanistic interpretability PhD salary)$1,590Jul 2023
Screwworm Free Future: Seizing the Eradication Window$55,599Oct 2025
Reducing Risk in AI Safety Through Expanding Capacity.$11,025Dec 2025
Trading assistant bot (Remind Me)$230.46Nov 2023
Exploring novel research directions in prosaic AI alignment$30,000Nov 2023
[Urgent] Top-up funding to present poster at the Tokyo AI Safety Conference $700Mar 2025
Improving farmed fish welfare in Egypt$5,000May 2025
Investigating and informing the public about the trajectory of AI$5,975Jan 2025
Evaluating Model Attack Selection and Offensive Cyber Horizons$41,000Dec 2025
Groundless Alignment Residency 2025$15,000Sep 2025
Translation of BlueDot Impact's AI alignment curriculum into Portuguese$3,300Dec 2023
AI Digest$97,730Jun 2025
AI forecasting and policy research by the AI 2027 team$44,022May 2025
Regrant to charities working on developing country nutrition$100,012Feb 2024
Automated forecasting$25,000Oct 2025
Research introduction of new antibiotics$78,000Oct 2025
Civitech Incubator$10,000Jan 2026
The first public benefit biotechnology company with an explicit goal of involuntary suffering abolition.$50,020Feb 2024
Building a Culture of Care: Educating on Animal Welfare in Somalia$520Aug 2024
Survey for LLM Self-Knowledge and Coordination Practices$10,000Aug 2024
Help AIs create AI safety tools$80,000Oct 2025
AI Safety Reading Group at metauni [Retrospective]$815Aug 2024
Grow An AI Safety Tiktok Channel To Reach Ten Million People$29,700Aug 2025
A Happier World (YouTube channel promoting EA ideas)$2,790Apr 2024
WhiteBox Research: Training Exclusively for Mechanistic Interpretability$12,420Aug 2023
AI Alignment Research Lab for Africa$2,450Jul 2023
Building Bridges: Effective Animal Advocacy in the Global South$2,667Aug 2024
The First Workshop on Mechanistic Interpretability for Vision$1,500May 2025
Aquatic Animal Alliance: A Global Movement for Neglected Species$4,212Aug 2024
Travel funding for International Conference on Learning Representations$1,500May 2024
Subsidize Real Money Prediction Markets on High Impact Topics$8,042Mar 2023
A Critical Study of the EU’s Anti-Subsidy Probe into Chinese Electric Vehicles$40Jan 2024
Support for SAELens and other Decode Research Projects$6,800Aug 2025

Related Pages

Top Related Pages

Approaches

Prediction Markets (AI Forecasting)

Analysis

Stampy / AISafety.info

Other

Leopold AschenbrennerNuño SempereRobin HansonEvan HubingerEliezer YudkowskyPaul Christiano

Organizations

MATS ML Alignment Theory Scholars programSeldon LabLighthaven (Event Venue)Sentinel (Catastrophic Risk Foresight)PolymarketGratified

Concepts

Funders OverviewEA Funding Absorption Capacity