Skip to content
Longterm Wiki
Back

FLI Grant Program: Mitigating AI-Driven Power Concentration

web

Credibility Rating

3/5
Good(3)

Good quality. Reputable source with community review or editorial standards, but less rigorous than peer-reviewed venues.

Rating inherited from publication venue: Future of Life Institute

This is FLI's grant program page for funding projects that mitigate AI-driven power concentration, listing 13 funded projects totaling ~$4M across technical, journalistic, and policy interventions — directly relevant to AI governance and existential risk from concentrated AI power.

Metadata

Importance: 62/100otherreference

Summary

The Future of Life Institute allocated up to $4M across 13 projects aimed at mitigating AI-driven power concentration, spanning technical interventions, investigative journalism, and policy research. The program addresses risks including Orwellian surveillance, corporate monopolies, and erosion of individual agency. Funded organizations include CAIS, OpenMined, Mila, and the Collective Intelligence Project.

Key Points

  • FLI awarded 13 grants totaling ~$4M to address AI-driven power concentration across technical and non-technical domains.
  • Largest grant ($1.66M) went to OpenMined Foundation, focused on privacy-preserving AI infrastructure.
  • Projects range from investigative journalism (CalMatters, Bulletin of Atomic Scientists) to technical AI safety research (CAIS, Mila).
  • FLI identifies power concentration risks as: resource hoarding, media/information control, and political authority capture.
  • Program aims to seed a new field of research and action against ungoverned AI acceleration concentrating power.

Cached Content Preview

HTTP 200Fetched Apr 11, 202616 KB
All Grant Programs How to mitigate AI-driven power concentration 

 We're offering up to $4M to support projects that work to mitigate the dangers of AI-driven power concentration and move towards a better world of meaningful human agency. Status: Funds allocated Last fall, FLI released a Request for Proposal to support projects to mitigate AI-driven power concentration. Our initial goal was to understand who is working on this problem, how they are doing so, and fund promising projects.

 We awarded 13 projects to address this issue, ranging from technical interventions to investigative journalism. These projects represent early efforts in what we hope will become a robust field of research and action.

 Grants archive

 An archive of all grants provided within this grant program: Project title Bulletin of the Atomic Scientists 

 Amount recommended $315,975.00 Primary investigator Barb Vicory No details available Project title CalMatters 

 Amount recommended $350,000.00 Primary investigator Louise Yokoi No details available Project title Cambridge Existential Risk Initiative 

 Amount recommended $157,565.00 Primary investigator Keir Reid No details available Project title Center for AI Safety (CAIS) 

 Amount recommended $500,000.00 Primary investigator Connor Smith No details available Project title Center for Collective Intelligence Design 

 Amount recommended $460,096.00 Primary investigator Kathy Peach No details available Project title Collective Intelligence Project 

 Amount recommended $400,000.00 Primary investigator Joal Stein No details available Project title Convergence Analysis 

 Amount recommended $280,000.00 Primary investigator David Kristoffersson No details available Project title Formation Research 

 Amount recommended $80,300.00 Primary investigator Alfie Lamerton No details available Project title Institute for Security and Technology 

 Amount recommended $242,981.00 Primary investigator Emma Hollingsworth No details available Project title Mila 

 Amount recommended $446,762.00 Primary investigator Maximilian Puelma Touzel No details available Project title OpenMined Foundation 

 Amount recommended $1,661,750.00 Primary investigator Andrew Trask No details available Project title Peace Research and Education Program 

 Amount recommended $350,000.00 Primary investigator Marine Ragnet No details available Project title Project Libertas 

 Amount recommended $395,600.00 Primary investigator Luke Kemp No details available Request for Proposal I. FLI launching new grants to oppose and mitigate AI-driven power concentration

 AI development is on course to concentrate power within a small number of groups, organizations, corporations, and individuals. Whether this entails the hoarding of resources, media control, or political authority, such concentration would be disastrous for everyone. We risk governments tyrannising with Orwellian surveillance, corporate monopolies crushing economic freedom, and rampant decision aut

... (truncated, 16 KB total)
Resource ID: f70336b31548bb41