Skip to content
Longterm Wiki
Back

Lightspeed Grants

web
lightspeedgrants.org·lightspeedgrants.org

Lightspeed Grants is a fast-funding initiative distributing $5M to projects addressing existential risks, particularly AI safety, biosecurity, and improving global decision-making, run by Lightcone Infrastructure with a streamlined multi-funder coordination process.

Metadata

Importance: 42/100homepage

Summary

Lightspeed Grants is a $5M grant program run by Lightcone Infrastructure aimed at funding projects that improve humanity's long-term future, with a focus on AI existential risk reduction, biosecurity, and improving key decision-makers' reasoning. The program uses a novel multi-funder coordination mechanism (S-Process) to align funding decisions while preserving individual funder autonomy. It emphasizes speed as a critical factor, aiming to reduce funding uncertainty that often derails high-impact projects.

Key Points

  • Distributes $5M to projects addressing existential risks, especially AI safety, biosecurity, and improving global decision-making.
  • Uses the S-Process mechanism for philanthropic coordination among multiple funders while preserving individual funder preferences.
  • Prioritizes speed: urgent requests receive responses within 14 days, addressing a key bottleneck for high-impact projects.
  • Run by Lightcone Infrastructure, with evaluators selected for reasoning ability and networks in the longtermist/EA community.
  • Open to projects outside core focus areas; aims to onboard new funders interested in existential risk reduction.

Cited by 1 page

PageTypeQuality
Jaan TallinnPerson53.0

Cached Content Preview

HTTP 200Fetched Apr 9, 20264 KB
Lightspeed Grants 
 
 
 
 This is a draft, please don't share yet.
 -->
 
 
 
 Lightspeed Grants 
 
 
 
 
 
 
 
 Fast funding for projects that help humanity flourish among the stars
 

 
 We are distributing $5,000,000 to projects that make a difference to humanity's future light cone. 
 

 
 Applications close on July 6th 2023, grant decisions will be communicated by August 6th 2023. Urgent requests receive a response within 14 days from application. More info and discussion on the LessWrong announcement .
 
 
 Apply [Applications are closed]
 
 
 
 
 
 
 
 
 
 Who should apply?
 

 
 We are looking for projects that have a chance of substantially changing humanity's future trajectory for the better. The areas in which this kind of change seems most likely to us are:
 

 
 Reducing the probability of existential catastrophe from Artificial Intelligence

 Preventing the development or propagation of novel biological pathogens

 Improving the reasoning of key global decisionmakers

 Uncovering crucial considerations that could shift our understanding of the future

 
 
 Applications that fall outside of these focus areas are still welcome, and we expect to fund a substantial number of projects that don't narrowly fall into any of the categories above.
 

 
 For more information, discussion and detail see also our LessWrong launch post.
 

 How does the Lightspeed Grants process work?

 
 The basic goal of the process is to allow multiple funders to coordinate on funding decisions, while maintaining maximum freedom for each funder to fund whatever projects they are excited about. The process works as follows:
 

 
 Every grant evaluator decides how much value each project creates at various funding levels.

 Funders review the evaluator's reasoning.

 We find an allocation of funds that’s fair and maximizes the funders’ and evaluators’ expressed preferences (using a number of somewhat dubious but probably not too terrible assumptions).

 Funders can adjust how much money they want to distribute after seeing everyone’s evaluations, including fully pulling out.

 
 
 Why?

 We've been doing various forms of grantmaking for many years, being involved in many grantmaking projects like the Survival and Flourishing Fund and the Long Term Future Fund and we think we can do better, both in grant quality and applicant-experience.

 In our experience of running projects aiming to reduce existential risk from AI, we've also found that speed is a key variable that often makes or breaks a project. Waiting on funding, and funding uncertainty, is often the key breaking point for a project. We aim to do better.

 The Lightspeed Grants process also aims to be a way for new funders who want to contribute to humanity's long-term survival and flourishing to learn about good funding opportunities. This seems particularly important given the recent uptick in interest in existential risks from Artificial Intelligence.

 If you are a funder interested in funding applicat

... (truncated, 4 KB total)
Resource ID: ddf336ad95b26cb6