Back
Lightcone Infrastructure - Manifund
webmanifund.org·manifund.org/projects/lightcone-infrastructure
This Manifund project page funds Lightcone Infrastructure, which operates LessWrong and Lighthaven—key community hubs for AI safety discourse and researcher coordination, making it tangentially relevant to the AI safety knowledge ecosystem.
Metadata
Importance: 25/100homepage
Summary
A Manifund funding project page for Lightcone Infrastructure, an organization that supports the AI safety research ecosystem by providing physical and operational infrastructure, including hosting the LessWrong platform and organizing events like the Lighthaven campus. The page represents a crowdfunding or regrant opportunity for supporting this research-enabling organization.
Key Points
- •Lightcone Infrastructure maintains key AI safety community platforms including LessWrong and the EA Forum
- •The organization provides physical event spaces and community infrastructure for AI safety researchers
- •Listed on Manifund, a platform for funding effective altruism and AI safety related projects
- •Supports the broader AI safety research ecosystem through operational and logistical infrastructure
- •Represents an indirect but important support layer for AI safety research coordination
Cited by 1 page
| Page | Type | Quality |
|---|---|---|
| Lighthaven (Event Venue) | Organization | 40.0 |
Cached Content Preview
HTTP 200Fetched Apr 7, 202698 KB
Lightcone Infrastructure | Manifund
Dec
JAN
Feb
22
2025
2026
2027
success
fail
About this capture
COLLECTED BY
Collection: Common Crawl
Web crawl data from Common Crawl.
TIMESTAMPS
The Wayback Machine - https://web.archive.org/web/20260122195523/https://manifund.org/projects/lightcone-infrastructure
Manifund
Home
Login
About
People
Categories
Newsletter
Home
About
People
Categories
Login
Create
41
Lightcone Infrastructure
Science & technology
Technical AI safety
AI governance
EA Community Choice
EA community
Forecasting
Global catastrophic risks
Oliver Habryka
Active
Grant
$237,365raised
$1,000,000funding goal
Donate
Sign in to donate
p]:prose-li:my-0 text-gray-900 prose-blockquote:text-gray-600 prose-a:font-light prose-blockquote:font-light font-light break-anywhere empty:prose-p:after:content-["\00a0"]">
TL;DR: Lightcone Infrastructure, the organization behind LessWrong, Lighthaven, the AI 2027 website[1], the AI Alignment Forum, and many other things, needs about $2M to make it through the next year. Donate directly, send me an email, DM, signal message (+1 510 944 3235), or leave a public comment on this post if you want to support what we do. We are a registered 501(c)3 and are IMO the best bet you have for converting money into good futures for humanity.
We build beautiful infrastructure for truth-seeking and world-saving.
The infrastructure we've built over the last 8 years coordinates and facilitates much of the (unfortunately still sparse) global effort that goes into trying to make humanity's long-term future go well. Concretely, we:
build and run LessWrong.com and the AI Alignment Forum
build and run Lighthaven, a ~30,000 sq. ft. campus in downtown Berkeley where we host conferences, researchers, and various programs dedicated to making humanity's future go better
designed and built the websites for AI 2027, If Anyone Builds It, Everyone Dies, AI Lab Watch, and other public communication projects
act as leaders of the rationality, AI safety, and existential risk communities. We run conferences (less.online) and residencies (inkhaven.blog), participate in discussions on various community issues, notice and try to fix bad incentives, build grantmaking infrastructure, help people who want to get involved, and lots of other things.
In general, we try to take responsibility for the end-to-end effectiveness of these communities. If there is some kind of coordination failure, or part of the engine of impact that is missing, I aim for Lightcone to be an organization that jumps in and fixes that, whatever it is.
As far as I can tell, the vast majority of people who have thought seriously about how to reduce existential risk (and have evaluated Lightcone as a donation target) think we are highly cost-effe
... (truncated, 98 KB total)Resource ID:
236572a394d569c5 | Stable ID: sid_THLBNuOiaK