LessWrong
- QualityRated 44 but structure suggests 73 (underrated by 29 points)
- Links2 links could use <R> components
Quick Assessment
Section titled “Quick Assessment”| Dimension | Assessment | Evidence |
|---|---|---|
| Influence on AI Safety | High | Material influenced formation of MIRI and CFAR; attracted major tech donors; 31% of EA survey respondents in 2014 first heard of EA through LessWrong Wikipedia, Rationality - LessWrong |
| Community Scale | Medium | 2016 survey had ≈3,000 respondents; 2023 survey had 558 respondents; peak 15,000 daily pageviews Wikipedia, 2023 Survey |
| Funding Base | Substantial | Over $5 million from Survival and Flourishing Fund, Future Fund, and Open Philanthropy combined EA Forum |
Overview
Section titled “Overview”LessWrong is a community blog and forum focused on discussion of cognitive biases, philosophy, psychology, economics, rationality, and artificial intelligence Wikipedia. Founded in 2009, it has grown to become the primary online hub for what is now called the “rationalist community”—described as “a 21st-century movement that formed around a group of internet blogs, primarily LessWrong and Astral Codex Ten” Rationalist community - Wikipedia.
The platform’s intellectual foundations derive primarily from the writings of AI researcher Eliezer YudkowskyResearcherEliezer YudkowskyComprehensive biographical profile of Eliezer Yudkowsky covering his foundational contributions to AI safety (CEV, early problem formulation, agent foundations) and notably pessimistic views (>90% ...Quality: 35/100, whose extensive blog posts on epistemology, cognitive science, and AI alignment were compiled into “The Sequences”—over 300 posts that were later published as the ebook Rationality: From AI to Zombies by MIRIOrganizationMIRIComprehensive organizational history documenting MIRI's trajectory from pioneering AI safety research (2000-2020) to policy advocacy after acknowledging research failure, with detailed financial da...Quality: 50/100 in 2015 Wikipedia. According to the Wikipedia article on the rationalist community, this material represents “basically the mainstream cognitive science approach to rationality (plus some extra stuff about language)” Rationalist community - Wikipedia, though critics argue it reflects a more culturally specific worldview.
The site’s influence extends well beyond its modest user base. In the 2010s, the rationalist community emerged as a significant force in Silicon Valley, attracting donations from tech founders including Elon Musk, Peter Thiel, Vitalik Buterin, Dustin Moskovitz, and Jaan Tallinn to rationalist-associated institutions Rationalist community - Wikipedia. The Sequences’ content directly influenced the formation of both the Machine Intelligence Research Institute (MIRI) and the Center for Applied Rationality (CFAR) Rationality - LessWrong.
History
Section titled “History”Origins at Overcoming Bias (2006-2009)
Section titled “Origins at Overcoming Bias (2006-2009)”LessWrong’s roots trace back to November 2006, when the group blog Overcoming Bias launched with AI researcher Eliezer Yudkowsky and economist Robin Hanson as principal contributors Wikipedia. Some early users were recruited through Yudkowsky’s transhumanist SL4 mailing list A Brief History of LessWrong. Over the next two years, Yudkowsky’s posts accumulated into what would become the foundational texts of the rationalist movement.
One early landmark was the Hanson-Yudkowsky AI-Foom Debate in 2008, which became one of the most influential early discussions on the platform about AI takeoff scenarios—whether advanced AI would emerge gradually or in a rapid “intelligence explosion” LessWrong Wiki.
In February 2009, these posts served as seed material for the launch of LessWrong as a dedicated platform Wikipedia. The site initially ran on a codebase forked from Reddit around 2009 Welcome to LessWrong 2.0, inheriting that platform’s upvote/downvote mechanics and threading system. At its peak, the site attracted over 15,000 pageviews daily A Brief History of LessWrong.
Decline and Revival (2015-2017)
Section titled “Decline and Revival (2015-2017)”The mid-2010s brought a period of stagnation. In 2015-2016, the site underwent a steady decline in activity, leading some observers to declare it effectively dead Wikipedia. Several factors contributed: key contributors had moved on to other projects, the aging Reddit codebase made development difficult, and much of the community’s energy had dispersed to other venues.
A key part of this dispersion was Scott Alexander, who had written on LessWrong under the pseudonym “Yvain” before launching his blog Slate Star Codex in 2013 Slate Star Codex - Wikipedia. The blog became central to what was called the “LessWrong Diaspora,” drawing significant readership from the rationalist community. In a 2017 survey, Slate Star Codex ranked fourth among how effective altruists first heard about EA, after “personal contact,” “LessWrong,” and “other books, articles and blog posts” Slate Star Codex - Wikipedia.
The platform’s revival came in 2017, when a team led by Oliver Habryka—then in his final years as a CS undergraduate—took over administration and development Wikipedia, Welcome to LessWrong 2.0. Joined by Ben Pace and Matthew Graves, Habryka spearheaded a complete rebuild Welcome to LessWrong 2.0. For the first time, LessWrong had a full-time dedicated development team Wikipedia.
LessWrong 2.0 and Lightcone (2017-Present)
Section titled “LessWrong 2.0 and Lightcone (2017-Present)”The relaunched platform, dubbed LessWrong 2.0, abandoned the aging Reddit code for a modern stack utilizing React, GraphQL, Slate.js, Vulcan.js, and Meteor Welcome to LessWrong 2.0. The rebuild represented not just technical modernization but a renewed commitment to cultivating high-quality intellectual discourse.
In 2021, Lightcone Infrastructure evolved from the LessWrong team as a distinct organizational entity EA Forum. This organization now operates LessWrong alongside other projects, having received substantial philanthropic support: over $2.3 million from the Survival and Flourishing FundSffSFF distributed $141M since 2019 (primarily from Jaan Tallinn's ~$900M fortune), with the 2025 round totaling $34.33M (86% to AI safety). Uses unique S-process mechanism where 6-12 recommenders exp...Quality: 59/100, $2 million from the Future Fund, and $760,000 from Open PhilanthropyOpen PhilanthropyOpen Philanthropy rebranded to Coefficient Giving in November 2025. See the Coefficient Giving page for current information. EA Forum.
Meanwhile, the broader rationalist blogosphere continued to evolve. After Slate Star Codex was taken down in June 2020 amid a controversy over a planned New York Times article, Scott Alexander launched its successor, Astral Codex Ten (ACX), in January 2021 Slate Star Codex - Wikipedia.
Community Demographics and Culture
Section titled “Community Demographics and Culture”The LessWrong community has been surveyed regularly since 2009, with Scott Alexander initiating the LessWrong Community Census tradition 2023 LessWrong Survey Results. Survey participation has fluctuated significantly over the years, reflecting the community’s trajectory—from 166 responses in 2009, to a peak of over 3,000 in 2016, down to 61 in 2020 (during the site’s transition period and pandemic), and recovering to 558 responses in 2023 2023 LessWrong Survey Results.
Demographics
Section titled “Demographics”The community skews heavily male and secular. According to 2023 survey data, 75% of users identify as cis male and 9.6% as cis female, with the remainder identifying as trans or non-binary 2023 LessWrong Survey Results. Religiously, 48% identify as “atheist and not spiritual” Simulating Religion.
The community maintains significant overlap with effective altruism. A 2014 survey found that 31% of effective altruist respondents had first heard of EA through LessWrong Wikipedia, and a 2016 survey found that 21.7% of LessWrong users (664 out of 3,060) identified as “effective altruists” Wikipedia. Wikipedia describes LessWrong as having “played a significant role in the development of the effective altruism movement, with the two communities being closely intertwined from the beginning” Wikipedia.
The community has also developed distinctive cultural practices, including the Secular Solstice—an annual winter celebration that had spread to seven cities by 2016 Simulating Religion. Survey data indicates that 13% of community members expressed a preference for polyamory Simulating Religion, a practice that has drawn both internal discussion and external criticism.
Intellectual Influence
Section titled “Intellectual Influence”The Sequences and Core Ideas
Section titled “The Sequences and Core Ideas”The Sequences—Yudkowsky’s foundational blog posts from 2006-2009—remain the community’s core intellectual resource. Published as Rationality: A-Z (also known as Rationality: From AI to Zombies), this collection covers topics ranging from Bayesian reasoning and cognitive biases to AI alignment and decision theory Rationality - LessWrong, Wikipedia.
Beyond the Sequences, LessWrong served as a platform for developing and popularizing key concepts in AI safety. The concept of instrumental convergence—the idea that sufficiently advanced AI systems would likely develop certain convergent goals regardless of their final objectives—was first articulated by Steve Omohundro, then formalized by Nick BostromResearcherNick BostromComprehensive biographical profile of Nick Bostrom covering his founding of FHI, the landmark 2014 book 'Superintelligence' that popularized AI existential risk, and key philosophical contributions...Quality: 25/100 in his 2012 paper “The Superintelligent Will,” with LessWrong hosting extensive discussion of these ideas LessWrong Wiki. The technical term “corrigibility” was introduced in a 2015 MIRI/FHI paper co-authored by Yudkowsky, Nate Soares, and others, establishing it as a formal subfield of AI safety research MIRI/LessWrong.
Broader Impact
Section titled “Broader Impact”The platform’s ideas have reached mainstream audiences through various channels. Tim Urban’s popular Wait But Why series on AI superintelligence (2015) drew heavily on LessWrong ideas and brought them to millions of readers, with the LessWrong community discussing and responding to the posts LessWrong.
According to Wikipedia, AI safety concerns that were prominent on LessWrong “played a role in the founding of OpenAILabOpenAIComprehensive organizational profile of OpenAI documenting evolution from 2015 non-profit to commercial AGI developer, with detailed analysis of governance crisis, safety researcher exodus (75% of ...Quality: 46/100, AnthropicLabAnthropicComprehensive profile of Anthropic tracking its rapid commercial growth (from $1B to $7B annualized revenue in 2025, 42% enterprise coding market share) alongside safety research (Constitutional AI...Quality: 51/100, and DeepMindLabGoogle DeepMindComprehensive overview of DeepMind's history, achievements (AlphaGo, AlphaFold with 200M+ protein structures), and 2023 merger with Google Brain. Documents racing dynamics with OpenAI and new Front...Quality: 37/100”—with safety being a stated primary concern for OpenAI’s founding, Anthropic founded by researchers who left OpenAI for safety reasons, and DeepMind co-founder Shane Legg being “largely motivated by AI safety” Wikipedia.
Criticisms and Controversies
Section titled “Criticisms and Controversies”Cult Allegations
Section titled “Cult Allegations”LessWrong has faced recurring accusations of cult-like dynamics. Religious scholar Greg Epstein questioned in The New York Times whether the rationalist community constitutes a cult, asking: “When you think about the billions at stake and the radical transformation of lives across the world because of the eccentric vision of this group, how much more cult-y does it have to be for this to be a cult?” Rationalist community - Wikipedia. RationalWiki, a site known for skeptical coverage of topics it considers pseudoscientific or harmful, “essentially accuses [Yudkowsky] of leading a personality cult” RationalWiki.
The community itself has acknowledged issues including “the cult of genius, ingroup-overtrust, insularity, out-of-touchness, lack of rigor, and lack of sharp culture” RationalWiki.
Philosophical Critiques
Section titled “Philosophical Critiques”Economist Bryan Caplan criticized the rationality community for “dogmatically embracing consequentialism/utilitarianism despite ‘many well-known, devastating counter-examples’” Econlib. He also argued the community assigns excessive credibility to extraordinary claims about “brain emulations, singularities, living in a simulation, hostile AI” without proportional evidence Econlib.
Tyler Cowen offered a different critique, arguing the rationality community functions “like…religion” by claiming an objective vantage point while actually representing “an extremely culturally specific way of viewing the world” Econlib. Critics have also argued that rationalists lack “a more robust anthropology” and employ problematic reductionism that may not adequately preserve human dignity and purpose Simulating Religion.
Adjacent Movements
Section titled “Adjacent Movements”The Neoreaction (NRx) movement has been described as “notoriously adjacent to the rationalist community” RationalWiki, though the actual presence of neoreactionaries on LessWrong appears minimal. A 2016 survey found that only 28 out of 3,060 respondents (0.92%) identified as “neoreactionary” Eruditorum Press. Both Yudkowsky and Scott Alexander have explicitly rejected the movement—Yudkowsky dismissed it with the same impatience he showed toward Roko’s Basilisk Eruditorum Press, while Alexander wrote a thirty-thousand-word “Anti-Reactionary FAQ” in 2013 critiquing the neoreactionary movement and the work of Curtis Yarvin Slate Star Codex - Wikipedia.
FTX Fallout
Section titled “FTX Fallout”The broader rationalist and effective altruist movements faced significant reputational damage following the FTX collapse in 2022. Sam Bankman-Fried had been influenced by William MacAskill during a 2012 lunch at MIT, where MacAskill encouraged the “earn to give” approach Effective altruism - Wikipedia. More troublingly, the UK Centre For Effective Altruism’s board of trustees reviewed allegations that Bankman-Fried was engaging in unethical business practices as early as 2018 during his time at Alameda, but took no action Effective altruism - Wikipedia. LessWrong itself had no direct involvement in FTX, but the scandal affected perceptions of the overlapping rationalist and effective altruist communities.
Current Status
Section titled “Current Status”LessWrong today operates as an active platform under Lightcone Infrastructure, with modern web architecture and dedicated development resources. The 2023 survey collected 558 responses 2023 LessWrong Survey Results, indicating a smaller but engaged community compared to the 3,000+ responses of 2016.
The platform continues to serve as a primary venue for technical AI alignment discussions, community surveys, and the ongoing development of rationalist methodology. Its combined funding of over $5 million from major philanthropic sources EA Forum indicates continued institutional support for its mission.
Sources and Further Reading
Section titled “Sources and Further Reading”Primary Sources
Section titled “Primary Sources”- LessWrong - Main Site
- Rationality: A-Z
- A Brief History of LessWrong
- Welcome to LessWrong 2.0
- 2023 LessWrong Survey Results
- The Hanson-Yudkowsky AI-Foom Debate
Reference Sources
Section titled “Reference Sources”- LessWrong - Wikipedia
- Rationalist community - Wikipedia
- Slate Star Codex - Wikipedia
- Effective altruism - Wikipedia
- LessWrong - EA Forum Topic