Skip to content

LessWrong

📋Page Status
Page Type:ContentStyle Guide →Standard knowledge base article
Quality:44 (Adequate)⚠️
Importance:25 (Peripheral)
Last edited:2026-01-31 (1 day ago)
Words:1.9k
Structure:
📊 1📈 0🔗 9📚 7321%Score: 11/15
LLM Summary:LessWrong is a rationality-focused community blog founded in 2009 that has influenced AI safety discourse, receiving $5M+ in funding and serving as the origin point for ~31% of EA survey respondents in 2014. Survey participation peaked at 3,000+ in 2016, declining to 558 by 2023, with the community being 75% male and highly secular.
Issues (2):
  • QualityRated 44 but structure suggests 73 (underrated by 29 points)
  • Links2 links could use <R> components
DimensionAssessmentEvidence
Influence on AI SafetyHighMaterial influenced formation of MIRI and CFAR; attracted major tech donors; 31% of EA survey respondents in 2014 first heard of EA through LessWrong Wikipedia, Rationality - LessWrong
Community ScaleMedium2016 survey had ≈3,000 respondents; 2023 survey had 558 respondents; peak 15,000 daily pageviews Wikipedia, 2023 Survey
Funding BaseSubstantialOver $5 million from Survival and Flourishing Fund, Future Fund, and Open Philanthropy combined EA Forum

LessWrong is a community blog and forum focused on discussion of cognitive biases, philosophy, psychology, economics, rationality, and artificial intelligence Wikipedia. Founded in 2009, it has grown to become the primary online hub for what is now called the “rationalist community”—described as “a 21st-century movement that formed around a group of internet blogs, primarily LessWrong and Astral Codex Ten” Rationalist community - Wikipedia.

The platform’s intellectual foundations derive primarily from the writings of AI researcher Eliezer Yudkowsky, whose extensive blog posts on epistemology, cognitive science, and AI alignment were compiled into “The Sequences”—over 300 posts that were later published as the ebook Rationality: From AI to Zombies by MIRI in 2015 Wikipedia. According to the Wikipedia article on the rationalist community, this material represents “basically the mainstream cognitive science approach to rationality (plus some extra stuff about language)” Rationalist community - Wikipedia, though critics argue it reflects a more culturally specific worldview.

The site’s influence extends well beyond its modest user base. In the 2010s, the rationalist community emerged as a significant force in Silicon Valley, attracting donations from tech founders including Elon Musk, Peter Thiel, Vitalik Buterin, Dustin Moskovitz, and Jaan Tallinn to rationalist-associated institutions Rationalist community - Wikipedia. The Sequences’ content directly influenced the formation of both the Machine Intelligence Research Institute (MIRI) and the Center for Applied Rationality (CFAR) Rationality - LessWrong.

LessWrong’s roots trace back to November 2006, when the group blog Overcoming Bias launched with AI researcher Eliezer Yudkowsky and economist Robin Hanson as principal contributors Wikipedia. Some early users were recruited through Yudkowsky’s transhumanist SL4 mailing list A Brief History of LessWrong. Over the next two years, Yudkowsky’s posts accumulated into what would become the foundational texts of the rationalist movement.

One early landmark was the Hanson-Yudkowsky AI-Foom Debate in 2008, which became one of the most influential early discussions on the platform about AI takeoff scenarios—whether advanced AI would emerge gradually or in a rapid “intelligence explosion” LessWrong Wiki.

In February 2009, these posts served as seed material for the launch of LessWrong as a dedicated platform Wikipedia. The site initially ran on a codebase forked from Reddit around 2009 Welcome to LessWrong 2.0, inheriting that platform’s upvote/downvote mechanics and threading system. At its peak, the site attracted over 15,000 pageviews daily A Brief History of LessWrong.

The mid-2010s brought a period of stagnation. In 2015-2016, the site underwent a steady decline in activity, leading some observers to declare it effectively dead Wikipedia. Several factors contributed: key contributors had moved on to other projects, the aging Reddit codebase made development difficult, and much of the community’s energy had dispersed to other venues.

A key part of this dispersion was Scott Alexander, who had written on LessWrong under the pseudonym “Yvain” before launching his blog Slate Star Codex in 2013 Slate Star Codex - Wikipedia. The blog became central to what was called the “LessWrong Diaspora,” drawing significant readership from the rationalist community. In a 2017 survey, Slate Star Codex ranked fourth among how effective altruists first heard about EA, after “personal contact,” “LessWrong,” and “other books, articles and blog posts” Slate Star Codex - Wikipedia.

The platform’s revival came in 2017, when a team led by Oliver Habryka—then in his final years as a CS undergraduate—took over administration and development Wikipedia, Welcome to LessWrong 2.0. Joined by Ben Pace and Matthew Graves, Habryka spearheaded a complete rebuild Welcome to LessWrong 2.0. For the first time, LessWrong had a full-time dedicated development team Wikipedia.

LessWrong 2.0 and Lightcone (2017-Present)

Section titled “LessWrong 2.0 and Lightcone (2017-Present)”

The relaunched platform, dubbed LessWrong 2.0, abandoned the aging Reddit code for a modern stack utilizing React, GraphQL, Slate.js, Vulcan.js, and Meteor Welcome to LessWrong 2.0. The rebuild represented not just technical modernization but a renewed commitment to cultivating high-quality intellectual discourse.

In 2021, Lightcone Infrastructure evolved from the LessWrong team as a distinct organizational entity EA Forum. This organization now operates LessWrong alongside other projects, having received substantial philanthropic support: over $2.3 million from the Survival and Flourishing Fund, $2 million from the Future Fund, and $760,000 from Open Philanthropy EA Forum.

Meanwhile, the broader rationalist blogosphere continued to evolve. After Slate Star Codex was taken down in June 2020 amid a controversy over a planned New York Times article, Scott Alexander launched its successor, Astral Codex Ten (ACX), in January 2021 Slate Star Codex - Wikipedia.

The LessWrong community has been surveyed regularly since 2009, with Scott Alexander initiating the LessWrong Community Census tradition 2023 LessWrong Survey Results. Survey participation has fluctuated significantly over the years, reflecting the community’s trajectory—from 166 responses in 2009, to a peak of over 3,000 in 2016, down to 61 in 2020 (during the site’s transition period and pandemic), and recovering to 558 responses in 2023 2023 LessWrong Survey Results.

The community skews heavily male and secular. According to 2023 survey data, 75% of users identify as cis male and 9.6% as cis female, with the remainder identifying as trans or non-binary 2023 LessWrong Survey Results. Religiously, 48% identify as “atheist and not spiritual” Simulating Religion.

The community maintains significant overlap with effective altruism. A 2014 survey found that 31% of effective altruist respondents had first heard of EA through LessWrong Wikipedia, and a 2016 survey found that 21.7% of LessWrong users (664 out of 3,060) identified as “effective altruists” Wikipedia. Wikipedia describes LessWrong as having “played a significant role in the development of the effective altruism movement, with the two communities being closely intertwined from the beginning” Wikipedia.

The community has also developed distinctive cultural practices, including the Secular Solstice—an annual winter celebration that had spread to seven cities by 2016 Simulating Religion. Survey data indicates that 13% of community members expressed a preference for polyamory Simulating Religion, a practice that has drawn both internal discussion and external criticism.

The Sequences—Yudkowsky’s foundational blog posts from 2006-2009—remain the community’s core intellectual resource. Published as Rationality: A-Z (also known as Rationality: From AI to Zombies), this collection covers topics ranging from Bayesian reasoning and cognitive biases to AI alignment and decision theory Rationality - LessWrong, Wikipedia.

Beyond the Sequences, LessWrong served as a platform for developing and popularizing key concepts in AI safety. The concept of instrumental convergence—the idea that sufficiently advanced AI systems would likely develop certain convergent goals regardless of their final objectives—was first articulated by Steve Omohundro, then formalized by Nick Bostrom in his 2012 paper “The Superintelligent Will,” with LessWrong hosting extensive discussion of these ideas LessWrong Wiki. The technical term “corrigibility” was introduced in a 2015 MIRI/FHI paper co-authored by Yudkowsky, Nate Soares, and others, establishing it as a formal subfield of AI safety research MIRI/LessWrong.

The platform’s ideas have reached mainstream audiences through various channels. Tim Urban’s popular Wait But Why series on AI superintelligence (2015) drew heavily on LessWrong ideas and brought them to millions of readers, with the LessWrong community discussing and responding to the posts LessWrong.

According to Wikipedia, AI safety concerns that were prominent on LessWrong “played a role in the founding of OpenAI, Anthropic, and DeepMind”—with safety being a stated primary concern for OpenAI’s founding, Anthropic founded by researchers who left OpenAI for safety reasons, and DeepMind co-founder Shane Legg being “largely motivated by AI safety” Wikipedia.

LessWrong has faced recurring accusations of cult-like dynamics. Religious scholar Greg Epstein questioned in The New York Times whether the rationalist community constitutes a cult, asking: “When you think about the billions at stake and the radical transformation of lives across the world because of the eccentric vision of this group, how much more cult-y does it have to be for this to be a cult?” Rationalist community - Wikipedia. RationalWiki, a site known for skeptical coverage of topics it considers pseudoscientific or harmful, “essentially accuses [Yudkowsky] of leading a personality cult” RationalWiki.

The community itself has acknowledged issues including “the cult of genius, ingroup-overtrust, insularity, out-of-touchness, lack of rigor, and lack of sharp culture” RationalWiki.

Economist Bryan Caplan criticized the rationality community for “dogmatically embracing consequentialism/utilitarianism despite ‘many well-known, devastating counter-examples’” Econlib. He also argued the community assigns excessive credibility to extraordinary claims about “brain emulations, singularities, living in a simulation, hostile AI” without proportional evidence Econlib.

Tyler Cowen offered a different critique, arguing the rationality community functions “like…religion” by claiming an objective vantage point while actually representing “an extremely culturally specific way of viewing the world” Econlib. Critics have also argued that rationalists lack “a more robust anthropology” and employ problematic reductionism that may not adequately preserve human dignity and purpose Simulating Religion.

The Neoreaction (NRx) movement has been described as “notoriously adjacent to the rationalist community” RationalWiki, though the actual presence of neoreactionaries on LessWrong appears minimal. A 2016 survey found that only 28 out of 3,060 respondents (0.92%) identified as “neoreactionary” Eruditorum Press. Both Yudkowsky and Scott Alexander have explicitly rejected the movement—Yudkowsky dismissed it with the same impatience he showed toward Roko’s Basilisk Eruditorum Press, while Alexander wrote a thirty-thousand-word “Anti-Reactionary FAQ” in 2013 critiquing the neoreactionary movement and the work of Curtis Yarvin Slate Star Codex - Wikipedia.

The broader rationalist and effective altruist movements faced significant reputational damage following the FTX collapse in 2022. Sam Bankman-Fried had been influenced by William MacAskill during a 2012 lunch at MIT, where MacAskill encouraged the “earn to give” approach Effective altruism - Wikipedia. More troublingly, the UK Centre For Effective Altruism’s board of trustees reviewed allegations that Bankman-Fried was engaging in unethical business practices as early as 2018 during his time at Alameda, but took no action Effective altruism - Wikipedia. LessWrong itself had no direct involvement in FTX, but the scandal affected perceptions of the overlapping rationalist and effective altruist communities.

LessWrong today operates as an active platform under Lightcone Infrastructure, with modern web architecture and dedicated development resources. The 2023 survey collected 558 responses 2023 LessWrong Survey Results, indicating a smaller but engaged community compared to the 3,000+ responses of 2016.

The platform continues to serve as a primary venue for technical AI alignment discussions, community surveys, and the ongoing development of rationalist methodology. Its combined funding of over $5 million from major philanthropic sources EA Forum indicates continued institutional support for its mission.