LessWrong is a rationality-focused community blog founded in 2009 that has influenced AI safety discourse, receiving $5M+ in funding and serving as the origin point for ~31% of EA survey respondents in 2014. Survey participation peaked at 3,000+ in 2016, declining to 558 by 2023, with the community increasingly focused on AI alignment discussions.
Facts
4Related Wiki Pages
Top Related Pages
Eliezer Yudkowsky
Co-founder of MIRI, early AI safety researcher and rationalist community founder
Eli Lifland
Forecaster and AI safety researcher specializing in AGI timelines forecasting, scenario planning, and AI governance. Ranks #1 on the RAND Forecasti...
Centre for Effective Altruism
Oxford-based organization that coordinates the effective altruism movement, running EA Global conferences, supporting local groups, and maintaining...
Center for Applied Rationality
Berkeley-based nonprofit organization developing and teaching applied rationality techniques through workshops, with connections to AI safety and e...
Sam Bankman-Fried
American cryptocurrency entrepreneur, founder of FTX and Alameda Research, convicted of fraud and sentenced to 25 years in prison following the col...