Longterm Wiki
Updated 2026-03-13HistoryData
Citations verified9 accurate7 flagged18 unchecked
Page StatusContent
Edited today3.4k words12 backlinksUpdated every 6 monthsDue in 26 weeks
62QualityGood •84.5ImportanceHigh34.5ResearchLow
Summary

Berkeley nonprofit founded 2012 teaching applied rationality through workshops (\$3,900 for 4.5 days), trained 1,300+ alumni reporting 9.2/10 satisfaction and 0.17σ life satisfaction increase at 1-year follow-up. Received \$3.5M+ from Coefficient Giving (formerly Open Philanthropy) and \$5M from FTX (later clawed back); faced major controversies over abuse allegations handling and cult-like dynamics, now operating with 8 part-time staff after multi-year hiatus.

Content6/13
LLM summaryScheduleEntityEdit history1Overview
Tables4/ ~13Diagrams0/ ~1Int. links15/ ~27Ext. links4/ ~17Footnotes0/ ~10References30/ ~10Quotes16/34Accuracy16/34RatingsN:2 R:6.5 A:1.5 C:7.5Backlinks12
Change History1
Surface tacticalValue in /wiki table and score 53 pages3 weeks ago

Added `tacticalValue` to `ExploreItem` interface, `getExploreItems()` mappings, the `/wiki` explore table (new sortable "Tact." column), and the card view sort dropdown. Scored 49 new pages with tactical values (4 were already scored), bringing total to 53.

sonnet-4 · ~30min

Issues2
QualityRated 62 but structure suggests 87 (underrated by 25 points)
Links3 links could use <R> components

Center for Applied Rationality

Organization

Center for Applied Rationality

Berkeley nonprofit founded 2012 teaching applied rationality through workshops (\$3,900 for 4.5 days), trained 1,300+ alumni reporting 9.2/10 satisfaction and 0.17σ life satisfaction increase at 1-year follow-up. Received \$3.5M+ from Coefficient Giving (formerly Open Philanthropy) and \$5M from FTX (later clawed back); faced major controversies over abuse allegations handling and cult-like dynamics, now operating with 8 part-time staff after multi-year hiatus.

3.4k words · 12 backlinks

Quick Assessment

DimensionAssessment
TypeNonprofit organization (501(c)(3))
Founded2012
LocationBerkeley, California (now mostly remote)
Primary FocusTeaching rationality techniques through workshops and programs
Key PeopleAnna Salamon (President), Julia Galef, Andrew Critch, Michael Smith (co-founders)
Major FundersCoefficient Giving ($3.5M+), Survival and Flourishing Fund ($1.6M+)
StatusActive with reduced operations; resumed workshops in 2025 after hiatus
SourceLink
Official Websiterationality.org
Wikipediaen.wikipedia.org
LessWronglesswrong.com
EA Forumforum.effectivealtruism.org

Overview

The Center for Applied Rationality (CFAR) is a nonprofit organization founded in 2012 that develops and teaches techniques to improve epistemic and instrumental rationality through immersive workshops, coaching programs, and curriculum development. The organization emerged from the rationality movement around Eliezer Yudkowsky's LessWrong community and focuses on applying insights from cognitive science, behavioral economics, psychology, and decision theory to real-world decision-making and problem-solving.12

CFAR's stated mission is "Developing clear thinking for the sake of humanity's future," with particular emphasis on addressing challenges related to existential risks, including AI safety.3 The organization has trained over 1,300 workshop alumni through approximately 60 flagship workshops held between 2012 and early 2020, with participants reporting an average satisfaction rating of 9.2 out of 10.45 Beyond its core workshop programs, CFAR has contributed to the broader rationality and effective altruism communities through specialized programs, curriculum development, and connections to organizations working on AI alignment and existential risk reduction.

The organization has evolved significantly since its founding, transitioning from a full-time staff of approximately 12 to a mostly-remote operation with eight part-time curriculum developers and instructors as of September 2025.6 After a multi-year hiatus from regular programming, CFAR resumed mainline workshops in November 2025, testing new formats while continuing to refine its approach to teaching applied rationality.7

History and Founding

CFAR was founded in 2012 by Anna Salamon, Julia Galef, Michael Smith (also known as Valentine Smith), and Andrew Critch.18 The organization's origins trace to Anna Salamon's work at the Machine Intelligence Research Institute (MIRI) in 2011, where she began experimenting with teaching rationality techniques.9 CFAR initially functioned as an extension of MIRI before becoming an independent nonprofit organization.9

The founders shared a common observation that intelligence, education, and good intentions were insufficient to guarantee sound decision-making. According to co-founder Julia Galef, they were motivated by the realization that "being smart, and being well educated and even being really well intentioned was far from a guarantee from making what turned out to be really stupid decisions."9 All four founders brought strong backgrounds in mathematics, artificial intelligence, and science to the organization's development.

Early Development and Growth

CFAR officially began offering classes in 2012, developing a flagship workshop model that charged $3,900 for multi-day intensive programs.9 The organization held monthly workshops training diverse participants including scientists, police officers, teachers, entrepreneurs, and high school students.10 Workshop activity varied by year:11

  • 2013: 7 workshops
  • 2014: 9 workshops (peak year)
  • 2015: 4 workshops
  • 2017: 8 workshops (including specialized workshops for MIRI researchers and Effective Altruism Global participants)

Leadership Transitions

Julia Galef served as CFAR's initial president until 2016, when Anna Salamon assumed the role.1 Salamon has continued as president through at least 2022, with Jesse Liptrap and Michael Blume serving on the board of directors.1 The organization's advisors have included Paul Slovic and Keith Stanovich.1

Organizational Evolution

In 2019, CFAR spun off the European Summer Program on Rationality (ESPR) into a separate organization run by Jan Kulveit, marking a shift in how the organization managed related programs.11 By September 2025, CFAR had transitioned to a mostly-remote operation with a significantly reduced staff, theorizing that part-time work might help avoid organizational pitfalls while maintaining curriculum development quality.6

After a hiatus from regular programming beginning in early 2020, CFAR conducted an experimental mini-workshop in June 2025 at Arbor Summer Camp and resumed mainline workshops in November 2025, marking the beginning of a pilot program testing new workshop formats.712

Core Programs and Methodology

Workshop Model

CFAR's primary educational delivery mechanism has been immersive multi-day workshops, typically running 4.5 days and costing $3,900 to $4,000.413 These workshops emphasize three core pillars: applying critical thinking to real-world problems, having students practice skills rather than merely learn concepts, and building lasting habits.10

The organization has offered scholarship programs, including funding from Jaan Tallinn (Skype co-founder) for Estonian students.1 CFAR has also provided specialized training for organizations including Facebook and participants in the Thiel Fellowship.1

Rationality Techniques

CFAR develops and refines rationality techniques by synthesizing insights from cognitive science, psychology, neuroscience, behavioral economics, mathematics, statistics, and game theory.14 The organization conducts empirical testing of techniques and invents new methods when academic research proves insufficient for practical application.14

Key techniques taught at CFAR workshops include:1415

TechniquePurpose
Double CruxImproves collaboration and conceptualizing research questions
Goal FactoringAddresses mismatches between goals and plans
FocusingModels second-to-second cognition
Resolve CyclesBuilds motivation and action
Murphy-JitsuPrepares for obstacles through pre-mortems
Trigger-Action PlansImproves habit formation and research practices
Urge PropagationEnhances understanding of motivation and decision-making

According to CFAR, these techniques aim to bridge System 1 and System 2 cognition, helping participants work with emotions, reframe problems, and understand the modular nature of the mind.16

Specialized Programs

Beyond flagship workshops, CFAR has developed targeted programs for specific communities:

  • SPARC (Summer Program on Applied Rationality and Cognition): Annual summer programs, including events in 2019 and funding from Coefficient Giving ($304,000 over two years in 2016).11
  • MIRI-focused workshops: Specialized sessions for AI safety researchers, including the 2015 MIRI Summer Fellows Program (a 3-week training program for AI safety researchers).14
  • Alumni support: Six-week official follow-up programs and ongoing coaching/mentorship for workshop graduates.17

Reported Impact and Effectiveness

Self-Reported Outcomes

CFAR has collected extensive survey data from workshop participants, reporting several positive outcomes:14

MetricResultTimeframe
Neuroticism decreaseStatistically significantPost-workshop
Self-efficacy increaseMarked (though less significant than neuroticism decrease)Post-workshop
Life satisfaction increase0.17σ1 year after participation
Overall satisfaction9.2/10 average ratingPost-workshop

The organization's 2017 Impact Report indicated that alumni reported higher impact through better motivation navigation and clearer thinking.18 However, these results are primarily based on internal surveys without independent replication or external validation.19

Contributions to AI Safety and Effective Altruism

CFAR has documented case studies of alumni contributions to existential risk reduction and AI safety work. Notable examples include:20

  • Scott Garrabrant: Joined MIRI in 2015 after participating in CFAR/MIRI programs; contributed to logical induction research
  • Nate Soares: MIRI researcher who credited techniques like Double Crux with improving research collaboration
  • Ben Hoffman: Contributed to effective altruism and rationality writing, as well as AI risk modeling

The organization has positioned itself as contributing to the social infrastructure and individual skill development of communities working on AI safety and existential risk reduction, though direct causal impact is difficult to attribute.6

Limitations and Criticisms of Impact Claims

CFAR's impact evidence relies heavily on self-reported participant surveys and case studies rather than rigorous experimental design with control groups. A 2016 article in VICE noted that while CFAR's workshops had measurable effects on participants, the organization's approach had both strengths and flaws.1 The lack of recent external evaluations and reliance on internal data collection methods limit the strength of effectiveness claims.19

Funding and Financial Structure

Major Funding Sources

As of June 2022, CFAR had received substantial funding from organizations aligned with effective altruism and existential risk reduction:1

  • Coefficient Giving: Over $3.5 million
    • $1,035,000 grant in 2016 (two years) for operational improvements and research21
    • Multi-year institutional grant renewed in 2018 for 2018-201922
  • Survival and Flourishing Fund: Over $1.6 million
  • Effective Altruism Funds: Over $300,000
    • $150,000 from Long-Term Future Fund in April 2019 (addressing funding shortfall after 2018 controversy)11
    • Additional $150,000 recommended in August 201911

FTX Funding and Collapse

CFAR received substantial funding from FTX before the cryptocurrency exchange's collapse in November 2022. Between March and September 2022, $3,405,000 was transferred from the FTX Foundation to CFAR, with an additional $1,500,000 transferred on October 3, 2022, in ten separate transactions.23 Following FTX's collapse, FTX debtors required CFAR to return approximately $5 million.23

Financial Position

As of December 2019 or 2020, CFAR reported approximately $1.4 million in total liquidity, including $650,000 in cash, $575,000 in expected grants, and $215,000 in accounts receivable.24 The organization's total assets have been reported at $23,039,646, though this figure may include the venue property acquired in 2018.25

In 2018, CFAR's fundraising efforts (including the Coefficient Giving renewal) totaled over $2.5 million and enabled the organization to make a down payment on a permanent venue, reducing costs and expanding program capacity.22

Connection to AI Safety and Existential Risk

CFAR is categorized as an existential risk organization with explicit connections to existential risk from artificial intelligence.1 The organization originated from the rationality movement around LessWrong, which pioneered discussions of AI alignment and existential risks.1

While CFAR does not operate explicit AI alignment research programs, the organization's focus on improving individual and collective reasoning capacity is positioned as supporting the broader AI safety ecosystem. The rationality techniques taught at CFAR workshops—such as Bayesian reasoning, debiasing methods, and systematic decision-making frameworks—are presented as relevant to challenges in AI safety work, including addressing scope insensitivity and improving collaborative research practices.2627

Co-founder Andrew Critch has been involved in multiple AI safety organizations beyond CFAR, including the Center for Human-Compatible AI and FAR.AI, illustrating the overlap between CFAR's rationality-focused mission and AI safety work.1 The organization has conducted specialized workshops for MIRI researchers and other AI safety communities, though the scale and ongoing nature of these programs is unclear given CFAR's reduced operations.14

Controversies and Criticisms

Handling of Abuse Allegations (Brent Case)

In 2018-2019, CFAR faced significant criticism for its handling of allegations against a community member known as Brent (likely Brent Hildebrand). In January 2018, CFAR's semi-independent investigation panel, ACDC, received allegations of physical, sexual, and emotional abuse from one of Brent's former partners.28 Despite these allegations, ACDC recommended against banning the individual in April 2018, and CFAR leadership followed this recommendation.28

CFAR later publicly acknowledged serious failures in this case, stating they "had sufficient evidence long beforehand to notice that Brent might be harmful" and that their "failure to investigate this hypothesis explicitly was a mistake."28 The organization allowed Brent to attend and assist at several CFAR events, which "afforded him social legitimacy and caused significant harm in expectation."28 Staff members reported feeling manipulated by Brent, and CFAR acknowledged that as an organization "which exists to promote epistemic integrity," they should be held to "an especially high bar" on such matters.28

Following public criticism in September 2018, CFAR disbanded ACDC, determining that "the panelists were in over their heads."28 The organization issued a detailed public apology acknowledging that their safety procedures and investigation processes were inadequate and that the harm was preventable.28

The Brent controversy led to CFAR not holding a fundraiser in 2018, creating a funding shortfall that was later addressed by emergency grants from the Long-Term Future Fund in 2019.11

"Zizians" Incident and Associated Violence

On November 15, 2019, four individuals identifying as "Zizians" (a rationalist splinter group wearing Guy Fawkes masks) were arrested for barricading a CFAR wooded retreat event in Sonoma County.1 The protesters accused CFAR's leader of discriminating against trans women and failing to develop novel rationality techniques.1

According to Wikipedia, alleged Zizian members were later linked to serious violent crimes: an attempted murder in November 2022, and four murders in 2022 and 2025, including the killing of a U.S. Border Patrol officer in a shootout, with two of the alleged members dying violently.1 While these crimes occurred after the 2019 protest and the connection between the protesters and the later violent incidents is not fully detailed in available sources, the association has contributed to negative perceptions of elements within the broader rationality community.

Organizational Culture Concerns

In a 2025 interview with NBC News, CFAR president and co-founder Anna Salamon stated: "We didn't know at the time, but in hindsight we were creating conditions for a cult."23 Salamon characterized the organization's implicit messaging as suggesting that "human thinking is flawed and biased," with the exception that "ours [is unique]. We have a unique method of seeing things clearly"—representing what she described as an overestimation of their own competence.23

Critics have described CFAR workshops as "vaguely culty," blending cognitive science with elements reminiscent of self-help movements and featuring rhetoric about humans as fundamentally flawed beings needing systematic fixes.13 The high cost of workshops ($3,900 for 4 days) has also drawn criticism, with some comparing CFAR's claims about cognitive improvement to Lumosity's claims that resulted in an FTC fine for making unfounded assertions.13

Effectiveness and Institutional Incentives

Some rationality community members have criticized CFAR for limited progress in creating a "real art of rationality" that aids meaningful intellectual advancement, citing "model gaps," impure motives (such as prioritizing AI safety career placement and deference to organizations like MIRI over pure rationality development), and institutional incentives that may distort efforts.29 A perceived tension exists between epistemic rationality (exemplified by the LessWrong principle "Politics is the Mind-Killer") and real-world decision-making contexts where instructors may hold preconceptions about desirable outcomes.29

Broader Effective Altruism Funding Structure

CFAR's funding has been part of a larger effective altruism ecosystem characterized by centralized funding from major grantmakers like Coefficient Giving. According to a 2024 analysis, this structure created "strong incentives to align with funder worldviews to get money," turning what was originally a niche intellectual community into a career track with concentrated power among funders while providing "thin governance/guardrails."23 Bloomberg characterized the broader EA movement as featuring "a culture of free-flowing funding with minimal accountability focused on selective universities."23

Relationship to Rationality and EA Communities

CFAR occupies a central position in the rationality movement and has strong ties to the effective altruism community. The organization originated directly from the LessWrong community and broader rationality movement associated with Eliezer Yudkowsky and the blog Overcoming Bias.127

Community Infrastructure

CFAR has contributed to building social infrastructure within rationality and EA communities through workshops, alumni networks, and partnerships with organizations like Effective Altruism Global.30 The organization has served as a gathering point for "rationality geeks" to exchange ideas on cognitive improvement and has helped connect individuals interested in existential risk reduction, AI safety, and effective altruism.10

Community-led initiatives have emerged from CFAR's alumni network. The Applied Rationality Unconference series, organized by CFAR alumni, began with a first retreat in 2023, followed by a second in November 2024 that opened participation beyond CFAR alumni.31 Multiple additional events were organized throughout 2025, including the "Blackpool Applied Rationality Unconference," representing unofficial, participant-organized versions of CFAR's official workshops.31

Current Community Role

As of recent reports, CFAR appears to have reduced its role as a central organizing force within the rationality community, though "shards of the organisation still help with AIRCS workshops."32 According to LessWrong discussions, there is no longer "a concentration of force working towards a public accessible rationality curriculum."32 The organization's shift to part-time remote operations and multi-year hiatus from regular programming has reduced its direct community presence, though the resumption of workshops in 2025 may signal renewed engagement.67

Organizational Challenges and Failures

CFAR has publicly acknowledged several organizational failures and challenges throughout its history:

Early Scaling Difficulties

In 2012-2013, CFAR's early attempts at exponential growth of workshops failed due to small-organization logistics challenges.17 Expanding the instructor base risked what the organization termed "failure mode #1"—enthusiastic but inadequately trained instructors teaching poor-quality classes and potentially damaging rationality's reputation.17 Initial scaling efforts with 2011-era knowledge produced subpar lessons that required significant iteration and refinement.17

Workshop Design Issues

Early workshops (circa May 2012) were criticized as too short and intense, overwhelming participants' capacity to digest ideas.17 CFAR responded by developing informal Skype chat follow-ups, which were later formalized into a six-week official support program.17

Limited Domain-Specific Impact

As of 2012 and continuing forward, CFAR has faced challenges in scaling "heavy-duty rationality skills" into critical fields like medicine, education, government, or software development.17 With capacity for approximately 20 participants per month in workshop programs, the organization's direct reach has remained limited relative to the scale of impact in these sectors.17 Health care professionals have been notably absent from workshops despite the sector's potential need for improved decision-making frameworks.17

Effective Fundraising Project

In 2013, CFAR highlighted as an example of "wise failure" the "impressively failed" Effective Fundraising Project—a two-person nonprofit startup founded in 2013 to write grants for effective charities that shut down gracefully after six months.3334 While CFAR presented this as a case study in productive failure, it illustrates challenges in expanding the organization's impact model beyond direct workshop delivery.

Staff Transition and Organizational Form

The organization's transition from approximately 12 full-time staff to eight very-part-time instructors and curriculum developers represents a significant scaling back of operations.6 While CFAR has theorized that part-time work may help avoid organizational pitfalls like burnout or "going off the rails," this shift also reflects challenges in maintaining a sustainable full-time organizational model.6

Current Status and Recent Developments

As of September 2025, CFAR operates primarily as a remote organization with eight part-time curriculum developers and instructors, plus part-time staff for accounting and venue management.6 This represents a significant reduction from the organization's earlier staffing levels of approximately 12 full-time employees.6

Workshop Resumption

After a hiatus from regular programming beginning in early 2020, CFAR conducted several experimental programs in 2025:712

  • June 2025: Experimental mini-workshop at Arbor Summer Camp, teaching rationality material alongside other content. This pilot project expanded from a single instructor to multiple instructors and marked the first "workshop-like" content CFAR had presented in several years.12
  • November 2025: First mainline workshop since 2022, representing the beginning of a pilot program testing new workshop formats.7

According to CFAR's updates, the June 2025 mini-workshop energized staff to pursue further programming and provided valuable learning experiences for curriculum refinement.12

Leadership and Governance

Anna Salamon continues to serve as CFAR's president as of the most recent updates, with Jesse Liptrap and Michael Blume on the board of directors.1 The organization maintains its 501(c)(3) nonprofit status and continues to operate with a focus on curriculum development and testing techniques through workshops and online sessions.6

Hiring and Operations

CFAR has described itself as "low-key" in its hiring approach, being selective about adding new part-time team members while focusing on iterative refinement of rationality techniques.6 The organization runs seminars, visitor programs, and tests techniques on volunteers as part of its ongoing curriculum development work.30

Key Uncertainties

Several important questions about CFAR's effectiveness, impact, and future remain uncertain or inadequately documented:

  1. Causal impact on AI safety: While CFAR has documented case studies of alumni who work in AI safety, the causal contribution of CFAR training to their work is difficult to isolate from other factors like selection effects and broader community influences.

  2. Effectiveness of specific techniques: The organization's reported outcomes rely primarily on self-reported participant surveys without independent replication or rigorous experimental designs with control groups. The extent to which specific rationality techniques produce lasting behavioral and cognitive changes remains empirically undervalidated.

  3. Scalability limitations: CFAR's challenges in expanding beyond workshop-based delivery and reaching critical professional domains (medicine, policy, education) raise questions about whether the organization's model can achieve broad societal impact.

  4. Organizational sustainability: The transition to part-time remote operations and multi-year hiatus from regular programming may reflect either a strategic refinement of approach or ongoing challenges in maintaining a sustainable organizational model.

  5. Community influence measurement: While CFAR clearly influenced the rationality and EA communities, the magnitude and longevity of this influence—particularly after reduced operations—is difficult to quantify.

  6. Future trajectory: Whether the 2025 workshop resumption represents a sustained return to programming or a limited pilot effort remains to be seen, as does the organization's long-term strategy for curriculum development and community engagement.

Sources

Footnotes

  1. Claim reference cr-d4ee (data unavailable — rebuild with wiki-server access) 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18

  2. Claim reference cr-08b4 (data unavailable — rebuild with wiki-server access)

  3. Claim reference cr-d2e7 (data unavailable — rebuild with wiki-server access)

  4. Claim reference cr-6d9c (data unavailable — rebuild with wiki-server access) 2 3

  5. Claim reference cr-3775 (data unavailable — rebuild with wiki-server access)

  6. Claim reference cr-b201 (data unavailable — rebuild with wiki-server access) 2 3 4 5 6 7 8 9 10

  7. Claim reference cr-2291 (data unavailable — rebuild with wiki-server access) 2 3 4 5

  8. Claim reference cr-12a8 (data unavailable — rebuild with wiki-server access)

  9. Claim reference cr-f4f0 (data unavailable — rebuild with wiki-server access) 2 3 4

  10. YouTube - CFAR OverviewYouTube - CFAR Overview 2 3

  11. Timeline of Center for Applied RationalityTimeline of Center for Applied Rationality 2 3 4 5 6

  12. Claim reference cr-e3e0 (data unavailable — rebuild with wiki-server access) 2 3 4

  13. Claim reference cr-c76a (data unavailable — rebuild with wiki-server access) 2 3

  14. Claim reference cr-8fa8 (data unavailable — rebuild with wiki-server access) 2 3 4 5

  15. Effective Altruism - EA Global 2018 CFAR WorkshopEffective Altruism - EA Global 2018 CFAR Workshop

  16. Claim reference cr-277b (data unavailable — rebuild with wiki-server access)

  17. Claim reference cr-0381 (data unavailable — rebuild with wiki-server access) 2 3 4 5 6 7 8 9

  18. Claim reference cr-4943 (data unavailable — rebuild with wiki-server access)

  19. Claim reference cr-0a57 (data unavailable — rebuild with wiki-server access) 2

  20. CFAR 2016 Case StudiesCFAR 2016 Case Studies

  21. Claim reference cr-759a (data unavailable — rebuild with wiki-server access)

  22. Claim reference cr-4809 (data unavailable — rebuild with wiki-server access) 2

  23. Claim reference cr-69f9 (data unavailable — rebuild with wiki-server access) 2 3 4 5 6

  24. Claim reference cr-f6d7 (data unavailable — rebuild with wiki-server access)

  25. Claim reference cr-3ba5 (data unavailable — rebuild with wiki-server access)

  26. Claim reference cr-9e18 (data unavailable — rebuild with wiki-server access)

  27. Claim reference cr-b4d8 (data unavailable — rebuild with wiki-server access) 2

  28. Claim reference cr-e7bd (data unavailable — rebuild with wiki-server access) 2 3 4 5 6 7

  29. Claim reference cr-5fa7 (data unavailable — rebuild with wiki-server access) 2

  30. Claim reference cr-03a0 (data unavailable — rebuild with wiki-server access) 2

  31. Claim reference cr-c61b (data unavailable — rebuild with wiki-server access) 2

  32. Claim reference cr-88a5 (data unavailable — rebuild with wiki-server access) 2

  33. Claim reference cr-38be (data unavailable — rebuild with wiki-server access)

  34. Claim reference cr-3b4b (data unavailable — rebuild with wiki-server access)

References

Claims (1)
As of December 2019 or 2020, CFAR reported approximately \$1.4 million in total liquidity, including \$650,000 in cash, \$575,000 in expected grants, and \$215,000 in accounts receivable. The organization's total assets have been reported at \$23,039,646, though this figure may include the venue property acquired in 2018.
Unsupported0%Feb 22, 2026
Center for Applied Rationality, based in Berkeley, CA, is a public charity with assets of $23,039,646.

The source does not contain any information about CFAR's liquidity, cash, expected grants, or accounts receivable.

2LessWrong - What is Going on with CFARlesswrong.com·Blog post
Claims (1)
As of recent reports, CFAR appears to have reduced its role as a central organizing force within the rationality community, though "shards of the organisation still help with AIRCS workshops." According to LessWrong discussions, there is no longer "a concentration of force working towards a public accessible rationality curriculum." The organization's shift to part-time remote operations and multi-year hiatus from regular programming has reduced its direct community presence, though the resumption of workshops in 2025 may signal renewed engagement.
Accurate90%Feb 22, 2026
I think the conclusion I take from it is ~"There's a bunch of individual people who were involved with CFAR still doing interesting stuff, but there is no such public organisation anymore in a meaningful sense (although shards of the organisation still help with AIRCS workshops); so you have to follow these individual people to find out what they're up to. Also, there is no concentration of force working towards a public accessible rationality curriculum anymore."
Claims (1)
In 2013, CFAR highlighted as an example of "wise failure" the "impressively failed" Effective Fundraising Project—a two-person nonprofit startup founded in 2013 to write grants for effective charities that shut down gracefully after six months. While CFAR presented this as a case study in productive failure, it illustrates challenges in expanding the organization's impact model beyond direct workshop delivery.
4Andrew Critch - CFARacritch.com
Claims (1)
The rationality techniques taught at CFAR workshops—such as Bayesian reasoning, debiasing methods, and systematic decision-making frameworks—are presented as relevant to challenges in AI safety work, including addressing scope insensitivity and improving collaborative research practices.
Minor issues85%Feb 22, 2026
At CFAR, we ask: Can we do more for the world by learning about cognitive biases like scope insensitivity that might thwart our attempts to make altruistic decisions? Can we get more use out of our gut instincts by learning what their strengths and weaknesses are? Can playing cooperative games with intuitive Bayesian reasoning improve our ability to assess arguments and reason collectively in groups?

The source mentions CFAR workshops teaching about cognitive biases like scope insensitivity, but does not explicitly list Bayesian reasoning, debiasing methods, and systematic decision-making frameworks as techniques taught at CFAR workshops. The source does not explicitly state that the techniques taught at CFAR workshops are relevant to challenges in AI safety work or improving collaborative research practices.

Claims (1)
- \$1,035,000 grant in 2016 (two years) for operational improvements and research
Claims (1)
CFAR's primary educational delivery mechanism has been immersive multi-day workshops, typically running 4.5 days and costing \$3,900 to \$4,000. These workshops emphasize three core pillars: applying critical thinking to real-world problems, having students practice skills rather than merely learn concepts, and building lasting habits.
Minor issues85%Feb 22, 2026
I say this because CFAR's $3,900 4-day seminars and their attendees are the subject of a New York Times magazine article this week that will leave you slapping your forehead in front of an imagined group of conference-goers.

The article states the seminars are 4 days, not 4.5 days. The article only mentions the price of the seminars as $3,900, not a range of $3,900 to $4,000. The article does not mention the three core pillars of the workshops.

Claims (1)
Some rationality community members have criticized CFAR for limited progress in creating a "real art of rationality" that aids meaningful intellectual advancement, citing "model gaps," impure motives (such as prioritizing AI safety career placement and deference to organizations like MIRI over pure rationality development), and institutional incentives that may distort efforts. A perceived tension exists between epistemic rationality (exemplified by the LessWrong principle "Politics is the Mind-Killer") and real-world decision-making contexts where instructors may hold preconceptions about desirable outcomes.
Accurate90%Feb 22, 2026
I suspect that "impure motives" (motives aimed at some local goal, and not simply at "help this mind be free and rational") were also a major contributor to what kept us from getting farther at CFAR, and that this interacted with and exacerbated the "model gaps" I was listing in hypothesis 1.
Claims (1)
Key techniques taught at CFAR workshops include:
Unsupported0%Feb 22, 2026
We put on four and a half day rationality workshops, and also some targeted programs for groups like AI researchers, mathematicians, so on and so forth.
Claims (1)
CFAR has contributed to building social infrastructure within rationality and EA communities through workshops, alumni networks, and partnerships with organizations like Effective Altruism Global. The organization has served as a gathering point for "rationality geeks" to exchange ideas on cognitive improvement and has helped connect individuals interested in existential risk reduction, AI safety, and effective altruism.
Unsupported20%Feb 22, 2026
Main workshops & research - cfar has performed literature reviews in psychology, cognitive science, and related fields in order to develop a range of mental techniques designed to help improve clarity of thinking and decision-making, and increase internal alignment towards goals.

The source does not mention CFAR's contribution to building social infrastructure within rationality and EA communities through workshops, alumni networks, and partnerships with organizations like Effective Altruism Global. The source does not mention CFAR serving as a gathering point for 'rationality geeks' to exchange ideas on cognitive improvement. The source does not mention CFAR helping connect individuals interested in existential risk reduction, AI safety, and effective altruism.

Claims (1)
In January 2018, CFAR's semi-independent investigation panel, ACDC, received allegations of physical, sexual, and emotional abuse from one of Brent's former partners. Despite these allegations, ACDC recommended against banning the individual in April 2018, and CFAR leadership followed this recommendation.
11LessWrong - CFAR A Year Laterlesswrong.com·Blog post
Claims (1)
- Alumni support: Six-week official follow-up programs and ongoing coaching/mentorship for workshop graduates.
Accurate100%Feb 22, 2026
CFAR now does official followups with participants for six weeks following the workshop.
12CFAR - About Missionrationality.org
Claims (2)
CFAR's stated mission is "Developing clear thinking for the sake of humanity's future," with particular emphasis on addressing challenges related to existential risks, including AI safety. The organization has trained over 1,300 workshop alumni through approximately 60 flagship workshops held between 2012 and early 2020, with participants reporting an average satisfaction rating of 9.2 out of 10. Beyond its core workshop programs, CFAR has contributed to the broader rationality and effective altruism communities through specialized programs, curriculum development, and connections to organizations working on AI alignment and existential risk reduction.
The organization has evolved significantly since its founding, transitioning from a full-time staff of approximately 12 to a mostly-remote operation with eight part-time curriculum developers and instructors as of September 2025. After a multi-year hiatus from regular programming, CFAR resumed mainline workshops in November 2025, testing new formats while continuing to refine its approach to teaching applied rationality.
Claims (1)
CFAR officially began offering classes in 2012, developing a flagship workshop model that charged \$3,900 for multi-day intensive programs. The organization held monthly workshops training diverse participants including scientists, police officers, teachers, entrepreneurs, and high school students. Workshop activity varied by year:
Claims (1)
According to CFAR, these techniques aim to bridge System 1 and System 2 cognition, helping participants work with emotions, reframe problems, and understand the modular nature of the mind.
Unsupported30%Feb 22, 2026
CFAR&rsquo;s workshops aim to give people more understanding and control of their own decision-making.

The source does not mention that the techniques aim to bridge System 1 and System 2 cognition, help participants work with emotions, reframe problems, and understand the modular nature of the mind.

Claims (1)
Between March and September 2022, \$3,405,000 was transferred from the FTX Foundation to CFAR, with an additional \$1,500,000 transferred on October 3, 2022, in ten separate transactions. Following FTX's collapse, FTX debtors required CFAR to return approximately \$5 million.
Accurate100%Feb 22, 2026
CFAR, for example, was required by FTX debtors to return the approximately $5 million it received before the collapse of November 2022. $3,405,000 was transferred from FTX Foundation to CFAR between March and September 2022, and an additional $1,500,000 was transferred on October 3, 2022 , in ten separate transactions.
16EA Forum - Center for Applied Rationalityforum.effectivealtruism.org·Blog post
Claims (1)
CFAR was founded in 2012 by Anna Salamon, Julia Galef, Michael Smith (also known as Valentine Smith), and Andrew Critch. The organization's origins trace to Anna Salamon's work at the Machine Intelligence Research Institute (MIRI) in 2011, where she began experimenting with teaching rationality techniques. CFAR initially functioned as an extension of MIRI before becoming an independent nonprofit organization.
Minor issues85%Feb 22, 2026
CFAR was founded in 2012 by Anna Salamon, Julia Galef , Valentine Smith and Andrew Critch.

The source does not mention that CFAR initially functioned as an extension of MIRI before becoming an independent nonprofit organization. The source does not mention Anna Salamon's work at the Machine Intelligence Research Institute (MIRI) in 2011, where she began experimenting with teaching rationality techniques. The source refers to Michael Smith as Valentine Smith.

17YouTube - CFAR Overviewyoutube.com·Talk
Claims (1)
CFAR officially began offering classes in 2012, developing a flagship workshop model that charged \$3,900 for multi-day intensive programs. The organization held monthly workshops training diverse participants including scientists, police officers, teachers, entrepreneurs, and high school students. Workshop activity varied by year:
18CFAR Press Kitrationality.org
Claims (1)
The organization emerged from the rationality movement around Eliezer Yudkowsky's LessWrong community and focuses on applying insights from cognitive science, behavioral economics, psychology, and decision theory to real-world decision-making and problem-solving.
19CFAR Financial Overviewrationality.org
Claims (1)
As of December 2019 or 2020, CFAR reported approximately \$1.4 million in total liquidity, including \$650,000 in cash, \$575,000 in expected grants, and \$215,000 in accounts receivable. The organization's total assets have been reported at \$23,039,646, though this figure may include the venue property acquired in 2018.
20CFAR Official Websiterationality.org
Claims (1)
CFAR's stated mission is "Developing clear thinking for the sake of humanity's future," with particular emphasis on addressing challenges related to existential risks, including AI safety. The organization has trained over 1,300 workshop alumni through approximately 60 flagship workshops held between 2012 and early 2020, with participants reporting an average satisfaction rating of 9.2 out of 10. Beyond its core workshop programs, CFAR has contributed to the broader rationality and effective altruism communities through specialized programs, curriculum development, and connections to organizations working on AI alignment and existential risk reduction.
Claims (1)
The rationality techniques taught at CFAR workshops—such as Bayesian reasoning, debiasing methods, and systematic decision-making frameworks—are presented as relevant to challenges in AI safety work, including addressing scope insensitivity and improving collaborative research practices.
Unsupported20%Feb 22, 2026
Our aim is therefore to find ways of improving both individual thinking skill, and the modes of thinking and social fabric that allow people to think together . And to do this among the relatively small sets of people tackling existential risk.

The source mentions CFAR's focus on AI safety and improving thinking skills, but it does not explicitly list the specific rationality techniques taught at CFAR workshops (Bayesian reasoning, debiasing methods, systematic decision-making frameworks) or their relevance to specific challenges in AI safety work (addressing scope insensitivity, improving collaborative research practices).

22Wikipedia - Center for Applied Rationalityen.wikipedia.org·Reference
Claims (1)
The organization emerged from the rationality movement around Eliezer Yudkowsky's LessWrong community and focuses on applying insights from cognitive science, behavioral economics, psychology, and decision theory to real-world decision-making and problem-solving.
23CFAR Updates Archiverationality.org
Claims (1)
In 2013, CFAR highlighted as an example of "wise failure" the "impressively failed" Effective Fundraising Project—a two-person nonprofit startup founded in 2013 to write grants for effective charities that shut down gracefully after six months. While CFAR presented this as a case study in productive failure, it illustrates challenges in expanding the organization's impact model beyond direct workshop delivery.
Claims (3)
CFAR's stated mission is "Developing clear thinking for the sake of humanity's future," with particular emphasis on addressing challenges related to existential risks, including AI safety. The organization has trained over 1,300 workshop alumni through approximately 60 flagship workshops held between 2012 and early 2020, with participants reporting an average satisfaction rating of 9.2 out of 10. Beyond its core workshop programs, CFAR has contributed to the broader rationality and effective altruism communities through specialized programs, curriculum development, and connections to organizations working on AI alignment and existential risk reduction.
Inaccurate50%Feb 22, 2026
CFAR&#8217;s founders, Anna Salamon, Julia Galef, Michael Smith, and Andrew Critch all have impressive backgrounds in math, artificial intelligence, science, or a combination. In 2011, Salamon, CFAR&#8217;s earliest founder, was working at the Machine Intelligence Research Institute (MIRI) an artificial research firm that now shares its offices with CFAR in Berkeley. CFAR originally began as an extension of MIRI, she explained in an email. &#8220;I was doing training and onboarding for the Machine Intelligence Research Institute, which in practice required a lot of rationality training. And I began to feel that developing exercises for training &#8216;rationality&#8217;—the ability to form accurate beliefs in confusing contexts, and to achieve one&#8217;s goals—was incredibly important, and worth developing in its own right,&#8221; Salamon wrote.

unsupported: The source does not mention CFAR's stated mission. unsupported: The source does not mention the number of workshop alumni trained by CFAR. unsupported: The source does not mention the number of flagship workshops held by CFAR between 2012 and early 2020. unsupported: The source does not mention the average satisfaction rating of participants. unsupported: The source does not mention CFAR's contributions to the broader rationality and effective altruism communities through specialized programs, curriculum development, and connections to organizations working on AI alignment and existential risk reduction.

The organization's 2017 Impact Report indicated that alumni reported higher impact through better motivation navigation and clearer thinking. However, these results are primarily based on internal surveys without independent replication or external validation.
Unsupported0%Feb 22, 2026
So far, a fair amount of the participants who have experienced CFAR&#8217;s teachings firsthand see positive results in their lives, even a year later, at least according to survey data that CFAR has internally collected.

The source does not mention a 2017 Impact Report or any specific findings related to alumni motivation, navigation, or clearer thinking. The source does not explicitly state that the results are based on internal surveys without independent replication or external validation.

CFAR was founded in 2012 by Anna Salamon, Julia Galef, Michael Smith (also known as Valentine Smith), and Andrew Critch. The organization's origins trace to Anna Salamon's work at the Machine Intelligence Research Institute (MIRI) in 2011, where she began experimenting with teaching rationality techniques. CFAR initially functioned as an extension of MIRI before becoming an independent nonprofit organization.
Accurate100%Feb 22, 2026
CFAR&#8217;s founders, Anna Salamon, Julia Galef, Michael Smith, and Andrew Critch all have impressive backgrounds in math, artificial intelligence, science, or a combination. In 2011, Salamon, CFAR&#8217;s earliest founder, was working at the Machine Intelligence Research Institute (MIRI) an artificial research firm that now shares its offices with CFAR in Berkeley. CFAR originally began as an extension of MIRI, she explained in an email.
Claims (1)
After a hiatus from regular programming beginning in early 2020, CFAR conducted an experimental mini-workshop in June 2025 at Arbor Summer Camp and resumed mainline workshops in November 2025, marking the beginning of a pilot program testing new workshop formats.
Claims (1)
The Applied Rationality Unconference series, organized by CFAR alumni, began with a first retreat in 2023, followed by a second in November 2024 that opened participation beyond CFAR alumni. Multiple additional events were organized throughout 2025, including the "Blackpool Applied Rationality Unconference," representing unofficial, participant-organized versions of CFAR's official workshops.
Accurate100%Feb 22, 2026
The Applied Rationality Unconference started in 2023 as a small unconference-style weekend retreat for CFAR (Center for Applied Rationality) alumni. We had fun, talked a lot about CFAR-style applied rationality and helped each other make genuine progress on some of the most important bottlenecks in our lives. 100% of attendees rated the experience 8/10 or higher, and it was one of my favourite weekends of that year. In November 2024 we ran a second retreat, this time opening it up to non-CFAR alumni. Attendees were excited enough that they decided to organise several more of these events throughout 2025!
27CFAR 2016 Case Studiesrationality.org
Claims (2)
CFAR develops and refines rationality techniques by synthesizing insights from cognitive science, psychology, neuroscience, behavioral economics, mathematics, statistics, and game theory. The organization conducts empirical testing of techniques and invents new methods when academic research proves insufficient for practical application.
Notable examples include:
28CFAR Resources Updatesrationality.org
Claims (1)
The organization has evolved significantly since its founding, transitioning from a full-time staff of approximately 12 to a mostly-remote operation with eight part-time curriculum developers and instructors as of September 2025. After a multi-year hiatus from regular programming, CFAR resumed mainline workshops in November 2025, testing new formats while continuing to refine its approach to teaching applied rationality.
29CFAR 2017 Impact Reportrationality.org
Claims (1)
The organization's 2017 Impact Report indicated that alumni reported higher impact through better motivation navigation and clearer thinking. However, these results are primarily based on internal surveys without independent replication or external validation.
Claims (1)
- Multi-year institutional grant renewed in 2018 for 2018-2019
Citation verification: 6 verified, 1 flagged, 18 unchecked of 34 total

Structured Data

3 factsView full profile →
Founded Date
2012

All Facts

Organization
PropertyValueAs OfSource
Founded Date2012
HeadquartersBerkeley
General
PropertyValueAs OfSource
Websitehttp://rationality.org/

Related Pages

Top Related Pages

Analysis

Timelines WikiAI Watch

Other

Nuño SempereJaan TallinnGwern BranwenVidur KapurNate Soares (MIRI)

Organizations

Survival and Flourishing FundCoefficient GivingMachine Intelligence Research InstituteLighthaven (Event Venue)Center for Human-Compatible AIWilliam and Flora Hewlett Foundation

Concepts

Community Building OverviewDiagram Naming ResearchEa Longtermist Wins Losses