Skip to content
Longterm Wiki
Back

Author

niplav

Credibility Rating

3/5
Good(3)

Good quality. Reputable source with community review or editorial standards, but less rigorous than peer-reviewed venues.

Rating inherited from publication venue: LessWrong

CFAR (Center for Applied Rationality) is closely linked to the LessWrong and AI safety communities; this post provides a community-level update or analysis of the organization's direction, relevant for those tracking the institutional landscape of AI safety.

Forum Post Details

Karma
95
Comments
34
Forum
lesswrong
Forum Tags
Center for Applied Rationality (CFAR)Community

Metadata

Importance: 30/100blog postnews

Summary

A LessWrong post examining the status and direction of the Center for Applied Rationality (CFAR), an organization focused on developing and teaching rationality skills. The post likely discusses CFAR's activities, funding, mission, and relationship to the broader AI safety and rationality communities.

Key Points

  • Examines the organizational status and activities of CFAR, a key rationality-training nonprofit in the EA/AI safety ecosystem
  • CFAR has historically been seen as a talent pipeline and community-builder for AI safety researchers
  • Discusses the relationship between rationality training and AI safety work
  • Raises questions about CFAR's effectiveness, funding, or strategic direction
  • Relevant to understanding the organizational landscape of the AI safety community

Cited by 1 page

PageTypeQuality
Center for Applied RationalityOrganization62.0

Cached Content Preview

HTTP 200Fetched Apr 7, 20262 KB
# What is Going On With CFAR?
By niplav
Published: 2022-05-28
Whispers [have been going around](https://nitter.hu/casebash/status/1524985482148724736#m) [on the internet](https://nitter.hu/JeffLadish/status/1525022527927382016#m). People [have been talking](https://old.reddit.com/r/slatestarcodex/comments/qcrhc4/can_someone_provide_an_overview_ofintroduction_to/hhit9py/), using words like "defunct" or "inactive" (not yet "dead").

The last update to the website was [December 2020](https://rationality.org/resources/updates/2020/december-newsletter) (the copyright on the website states "© Copyright 2011-2021 Center for Applied Rationality. All rights reserved."), the last [large-scale public communication](https://www.lesswrong.com/posts/96N8BT9tJvybLbn5z/we-run-the-center-for-applied-rationality-ama) was end of 2019 (that I know of).

If CFAR is now "defunct", it might be useful for the rest of the world to know about that, because the problem of making humans and groups more rational hasn't disappeared, and some people might want to pick up the challenge (and perhaps talk to people who were involved in it to rescue some of the conclusions and insights).

Additionally, it would be interesting to hear why the endeavour was abandoned in the end, to avoid going on wild goose-chases oneself (or, in the very boring case, to discover that they ran out of funding (though that appears unlikely to me)).

If CFAR isn't "defunct", I can see a few possibilities:

* It's working on some super-secret projects, perhaps in conjunction with MIRI (which sounds reasonable enough, but there's still value left on the table with distributing rationality training and raising the civilizational sanity)
* They are going about their regular business, but the social network they operate in is large enough that they don't need to advertise on their website (I think this is unlikely, it contradicts most of the evidence in the comments linked above)

So, what is going on?
Resource ID: 12c6782d91fb88a4 | Stable ID: sid_hKYfMNmIjc