Skip to content
Longterm Wiki
Back

VICE - Center for Applied Rationality

web

This VICE article provides an outside media perspective on CFAR, which is relevant context for understanding the social and organizational landscape surrounding the AI safety community, though content is unavailable for direct verification.

Metadata

Importance: 22/100news articlenews

Summary

A VICE media article covering the Center for Applied Rationality (CFAR), an organization focused on teaching rationality and critical thinking skills, often associated with the AI safety and effective altruism communities. The article likely examines CFAR's methods, culture, or influence on the broader rationalist and AI safety ecosystem.

Key Points

  • CFAR is an organization that runs workshops teaching applied rationality techniques drawn from cognitive science and psychology
  • CFAR has historically had close ties to AI safety research communities, particularly MIRI and the broader LessWrong rationalist sphere
  • The organization trains individuals to think more clearly about high-stakes decisions, including those related to existential risk
  • VICE coverage suggests mainstream media scrutiny of rationalist/AI safety adjacent organizations and their methods
  • CFAR's influence on AI safety talent pipeline and community culture has been a subject of external commentary

Cited by 1 page

PageTypeQuality
Center for Applied RationalityOrganization62.0

Cached Content Preview

HTTP 200Fetched Apr 9, 202614 KB
The 'Rationality' Workshop That Teaches People to Think More Like Computers 
 
 
 
 
 
 
 
 
 
 

 

 

 

 

 

 

 

 
 
 

 

 
 
 
 
 
 
 

 

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

 

 

 

 

 
 
 
 
 
 
 
 
 
 Back to homepage 
 

 Subscribe to VICE

 Get unlimited access to everything VICE has to offer.

 
 
 
 
 
 Digital Monthly

 $ 2.00 / month 
 
 
 
 
 
 Turn off all ads on VICE.com
 
 
 
 
 
 Exclusive New VICE Documentaries
 
 
 
 
 
 Member Exclusive Features & Columns
 
 
 
 Join Now
 
 
 
 
 
 
 Digital Annual

 $ 20.00 / year 
 
 
 
 
 
 Turn off all ads on VICE.com
 
 
 
 
 
 Exclusive New VICE Documentaries
 
 
 
 
 
 Member Exclusive Features & Columns
 
 
 
 Join Now
 
 
 
 
 
 
 Print & Digital

 $ 70.00 / year 
 
 
 
 
 
 Turn off all ads on VICE.com
 
 
 
 
 
 Exclusive New VICE Documentaries
 
 
 
 
 
 Member Exclusive Features & Columns
 
 
 
 
 
 4 Magazines Delivered to Your Door
 
 
 
 Join Now
 
 
 
 
 
 

 
 
 
 

 

 

 
 
 
 
 
 
 
 
 
 
 
 
 

 
 

 
 Share: 
 Share on X (Opens in new window) X 

 Share on Facebook (Opens in new window) Facebook 

 Share using Native tools Share Copied to clipboard 
 
 

 
 Melissa Beswick, a research coordinator at the University of Pennsylvania and one of my closest friends, has tried for years to force herself into the habit of swimming laps.

 
 

 “There have been times when I’ve tried to swim consistently, but it’s only lasted a couple weeks, never more,” she told me. 

 
 
 Videos by VICE 

 
 
 

 Finally, that’s changed. Beswick now swims two to three times a week, and she’s confident she can stick with it.

 
 

 She credits her newfound motivation at least in part to a curious trip she took out to the Bay Area last fall. She flew across the country to spend five days in a cramped hostel with about three dozen others—mostly young tech workers—all there to attend a workshop hosted by The Center For Applied Rationality , or CFAR. 

 As far as self-help seminars go, CFAR is definitely unique. Instead of invoking spirituality or pointing toward a miracle cure-all, the organization emphasizes thinking of your brain as a kind of computer. 

 
 

 Throughout the workshop, participants and facilitators described their thinking patterns using programming and AI terms, Beswick said. 

 “One of the first things we had to do was create a ‘bugs list,’” Beswick told me, meaning a list of personal thinking errors. The exercise was a nod towards the term programmers use to refer to problems in computer code.

 “We had all noticed in different ways in different contexts that being smart, and being well educated and even being really well intentioned was far from a guarantee from making what turned out to be really stupid decisions.”

 
 

 The names of classes even, are sometimes derived from tech terms. One class, dubbed “propagating urges,” comes from the machine learning term backpropagation . 

 CFA

... (truncated, 14 KB total)
Resource ID: c58992c9e437a569 | Stable ID: sid_ecgVdrtGA1