Skip to content
Longterm Wiki
Back

The FTX crisis highlights a deeper cultural problem within EA - we don't sufficiently value good governance

web

Author

James Fodor

Credibility Rating

3/5
Good(3)

Good quality. Reputable source with community review or editorial standards, but less rigorous than peer-reviewed venues.

Rating inherited from publication venue: EA Forum

Written in November 2022 shortly after the FTX collapse, this post is a community critique relevant to understanding how organizational governance failures in EA-adjacent institutions can create systemic risks, with lessons applicable to AI safety organizations.

Forum Post Details

Karma
452
Comments
55
Forum
eaforum
Forum Tags
Building effective altruismCommunityCriticism of the effective altruism communityCriticism of effective altruist organizationsFTX collapseNonprofit governanceCriticism of work in effective altruismTransparency

Metadata

Importance: 52/100opinion piececommentary

Summary

James Fodor argues that the FTX collapse reflects a systemic pattern of governance failures across EA organizations, not an isolated incident. He documents specific cases of weak financial oversight, inadequate accountability, and insufficient transparency across multiple EA orgs since the movement's inception, attributing these failures to a cultural devaluation of institutional best practices in favor of philosophical and strategic discourse.

Key Points

  • EA organizations have repeatedly exhibited weak governance, including failures at Singularity Institute, CEA, 80k Hours, MIRI/CFAR, Leverage Research, and FTX/Future Fund.
  • EA culture tends to prioritize philosophical and strategic discussions over organizational governance, making accountability norms low-status within the community.
  • Governance failures share common threads: poor record-keeping, inadequate board oversight, conflicts of interest, and insufficient protection for those in power-asymmetric relationships.
  • The FTX collapse should be understood as a predictable outcome of systemic cultural neglect of governance, not a black swan event.
  • Fodor calls for the EA community to establish stronger norms around transparency, accountability, external auditing, and stakeholder engagement.

Cited by 2 pages

Cached Content Preview

HTTP 200Fetched Apr 9, 20269 KB
# The FTX crisis highlights a deeper cultural problem within EA - we don't sufficiently value good governance
By James Fodor
Published: 2022-11-14
Introduction
------------

In this piece, I will explain why I don't think the collapse of FTX and resulting fallout for Future Fund and EA community in general is a one-off or 'black swan' event as some have argued on this forum. Rather, I think that what happened was part of a broader pattern of failures and oversights that have been persistent within EA and EA-adjacent organisations since the beginning of the movement.

As a disclaimer, I do not have any inside knowledge or special expertise about FTX or any of the other organisations I will mention in this post. I speak simply as a long-standing and concerned member of the EA community.

Weak Norms of Governance
------------------------

The essential point I want to make in this post is that the EA community has not been very successful in fostering norms of transparency, accountability, and institutionalisation of decision-making. Many EA organisations began as *ad hoc* collections of like-minded individuals with very ambitions goals but relatively little career experience. This has often led to inadequate organisational structures and procedures being established for proper management of personal, financial oversight, external auditing, or accountability to stakeholders. Let me illustrate my point with some major examples I am aware of from EA and EA-adjacent organisations:

1.  Weak governance structures and financial oversight at the [Singularity Institute](https://www.lesswrong.com/posts/6SGqkCgHuNr7d4yJm/thoughts-on-the-singularity-institute-si), leading to the theft of over $100,000 in 2009.
2.  Inadequate record keeping, rapid executive turnover, and insufficient board oversight at the [Centre for Effective Altruism](https://www.centreforeffectivealtruism.org/our-mistakes) over the period 2016-2019.
3.  Inadequate financial record keeping at [80,000 Hours](https://80000hours.org/about/credibility/evaluations/mistakes/#accounting-behind) during 2018.
4.  Insufficient oversight, unhealthy power dynamics, and other harmful practices reported at [MIRI/CFAR](https://www.lesswrong.com/posts/MnFqyPLqbiKL8nSR7/my-experience-at-and-around-miri-and-cfar-inspired-by-zoe) during 2015-2017.
5.  Similar problems reported at the EA-adjacent organisation [Leverage Research](https://medium.com/@zoecurzi/my-experience-with-leverage-research-17e96a8e540b#3b11) during 2017-2019.
6.  'Loose norms around board of directors and conflicts of interests between funding orgs and grantees' at [FTX and the Future Fund](https://forum.effectivealtruism.org/posts/efGNMe6uB87qXozXJ/ny-times-on-the-ftx-implosion-s-impact-on-ea) from 2021-2022.

While these specific issues are somewhat diverse, I think what they have in common is an insufficient emphasis on principles of good organisational governance. This ranges from the most basic such as clear objectives and good record

... (truncated, 9 KB total)
Resource ID: 72746ecdf8d4b160 | Stable ID: sid_8TQJYKh7zg