Nick Beckstead
Nick Beckstead
Nick Beckstead is a philosopher and EA/longtermism figure whose 2013 dissertation formalized longtermist ethics; this article covers his career arc from academic philosopher to Coefficient Giving (then Open Philanthropy) grantmaker to FTX Future Fund CEO to Secure AI Project founder, including substantive coverage of controversies around FTX forewarnings, longtermist prioritization critiques, and alleged research suppression.
Quick Assessment
| Dimension | Assessment |
|---|---|
| Full Name | Nicholas Beckstead |
| Born | 1985 |
| Nationality | American |
| Field | Philosophy, AI safety, longtermism, effective altruism |
| Current Role | Co-founder and CEO, Secure AI Project |
| Key Contribution | PhD dissertation On the Overwhelming Importance of Shaping the Far Future (2013) |
| Affiliated Organizations | Secure AI Project, formerly Coefficient Giving, FTX Future Fund, Future of Humanity Institute, Centre for Effective Altruism |
Key Links
| Source | Link |
|---|---|
| Official Website | nickbeckstead.com |
| Wikipedia | en.wikipedia.org |
| EA Forum | forum.effectivealtruism.org |
| PhilPeople | philpeople.org |
| Secure AI Project | secureaiproject.org |
Overview
Nick Beckstead is an American philosopher whose work has been central to the development of longtermism as a philosophical and philanthropic framework. His 2013 Rutgers University PhD dissertation, On the Overwhelming Importance of Shaping the Far Future, is widely cited as an early systematic defense of the view that actions affecting the trajectory of civilization over millions of years deserve enormous moral weight—potentially outweighing the near-term benefits of conventional charitable interventions.1 Supervised by philosopher Larry Temkin, the dissertation addresses existential risk, population ethics, and decision theory under uncertainty, and continues to be a foundational reference in the effective altruism and AI safety communities.2
Beyond academic philosophy, Beckstead has played a significant operational role in the effective altruism ecosystem. He was among the earliest employees at Coefficient Giving, serving as a Program Officer overseeing grantmaking in global catastrophic risk reduction, machine learning, science philanthropy, animal product alternatives, and mechanisms of aging.3 He subsequently served as CEO of the FTX Foundation and FTX Future Fund before that organization collapsed in November 2022, following which he transitioned to policy-focused AI safety work. He currently co-founded and leads the Secure AI Project, which focuses on developing pragmatic policies to reduce severe harms from advanced AI.4
Beckstead also has deep roots in the effective altruism movement's organizational history. As a graduate student, he co-founded the first US chapter of Giving What We Can and was one of three founding trustees of the Centre for Effective Altruism.5 These early contributions earned him recognition from EA co-founders as a key figure in establishing the movement's institutional infrastructure in the United States.
Education and Early Career
Beckstead completed a bachelor's degree in mathematics and philosophy at the University of Minnesota before pursuing a PhD in philosophy at Rutgers University, which he completed in 2013.1 During his graduate studies, he became involved with the effective altruism movement early in its formation—he joined Giving What We Can shortly after its 2009 launch and went on to co-found the organization's first US chapter at Rutgers, pledging to donate half of his post-tax income until retirement to cost-effective global poverty organizations.5
His early community-building contributions extended beyond Giving What We Can. He helped organize a GWWC talk at Rutgers that attracted over 500 attendees, served as GWWC's director of research, and was one of the three founding trustees of what would become the Centre for Effective Altruism. According to an EA Forum post, William MacAskill described him as essentially a co-founder of GWWC and credited him as crucial to bridging GiveWell and Coefficient Giving (then Open Philanthropy) with the broader EA movement.5
Following his PhD, Beckstead was a Research Fellow at the Future of Humanity Institute at Oxford University, where he worked on long-term priorities in effective altruism and existential risk. He held this position from June 2013 until November 2014.6
Research and Publications
PhD Dissertation
Beckstead's central academic contribution is his 2013 dissertation, On the Overwhelming Importance of Shaping the Far Future. The core argument holds that, from a global perspective, what matters most in expectation is doing what is best for the general trajectory along which humanity develops over millions of years or longer.7 This framing implies that existential risk reduction and trajectory changes—interventions that alter the long-run course of civilization—can dwarf proximate benefits such as disease treatment or poverty alleviation in terms of expected moral value.
The dissertation includes substantive chapters on empirical and normative defenses of longtermism, rebuttals to population ethics objections (including critiques of person-affecting views and diminishing marginal value arguments), and early discussion of decision theory under extreme uncertainty. Beckstead distinguishes between broad far-future shaping—general interventions robust across many scenarios—and more targeted interventions, arguing for a preference toward the former while acknowledging significant uncertainty.8
Other Academic Work
In collaboration with Toby Ord, Beckstead co-authored "Managing Existential Risk From Emerging Technologies," published in the UK Government Chief Scientific Adviser's annual report Innovation: Managing Risk, Not Avoiding It (2014).7 He also published "How much could refuges help us recover from a global catastrophe?" in the journal Futures in 2015, examining the potential role of isolated refuges in civilizational recovery from catastrophic events.7
A more recent academic contribution, co-authored with philosopher Teruji Thomas, is "A Paradox for Tiny Probabilities and Enormous Values," which appeared as a Global Priorities Institute working paper in 2020 and was later published in the journal Noûs in 2023–2024.9 The paper argues that every theory of uncertain prospects faces at least one of three unpalatable properties: timidity, recklessness, or problematic trade-offs. This work draws on a chapter from Beckstead's 2013 dissertation and has implications for decision theory, axiology, and utilitarian ethics more broadly.9
During his tenure at Coefficient Giving, Beckstead's output shifted primarily toward grantmaking and management rather than peer-reviewed publications, though he produced a number of web pages and reports on topics including animal product alternatives, potential risks from advanced AI, mechanisms of aging, and the long-term significance of reducing global catastrophic risks.7
Career at Coefficient Giving (then Open Philanthropy)
Beckstead joined Coefficient Giving as one of its earliest employees at the end of 2014.3 As a Program Officer, he oversaw a broad portfolio of grantmaking spanning several cause areas. Notable grants associated with his oversight included support for Target Malaria (a gene drive project for malaria control), Impossible Foods (animal product alternatives), and Ed Boyden's laboratory at MIT.3 His portfolio also encompassed EA community support and existential risk reduction, reflecting his philosophical background.
In an EA Global 2017 talk, Beckstead described his approach to community grantmaking as oriented toward empowering talented individuals in career choices rather than scaling a mass movement, on the grounds that EA had succeeded in attracting funds and that the marginal need was for high-quality intellectual contributions and responsive support for emerging projects rather than additional fundraising infrastructure.10
He remained at Coefficient Giving (then Open Philanthropy) until approximately 2021, when he transitioned to lead the FTX Foundation and FTX Future Fund.1
FTX Future Fund and Subsequent Work
Beckstead joined the FTX Foundation as its CEO in November 2021 and served as the leader of the FTX Future Fund, which made grants focused on longtermist and EA-aligned causes.1 He resigned from both roles in November 2022 following the collapse of FTX.1
After leaving FTX, Beckstead took on a role as Policy Lead at the CAIS (Center for AI Safety) and undertook various AI safety and governance consulting projects.4 He subsequently co-founded the Secure AI Project, where he serves as CEO. According to the organization's description, the Secure AI Project develops and advocates for pragmatic policies to reduce risks of severe harm from advanced AI.4 Commentator Zvi Mowshowitz has described the organization's work favorably, noting a private track record that includes improving safety practices at a major AI lab and characterizing the project as able to generate substantial impact relative to its funding.11
Beckstead also stepped down from the boards of Effective Ventures UK and Effective Ventures US on August 23, 2023.12
Philosophical Views
Beckstead's philosophical commitments center on longtermism and the moral significance of future generations. In discussions documented on the EA Forum and in podcast appearances, he has advocated for weighting future generations roughly equivalently to the present, and for prioritizing causes that are neglected and where additional resources can significantly bend humanity's long-run trajectory.8
On population ethics, Beckstead has engaged critically with strict person-affecting views—the position that only identifiable individuals' welfare matters morally—arguing these views generate counterintuitive implications when applied to scenarios involving potential future people. He favors approaches that aggregate across moral frameworks under uncertainty rather than committing to a single theory.8
Beckstead has also discussed the practical implications of moral uncertainty for cause selection. He has expressed support for improving collective judgment through mechanisms like forecasting tournaments and prediction markets, citing Phil Tetlock's research as an example of empirically grounded progress on these questions.8
On AI development strategy specifically, Beckstead has argued that racing to build AI capabilities without a viable alignment solution in place is strategically unsound, even if the developers have pro-safety intentions.13 He has also noted the challenge of gaining mainstream AI researcher buy-in for safety-focused approaches, attributing part of this friction to differences in research culture between machine learning (which emphasizes empirical, code-based demonstrations) and some AI safety methodologies.14
Criticisms and Controversies
Association with FTX and Sam Bankman-Fried
The most significant controversy surrounding Beckstead concerns his role in leading the FTX Future Fund despite prior warnings about Sam Bankman-Fried's conduct. Reporting in Time magazine described how multiple EA leaders, including Beckstead, were warned as early as 2018–2019 about concerns regarding Bankman-Fried's trustworthiness, including allegations of lying and problematic business practices at Alameda Research.15 Despite these warnings, Beckstead joined the FTX Future Fund as its leader in 2021. Critics have argued that EA leadership, including Beckstead, prioritized FTX's philanthropic resources over adequate due diligence on Bankman-Fried's ethics.
Following the FTX collapse, Beckstead recused himself from Effective Ventures board matters related to FTX but remained on both EV UK and EV US boards for over nine months before departing in August 2023.12 Some observers described this prolonged tenure as problematic given the circumstances of his departure from FTX.
Longtermism and Prioritization of Rich Countries
A passage from Beckstead's 2013 PhD dissertation attracted public criticism: he wrote that it seemed more plausible to him that saving a life in a rich country is substantially more important than saving a life in a poor country, other things being equal, due to the greater potential "ripple effects" from wealthier, more connected individuals.15 Critics in outlets including Jacobin and Current Affairs cited this as evidence of elitism embedded in longtermist thinking, framing it as a form of trickle-down ideology that deprioritizes existing poor populations in favor of speculative future benefits.16
Beckstead's broader longtermist framework has been critiqued by philosophers and journalists as potentially justifying neglect of present harms in pursuit of speculative long-run gains, with some critics characterizing longtermism as a dangerous ideological framework when operationalized at scale.17
Alleged Research Suppression
Philosopher Simon Knutsson has reported that managers associated with the Effective Altruism Foundation (EAF), in cooperation with Beckstead in his capacity as a grant investigator and CEA trustee, pressured him to modify his paper "The World Destruction Argument"—which critiqued views associated with Beckstead and others—in order to conform to EA communication guidelines, or risk losing funding. Knutsson's account includes claims that this involved outreach to his academic supervisor and that it reflected an attempt to suppress pessimistic ethics research.18 The nature and extent of Beckstead's direct involvement in these communications is not fully documented in publicly available sources.
Key Uncertainties
- The full scope of Beckstead's decision-making authority and situational awareness regarding FTX prior to its collapse remains unclear from public sources.
- The degree to which the Secure AI Project's claimed policy impacts are independently verifiable is difficult to assess; accounts of its effectiveness are primarily from affiliated commentators.
- Beckstead's precise views on longtermist prioritization have evolved since the 2013 dissertation; public statements suggest he has moderated some positions, but the extent of revision is not fully documented.
Sources
Footnotes
-
EA Forum – Nick Beckstead topic page — EA Forum – Nick Beckstead topic page ↩ ↩2 ↩3 ↩4 ↩5
-
Nick Beckstead – Research page — Nick Beckstead – Research page ↩
-
80,000 Hours Podcast – Nick Beckstead on giving billions — 80,000 Hours Podcast – Nick Beckstead on giving billions ↩ ↩2 ↩3
-
Nick Beckstead – Personal website — Nick Beckstead – Personal website ↩ ↩2 ↩3
-
EA Forum – Nick Beckstead is leaving the Effective Ventures boards — EA Forum – Nick Beckstead is leaving the Effective Ventures boards ↩ ↩2 ↩3
-
Timeline of Future of Humanity Institute – Issa Rice — Timeline of Future of Humanity Institute – Issa Rice ↩
-
Nick Beckstead – Research publications list — Nick Beckstead – Research publications list ↩ ↩2 ↩3 ↩4
-
80,000 Hours Podcast – Nick Beckstead on giving billions — 80,000 Hours Podcast – Nick Beckstead on giving billions ↩ ↩2 ↩3 ↩4
-
Global Priorities Institute – A Paradox for Tiny Probabilities and Enormous Values (working paper) — Global Priorities Institute – A Paradox for Tiny Probabilities and Enormous Values (working paper) ↩ ↩2
-
EA Forum – Nick Beckstead: EA Community Building (2017 talk) — EA Forum – Nick Beckstead: EA Community Building (2017 talk) ↩
-
Zvi Mowshowitz – The Big Nonprofits Post 2025 — Zvi Mowshowitz – The Big Nonprofits Post 2025 ↩
-
Citation rc-390c (data unavailable — rebuild with wiki-server access) ↩ ↩2
-
AI Alignment Forum – Let's think about slowing down AI — AI Alignment Forum – Let's think about slowing down AI ↩
-
EA Global 2018 – Nick Beckstead fireside chat — EA Global 2018 – Nick Beckstead fireside chat ↩
-
Time Magazine – Sam Bankman-Fried, Effective Altruism, Alameda, FTX — Time Magazine – Sam Bankman-Fried, Effective Altruism, Alameda, FTX ↩ ↩2
-
Jacobin – Effective Altruism, Longtermism, Nick Bostrom, racism — Jacobin – Effective Altruism, Longtermism, Nick Bostrom, racism ↩
-
Aeon – Why longtermism is the world's most dangerous secular credo — Aeon – Why longtermism is the world's most dangerous secular credo ↩
-
Simon Knutsson – Problems in Effective Altruism and Existential Risk — Simon Knutsson – Problems in Effective Altruism and Existential Risk ↩
References
“Beckstead's research focuses on topics related to the long-term future and its normative implications, including existential risk , [ 5] [ 6] population ethics , [ 7] space colonization , [ 8] and differential progress . [ 9] His doctoral dissertation, which combines some of these interests, is often credited as an important early contribution to longtermism .”
The source does not provide the exact title of the dissertation. It only mentions that the dissertation combines his research interests. The source does not mention that Larry Temkin supervised the dissertation. The source does not explicitly state that the dissertation is a foundational reference in the effective altruism and AI safety communities.
“Among the EA brain trust personally notified about Bankman-Fried’s questionable behavior and business ethics were Nick Beckstead, a moral philosopher who went on to lead Bankman-Fried’s philanthropic arm, the FTX Future Fund, and Holden Karnofsky, co-CEO of OpenPhilanthropy, a nonprofit organization that makes grants supporting EA causes.”
4Nick Beckstead is Leaving the Effective Ventures Boards — EA Forumforum.effectivealtruism.org·Blog post▸
“He was involved in EA right from the start — one of the members of Giving What We Can at launch in 2009 — and he soon started running our first international chapter at Rutgers, before becoming our director of research. He contributed greatly to the early theory of effective altruism and along with Will and I was one of the three founding trustees of the Centre for Effective Altruism.”
“On 23rd August, Nick Beckstead stepped down from the boards of Effective Ventures UK and Effective Ventures US.”