Back
For Our Posterity – Leopold Aschenbrenner's Personal Site
webforourposterity.com·forourposterity.com
Personal website of Leopold Aschenbrenner, former OpenAI Superalignment researcher and author of the influential 'Situational Awareness' essay series on AGI strategy, offering links to key AI safety and AGI policy writings.
Metadata
Importance: 62/100homepage
Summary
This is the personal website of Leopold Aschenbrenner, a former OpenAI Superalignment team member who authored the widely-discussed 'Situational Awareness: The Decade Ahead' essay series. The site hosts his writings on AGI alignment, superalignment research directions, economic growth, and AI geopolitics. It serves as a hub for his views on the urgency and tractability of AGI safety.
Key Points
- •Home to the 'Situational Awareness' essay series, a major public document on AGI strategic landscape and safety timelines.
- •Features posts on superalignment research including weak-to-strong generalization and scalable oversight.
- •Includes commentary on AGI alignment resourcing gaps and the argument that alignment is undersupported.
- •Covers geopolitical dimensions of AGI development, including US-China competition framing.
- •Author founded an AGI-focused investment firm backed by prominent tech figures including Patrick and John Collison.
Cited by 1 page
| Page | Type | Quality |
|---|---|---|
| Leopold Aschenbrenner | Person | 61.0 |
Cached Content Preview
HTTP 200Fetched Apr 21, 20265 KB
Hi, I'm Leopold Aschenbrenner . I recently founded an investment firm focused on AGI, with anchor investments from Patrick Collison, John Collison, Nat Friedman, and Daniel Gross.
Before that, I worked on the Superalignment team at OpenAI.
In a past life, I did research on economic growth at Oxford's Global Priorities Institute. I graduated as valedictorian from Columbia at age 19. I originally hail from Germany and now live in the great city of San Francisco, California.
My aspiration is to secure the blessings of liberty for our posterity. I'm interested in a pretty eclectic mix of things, from First Amendment law to German history to topology, though I'm pretty focused on AI these days.
Follow me on Twitter . You can email me here .
Your email address
Subscribe
Please check your inbox and click the link to confirm your subscription.
Please enter a valid email address!
An error occurred, please try again later.
Featured Posts
14 Jun 2024
Paid
Members
Public
SITUATIONAL AWARENESS: The Decade Ahead
Virtually nobody is pricing in what's coming in AI. I wrote an essay series on the AGI strategic picture: from the trendiness in deep learning and counting the OOMs, to the international situation and The Project.
14 Jun 2024
Paid
Members
Public
Dwarkesh podcast on SITUATIONAL AWARENESS
My 4.5 hour conversation with Dwarkesh. I had a blast!
14 Dec 2023
Paid
Members
Public
Weak-to-strong generalization
A new research direction for superalignment: can we leverage the generalization properties of deep learning to control strong models with weak supervisors?
29 Mar 2023
Paid
Members
Public
Nobody’s on the ball on AGI alignment
Far fewer people are working on it than you might think, and even the alignment research that is happening is very much not on track. (But it’s a solvable problem, if we get our act together.)
26 Jul 2021
Paid
Members
Public
Burkean Longtermism
People will not look forward to posterity, who never look backward to their ancestors.
9 Jul 2021
Paid
Members
Public
My Favorite Chad Jones Papers
Some of the very best, and most beautiful, economic theory on long-run growth.
23 Nov 2020
Paid
Members
Public
Europe’s Political Stupor
On the European obsession with America, the dearth of the political on the Continent, and the downsides of homogeneity.
19 Oct 2020
Paid
Members
Public
The Risks of Stagnation (Article for Works in Progress)
Human activity and new technologies can be dangerous, threatening the very survival of humanity. Does that mean economic growth is inherently risky?
Recent Posts
14 Dec 2023
Paid
Members
Public
Superalignment Fast Grants
We’
... (truncated, 5 KB total)Resource ID:
928c0d953a6b3a4d | Stable ID: sid_jRqtlL3tKA