Back
Credibility Rating
4/5
High(4)High quality. Established institution or organization with editorial oversight and accountability.
Rating inherited from publication venue: Epoch AI
Data Status
Not fetched
Cited by 7 pages
| Page | Type | Quality |
|---|---|---|
| The Case For AI Existential Risk | Argument | 66.0 |
| Is Scaling All You Need? | Crux | 42.0 |
| AGI Development | -- | 52.0 |
| Novel / Unknown Approaches | Capability | 53.0 |
| AI Risk Critical Uncertainties Model | Crux | 71.0 |
| Epoch AI | Organization | 51.0 |
| Long-Timelines Technical Worldview | Concept | 91.0 |
Cached Content Preview
HTTP 200Fetched Feb 23, 202698 KB
Can AI scaling continue through 2030? | Epoch AI
Latest
Publications & Commentary
Papers & Reports
Newsletter
Podcast
Data & Resources
Datasets
Overview
Benchmarking
Models
Frontier Data Centers
Hardware
Companies
Chip Sales
Polling
Resources
AI Trends & Statistics
Data Insights
Projects
FrontierMath
GATE Playground
Distributed Training
Model Counts
About
About Us
Our Team
Careers
Consultations
Our Funding
Donate
Contact
Search epoch.ai
Search
Enter a query to search for results
Placeholder
Article
Can AI scaling continue through 2030?
report
Can AI scaling continue through 2030?
We investigate the scalability of AI training runs. We identify electric power, chip manufacturing, data and latency as constraints. We conclude that 2e29 FLOP training runs will likely be feasible by 2030.
Cite
Published
Aug 20, 2024
Authors
Jaime Sevilla,
Tamay Besiroglu,
Ben Cottier,
Josh You,
Edu Roldán,
Pablo Villalobos,
Ege Erdil
Resources
Source Code
Introduction
In recent years, the capabilities of AI models have significantly improved. Our research suggests that this growth in computational resources accounts for a significant portion of AI performance improvements . 1 The consistent and predictable improvements from scaling have led AI labs to aggressively expand the scale of training , with training compute expanding at a rate of approximately 4x per year.
To put this 4x annual growth in AI training compute into perspective, it outpaces even some of the fastest technological expansions in recent history. It surpasses the peak growth rates of mobile phone adoption (2x/year, 1980-1987), solar energy capacity installation (1.5x/year, 2001-2010), and human genome sequencing (3.3x/year, 2008-2015).
Here, we examine whether it is technically feasible for the current rapid pace of AI training scaling—approximately 4x per year—to continue through 2030. We investigate four key factors that might constrain scaling: power availability, chip manufacturing capacity, data scarcity, and the “latency wall”, a fundamental speed limit i
... (truncated, 98 KB total)Resource ID:
9587b65b1192289d | Stable ID: NTZlMGFiY2