Longterm Wiki
Back

Epoch AI, "Frontier LLM training runs can't get much longer" (https://epoch.ai/data-insights/longest-training-run)

web

Data Status

Not fetched

Cited by 1 page

PageTypeQuality
Capability-Alignment Race ModelAnalysis62.0

Cached Content Preview

HTTP 200Fetched Feb 23, 20269 KB
Frontier LLM training runs can’t get much longer | Epoch AI 
 

 
 
 
 
 
 

 

 
 
 
 

 

 

 

 

 
 
 

 

 

 

 
 
 
 

 

 

 

 

 
 
 
 
 
 
 
 
 

 

 
 
 

 
 
 

 
 
 
 
 

 
 

 
 
 
 
 
 
 
 
 

 
 
 

 
 
 
 Latest 
 
 
 
 
 
 
 Publications & Commentary 
 
 
 
 
 
 
 
 Papers & Reports 
 
 
 
 Newsletter 
 
 
 
 Podcast 
 
 
 
 
 
 
 
 
 Data & Resources 
 
 
 
 
 
 
 
 
 Datasets 
 
 Overview 
 
 Benchmarking 
 
 Models 
 
 Frontier Data Centers 
 
 Hardware 
 
 Companies 
 
 Chip Sales 
 
 Polling 
 
 
 
 
 
 
 Resources 
 
 AI Trends & Statistics 
 
 Data Insights 
 
 
 
 
 
 
 
 
 
 
 Projects 
 
 
 
 
 
 
 
 FrontierMath 
 
 
 
 GATE Playground 
 
 
 
 Distributed Training 
 
 
 
 Model Counts 
 
 
 
 
 
 
 
 
 About 
 
 
 
 
 
 
 
 About Us 
 
 
 
 Our Team 
 
 
 
 Careers 
 
 
 
 Consultations 
 
 
 
 Our Funding 
 
 
 
 Donate 
 
 
 
 
 
 
 
 
 Contact 
 
 
 
 
 
 
 
 
 
 
 

 
 
 

 
 

 
 
 
 
 
 
 
 

 
 
 
 
 
 
 

 
 
 
 
 

 
 

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

 
 
 

 
 
 
 
 

 

 
 
 
 
 
 Search epoch.ai
 
 
 Search 
 

 
 
 
 Enter a query to search for results
 
 
 
 

 
 
 
 Placeholder 
 

 
 
 
 
 

 
 
 

 
 
 
 
 
 
 
 Epoch AI’s work is free to use, distribute, and reproduce provided the source and authors are credited under the Creative Commons Attribution license .

 

 
 
 Cite this work as

 

 
 
 
 Luke Emberson and Yafah Edelman (2025), "Frontier training runs will likely stop getting longer by around 2027". Published online at epoch.ai. Retrieved from: 'https://epoch.ai/data-insights/longest-training-run' [online resource] 
 
 
 
 

 
 BibTeX citation

 

 
 @misc{epoch2025longesttrainingrun,
 title={Frontier training runs will likely stop getting longer by around 2027},
 author={Luke Emberson and Yafah Edelman},
 year={2025},
 url={https://epoch.ai/data-insights/longest-training-run},
 note={Accessed: }
 } 
 

 
 
 

 
 
 
 
 
 

 

 
 
 

 

 
 

 
 Data Insight 
 
 
 
 
 
 

 
 Frontier training runs will likely stop getting longer by around 2027
 
 
 

 Frontier training runs will likely stop getting longer by around 2027

 

 In “ The Longest Training Run ”, we argue that training runs that last too long are outclassed by training runs that start later and benefit from additional hardware and algorithmic improvements . Based on our latest numbers, this suggests that training runs lasting more than 9 months may be inefficient. At the current pace, training runs will reach this size around 2027 (90% CI: Aug 2025 to Sept 2029).

 
 
 
 
 
 
 

 

 
 

 

 

 
 

 

 
 

 

 
 
 
 Enable JavaScript to see an interactive visualization.

 

 
 

 

 
 
 

 
 

 Longer training runs are a significant driver of the rapid growth seen in training compute . If training time stops increasing, training compute growth will slow – unless developers ramp up hardware scaling even faster. This could be achieved by speeding up the build-out of larger clusters, or by spreading training across multiple cluste

... (truncated, 9 KB total)
Resource ID: 9d535d8e91127085 | Stable ID: MzRiNDYyNT