Longterm Wiki
Back

Credibility Rating

4/5
High(4)

High quality. Established institution or organization with editorial oversight and accountability.

Rating inherited from publication venue: Epoch AI

Data Status

Not fetched

Cited by 1 page

PageTypeQuality
AI Capability Threshold ModelAnalysis72.0

Cached Content Preview

HTTP 200Fetched Feb 23, 202698 KB
Training compute of frontier AI models grows by 4-5x per year | Epoch AI 
 
 

 
 
 
 
 
 

 

 
 
 
 

 

 

 

 

 
 
 

 

 

 

 
 
 
 

 

 

 

 

 
 
 

 

 
 
 

 
 
 
 
 

 
 
 
 
 

 
 

 
 
 
 
 
 
 
 
 

 
 
 

 
 
 
 Latest 
 
 
 
 
 
 
 Publications & Commentary 
 
 
 
 
 
 
 
 Papers & Reports 
 
 
 
 Newsletter 
 
 
 
 Podcast 
 
 
 
 
 
 
 
 
 Data & Resources 
 
 
 
 
 
 
 
 
 Datasets 
 
 Overview 
 
 Benchmarking 
 
 Models 
 
 Frontier Data Centers 
 
 Hardware 
 
 Companies 
 
 Chip Sales 
 
 Polling 
 
 
 
 
 
 
 Resources 
 
 AI Trends & Statistics 
 
 Data Insights 
 
 
 
 
 
 
 
 
 
 
 Projects 
 
 
 
 
 
 
 
 FrontierMath 
 
 
 
 GATE Playground 
 
 
 
 Distributed Training 
 
 
 
 Model Counts 
 
 
 
 
 
 
 
 
 About 
 
 
 
 
 
 
 
 About Us 
 
 
 
 Our Team 
 
 
 
 Careers 
 
 
 
 Consultations 
 
 
 
 Our Funding 
 
 
 
 Donate 
 
 
 
 
 
 
 
 
 Contact 
 
 
 
 
 
 
 
 
 
 
 

 
 
 

 
 

 
 
 
 
 
 
 
 

 
 
 
 
 
 
 

 
 
 
 
 

 
 

 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

 
 
 

 
 
 
 
 

 

 
 
 
 
 
 Search epoch.ai
 
 
 Search 
 

 
 
 
 Enter a query to search for results
 
 
 
 

 
 
 
 Placeholder 
 

 
 
 
 
 

 
 
 

 

 

 
 
 
 
 Article 
 
 
 
 
 
 
 Training compute of frontier AI models grows by 4-5x per year 
 

 
 report 
 Training compute of frontier AI models grows by 4-5x per year

 
 
 
 Our expanded AI model database shows that the compute used to train recent models grew 4-5x yearly from 2010 to May 2024. We find similar growth in frontier models, recent large language models, and models from leading companies.

 
 

 
 
 
 
 
 
 
 
 
 
 
 
 
 

 
 
 
 
 

 
 
 
 
 

 
 
 

 
 

 
 
 
 
 Cite
 
 
 
 
 

 

 
 
 Published

 May 28, 2024 
 

 

 
 
 
 
 
 

 
 Authors

 
 
 
 Jaime Sevilla, 
 
 
 Edu Roldán 
 
 
 

 
 
 

 Resources

 
 

 
 

 

 

 
 
 Dataset
 
 

 
 

 

 

 
 
 Source Code
 
 
 
 
 
 
 

 

 
 
 
 
 

 
 
 

 
 

 Introduction

 Over the last ten years, we have witnessed a dramatic increase in the computational resources dedicated to training state-of-the-art AI models. This strategy has been incredibly productive, translating into large gains in generality and performance . For example, we estimate that about two-thirds of the improvements in performance in language models in the last decade have been due to increases in model scale.

 Given the central role of scaling, it is important to track how the computational resources (‘compute’) used to train models have grown in recent years. In this short article, we provide an updated view of the trends so far, having collected three times more data since our last analysis .

 We tentatively conclude that compute growth in recent years is currently best described as increasing by a factor of 4-5x/year. We find consistent growth between recent notable models, the running top 10 of models by compute, recent large language models, and top models released by OpenAI, Google DeepMind and Meta AI.

 There are some unresolved uncertainti

... (truncated, 98 KB total)
Resource ID: 7d0515f6079d8beb | Stable ID: ZWE3MjAyOT