Back
Fortune AI training costs
webCredibility Rating
3/5
Good(3)Good quality. Reputable source with community review or editorial standards, but less rigorous than peer-reviewed venues.
Rating inherited from publication venue: Fortune
Useful context for understanding compute concentration and resource barriers in frontier AI development; relevant to governance discussions about who controls AI progress and the sustainability of scaling-based approaches.
Metadata
Importance: 42/100news articlenews
Summary
This Fortune article examines the rapidly escalating costs of training frontier AI models, with some models potentially requiring billions of dollars and computational demands doubling roughly every six months. It raises concerns about whether current scaling trajectories are economically and practically sustainable for ongoing AI development.
Key Points
- •Training costs for leading AI models are reaching into the billions of dollars, representing a dramatic acceleration from prior generations.
- •Computational requirements for frontier models are doubling approximately every six months, outpacing even Moore's Law trends.
- •The sustainability of this cost trajectory is questioned, with implications for which organizations can compete at the frontier.
- •Rising costs may concentrate AI development among a small number of well-resourced companies like OpenAI, Anthropic, and Microsoft.
- •The trend raises broader questions about the long-term viability of current scaling approaches to AI advancement.
Review
The source examines the escalating costs of training advanced AI models, revealing a remarkable trend of exponential growth in computational requirements. Researchers from Epoch AI have tracked how the computational power needed to train cutting-edge AI models has been doubling approximately every six months since the early 2010s, with training costs roughly tripling annually. This trajectory suggests potential training costs could reach $140 billion by 2030, though the projection is acknowledged as a speculative extrapolation.
The implications for AI development are profound, with potential economic and technological limitations emerging. Experts like Lennart Heim warn that training costs could theoretically surpass entire national GDPs by the mid-2030s, raising critical questions about the sustainability of current AI development approaches. Alternative strategies are being explored, such as smaller, task-specific models, open-source collaboration, and innovative data sourcing techniques like synthetic data generation. The research highlights the complex interplay between technological advancement, economic constraints, and the pursuit of increasingly sophisticated artificial intelligence.
Cited by 1 page
| Page | Type | Quality |
|---|---|---|
| Dense Transformers | Concept | 58.0 |
Cached Content Preview
HTTP 200Fetched Apr 9, 202612 KB
The cost of training AI could soon become too much to bear | Fortune Home
Latest
Fortune 500
Finance
Tech
Leadership
Lifestyle
Rankings
Multimedia
Tech AI The cost of training AI could soon become too much to bear
By David Meyer David Meyer Down Arrow Button Icon By David Meyer David Meyer Down Arrow Button Icon April 4, 2024, 6:13 AM ET Add us on The cost of training the most advanced AI models may soon be too much to bear, some experts forecast. Getty Images Although companies like OpenAI and Google don’t disclose the precise costs of training AI models like GPT-4 and Gemini, it’s clearly a fiendishly expensive business—and the bigger and more capable these so-called frontier models get, the more it costs to train them.
Recommended Video
When OpenAI released GPT-3 in 2020, cloud provider Lambda suggested the model—which had 175 million parameters—cost over $4.6 million to train. OpenAI hasn’t disclosed the size of GPT-4, which it released a year ago, but reports range from 1 trillion to 1.8 trillion parameters, and CEO Sam Altman vaguely pegged the training cost at “ more than ” $100 million. Anthropic CEO Dario Amodei suggested in August that models costing over $1 billion would appear this year and that “by 2025 we may have a $10 billion model.”
Is this kind of exponential cost growth realistic? It’s certainly a real trend, say researchers, but whether it can be sustained is another matter.
In 2022, researchers in the U.K., the U.S., Germany, and Spain found that, since the deep learning field took off in the early 2010s, the amount of computational power needed to train the most capable new models doubled roughly every six months. According to Epoch AI director Jaime Sevilla, who was the lead author on that paper, the trajectory has held since then, with the cost of training roughly tripling each year—the 4X growth in compute requirements is offset by a 1.3X increase in efficiency.
“It’s still a straight line, and it keeps pointing up,” says Sevilla.
Here’s Epoch AI’s projection of the hardware cost involved in training the most expensive AI models, through 2030. This excludes AI researchers’ salaries, which are considerable these days. There is also huge uncertainty—as is clear from the vast range in Epoch’s estimations—about the exact trajectory. This results from how little is publicly known about the size and cost of models like GPT-4 and the statistical effects of disparate estimates of the exact investment growth rate, which compounds the more years out one tries to project. Sevilla describes the median forecast—the line that hits $140 billion by 2030—as “a naive extrapolation based on historical data, rather than an all-things-considered forecast.”
And here come the additional caveats, apart from that uncertainty. The first and most obvious is that, if this trend continues, the cost of training relative to the capabilities that are gained will at some point become too much for any company
... (truncated, 12 KB total)Resource ID:
b2534f71895a316d | Stable ID: sid_0aG9lfcBXZ