Back
Our World in Data GPU performance
webCredibility Rating
4/5
High(4)High quality. Established institution or organization with editorial oversight and accountability.
Rating inherited from publication venue: Our World in Data
A useful reference dataset for researchers analyzing AI compute trends, hardware scaling, and their implications for AI capability trajectories and safety timelines.
Metadata
Importance: 52/100dataset
Summary
This Our World in Data visualization tracks GPU computational performance per dollar over time, focusing on hardware used in large AI model training. The data is inflation-adjusted and illustrates the dramatic improvements in cost-efficiency of AI training compute. It provides empirical grounding for understanding AI capability scaling trends.
Key Points
- •Tracks calculations per dollar for GPUs used in AI training, adjusted for inflation over time.
- •Illustrates exponential improvements in GPU price-performance, relevant to understanding AI scaling dynamics.
- •Provides empirical data supporting analysis of compute trends driving AI capability growth.
- •Focuses specifically on hardware relevant to large AI models, not general consumer GPUs.
- •Useful for forecasting future compute availability and its implications for AI development timelines.
Review
This source offers a critical analysis of GPU computational performance, examining how many floating-point operations per second can be achieved per dollar of hardware investment. By tracking GPUs specifically used for training large AI models (over 1 billion parameters), the research provides insights into the evolving landscape of AI computational infrastructure. The methodology is particularly noteworthy for its nuanced approach, acknowledging that raw hardware metrics only tell part of the story. The analysis recognizes that software and algorithmic advances can deliver substantial performance improvements independent of hardware upgrades. By using 32-bit precision measurements and noting that real-world performance might differ due to lower precision calculations, the source provides a balanced and forward-looking perspective on AI computational capabilities.
Cached Content Preview
HTTP 200Fetched Apr 9, 202611 KB
GPU computational performance per dollar - Our World in Data Data GPU computational performance per dollar
See all data and research on: Artificial Intelligence What you should know about this indicator
This measures computing power per dollar—specifically, how many calculations per second you get for each inflation-adjusted dollar when buying a GPU.
GPUs are specialized chips that can perform many calculations simultaneously, making them the primary hardware for training AI systems. The data includes only GPUs used to train major AI models (those with over 1 billion parameters) or specifically designed for machine learning.
The chart shows theoretical peak performance using a standard precision format (32-bit precision). Modern AI training typically uses lower precision calculations that are faster, so real-world performance may be higher than shown here.
These figures reflect purchase prices only (adjusted to 2024 dollars). Running costs—electricity, cooling, and infrastructure—are not included here.
Raw hardware improvements tell only part of the story. Software and algorithmic advances often deliver substantial speedups, independent of better hardware.
GPU computational performance per dollar Hardware computational performance shown in floating-point operations per second (FLOP/s) per US dollar, adjusted for inflation. Source Epoch AI (2025); U.S. Bureau of Labor Statistics (2026) – with major processing by Our World in Data Last updated October 10, 2025 Next expected update October 2026 Unit FLOP/s/$ More Data on Artificial Intelligence
Sources and processing
This data is based on the following sources
Epoch AI – Machine Learning Hardware
This dataset contains detailed information about machine learning hardware, including GPUs, NPUs, and other specialized AI chips. It includes technical specifications such as computational performance across different precision levels (FP64, FP32, FP16, INT8, etc.), memory configurations, release dates, pricing, and manufacturing details.
Retrieved on February 27, 2026 Retrieved from https://epoch.ai/data/machine-learning-hardware Citation This is the citation of the original data obtained from the source, prior to any processing or adaptation by Our World in Data. To cite data downloaded from this page, please use the suggested citation given in Reuse This Work below. Epoch AI, 'Data on Machine Learning Hardware'. Published online at epoch.ai. Retrieved from ' https://epoch.ai/data/machine-learning-hardware ' [online resource]. This dataset contains detailed information about machine learning hardware, including GPUs, NPUs, and other specialized AI chips. It includes technical specifications such as computational performance across different precision levels (FP64, FP32, FP16, INT8, etc.), memory configurations, release dates, pricing, and manufacturing details.
Retrieved on February 27, 2026 Retrieved from https://epoch.ai/data/machin
... (truncated, 11 KB total)Resource ID:
84cf97372586911e | Stable ID: sid_9pQnLcNOog