Skip to content
Longterm Wiki

DeepSeek V3

DeepSeekOpen Weight
DeepSeek V3 was released December 26, 2024. A 671B parameter mixture-of-experts model (37B active per token). Achieved GPT-4o-level performance at a fraction of the training cost — reportedly trained for just \$5.6M using FP8 mixed precision on 2,048 H800 GPUs. Scored 88.5% on MMLU and 90.2% on MATH. Released under MIT license. API pricing at \$0.27/\$1.10 per million tokens — one of the cheapest frontier-class models.
Developer
DeepSeek
Released
2024-12-26
Context Window
128K tokens

Modality

text

Capabilities1

tool-use

Details

Model FamilyDeepSeek
Generation3
Release Date2024-12-26
Parameters671B
Context Window128K tokens
Open WeightYes

Tags

deepseekmixture-of-expertsopen-weightcost-efficient