Skip to content
Longterm Wiki
Index
Fact·f_ZencK2XFDA·Fact

OpenAI — Model Parameters: 175 billion

Verdictconfirmed99%
1 check · 4/16/2026

The source text explicitly confirms that GPT-3 has 175 billion parameters. The abstract states 'we train GPT-3, an autoregressive language model with 175 billion parameters' and Table 2.1 shows the exact parameter count as 175.0B. The arxiv submission date (2005.14165 = May 28, 2020) aligns with the claimed 'as of 2020-06' timeframe. The exact stored value in the claim (175,000,000,000) matches the source's specification of 175 billion parameters.

Our claim

entire record
Subject
OpenAI
Property
Model Parameters
Value
175 billion
As Of
June 2020
Notes
GPT-3 parameter count.

Source evidence

1 src · 1 check
confirmed99%primaryHaiku 4.5 · 4/16/2026

NoteThe source text explicitly confirms that GPT-3 has 175 billion parameters. The abstract states 'we train GPT-3, an autoregressive language model with 175 billion parameters' and Table 2.1 shows the exact parameter count as 175.0B. The arxiv submission date (2005.14165 = May 28, 2020) aligns with the claimed 'as of 2020-06' timeframe. The exact stored value in the claim (175,000,000,000) matches the source's specification of 175 billion parameters.

Case № f_ZencK2XFDAFiled 4/16/2026Confidence 99%