OpenAI — Model Parameters: 175 billion
The source text explicitly confirms that GPT-3 has 175 billion parameters. The abstract states 'we train GPT-3, an autoregressive language model with 175 billion parameters' and Table 2.1 shows the exact parameter count as 175.0B. The arxiv submission date (2005.14165 = May 28, 2020) aligns with the claimed 'as of 2020-06' timeframe. The exact stored value in the claim (175,000,000,000) matches the source's specification of 175 billion parameters.
Our claim
entire record- Subject
- OpenAI
- Property
- Model Parameters
- Value
- 175 billion
- As Of
- June 2020
- Notes
- GPT-3 parameter count.
Source evidence
1 src · 1 checkNoteThe source text explicitly confirms that GPT-3 has 175 billion parameters. The abstract states 'we train GPT-3, an autoregressive language model with 175 billion parameters' and Table 2.1 shows the exact parameter count as 175.0B. The arxiv submission date (2005.14165 = May 28, 2020) aligns with the claimed 'as of 2020-06' timeframe. The exact stored value in the claim (175,000,000,000) matches the source's specification of 175 billion parameters.