Back
DeepSeek - Wikipedia
referenceCredibility Rating
3/5
Good(3)Good quality. Reputable source with community review or editorial standards, but less rigorous than peer-reviewed venues.
Rating inherited from publication venue: Wikipedia
Useful background reference on DeepSeek as a key actor in the current AI landscape; relevant to discussions of compute governance, export controls, and the geopolitics of frontier AI development.
Metadata
Importance: 52/100wiki pagereference
Summary
Wikipedia overview of DeepSeek, a Chinese AI company founded in 2023 that develops large language models and AI research. The company gained significant attention for releasing competitive open-weight models at lower training costs than Western counterparts, raising questions about compute efficiency and AI development dynamics.
Key Points
- •DeepSeek is a Chinese AI lab founded in 2023, notable for developing frontier-level LLMs at reportedly low training costs
- •Their models (DeepSeek-R1, V3, etc.) demonstrated competitive performance with leading Western models, challenging assumptions about compute requirements
- •The company's releases prompted discussions about export controls, compute governance, and the effectiveness of chip restrictions on AI development
- •DeepSeek publishes open-weight models, influencing debates around open vs. closed AI development and proliferation risks
- •Their efficiency claims raised questions about the relationship between compute spend and AI capabilities
6 FactBase facts citing this source
| Entity | Property | Value | As Of |
|---|---|---|---|
| DeepSeek | Headquarters | Hangzhou, Zhejiang, China | — |
| DeepSeek | Country | China | — |
| DeepSeek | Wikipedia | https://en.wikipedia.org/wiki/DeepSeek | — |
| DeepSeek | Founded Date | Jul 2023 | — |
| DeepSeek | Legal Structure | Private subsidiary of High-Flyer | — |
| DeepSeek | Website | https://www.deepseek.com | — |
Cached Content Preview
HTTP 200Fetched Apr 7, 202657 KB
DeepSeek - Wikipedia
Jump to content
From Wikipedia, the free encyclopedia
Chinese artificial intelligence company
This article is about the company. For the chatbot, see DeepSeek (chatbot) .
Hangzhou DeepSeek Artificial Intelligence Basic Technology Research Co., Ltd. Native name 杭州深度求索人工智能基础技术研究有限公司 Company type Private Industry Information technology
Artificial intelligence Founded 17 July 2023 ; 2 years ago  ( 2023-07-17 ) [ 1 ] Founder Liang Wenfeng
Headquarters Hangzhou , Zhejiang ,
China Key people Liang Wenfeng (CEO)
Products DeepSeek Owner High-Flyer Number of employees 160 (2025) [ 2 ] Website deepseek.com
Hangzhou DeepSeek Artificial Intelligence Basic Technology Research Co., Ltd. , [ 3 ] [ 4 ] [ 5 ] [ a ] doing business as DeepSeek , [ b ] is a Chinese artificial intelligence (AI) company that develops large language models (LLMs). Based in Hangzhou , Zhejiang , DeepSeek is owned and funded by the Chinese hedge fund High-Flyer . DeepSeek was founded in July 2023 by Liang Wenfeng , the co-founder of High-Flyer, who also serves as the CEO for both of the companies. [ 7 ] [ 8 ] [ 9 ] The company launched an eponymous chatbot alongside its DeepSeek-R1 model in January 2025.
Released under the MIT License , DeepSeek-R1 provides responses comparable to other contemporary large language models, such as OpenAI 's GPT-4 and o1 . [ 10 ] Its training cost was reported to be significantly lower than other LLMs. The company claims that it trained its V3 model for US$6 million—far less than the US$100 million cost for OpenAI's GPT-4 in 2023 [ 11 ] —and using approximately one-tenth the computing power consumed by Meta 's comparable model, Llama 3.1 . [ 11 ] [ 12 ] [ 13 ] DeepSeek's success against larger and more established rivals has been described as "upending AI". [ 14 ] [ 15 ]
DeepSeek's models are described as "open weight," meaning the exact parameters are openly shared, although certain usage conditions differ from typical open-source software . [ 16 ] [ 10 ] The company reportedly recruits AI researchers from top Chinese universities [ 14 ] and also hires from outside traditional computer science fields to broaden its models' knowledge and capabilities. [ 12 ]
DeepSeek significantly reduced training expenses for their R1 model by incorporating techniques such as mixture of experts (MoE) layers. [ 17 ] The company also trained its models during ongoing trade restrictions on AI chip exports to China, using weaker AI chips intended for export and employing fewer units overall. [ 13 ] [ 18 ] Observers say this breakthrough sent "shock waves" through the indu
... (truncated, 57 KB total)Resource ID:
kb-ccf8fffd26810a51