- Anna's DayBreak News
- Posts
- The AI Chip Cold War - Part V: China's Answer: DeepSeek, Local Chips, and Aiming for AI Independence
The AI Chip Cold War - Part V: China's Answer: DeepSeek, Local Chips, and Aiming for AI Independence
Anna's Deep Dives
Just facts, you think for yourself
A. DeepSeek Shakes Things Up
A Chinese AI startup called DeepSeek, started in 2023, has made a big entrance. Its DeepSeek-R1 AI model, released in January 2025, quickly got worldwide attention. The model reportedly cost less than $6 million to develop. This is very different from some Western companies that spent hundreds of millions.
DeepSeek-R1 quickly became the most downloaded free app in the U.S. just weeks after it came out. This fast pickup showed how big its impact could be. The stock markets reacted strongly. Nvidia's stock dropped nearly 17%, which meant about a $600 billion loss in its market value. Some of this was because of DeepSeek.
DeepSeek is good at more than just one model. Its DeepSeek-Prover-V2 model is very good at math reasoning. It solved 49 out of 658 problems from the tough PutnamBench. It also handled 6 out of 15 problems from recent AIME math contests. In healthcare, DeepSeek-V3 showed it's great at diagnosing illnesses. It scored 4.70 out of 5 for diagnosis and 4.48 out of 5 for treatment ideas in 125 patient cases. This is as good as well-known models like GPT-4o.
B. Smarter, Not Harder? DeepSeek's Efficient AI
DeepSeek's way of doing things shows amazing efficiency. The company said it trained its powerful DeepSeek-V3 model for only $6 million. It used a fairly small group of 2,048 Nvidia H800 GPUs. This is very different from Western AI labs. For example, OpenAI reportedly spent $80 million to $100 million to train its GPT-4 model. That took about 16,000 advanced Nvidia H100 GPUs.
The DeepSeek-R1 model uses a new Mixture-of-Experts (MoE) design. This design is a big reason why it's so efficient. The MoE system smartly turns on only a small part – about 37 billion – of the model's total 671 billion settings for any task. This greatly cuts down on how much computing it needs. DeepSeek got great test results by training this model on a huge 14.8 trillion pieces of data, all reportedly for about $6 million. Older models often need much more money for something this big. Compared to some others who might use over 16,000 GPUs, DeepSeek's training method saved about $5.6 million.
The running costs are also very different. DeepSeek's model reportedly costs $2.19 for every million pieces of information it produces. But OpenAI's model can cost around $60 for the same amount, which is almost thirty times more. DeepSeek uses advanced methods like DualPipe and FP8 Mixed Precision Training. These methods help use resources better when training and running the model. However, some industry experts are doubtful. They question if DeepSeek’s reported costs are completely accurate, thinking there might be hidden costs that are much higher.
C. DeepSeek vs. Big Spenders: A New Way for AI?
DeepSeek's success with cheap, highly efficient AI models goes against the usual way things are done in tech. It makes people question the common idea that AI development needs a lot of money. The company's work suggests that top AI results might be possible without the huge spending on hardware that people used to think was necessary. This could help countries and groups with less money. They might now see a way to create their own AI.
The big, established companies work on a different level. For instance, Meta (Facebook's parent company) recently said it plans to spend a massive $60 billion to $65 billion on its AI systems. This shows the kind of big spending that DeepSeek is trying to change. DeepSeek's R1 model showed that using fewer, or even older, Nvidia chips could still give results similar to top AI systems. This again shows how it's challenging the idea that you need the newest hardware.
We don’t take shortcuts, chase headlines, or push narratives. We just bring you the news, straight and fair. If you value that, click here to become a paid subscriber—your support makes all the difference.
D. DeepSeek's Open Approach: Good for All or Smart Business?
DeepSeek has made some of its models open source. This approach seems to encourage people around the world to work together and be more open about AI development. It lets developers everywhere use and improve advanced AI without paying huge costs upfront. This could make AI innovation available to more people.
This open-source idea generally fits with Europe’s efforts for better data privacy and more local tech control. It could also help local tech groups worldwide by giving them access to basic AI models. But the open approach has its critics and possible problems. It brings up worries about spreading biases that might be in the AI's training data. The global community needs to constantly and carefully test and watch these models.
Security and data privacy problems with powerful AI models are still there. DeepSeek's tech has reportedly been banned for government workers in several countries, including the U.S. and Australia, because of these worries. Also, some reports say DeepSeek's chatbot, even though it's good in some ways, failed accuracy tests a lot (reportedly 83% of the time in some tests). This shows it's still hard to make sure AI is reliable, even with new designs. It's still a big question if DeepSeek is being open just for innovation or if it's a smart business move to get popular worldwide.
E. More Than DeepSeek: China's Own AI Chip Makers Are Growing
While DeepSeek gets a lot of attention for its AI models, many other Chinese AI chip makers are also growing well in China. In 2025, China's AI chip scene includes big tech companies and many new, ambitious startups. Important companies include Huawei, Alibaba (with its T-Head chip department), and Baidu. Each of these big companies is making its own AI chips. They are trying to improve China's tech skills and depend less on foreign companies, especially with U.S. rules in place.
Many startups are also making good progress. Cambricon Technologies, one of China's first AI chip companies, recently got CNY5 billion (about USD693 million) in funding. This money is for making new chips for large language models. Zhipu AI, another well-known AI startup, is valued at about CNY20 billion (around USD2.7 billion). This shows investors believe in it. Companies like Moore Threads and Biren Technology are clearly working to catch up to Western leaders like Nvidia. Some think the tech gap is about ten times in certain performance areas. They are focused on making powerful GPUs for the Chinese market.
F. Huawei, Alibaba, and Other New Chinese Chip Makers
Huawei is now a key player in China's effort to make its own AI chips. Its Ascend series of AI chips is getting more advanced. The Ascend 910B can perform up to 400 TFLOPS, which is 80 TFLOPS better than earlier versions. The company also has the Ascend 910D. An important new chip is Huawei's Ascend 910C. This chip reportedly mixes the power of two 910B chips. It tries to perform as well as Nvidia's H100 AI GPU. Shipments of the Ascend 910C were planned to start this month, May 2025.
Showing how companies are working together, DeepSeek said it released an AI model using Huawei's Ascend 910B chip. DeepSeek said this setup was 97.3% cheaper to run than systems using OpenAI's ChatGPT-4. Huawei's CloudMatrix 384 system also shows a lot of computing power. It delivers 300 PFLOPs, reportedly 1.7 times more than Nvidia's GB200 NVL72 system (180 PFLOPs). The CloudMatrix also has 49.2 TB of memory, more than the Nvidia system's 13.8 TB.
Alibaba, with its T-Head company, is another big local chip maker. Its Yitian 710 server CPU, which came out in 2021, has 128 Arm cores and can run at 3.2 GHz. It got a great score of 440 in the SPECint2017 test. At the time, this was reportedly 20% better than top international processors. Alibaba also made the Zhenyue 510 chip to cut delays by 30% for its large cloud computing services.
Baidu, the big Chinese search engine company, is also important with its Kunlunxin AI chips. Baidu started making chips in 2019. By early 2025, the company had set up a huge group of 10,000 GPUs. This was reportedly the biggest AI computing group like it in China. Its Kunlunxin chips run different AI programs, including Baidu's self-driving robotaxi service.
G. China's AI Chips: Catching Up or Doing Things Differently?
China's own AI chip industry is clearly growing stronger. Local companies like Huawei are directly competing with Nvidia's long-time top position, at least in the Chinese market. Semiconductor Manufacturing International Corporation (SMIC), China's biggest chip factory, is working hard to make more advanced chips. They are trying to make many chips at 7-nanometer and, later, 5-nanometer sizes. But U.S. export rules greatly slow down China's ability to get the key foreign tech and advanced equipment needed for these top-level chips.
The types of AI chips used in China are changing. Because of U.S. rules, the amount of foreign-made AI chips in China is expected to fall from 63% in 2024 to about 41.5% in 2025. This drop shows both the effect of the rules and that more Chinese companies are using local options. Big Chinese tech companies like Baidu and Tencent are now putting more money into local chip making. This trend is a clear plan to become more self-sufficient. It's predicted that local chip production in China will cover 35% of what China needs by the end of 2025.
Even small and medium-sized businesses (SMEs) in China are changing their AI plans. Many are now choosing to rent AI computing power from cloud companies instead of buying expensive AI hardware themselves. A recent study showed that 39.5% of Chinese SMEs doing AI work are choosing to rent for 6 to 12 months. They mostly use this rented power for training deep learning models and for science computing.
But Chinese companies haven't completely given up on foreign chips yet. Many still use foreign tech that they know works well. This is partly because there aren't clear government rules forcing them to use local products for everything. This continued use of foreign tech, even though it's decreasing, makes it harder for China to reach its big goal of being fully independent in chip technology.

Subscribe to Premium to read the rest.
We believe that access to unbiased news is essential for an informed society. Your support enables us to maintain our independence and deliver factual reporting.
Already a paying subscriber? Sign In.
A subscription gets you:
- • Full Content Access: Unlock all our content, from daily news updates, weekly deep dives and insightful market recaps.
- • Ad-Free Experience: Enjoy uninterrupted reading, free from distracting advertisements.
- • Support Neutral Journalism: Your subscription directly funds our commitment to unbiased reporting.