Analysis. The shockwave that followed the release of DeepSeek’s AI models has led many to doubt the assumption that spending more money on expensive large-scale GPU infrastructure will deliver the best results.
.
China-based company claims that DeepSeek is on par with existing models, and that it costs less than $6 million for training are also unverified.
This move calls into question the assumption of spending billions of dollars on datacenter infrastructure to build larger and complex models if China can achieve this with limited supply of older hardware. Nvidia, who has enjoyed record profits for its GPU accelerators in AI, lost In a single day, the market value of Apple dropped by almost $600 billion.
This hysteria is a result of growing concern that more money is being invested in AI development and infrastructure to support it. There has been little returnso far.
The initial panic may have been unfounded as the freefall of US tech shares was soon stopped, and experts pointed to the fact that DeepSeek appeared to have used Output from existing models that Anthropic and OpenAI developed in their training. Unverified are the claims of the China-based company that DeepSeek is comparable to existing models, and that it costs less than $6 million for training. Manoj Sukumaran is Omdia’s Principal Analyst for Datacenter IT. He told The Register. He said. Sukumaran said that these innovations were essential for GenAI to be accessible to more users and would instead accelerate user adoption. Sukumaran believes that the massive AI buildouts will continue. He added.
OpenAI and friends aren’t the only Chinese LLM makers to be concerned about. Right, Alibaba?
READ MORE.
However, Taiwan-based TrendForce predicts that organizations will conduct more rigorous assessments of AI infrastructure investments and focus on adopting efficient models to reduce their reliance on GPUs.
The analyst also envisages growth in the adoption of infrastructure using custom ASICs (application-specific integrated circuits) to lower deployment costs, and that demand for GPU-based products could see “notable changes” from 2025 onward. TrendForce reports. “DeepSeek has adopted model distillation techniques to compress large models, improve inference speed, and reduce hardware dependencies.”
- Microsoft launches DeepSeek R1 on Azure AI Foundry and GitHub
- DeepSeek sparks doubt and intrigue in the tech world.
- Microsoft touts’significant investments’ in AI to counter DeepSeek.
- Guess whose database was left wide open exposing chat logs and API keys? DeepSeek.
This week, IBM CEO Arvind Krsna said that he found in DeepSeek some confirmation for his company’s AI approach. Krishna made the claim during the recent earnings call of his company
“We see as much as 30 times reduction in inference costs using these approaches. As other people begin to follow that route, we think that this is incredibly good for our enterprise clients. And we will certainly take advantage of that in our business, but I believe that others will also follow that route.”
According to Gartner’s note on the implications for DeepSeek, the analyst stated that efficient scaling of AI would be more important in the future than how much computing power can be assembled. It observed.
But, it said that the Chinese AI does not set a new standard for model performance because it often matches existing models but doesn’t exceed them. Gartner states that
In terms of infrastructure, “it’s not proof that scaling models via additional compute and data doesn’t matter, but that it pays off to scale a more efficient model.”
the takeaway is that DeepSeek won’t suddenly lead to a drop in demand for AI Infrastructure. So Nvidia investors, and those who pump money into datacenters, can rest a little easier. It is not the harbinger that some predict of the AI bubble burst.
It serves as a reminder to do things better and that throwing money and resources into a problem may not be the best solution. Neil Roseman, CEO at security firm Invicti, said
“DeepSeek’s superior price-to-performance ratio serves as a reality check for the AI industry, particularly US companies and their venture capital backers,” . “While companies make massive bets on AI, current results don’t justify these investments. Success will come from efficient, focused development addressing genuine needs.” (r)