About a year ago, Xiao Li saw a flood of Nvidia chips on WeChat. He was a real estate contractor who had become a data center project manager. In 2023, he switched to AI infrastructure, attracted by the promise of China’s AI craze.
At the time, traders from his circle boasted about securing shipments high-performance Nvidia GPUs which were subject to US Export restrictions. Many were smuggled to Shenzhen through overseas channels. On the black market, a single Nvidia chip, which is crucial for training AI models, sold up to 200,000 Yuan ($28,00) at the height of demand.
His WeChat feed, and the industry group chats, tell a completely different story. Traders have become more discreet and prices are down. Despite this, Li has seen two data center projects that he is familiar with struggle to find funding from investors, who are predicting poor returns. This forces project leaders to sell surplus GPUs. “It appears that everyone is selling but few are purchasing,” he says.
A few months ago, the construction of data centers was booming, thanks to both government and private investors. Many newly constructed facilities are now empty. According to those who spoke with MIT Technology Review, including contractors, an executive from a GPU server firm, and project managers, most of the companies that run these data centers are struggling. Local Chinese outlets Jiazi Guiangnianas well as 36kr claim that up to 80% China’s newly constructed computing resources are unused. Renting GPUs to companies who need them for AI model training–the primary business model for the latest wave of data centres–was once considered a sure bet. The industry is in trouble due to the sudden shift in economics and the rise of DeepSeek.
“The growing pain China’s AI industry is going through is largely a result of inexperienced players–corporations and local governments–jumping on the hype train, building facilities that aren’t optimal for today’s need,” says Jimmy Goodrich, senior advisor for technology to the RAND Corporation.
The result is that projects fail, energy is wasted, and data centres have become “distressed” assets whose investors want to unload them below market rates. He says that the situation may eventually lead to government intervention: “The Chinese Government is likely to step up, take over, then hand them off to operators who are more capable.”
An chaotic building boom
The response in China when ChatGPT exploded on the scene in 2022 was swift. The central government declared AI infrastructure a national priority and urged local governments to speed up the development of “smart computing centers” – a term coined for AI-focused data centres. According to KZ Consulting (a market research company), over 500 new data centers will be announced in 2023 and 2024 from Inner Mongolia to Guangdong. According to the China Communications Industry Association Data Center Committee (a state-affiliated association), at least 150 newly built data center projects were completed and operational by the end 2024. State-owned companies, publicly-traded firms, and state affiliated funds lined up to make investments in them, hoping they would position themselves as AI leaders. Local governments heavily promoted these projects in the hopes that they would stimulate the economy and establish the region as a major AI hub.
As these expensive construction projects continue, however, the Chinese frenzy for large language models is losing steam. Over 144 companies registered to develop their own LLMs with the Cyberspace Administration of China, China’s central internet regulator in 2024. According to the Economic Observer, a Chinese publication that focuses on the economy, only 10% of these companies were actively investing in large scale model training at the end of the calendar year.
China’s political system is highly centralised, with local government officials usually moving up through regional appointments. Many local leaders are more interested in short-term projects that show quick results, often to gain favor from higher-ups, than long-term growth. Local officials have used large, high-profile projects to boost their careers for years.
This dynamic was only intensified by the post-pandemic recession. China’s once-dominant real estate sector, which was the backbone of the local economies, slumped for the first time since decades. Officials scrambled to come up with alternative growth drivers. Meanwhile, China’s once-high-flying Internet industry also entered a period stagnation. In this vacuum, AI-based infrastructure became the new stimulus.
Li says that AI felt like a shot. “A lot money that used flow into real-estate is now going to AI data centers.” Fang Cunbao is a data center manager in Beijing who says that some saw AI infrastructure as an opportunity to justify business expansions or boost stock values. Lotus, a MSG manufacturer, as well as Jinlun Technology – a textile company – were among them. These are not names that one would associate with cutting edge AI technology.
The gold rush approach meant that the push for AI data centers was driven largely from the top-down, often without regard to actual demand or technical feasibility. Fang, Li and multiple on the ground sources who asked to remain anonymous for fear of political repercussions, said this. They claim that many projects were led or funded by executives and investors who had little expertise in AI infrastructure. In a rush to keep pace, many projects were built hastily and did not meet industry standards.
Goodrich says that assembling large clusters of chip is a difficult task. Very few companies or individuals are able to handle it on a large scale. “This is really cutting-edge computer engineering. I’d be surprised to find out that most of these smaller companies know how to do this. Many of the newly built data centers are quickly strung and do not offer the stability a company like DeepSeek wants.”
Making matters worse, project managers often relied on brokers and middlemen–some who exaggerated forecasts of demand or manipulated procurement procedures to pocket government subsidies.
The excitement that once surrounded China’s data center boom has waned by the end of 2024. GPU rental is not a profitable business anymore.
DeepSeek reckoning.
In theory, the business model for data centers is simple: They make money renting GPU clusters to businesses that need computing power for AI training. In reality, however securing clients can be difficult. Only a handful of top tech companies in China use computing power to train AI models. Since the rise of DeepSeek and its open-source reasoning system R1, which matches ChatGPT o1’s performance but is built at a fraction the cost, many smaller players have given up on pretraining models or changed their strategy.
DeepSeek is a moment for reckoning in the Chinese AI industry. Hancheng Cao is an assistant professor at Emory University.
The rise in reasoning models such as DeepSeek’s o1 and OpenAI’s chatGPT o3 and o1 has also changed the needs of businesses from a datacenter. This technology allows for the most computing power to be used in response to user queries. It does not require the training of a model or the creation of a model. This reasoning process is often more efficient but takes a lot longer. Hardware with low latency is therefore essential. To minimize transmission delays, data centers should be located near major tech centres. This will also ensure that highly-skilled operations and maintenance staff are available.
Because of this change, many data centers in rural, central, and western China, where electricity and land is cheaper, are losing their appeal to AI companies. In Zhengzhou in Li’s native province of Henan a newly constructed data center distributes free computing vouchers to local firms, but still struggles to gain clients.
In addition, many of the new data centres that have appeared in recent years are optimized for pretraining workloads, which is large, sustained computations on massive data sets, rather than inference, or the process of running trained models to respond to inputs from users in real time. Hardware that is optimized for inference differs from the hardware used traditionally for large-scale AI-training.
Nvidia GPUs such as the H100 and A100 were designed for massive data processing and prioritize speed and memory capacity. As AI moves towards real-time reasoning the industry is looking for chips that are more efficient and responsive. Even a minor error in infrastructure can make a datacenter unsuitable for the tasks that clients need.
Under these circumstances, GPU rental prices have dropped to an all time low. According to a recent report by the Chinese media outlet Zhineng Yongxian (19459009), an Nvidia server with eight GPUs rents for 75,000 yuan a month. This is down from peaks of around 182,000 yuan. Fan says that some data centers would prefer to leave their facilities empty rather than risk losing more money due to the high operating costs. The shortage of cutting edge chips is a result of an oversupply in computational power, particularly in central and western China.
But not all brokers wanted to make money off data centers. Many were more interested in maximizing government benefits. According to Fang and other Chinese media, some operators exploit the green electricity sector by obtaining permits to produce and sell power. They resell the energy back to the grid, at a premium, instead of using it for AI workloads. According to local media Jiazi Guangnian, in other cases, companies purchase land for data center developments to qualify for state-backed credits and loans, leaving facilities unoccupied while still receiving state funding.
Fang says that by 2024, “no clear-headed contractor or broker would enter the business expecting direct profit.” “Everyone I spoke to is using the data center deal to get something else that the government can offer.”
An evil necessary
China’s central Government is still pushing for AI infrastructure despite the underutilization. It convened a AI industry symposium in early 2025 to emphasize the importance of self-reliance.
Major Chinese technology companies are taking notice and making investments that align with this national priority. Alibaba Group announced plans to invest over $50 billion in cloud computing infrastructure and AI hardware over the next three year, while ByteDance will investabout $20 billionin GPUs and Data Centers.
Meanwhile, companies in the US do the same. OpenAI, Softbank and Oracle, among other major tech firms, have joined forces to commit to Stargate initiative. The initiative plans to invest $500 billion in the next four-years to build advanced computing infrastructure and data centers. Experts say that China will not scale back its AI efforts due to the AI competition between China and the United States. Goodrich, a tech policy advisor at RAND, says that infrastructure will determine the success of generative AI.
The Chinese central government is likely to see [underused data centers] a necessary evil in order to develop a significant capability, a growing-pain of sorts. The state will consolidate the distressed assets and failed projects. Goodrich says, “They see the end and not the means.”
The demand for Nvidia chips remains strong, especially for the H20 chip that was specifically designed for the Chinese market. One industry source who asked not to be named due to his company’s policy confirmed that the H20 is the most popular Nvidia chips, followed by the H100. Sales of the H100 are still steady in China, even though they are officially restricted by US Sanctions. Some of the new demand comes from companies who are deploying their versions of DeepSeek’s open-source models.
Many data centers in China are still in limbo, built for a future which has not yet arrived. It is unclear whether they will find another life. Fang Cunbao’s DeepSeek success has been a moment to reckon with, casting doubt on Fang Cunbao’s assumption that an endless expansion in AI infrastructure will guarantee progress.
It’s a myth now, he realizes. Fang quit the data center business at the beginning of this year. “The market is chaotic. He says that early adopters made money, but now people are just chasing policy loopholes. He has decided to move into AI education.
He says that the only thing standing between us and a world where AI is everywhere, is solid plans for deploying the technology.