Alibaba and China Telecom announced Tuesday they’re building a computing facility in southern China that will use chips designed by the e-commerce company. It’s part of China’s broader push to develop its own tech infrastructure.
The center will have 10,000 Zhenwu semiconductors that Alibaba built for AI work. The chips can run AI systems with hundreds of billions of parameters. China Telecom will own and operate the place.
That’s big. It shows Chinese tech companies are getting serious about making their own chip designs, especially as Beijing wants less dependence on foreign technology.
Washington has spent the last few years blocking China from buying certain semiconductor equipment and chips. That includes AI processors from Nvidia. The restrictions have basically forced Chinese companies to work faster on building their own alternatives.
Alibaba makes its chips through a unit called T-head. The Hangzhou-based company is also one of China’s biggest cloud computing players. It designs chips, runs data centers, and builds AI models that get sold through its cloud business. Cloud has been growing faster than most of its other divisions.
CEO Eddie Wu said Tuesday he’ll head up a new technology committee. Zhou Jingren, the company’s chief AI architect, will be on it. So will Li Feifei, who runs technology for Alibaba Cloud, and Wu Zeming, the group’s chief technology officer.
China’s been building more large-scale data centers with domestic tech. Last month, a computing system using Huawei’s Ascend 910C AI chips went live.
U.S. tech giants are expected to drop around $700 billion this year on AI infrastructure. Chinese companies are taking a different route. They’re spending less and focusing on AI applications they think will actually make money and deliver returns.
The data center is in Shaoguan, which is in Guangdong province. China Telecom and Alibaba said they’re planning to expand it to 100,000 chips. The computing power could be used for healthcare, advanced materials, and other industries. Alibaba’s stock (NYSE: BABA) was up 4.68 percent on Wednesday.
SMIC and Hua Hong Semiconductor both hit sales highs in 2025. AI demand and U.S. export restrictions that push China to build domestic tech faster have fueled the growth.
SMIC, China’s biggest chipmaker, grew revenue 16 percent to $9.3 billion. Analysts think it’ll hit $11 billion in 2026. Hua Hong had its best fourth quarter ever with sales of $659.9 million and expects steady growth into early 2026.
Smaller Chinese firms also reported record numbers last year. ChangXin Memory Technologies, which is privately held, saw revenue jump 130 percent to $8 billion. Moore Threads Technology Co., a GPU design company, saw its 2025 revenue rise somewhere between 231 and 247 percent.
The homegrown approach is paying off in the Chinese market. Nvidia, the California company that’s now the world’s most valuable, used to dominate AI chip sales in China. Not anymore.
Chinese GPU and AI accelerator makers grabbed 41 percent of the local market in 2025, shipping 1.65 million cards. Nvidia still leads with 55 percent and 2.2 million cards, but that’s a big drop from where it was before.
Uber is expanding its deal with Amazon Web Services to run more of its platform on Amazon’s own AI and compute chips. That includes more use of Graviton, AWS’s Arm-based processors, and a trial of Trainium, its AI training chip that’s positioned as a Nvidia competitor.
It’s a shift in Uber’s cloud strategy. The company had said it would move infrastructure to Google Cloud and Oracle in 2023, but now it’s leaning more on AWS, especially for AI workloads. Amazon is using its custom silicon to win over big customers who want alternatives to traditional chip providers.
The deal shows how intense the competition is in AI infrastructure. AWS is using its own hardware to win enterprise business. Uber joins Anthropic, OpenAI, and Apple in using more AWS chips as AI compute demand keeps growing.
Even with record results and strong forecasts, Nvidia’s stock (NASDAQ: NVDA) has been stuck in place for over eight months. There’s no single reason holding back the AI chipmaker. It’s more like a bunch of things at once.
Geopolitics, inflation that won’t quit, and questions about AI’s future have all weighed on Nvidia’s stock. Some experienced investors are starting to lose confidence.
Hedge funds are getting in on it, too. They sold stocks last month at the fastest rate in 13 years, according to Goldman Sachs data. Nvidia was one of the big tech names that got hit. Fund managers also shorted U.S. exchange-traded funds, which is a pretty bearish sign that they think stock prices will drop. Historically, moves like that don’t look good for the market.
However, Bank of America analyst Vivek Arya just raised the firm’s global semiconductor forecast for 2026 to $1.3 trillion. That’s $300 billion higher than what the bank predicted just four months ago.
Arya said Nvidia and Broadcom are still the main drivers behind AI spending.
The smartest crypto minds already read our newsletter. Want in? Join them.