Alibaba Deploys 10,000 Local AI Chips as China’s Big Tech Accelerates Nvidia Exit Strategy

Alibaba has launched a major offensive in the semiconductor independence race, deploying 10,000 of its proprietary 'Zhenwu' AI chips through a new computing center operated by China Telecom. This massive deployment signals China's accelerating pivot away from foreign semiconductor reliance, coming directly as U.S. restrictions block access to critical AI processors like those from Nvidia. The move demonstrates Chinese tech giants are not just designing alternatives but are now executing at scale, with Alibaba's T-head unit producing chips capable of running AI systems with hundreds of billions of parameters—a direct challenge to Western AI infrastructure dominance.
Chinese chip companies have been posting record revenues
SMIC and Hua Hong Semiconductor both hit sales highs in 2025. AI demand and U.S. export restrictions that push China to build domestic tech faster have fueled the growth.
SMIC, China’s biggest chipmaker, grew revenue 16 percent to $9.3 billion. Analysts think it’ll hit $11 billion in 2026. Hua Hong had its best fourth quarter ever with sales of $659.9 million and expects steady growth into early 2026.
Smaller Chinese firms also reported record numbers last year. ChangXin Memory Technologies, which is privately held, saw revenue jump 130 percent to $8 billion. Moore Threads Technology Co., a GPU design company, saw its 2025 revenue rise somewhere between 231 and 247 percent.
The homegrown approach is paying off in the Chinese market. Nvidia, the California company that’s now the world’s most valuable, used to dominate AI chip sales in China. Not anymore.
Chinese GPU and AI accelerator makers grabbed 41 percent of the local market in 2025, shipping 1.65 million cards. Nvidia still leads with 55 percent and 2.2 million cards, but that’s a big drop from where it was before.
It’s been a rough year so far for Nvidia
Uber is expanding its deal with Amazon Web Services to run more of its platform on Amazon’s own AI and compute chips. That includes more use of Graviton, AWS’s Arm-based processors, and a trial of Trainium, its AI training chip that’s positioned as a Nvidia competitor.
It’s a shift in Uber’s cloud strategy. The company had said it would move infrastructure to Google Cloud and Oracle in 2023, but now it’s leaning more on AWS, especially for AI workloads. Amazon is using its custom silicon to win over big customers who want alternatives to traditional chip providers.
The deal shows how intense the competition is in AI infrastructure. AWS is using its own hardware to win enterprise business. Uber joins Anthropic, OpenAI, and Apple in using more AWS chips as AI compute demand keeps growing.
Even with record results and strong forecasts, Nvidia’s stock (NASDAQ: NVDA) has been stuck in place for over eight months. There’s no single reason holding back the AI chipmaker. It’s more like a bunch of things at once.
Geopolitics, inflation that won’t quit, and questions about AI’s future have all weighed on Nvidia’s stock. Some experienced investors are starting to lose confidence.
Hedge funds are getting in on it, too. They sold stocks last month at the fastest rate in 13 years, according to Goldman Sachs data. Nvidia was one of the big tech names that got hit. Fund managers also shorted U.S. exchange-traded funds, which is a pretty bearish sign that they think stock prices will drop. Historically, moves like that don’t look good for the market.
However, Bank of America analyst Vivek Arya just raised the firm’s global semiconductor forecast for 2026 to $1.3 trillion. That’s $300 billion higher than what the bank predicted just four months ago.
Arya said Nvidia and Broadcom are still the main drivers behind AI spending.
The smartest crypto minds already read our newsletter. Want in? Join them.
Related Articles
Log in to Reply
Log in to comment your thoughtsComments