The AI Tax: How Nvidia Could Drain Every Fortune 500’s Wallet by 2035
Silicon Valley's new toll booth is open for business—and Jensen Huang's grinning behind the counter. Nvidia's GPUs have become the de facto currency of the AI gold rush, and every corporate giant is paying the vig.
The hardware hustle
Forget software-as-a-service—we're living through hardware-as-a-shakedown. Those H100 chips aren't just expensive, they're corporate crack. Once you've trained your models on Nvidia's silicon, good luck kicking the habit when the next-gen drops.
The 2035 payout
Wall Street analysts whisper about 'accelerator addiction' in boardrooms. By 2035, the collective AI infrastructure spend could make Nvidia's revenue look like a Fortune 500 company itself—ironic for a firm that's turning them all into tributary states.
Here's the kicker: while CFOs weep over capital expenditures, Huang just announced the B200 like a dealer unveiling a new product line. The house always wins—especially when it owns the only casino in town.
Image source: Getty Images.
The path to doubling
The math is simpler than skeptics think. Nvidia's revenue hit $130.5 billion in fiscal 2025, more than doubling from the prior year. Wall Street already expects $254 billion by fiscal 2027. But that's just the beginning. The real explosion could come if AGI transforms every industry.
A compound growth rate of 19% from 2027 to 2035 gets you to $1 trillion in revenue. At 45% net margin and an earnings multiple of 20, reasonable for critical infrastructure, that's a $9 trillion market cap. With 24.39 billion shares outstanding, that translates to $369 per share, a double from today.
Bull case? If Nvidia captures 50% of a $5 trillion AGI computing market, the stock could reach $615. Not the 20x return some imagine, but doubling your money as a company grows from $4 trillion to $9 trillion is hardly settling.
Fantasy?committed $100 billion to AI data centers. Saudi Arabia, the United Arab Emirates, and Japan pledged $90 billion for sovereign computing. OpenAI alone spends $7 billion annually on Nvidia hardware. The smart money isn't betting on chatbots. It's prepaying for infrastructure in the AGI economy.
The trillion-dollar question
The entire investment case hangs on one question: Will AGI arrive by 2030? Answering it correctly is the difference between owning a stock that merely grows with the semiconductor industry and one that doubles as it becomes the world's first $9 trillion company. This chasm exists because current models only see the present: GPU sales for AI training. They fail to price in the explosion that happens when AGI becomes the engine of every industry.
Imagine pharmaceutical companies simulating every possible drug interaction, enterprises deploying millions of autonomous AI workers, and billions of users adopting personal AI companions. The first signals are already here.reports more than 8,000 customers already using its AI agents, but the infrastructure required to power this future will dwarf today's entire market.
The CUDA moat
Nvidia's competitive moat wasn't dug overnight. It was built over 15 years by an army of 2 million developers, making its Compute Unified Device Architecture (CUDA) software the native language of AI. The cost to leave this ecosystem is a multibillion-dollar tax paid in retraining models (a $100 million expense for a GPT-4 class model), porting years of code, and accepting major performance penalties.
The proof is that even the world's richest tech giants, after spending billions building their custom chips, still buy Nvidia's GPUs.pays the toll.pays the toll. This powerful lock-in is why every serious contender in the race to AGI -- OpenAI, Anthropic, xAI -- has aligned with Nvidia. They aren't just choosing a vendor; they're choosing the only battle-tested stack.
But even the strongest fortress can have a vulnerability. The moat is deepest around the specialized work of training AI. For the high-volume world of AI inference, rival armies fromand cloud giants are gathering at the gates, building alternative routes that bypass the CUDA toll.
The risk of thinking small
Yes, a Taiwan invasion could crater the stock 70%. Custom application-specific integrated circuits (ASICs) might compress margins. AGI could arrive late. But here's the real calculation:
|
Market role |
Planetary-scale intelligence utility |
Excellent but standard semiconductor leader |
|
Margin |
70% or higher gross margin sustained |
Margin compresses to 30% to 50% because of competition |
|
AGI timeline |
Arrives by 2030, driving exponential demand |
Slips to 2040 or later, or never arrives in this form |
|
CUDA moat |
Remains the dominant standard |
Open source and ASICs create viable alternatives |
|
Market share |
Captures 30% to 50% of the multitrillion-dollar AGI market |
Loses share to AMD, hyperscalers, custom chips |
Nvidia isn't just riding the AI cycle. It's laying the rails for AGI itself. When every company needs AI like they need electricity, when every decision runs through neural networks, when intelligence becomes a utility, that's when early investors realize they didn't buy a semiconductor stock. They bought the cognitive infrastructure toll of the 21st century.