OpenAI’s $100 Billion Nvidia Deal: Structured as Pure Cash Payments
Tech giants place their biggest bet yet on AI infrastructure.
The Cash-First Approach
OpenAI commits $100 billion in straight cash payments to Nvidia—bypassing equity swaps or complex financing structures. This pure liquidity move signals unprecedented confidence in immediate AI hardware needs.
Silicon Valley's New Currency
Cash becomes king in the race for GPU dominance. While startups scramble for cloud credits, established players deploy war chests directly to hardware suppliers.
Market Implications
The deal reshapes how tech titans approach partnerships—no more dancing around with stock options when you can just wire the money. Because nothing says commitment like nine zeroes hitting a supplier's account.
Wall Street analysts already calling it the most expensive hardware pre-order in history—because when you're spending other people's venture capital, why bother with creative financing?
OpenAI delays costs by leasing Nvidia chips instead of buying
Jensen Huang, the CEO of Nvidia, described the deal as “monumental in size.” He said building a single gigawatt AI data center could cost about $50 billion. Out of that, around $35 billion goes straight to Nvidia for its chips. The remaining is for everything else. But OpenAI isn’t paying that up front. By leasing the GPUs instead, the company avoids taking a financial hit all at once.
OpenAI will get an initial $10 billion from the deal soon. That money helps kick off the first wave of deployment. And while some of the funds will be used for hiring, operations, and other expenses, the majority of it will go straight to compute. More specifically, to Nvidia’s processors. These GPUs are the engines behind AI training, powering models like ChatGPT and everything that runs on them.
Sarah Friar, OpenAI’s chief financial officer, said in Abilene that the plan wouldn’t be possible without partners. She pointed to Oracle, which is leasing the Abilene data center, and Nvidia, which is providing equity up front in return for long-term payments.
“Folks like Oracle are putting their balance sheets to work to create these incredible data centers you see behind us,” Sarah said. “In Nvidia’s case, they’re putting together some equity to get it jumpstarted, but importantly, they will get paid for all those chips as those chips get deployed.”
Debt talks begin while Nvidia chips eat most of OpenAI’s cash
OpenAI is not profitable. It doesn’t have positive cash flow, and it doesn’t hold investment-grade credit. That’s why financing data centers through equity is costly. Executives inside the company said they’re preparing to take on debt to handle the rest of the expansion. And thanks to the lease structure with Nvidia, banks are more comfortable lending. The terms look better when a company isn’t trying to buy everything outright.
Sarah said the compute shortage is the bigger issue. “What I think we should all be focused on today is the fact that there’s not enough compute,” she said. “As the business grows, we will be more than capable of paying for what is in our future — more compute, more revenue.”
But not everyone is thrilled about the way this is structured. Nvidia’s $4.3 trillion market cap has been built on selling chips to OpenAI, Google, Meta, Microsoft, and Amazon. At the same time, OpenAI’s $500 billion private valuation is only possible because of cash injections from Microsoft and others. That money doesn’t sit around. It goes right back to Nvidia.
Jamie Zakalik, an analyst at Neuberger Berman, told CNBC the deal shows OpenAI raising capital and pouring it straight into the same company providing the tech. “It’s goosing up everyone’s earnings and everyone’s numbers,” Jamie said. “But it’s not actually creating anything.”
When asked about those concerns, OpenAI CEO Sam Altman didn’t push back. “We need to keep selling services to consumers and businesses — and building these great new products that people pay us a lot of money for,” Sam said. “As long as that keeps happening, that pays for a lot of these data centers, a lot of chips.”
If you're reading this, you’re already ahead. Stay there with our newsletter.