BTCC / BTCC Square / Cryptopolitan /
AWS Unveils Revolutionary Liquid Cooling Tech for Nvidia’s Next-Gen AI GPUs – The Future of High-Performance Computing

AWS Unveils Revolutionary Liquid Cooling Tech for Nvidia’s Next-Gen AI GPUs – The Future of High-Performance Computing

Published:
2025-07-10 04:03:16
11
2

AWS developed a custom liquid cooling system to cool Nvidia’s next-gen AI GPUs

Amazon Web Services just dropped a game-changer in data center innovation—custom liquid cooling for Nvidia's bleeding-edge AI chips. No more thermal throttling, just raw computational power.

Why it matters: As AI models grow more insatiable, traditional air cooling hits its limits. AWS's solution could redefine efficiency benchmarks—while racking up those sweet, sweet cloud revenue margins.

The cynical take: Another 'groundbreaking' infrastructure play that'll get Wall Street analysts frothing—until the next quarterly earnings miss.

AWS launches P6e instances featuring Nvidia Blackwell GPUs

AWS has also just introduced P6e instances that leverage Nvidia’s GB200 NVL72, a dense, supercomputing platform containing 72 Blackwell GPUs in one rack. These are designed to cope with the computationally intensive nature of huge AI models and generative AI tasks.

Until now, only companies like Microsoft and CoreWeave have offered this next-level GPU cluster. And now AWS customers can access the newest and most advanced custom GPU machine learning training infrastructure available in the cloud, powered by the latest-generation, water-cooled NVIDIA A100 Tensor Core GPUs.

The IRHX keeps these pockets of clusters at SAFE temperatures, providing optimal performance without overheating. By baking the IRHX directly into its data center design, AWS can avoid waiting to retrofit entire structures for liquid cooling or paying for costly construction.

In his announcement of the launch of the P6e, Brown noted that by combining the GB200 NVL72 system with Amazon’s IRHX, customers can leverage unmatched computing power at scale. It will also allow developers, researchers, and companies to train much larger AI models more quickly and efficiently than they could in the past.

Amazon strengthens its lead in cloud infrastructure

The push to in-house its cooling tech at the in-progress data center reveals even more about Amazon’s broader play to own more of its infrastructure. In recent years, AWS has spent heavily developing its chips, storage systems, and networking gear to power its cloud services.

These advancements enable Amazon to mitigate reliance on third-party suppliers further and strike a balance between operational performance and cost.

This approach has paid off. In the first quarter of 2025, AWS notched its highest operating margin since the unit was created and is now the chief engine of Amazon’s overall profitability. The IRHX launch expands AWS’s innovation leadership and infrastructure footprint in the cloud industry.

Other tech titans are also doing likewise. Microsoft, for example, built its own AI chips and custom cooling system, Sidekicks, to go with them. Google and Meta are also investigating ways to construct hardware and systems tailored to AI workloads.

However, Amazon has a crucial advantage — its sprawling global footprint of data centers and years of experience building and deploying custom hardware at scale. The IRHX could add to that by streamlining its AI-ready infrastructure, making it more efficient, sustainable, and scalable.

KEY Difference Wire: the secret tool crypto projects use to get guaranteed media coverage

|Square

Get the BTCC app to start your crypto journey

Get started today Scan to join our 100M+ users