Historically, knowledge facilities have relied on air cooling — the place mechanical chillers flow into chilled air to soak up warmth from servers, serving to them preserve optimum circumstances. However as AI fashions enhance in measurement, and using AI reasoning fashions rises, sustaining these optimum circumstances shouldn’t be solely getting more durable and dearer — however extra energy-intensive.
Whereas knowledge facilities as soon as operated at 20 kW per rack, right now’s hyperscale amenities can help over 135 kW per rack, making it an order of magnitude more durable to dissipate the warmth generated by high-density racks. To maintain AI servers operating at peak efficiency, a brand new method is required for effectivity and scalability.
One key resolution is liquid cooling — by decreasing dependence on chillers and enabling extra environment friendly warmth rejection, liquid cooling is driving the following era of high-performance, energy-efficient AI infrastructure.
The NVIDIA GB200 NVL72 and the NVIDIA GB300 NVL72 are rack-scale, liquid-cooled techniques designed to deal with the demanding duties of trillion-parameter giant language mannequin inference. Their structure can also be particularly optimized for test-time scaling accuracy and efficiency, making it a perfect selection for operating AI reasoning fashions whereas effectively managing vitality prices and warmth.

Driving Unprecedented Water Effectivity and Value Financial savings in AI Knowledge Facilities
Traditionally, cooling alone has accounted for as much as 40% of a knowledge middle’s electrical energy consumption, making it one of the crucial important areas the place effectivity enhancements can drive down each operational bills and vitality calls for.
Liquid cooling helps mitigate prices and vitality use by capturing warmth straight on the supply. As a substitute of counting on air as an middleman, direct-to-chip liquid cooling transfers warmth in a know-how cooling system loop. That warmth is then cycled by way of a coolant distribution unit through liquid-to-liquid warmth exchanger, and in the end transferred to a facility cooling loop. Due to the upper effectivity of this warmth switch, knowledge facilities and AI factories can function successfully with hotter water temperatures — decreasing or eliminating the necessity for mechanical chillers in a variety of climates.
The NVIDIA GB200 NVL72 rack-scale, liquid-cooled system, constructed on the NVIDIA Blackwell platform, affords distinctive efficiency whereas balancing vitality prices and warmth. It packs unprecedented compute density into every server rack, delivering 40x greater income potential, 30x greater throughput, 25x extra vitality effectivity and 300x extra water effectivity than conventional air-cooled architectures. Newer NVIDIA GB300 NVL72 techniques constructed on the Blackwell Extremely platform boast a 50x greater income potential and 35x greater throughput with 30x extra vitality effectivity.
Knowledge facilities spend an estimated $1.9-2.8M per megawatt (MW) per 12 months, which quantities to just about $500,000 spent yearly on cooling-related vitality and water prices. By deploying the liquid-cooled GB200 NVL72 system, hyperscale knowledge facilities and AI factories can obtain as much as 25x price financial savings, resulting in over $4 million {dollars} in annual financial savings for a 50 MW hyperscale knowledge middle.
For knowledge middle and AI manufacturing facility operators, this implies decrease operational prices, enhanced vitality effectivity metrics and a future-proof infrastructure that scales AI workloads effectively — with out the unsustainable water footprint of legacy cooling strategies.
Shifting Warmth Exterior the Knowledge Heart
As compute density rises and AI workloads drive unprecedented thermal masses, knowledge facilities and AI factories should rethink how they take away warmth from their infrastructure. The standard strategies of warmth rejection that supported predictable CPU-based scaling are now not enough on their very own. Right now, there are a number of choices for transferring warmth outdoors the power, however 4 main classes dominate present and rising deployments.
Key Cooling Strategies in a Altering Panorama
- Mechanical Chillers: Mechanical chillers use a vapor compression cycle to chill water, which is then circulated by way of the info middle to soak up warmth. These techniques are usually air-cooled or water-cooled, with the latter typically paired with cooling towers to reject warmth. Whereas chillers are dependable and efficient throughout various climates, they’re additionally extremely energy-intensive. In AI-scale amenities the place energy consumption and sustainability are prime priorities, reliance on chillers can considerably influence each operational prices and carbon footprint.
- Evaporative Cooling: Evaporative cooling makes use of the evaporation of water to soak up and take away warmth. This may be achieved by way of direct or oblique techniques, or hybrid designs. These techniques are way more energy-efficient than chillers however include excessive water consumption. In giant amenities, they’ll devour tens of millions of gallons of water per megawatt yearly. Their efficiency can also be climate-dependent, making them much less efficient in humid or water-restricted areas.
- Dry Coolers: Dry coolers take away warmth by transferring it from a closed liquid loop to the ambient air utilizing giant finned coils, very similar to an automotive radiator. These techniques don’t depend on water and are perfect for amenities aiming to scale back water utilization or function in dry climates. Nevertheless, their effectiveness relies upon closely on the temperature of the encircling air. In hotter environments, they might wrestle to maintain up with high-density cooling calls for until paired with liquid-cooled IT techniques that may tolerate greater working temperatures.
- Pumped Refrigerant Programs: Pumped refrigerant techniques use liquid refrigerants to maneuver warmth from the info middle to out of doors warmth exchangers. In contrast to chillers, these techniques don’t depend on giant compressors inside the power they usually function with out using water. This technique affords a thermodynamically environment friendly, compact and scalable resolution that works particularly properly for edge deployments and water-constrained environments. Correct refrigerant dealing with and monitoring are required, however the advantages in energy and water financial savings are important.
Every of those strategies affords completely different benefits relying on elements like local weather, rack density, facility design and sustainability targets. As liquid cooling turns into extra frequent and servers are designed to function with hotter water, the door opens to extra environment friendly and environmentally pleasant cooling methods — decreasing each vitality and water use whereas enabling greater compute efficiency.
Optimizing Knowledge Facilities for AI Infrastructure
As AI workloads develop exponentially, operators are reimagining knowledge middle design with infrastructure constructed particularly for high-performance AI and vitality effectivity. Whether or not they’re remodeling their total setup into devoted AI factories or upgrading modular elements, optimizing inference efficiency is essential for managing prices and operational effectivity.
To get the perfect efficiency, excessive compute capability GPUs aren’t sufficient — they want to have the ability to talk with one another at lightning velocity.
NVIDIA NVLink boosts communication, enabling GPUs to function as an enormous, tightly built-in processing unit for max efficiency with a full-rack energy density of 120 kW. This tight, high-speed communication is essential for right now’s AI duties, the place each second saved on transferring knowledge can imply extra tokens per second and extra environment friendly AI fashions.
Conventional air cooling struggles at these energy ranges. To maintain up, knowledge middle air would must be both cooled to below-freezing temperatures or stream at near-gale speeds to hold the warmth away, making it more and more impractical to chill dense racks with air alone.
At almost 1,000x the density of air, liquid cooling excels at carrying warmth away because of its superior warmth capacitance and thermal conductivity. By effectively transferring warmth away from high-performance GPUs, liquid cooling reduces reliance on energy-intensive and noisy cooling followers, permitting extra energy to be allotted to computation reasonably than cooling overhead.
Liquid Cooling in Motion
Innovators throughout the trade are leveraging liquid cooling to slash vitality prices, enhance density and drive AI effectivity:
Cloud service suppliers are additionally adopting cutting-edge cooling and energy improvements. Subsequent-generation AWS knowledge facilities, that includes collectively developed liquid cooling options, enhance compute energy by 12% whereas decreasing vitality consumption by as much as 46% — all whereas sustaining water effectivity.
Cooling the AI Infrastructure of the Future
As AI continues to push the boundaries of computational scale, improvements in cooling will likely be important to assembly the thermal administration challenges of the post-Moore’s legislation period.
NVIDIA is main this transformation by way of initiatives just like the COOLERCHIPS program, a U.S. Division of Vitality-backed effort to develop modular knowledge facilities with next-generation cooling techniques which are projected to scale back prices by a minimum of 5% and enhance effectivity by 20% over conventional air-cooled designs.
Trying forward, knowledge facilities should evolve not solely to help AI’s rising calls for however achieve this sustainably — maximizing vitality and water effectivity whereas minimizing environmental influence. By embracing high-density architectures and superior liquid cooling, the trade is paving the best way for a extra environment friendly AI-powered future.
Study extra about breakthrough options for knowledge middle vitality and water effectivity offered at NVIDIA GTC 2025 and uncover how accelerated computing is driving a extra environment friendly future with NVIDIA Blackwell.