GPU Solutions
All articles
Sustainability10 March 20267 min read

Liquid cooling in GPU datacenters: the real efficiency numbers

We're publishing our PUE and energy consumption numbers after 6 months running HGX B200 under direct liquid cooling.


Every datacenter vendor advertises a 1.2 PUE. If that number were real, the industry would have solved its energy problem. It hasn't — the industry-average Tier III datacenter operates around 1.45-1.55 per annualized Uptime Institute Global Data Center Survey data, and that's before heatwave peaks. The gap between marketing and measurement is what this post is about.

What PUE actually measures

PUE = total DC energy / IT energy. A 1.2 PUE means for every watt the GPU consumes, 0.2 W go to cooling, UPS, lighting, and losses. A 1.5 PUE triples that overhead.

The problem is PUE gets measured under wildly different conditions depending who reports it. Some give ‘design’ PUE (theoretical, ideal load), others ‘annualized’ (real average including summer and winter), others ‘best quarter’. The only comparable numbers are annualized with independent physical metering.

Forced air vs direct liquid cooling

HGX B200 dissipates 10.2 kW per node. With 8 GPUs at 1 kW each plus CPU, memory, and networking, a 4-node rack exceeds 45 kW of thermal load. Forced air stops working at that density: return temperature rises past ASHRAE A2/A3 envelopes, GPUs enter thermal throttling, and MTBF drops by an order of magnitude.

Direct liquid cooling (warm water, 40-45°C inlet) removes the problem by physics. Water has 3,400× the volumetric heat capacity of air; one water hose does the work of forty extractors. GPUs run 8-12°C cooler at the same load, which cuts silicon leakage current and improves MTBF.

Our numbers, September 2025 – March 2026

Six months of continuous Madrid operation on the HGX B200 cluster under direct-to-chip liquid cooling. Measured with class 0.5s certified meters, hourly readings, monthly publication.

Annualized PUE

1.18 (σ = 0.03)

Peak PUE (August Madrid, 41°C)

1.24

Minimum PUE (January)

1.13

Cooling / IT consumption ratio

11.6%

Water per IT MWh

0.09 m³ (closed loop)

Mean GPU inlet temperature

36.2°C

GPU thermal throttling events

0

The difference between PUE 1.5 and 1.2 over the life of an HGX B200 is 21 tons of CO₂ avoided per node — equivalent to the annual electricity consumption of about 40 Spanish homes. Multiply by racks. Then multiply by fleet.
Computed with Spain's 2024 grid emission factor (REE/CNMC: 0.16 kgCO₂eq/kWh) and average household consumption of 3,200 kWh/year (IDAE)

What sustainable marketing doesn't tell you

  • Closed-loop water doesn't evaporate — it regenerates. It doesn't compete with local agriculture.
  • Waste heat is technically reusable for district heating or industrial hot water — an open efficiency frontier as European district heating networks mature.
  • The datacenter is on the Spanish grid; Spain closed 2024 with ~56% annualized renewable generation (REE 2024 electricity system report). Carbon footprint per MWh is roughly half the European average.
  • Hardware manufacturing remains the dirtiest part of the lifecycle — no greenwashing fixes that. The only answer is extending service life and cutting operational consumption.

We publish these numbers because until the industry measures the same way, AI sustainability conversations live in powerpoints. Our joint chair with the University of Granada is working on a public reproducible energy benchmarking protocol. When it ships, we'll publish that too.