Everyone in the industry knows how this story ends.
Air cooling had a great run, but its physics-limited ceiling has been in sight for years. What’s less clear, and what’s shaping the next phase of data center design, is how fast operators can make the leap to liquid.
AI has simply accelerated the timeline.
When racks jump from 5-10 kW to 30, 60, even 120 kW each, airflow and containment strategies can’t keep up. Fans spin harder, energy bills climb, and GPUs throttle performance to survive. Nearly 80% of data centers are still air-cooled, but none of them think that’s where they’ll be five years from now – in fact, we believe it will flip closer to 80% liquid in five years.
The question isn’t if liquid cooling will take over. It’s how long operators can afford to wait, and how they’ll get there without tearing their facilities apart in the process.
Air Cooling Had a Good Run
Air cooling deserves some credit. When racks were smaller and workloads were lighter, it was the king of the data hall, fans humming, filters catching dust, everyone happy.
Then AI showed up.
Now those same halls are filled with racks pulling 100 kW and up, and air cooling is sweating it out (literally). The best containment designs can’t fight physics forever. You can almost hear them wheezing at 20 kW, clinging to airflow charts and hope.
The Real Question Isn’t If, It’s How Fast
Liquid cooling isn’t “on the horizon.” It’s already here, powering the most advanced AI deployments on the planet.
The conversation now is about strategy: will operators retrofit existing facilities, build new AI-native campuses, or take a hybrid path that evolves over time?
That’s where the next generation of cooling infrastructure comes in.
Why Liquid Cooling Wins the Long Game
Physics has spoken: air just can’t carry the heat AI generates.
Water, on the other hand, can move it 3,000× more effectively, pulling waste heat directly from CPUs, GPUs, and memory before it ever hits the room.
Precision Where It Matters
The real evolution isn’t just in the racks, it’s in the systems managing the flow.
Facility-scale Cooling Distribution Units (CDUs) like Nautilus’s EcoCore FCD make high-density liquid cooling practical, reliable, and fast to deploy. Each unit supports up to 4 MW of heat rejection capacity and can be installed in as little as 12-16 weeks, turning what used to be a multi-year infrastructure project into a quarter-long upgrade.
Nautilus’s variable-pressure Leak Prevention System ensures true leak-proof operation, even under extreme workloads. And with 400,000+ unit hours of real-world operational experience, Nautilus has already refined every element, from pressure management to degassing, to remove risk and accelerate rollout.
Air vs. Liquid: The Gap Is Only Getting Wider
| Criteria | Air Cooling | Liquid Cooling | 
| Rack Density | 15-20 kW max | 100-120 kW+ | 
| Heat Transfer Efficiency | Limited (1× baseline) | 3,000× higher | 
| Deployment Timeline | Months–Years (retrofits) | 12-16 weeks typical | 
| Environmental Impact | High water use, refrigerants | Zero refrigerants, low-to-no water draw | 
| Future Readiness | Built for small demands | Built for AI-scale workloads | 
Retrofitting the Old Guard
Most data centers weren’t designed for this era. Panels cap out at 250 A, raised floors buckle under 6,000-lb racks, and mechanical rooms were never meant to distribute liquid.
That’s why retrofit-ready CDUs are critical. The EcoCore FCD installs off the data-hall floor, in mechanical galleries or alleys, allowing operators to scale cooling from 500 kW to 10 MW+ without major reconstruction.
It’s the difference between shutting down for a year and scaling up in weeks.
The Next Generation Is Liquid-Cooled
Air cooling isn’t disappearing, it’s just retiring from the front lines.
For legacy racks and edge workloads, it’ll still have a place. But the compute powering AI, HPC, and large-scale modeling? That’s all moving to liquid.
The operators who act now will gain density, efficiency, and speed to market. The ones who wait will be left fighting physics.
At Nautilus, we’ve already seen what that future looks like, 400,000 unit hours of it, and it’s unmistakably liquid-cooled.
Discover how Nautilus EcoCore FCD is powering AI-ready cooling infrastructure.