How Liquid Cooling Prevents “I asked ChatGPT…” from Damaging the Environment

How many times a day are you utilizing Artificial Intelligence (AI)?

You’re more than likely having ChatGPT draft an email or two, maybe creating auto-transcripts of your Google Meet all-hands call, and utilizing one or dozens of the thousands SaaS tools in existence. 

If you answered zero, you’re wrong. Even using Siri or Google or Alexa to ask a question, scrolling through Facebook, going to your bank, or popping a destination into Google Maps utilizes AI—often without explicit user awareness. In fact, 38% of SaaS companies alone have incorporated generative AI features into their services in some fashion.

Which means every single day, you are using AI to some degree. 

Unfortunately, behind the marvel of AI lies a massive environmental paradox: while AI unlocks unprecedented possibilities and can potentially improve environmental outcomes, it also demands massive energy resources.

That means Bob from the sales team may put in a simple ChatGPT question that utilizes nearly 10 times the electricity as a traditional Google search. With this in mind, the onus of responsibility cannot be placed on Bob or other users. The challenge is that:

As a wise man once said: “with great power comes great responsibility.” The data center industry and companies are responsible for prioritizing liquid cooling technologies to mitigate AI’s energy consumption and carbon emissions, and they are accepting this responsibility. But is it as fast and universal as we need it to be? Not quite.

Before we discuss further: no, this doesn’t mean we have to stop improving and advancing with AI. We can still support AI’s growth responsibly while protecting the environment—the two are not mutually exclusive.

The Hidden Costs of AI

AI models don’t just operate in the cloud—LLMs and other AI-based algorithms demand immense computational power for both training, execution, and inference. Computational power electricity, and strains our existing infrastructure.

Electricity Usage and Infrastructure Strain

Current estimates show data centers on the path to consume 3-4% of global electricity before 2030, and 8% of US power by the same date. Beyond the issue of the resources it takes to support that electricity, this level of power consumption is simply not supported by our current infrastructure. The average age of power grids across the globe are at minimum 20 years old. For context, 20 years ago major industry pundits like Data Center Knowledge were founded, Data Center Frontier didn’t exist, and Data Center Dynamics was merely 4 years old.

Our global infrastructure is not built to withstand the continued growth of AI if data centers are built, equipped, and run with the traditional methods of cooling and rack density. If most facilities are only packing in 6-12 kW per rack, and using air cooling, there simply aren’t enough racks and facilities to handle the demands of AI and HPC without constant threat of outages.

Water Resources and Environmental Impact

In 2019, researchers at the University of Massachusetts Amherst found that training a single AI model can emit as much carbon dioxide as five cars over their entire lifespans—626,000 pounds to be exact.

That was in 2019…in 2025, even with strides in efficiency and optimization, estimates are likely the same, if not higher. Carbon emissions are only one part of the resources equation, as water consumption increases to cool these massive data centers: by 2027 it’s estimated that the global AI demand may require 4.2-6.6 billion cubic meters of water withdrawal. That’s more than the entire country of Denmark.

Even when taking into account water discharge by water consumption’s definition, that doesn’t touch the incorporation of pollutants, refrigerants, and other chemicals that impact the quality of the water that is discharged. This is especially alarming considering 20% of data centers in the United States rely on moderate to high stress watersheds.

How Does Liquid Cooling Reduce AI’s Impact on the Environment?

The good news? The data center industry doesn’t need to reinvent the wheel to address AI’s environmental challenges. The solution is already here: liquid cooling. Unlike traditional air-based systems, liquid cooling technology offers the efficiency, scalability, reliability, and sustainability required to support the growing demands of AI and high-performance computing (HPC). 

There’s no universal agreement on how to deploy AI-friendly high-density racks or the liquid cooling that supports them. What we do have information on is that the environmental impact of AI is already concerning, and continues to grow. 

Meaning we as an industry need to agree on one thing: the shift to liquid cooling can’t wait.

Nautilus’ own Gabe Andrews recently highlighted that liquid cooling needs to become the backbone of sustainable and scalable data center operations. “Cooling and power failures account for 71 percent of all outages…and most are not using liquid cooling,” he said. “We know traditional cooling is not capable of supporting high-density without backsliding [on PUE].”

Yes, opting to power facilities with renewable energy sources like solar, wind, geothermal, or otherwise is smart. Yes, educating the public on the natural resources AI impacts is necessary. Yes, we need to improve our energy grids across the globe to support our technological advancements. Yes, a multifaceted approach is the best.

But…we have a solution that we know works, now. While our industry and nations work to move toward the long-term solutions, liquid cooling combined with high-density racks is the short and long-term solution for reasons even beyond sustainability:

  1. Environment & Performance
    • Improved efficiency: Since liquids have a much higher thermal conductivity, liquid cooling dissipates heat much more effectively than air cooling so data centers can:
      • Have smaller physical footprints
      • Condense operations into high-density racks
      • Add more computational power per square foot with high-density racks
      • Consume less power while cooling a larger space
    • Reduced water waste: When set up for zero water usage, liquid cooling consumes less water across the board. Nautilus’ water technology can save 380 million gallons of water annually in a 100MW center, and other liquid cooling players note yearly water savings as high as 33 million gallons per year.
    • Enhanced reliability: Liquid cooling maintains a consistent cooling temperature, reducing the risk of equipment failure caused by overheating. This not only extends the lifespan of servers and minimizes costly downtime, it becomes particularly necessary as we have more and more high temperature days that threaten operations.
  2. Economic
    • Cost savings: As an example, Nautilus EcoCore COOL is rated at 50 kW, with a typical draw of 18 kW that reduces operational costs through efficient power usage. Beyond savings from energy efficiency, liquid cooling can reduce costs through:
      • The reduction or near eradication of downtime from overheated servers
      • Substantial long-term savings on physical equipment—less server downtime = longer hardware lifespans
      • Rapid technology deployment to save significant CapEx—EcoCore COOL CDU and infrastructure, for example, can deploy in less than 9 months
    • Regulatory compliance: With governments around the world introducing stricter regulations on energy efficiency and carbon emissions, transitioning to liquid cooling is a proactive way to stay ahead of compliance requirements and fines.
    • Rapid deployment: Companies can deploy liquid cooling that supports high-density racks—like
  3. Business Growth
    • Differentiation: Customers and investors are always prioritizing efficiency, but also sustainability. Data centers that adopt liquid cooling not only reduce their environmental footprint, but position themselves as leaders in innovation and green initiatives beyond renewable energy.
    • Scalability: AI and HPC demands will only grow, and liquid cooling is the only method that offers the scalability necessary to support future workloads. Companies that implement this technology now are better equipped to handle larger, next-generation processors and GPUs that generate more heat.

Looking Forward

As more and more folks turn to ChatGPT for a search, and with AI in general projected to consume an ever-growing share of global electricity and water resources, liquid cooling is a necessity. These cooling technologies effectively balance the needs of high-density computing without sacrificing sustainability, efficiency, or performance.As an industry, it’s our responsibility to make decisive moves about adopting and deploying liquid cooling technology—before we’re in crisis mode.

More Recent Posts

Scroll to Top