AI Server Cooling: The Water Consumption Issue
Generative models like GPT-4 and Midjourney are not only energy-intensive but also require significant amounts of water for cooling, potentially leading to a competition for this vital resource between AI systems and human needs. This emerging issue, highlighted by researchers at the University of California at Riverside and the University of Texas at Arlington, is increasingly concerning as demands for fresh water to cool data centers rise.

Surprisingly, a simple exchange of 20-50 questions with AI system ChatGPT necessitates about 500ml of water. The training of a GPT-3 language model requires a staggering 700,000 liters of water – equivalent to the water used in the production of 320 Tesla electric vehicles. As AI development accelerates, the water consumption of such systems could balloon to enormous proportions unless data center cooling optimization measures are put in place.
The situation becomes more urgent as predictions suggest vast areas of the United States could experience drought by mid-century. Current reluctance among data centers to share information about water usage, often employing varied counting systems or obscuring information, as Google has been accused of, further complicates the issue. Researchers have had to rely mostly on indirect data for their calculations, suggesting the use of the model developed by SPX Cooling Technologies as a universal solution to determine water expenditure on language model training and operation.
However, Dell’Oro Group suggests that AI itself is not the issue; rather, the focus should be on rationalizing thermal control systems. The location of an AI data center significantly impacts its water consumption. While many data centers do not use liquid cooling, there are alternatives. Microsoft, for instance, has reported using zero-water systems at their Arizona data center, albeit at a higher energy cost.
A range of solutions, from air to water or immersion cooling, are available, each with its own pros and cons. Companies like Submer and LiquidStack offer immersion cooling systems that provide Power Usage Effectiveness (PUE) of less than 1.05, compared to the usual 1.4-1.5 provided by air-cooled systems.

The researchers propose not only ways to optimize cooling systems but also suggestions for generating less heat in the first place. Recommendations include building data centers in cooler climates, timing certain tasks for cooler periods of the day, and using backup battery sources instead of generators for energy storage during the day.
The key to efficient energy and cooling system use, scientists suggest, is greater data center transparency. In Europe, revisions to the Energy Efficiency Directive could soon require all but the smallest data centers to report on numerous parameters. The secretive nature of the data center industry often makes it difficult to gather accurate data for model building. Despite this, the rapid pace of AI industry development might not afford hyperscalers enough time to implement high-quality reporting. The need for effective and sustainable solutions is becoming increasingly urgent.
Author Profile

- I'm Vasyl Kolomiiets, a seasoned tech journalist regularly contributing to global publications. Having a profound background in information technologies, I seamlessly blended my technical expertise with my passion for writing, venturing into technology journalism. I've covered a wide range of topics including cutting-edge developments and their impacts on society, contributing to leading tech platforms.
Latest entries
Troubleshooting15/11/2023Intel Fixes Critical Vulnerability Affecting All Processors – CVE-2023-23583
Business15/11/2023Google Pays $8 Billion to Samsung for Default ‘Play Market’ and Search in Galaxy Devices
Technology04/11/2023North Korea Upgrades Mobile Networks with Huawei Equipment Imports
Technology03/11/2023Chinese Scientists Invent Passive Saltwater Cooler, Boosts CPU Speed by a Third