Artificial Intelligence Has A Hidden Water Cost
Sep 3, 2025 | Pratirodh Bureau
Cooling towers, like these at a power plant in Florida, US, use water evaporation to lower the temperature of equipment (Paul Hennessy/SOPA Images/LightRocket via Getty Images)
Artificial intelligence systems are thirsty, consuming as much as 500 milliliters of water – a single-serving water bottle – for each short conversation a user has with the GPT-3 version of OpenAI’s ChatGPT system. They use roughly the same amount of water to draft a 100-word email message.
That figure includes the water used to cool the data center’s servers and the water consumed at the power plants generating the electricity to run them.
But the study that calculated those estimates also pointed out that AI systems’ water usage can vary widely, depending on where and when the computer answering the query is running.
To me, as an academic librarian and professor of education, understanding AI is not just about knowing how to write prompts. It also involves understanding the infrastructure, the trade-offs, and the civic choices that surround AI.
Many people assume AI is inherently harmful, especially given headlines calling out its vast energy and water footprint. Those effects are real, but they’re only part of the story.
When people move from seeing AI as simply a resource drain to understanding its actual footprint, where the effects come from, how they vary, and what can be done to reduce them, they are far better equipped to make choices that balance innovation with sustainability.
2 hidden streams
Behind every AI query are two streams of water use.
The first is on-site cooling of servers that generate enormous amounts of heat. This often uses evaporative cooling towers – giant misters that spray water over hot pipes or open basins. The evaporation carries away heat, but that water is removed from the local water supply, such as a river, a reservoir or an aquifer. Other cooling systems may use less water but more electricity.
The second stream is used by the power plants generating the electricity to power the data center. Coal, gas and nuclear plants use large volumes of water for steam cycles and cooling.
Hydropower also uses up significant amounts of water, which evaporates from reservoirs. Concentrated solar plants, which run more like traditional steam power stations, can be water-intensive if they rely on wet cooling.
By contrast, wind turbines and solar panels use almost no water once built, aside from occasional cleaning.
Climate and timing matter
Water use shifts dramatically with location. A data center in cool, humid Ireland can often rely on outside air or chillers and run for months with minimal water use. By contrast, a data center in Arizona in July may depend heavily on evaporative cooling. Hot, dry air makes that method highly effective, but it also consumes large volumes of water, since evaporation is the mechanism that removes heat.
Timing matters too. A University of Massachusetts Amherst study found that a data center might use only half as much water in winter as in summer. And at midday during a heat wave, cooling systems work overtime. At night, demand is lower.
Newer approaches offer promising alternatives. For instance, immersion cooling submerges servers in fluids that don’t conduct electricity, such as synthetic oils, reducing water evaporation almost entirely.
And a new design from Microsoft claims to use zero water for cooling, by circulating a special liquid through sealed pipes directly across computer chips. The liquid absorbs heat and then releases it through a closed-loop system without needing any evaporation. The data centers would still use some potable water for restrooms and other staff facilities, but cooling itself would no longer draw from local water supplies.
These solutions are not yet mainstream, however, mainly because of cost, maintenance complexity and the difficulty of converting existing data centers to new systems. Most operators rely on evaporative systems.