AI growth is making data centers more important, but also more resource-intensive. The International Energy Agency says data centers used about 415 terawatt-hours of electricity in 2024, around 1.5% of global electricity consumption, and it expects data-center electricity demand to more than double to about 945 TWh by 2030, with AI as the biggest driver. More electricity does not automatically mean more water everywhere, but in many facilities it does mean more cooling pressure and more scrutiny from local communities.
The mistake people make is talking about “data center water use” as if it is one simple number. It is not. Water use depends heavily on design, climate, workload density, cooling method, and where the facility sits. Uptime Institute notes that water use is highly local and says each site has a distinct water signature, while Microsoft says water-usage effectiveness varies with location, humidity, and temperature.

Why data centers use so much water
Water is mainly used for cooling. Google says water helps cool servers, regulate indoor temperatures, and keep data centers running, and it notes that water cooling can reduce energy use and related emissions compared with some air-based approaches. Google’s 2025 AI-inference sustainability work also says data centers often consume water for cooling, and improving energy efficiency can reduce associated water use.
That creates a tradeoff people like to ignore. Cutting electricity use can sometimes increase water use, and cutting water use can sometimes require more energy-intensive designs. This is why the issue is not “water bad, air good” or the reverse. It is an infrastructure balancing act, and AI is making that balance harder because newer facilities are being optimized for denser, hotter workloads. Microsoft said its newer data-center designs are optimized for AI workloads, and starting in 2024 it launched a design that uses zero water for cooling by using chip-level cooling instead of evaporative cooling.
Why the local impact matters
The water issue is local because communities do not experience resource pressure in abstract global averages. EESI reported in 2025 that large data centers can consume up to 5 million gallons of water per day, roughly comparable to the daily use of a town of 10,000 to 50,000 people. That does not mean every site uses that much, but it shows why residents and utilities are paying closer attention when AI infrastructure expands in water-stressed areas.
| Issue | Why it matters |
|---|---|
| Cooling demand | More servers usually mean more heat to remove |
| Location | Arid and hot regions face sharper tradeoffs |
| Design choice | Evaporative, air, and chip-level systems use water differently |
| Community pressure | Local water systems feel the impact, not global averages |
| AI density | AI-ready sites can intensify cooling needs |
What major companies are doing
Cloud companies already know this is becoming a reputational and operational issue. AWS says it aims to become water positive by 2030 and was 53% of the way there by the end of 2024. It also says it is expanding recycled-water use from 24 sites to more than 120 U.S. locations by 2030, with an expected preservation of over 530 million gallons of drinking water annually. Microsoft says its operational data centers have achieved an 18% reduction in water intensity from its 2022 baseline, and its next-generation design can avoid more than 125 million liters of water per year per data center for cooling.
That sounds positive, but do not confuse mitigation with disappearance. The broader buildout is still huge. IEA says global electricity generation to supply data centers is projected to rise from about 460 TWh in 2024 to over 1,000 TWh by 2030 in its base case. If infrastructure keeps scaling that fast, efficiency gains will matter, but so will siting decisions, local water conditions, and whether companies actually use lower-water designs where communities are already stressed.
What this means in the AI era
The real story is not that data centers suddenly started using water. It is that AI is making the scale, pace, and location of that demand much harder to ignore. Water use is becoming a bigger issue because AI infrastructure is arriving faster than many local systems were built to absorb. That is why this is shifting from a technical sustainability metric into a public-policy and community issue.
Conclusion
Data center water use is becoming a bigger issue in the AI era because cooling massive computing infrastructure puts real pressure on local resources. The lazy version of this debate is to pretend all data centers are equally wasteful or that efficiency claims solve everything. Neither is true. The real question is where these facilities are built, how they are cooled, and whether water efficiency is improving fast enough to keep up with AI-driven expansion.
FAQs
1. Why do data centers use water?
Mostly for cooling. Water helps remove heat from servers and building systems, especially in facilities using evaporative or hybrid cooling methods.
2. Does AI make data-center water use worse?
AI can increase cooling pressure because AI-ready data centers tend to run denser, hotter workloads and overall data-center demand is growing rapidly.
3. Are companies reducing water use?
Some are trying. Microsoft says newer facilities can use zero water for cooling, and AWS says it is expanding recycled-water use and pursuing water-positive goals.
4. Why is this such a local issue?
Because water stress is local. A facility’s impact depends on the region’s climate, water supply, and infrastructure, not just on a company’s global sustainability goals.