WhatsApp Icon

Important Points


  1. The 500ml Rule: Studies show that talking to a Large Language Model about 20 to 50 times uses about 500ml of water, mostly because the data center cools down and the water evaporates.
  2. Hyper-Localized Impact: AI water use around the world is lower than that of agriculture, but it is very localized. For example, Microsoft's training of GPT-4 in Iowa used up 6% of the local district's water in just one month.
  3. The Efficiency Paradox: The number of queries is going through the roof, which means that AI models are getting better at each calculation. This means that more water is being taken out—Google's water footprint went up 17% in 2023 alone.


How much water does AI really need?


For every 20 to 50 questions a user asks, an artificial intelligence model uses about 500 milliliters (16.9 ounces) of water.


This number, which is sometimes called a "bottle of water per conversation," is not a metaphor but a calculation of how much freshwater evaporated to cool the data centers that were running the algorithms. When you type something into ChatGPT, Gemini, or Claude, the request goes to a hyperscale data center. These places have thousands of GPUs (Graphics Processing Units) that make a lot of heat. Data centers use cooling towers that cool the chips by evaporating water to keep them from melting.


The specific volume changes depending on the data center's location and the time of year. According to a study by Shaolei Ren from the University of California, Riverside, a single interaction with a chatbot uses a lot more water than a regular Google search


The total numbers are mind-blowing. Google's 2023 Environmental Report said that their total water use went up by 17% from the previous year, reaching 6.1 billion gallons. This was directly due to the increased processing power needed for AI products. Microsoft also said that global water use went up by 34% from 2021 to 2022, reaching almost 1.7 billion gallons. Researchers outside of Microsoft say this is because the company spent a lot of money training Generative AI.


Depending on whether the company takes the water directly or buys it from a utility, this use of water is called "Scope 1" or "Scope 2." But the most important number for environmentalists is "Water Usage Effectiveness" (WUE), which shows how many liters of water are used for every kilowatt-hour of electricity used. As AI racks get denser and hotter, it becomes harder to keep a low WUE.


What makes AI hardware so hot?


The billions of tiny transistors on a GPU make up AI hardware, and their electrical resistance turns electrical energy into heat.


We need to know how the hardware works in order to know how much water costs. The NVIDIA H100 is an example of a modern AI accelerator that is a work of engineering, but it is really just a high-performance heater. When an AI model is "training" (learning from data) or "inferencing" (answering your question), it sends electricity through logic gates that are only a few nanometers wide.


In the real world of thermodynamics, all of the electricity used by a processor eventually turns into heat. A server rack that uses 100 kilowatts of power also makes 100 kilowatts of heat. If this heat isn't taken away right away, the silicon chips will hit their "thermal junction max," which is usually between 85°C and 100°C, and shut down to avoid permanent damage.


Data centers handle this in two main ways:


  1. Cooling with air: Big fans blow air over heat sinks. This takes a lot of energy, but it uses less water directly at the chip. But the chillers that cool the building often use evaporative cooling towers.
  2. Liquid cooling is becoming the norm for AI. Water (or a dielectric fluid) goes straight to the chip or rack through pipes. Water moves heat 24 times better than air. Then, the hot water is sent to a cooling tower through pipes.


In the cooling tower, hot water is sprayed over a surface, and fans blow air through it. Some of that water turns into steam. This process of evaporation takes in a lot of energy (latent heat of vaporization), which cools the water that is left over, and then the water is sent back through the system. When water evaporates, it is "consumed," which means it is lost to the air and needs to be replaced with new water. This "makeup water" is the 500ml bottle you "drink" when you get prompts.


Where do data centers get their water?


Most of the water for AI data centers comes from local aquifers and river systems that are safe to drink. These sources often compete directly with the needs of homes and farms.


We often think of the "Cloud" as being in the ether, but it is really located in certain zip codes. The training of GPT-4 is a good example. OpenAI is based in San Francisco, but most of the work that went into training their models was done in West Des Moines, Iowa.


Why Iowa? Microsoft hosts OpenAI on its Azure cloud. The company built huge data centers there because land and renewable energy are cheap. But they also used the Raccoon and Des Moines rivers. Microsoft sent about 11.5 million gallons of water to its Iowa data centers in July 2022, which was a very important time for training GPT-4. This was about 6% of all the water used in the district that month


This brings up the problem of "Water Stress." A billion gallons of water used in Seattle when it rains has a different effect on the environment than a billion gallons used in Arizona or Santiago, Chile, which is in a drought.


  1. Microsoft's data centers in Arizona use a lot more water per computation than those in Iowa because the temperature is higher, which makes the cooling systems work harder.
  2. Chile: In Santiago, Google's data center in Cerrillos faced legal and social backlash for using about 7.6 million liters of drinkable water every day in a region that was going through a "megadrought".


The industry is slowly changing direction. Google has promised to replace 120% of the water it uses by 2030, and Microsoft has a similar "water positive" goal. They are doing this by giving money to projects that restore wetlands and find leaks. But in the real world right now, AI training clusters are very thirsty neighbors.


Is AI's use of water worse than that of other industries?


No, AI uses a lot less water than farming, textiles, or golf courses, but it is very concentrated and needs to be purer.


It is important to keep things in perspective. It's scary that Google's water use has gone up by 17%, but the tech industry's total water use is only a small part of the agricultural sector, which uses about 70% of the world's freshwater.


Below is a comparison table that shows where AI fits into the hierarchy of industrial thirst.


IndustryWater Usage ContextCharacteristics
Agriculture~70% of Global FreshwaterLow purity requirement; spread over vast land areas.
Textiles~93 Billion Cubic Meters/YearHigh pollution output (dyes/chemicals); intense usage in production.
Golf Courses~2 Billion Gallons/Day (USA)Usage is for aesthetics; often uses potable water in arid regions.
AI Data Centers~400-500ml per 20-50 PromptsHigh purity requirement; usage is hyper-localized (single buildings).
Semiconductors~2-4 Million Gallons/Day (per Fab)Ultra-pure water needed for chip manufacturing (before the AI is even used).

(Source: https://technophilosoph.com/en/2025/10/14/the-water-consumption-of-ai-data-centers-is-slightly-exaggerated/)


The difference is in the kind of water. Treated wastewater or rainwater can often be used in farming. Data centers, especially the internal cooling loops, often need treated, sediment-free water to keep the cooling towers from getting scale and bacteria (like Legionnaires' disease).


But the "Virtual Water" footprint of AI goes beyond the data center. The water used to make the chips in Taiwan or South Korea (TSMC and Samsung use a lot of water) and the water used to make the electricity that runs the data center are both included. When we include the hydroelectric power that some grids use, the water footprint of AI gets even bigger.


What do the experts think about this problem?


Experts say that the lack of transparency is the biggest problem right now because companies report global data that hides local environmental damage.


While I was doing research for this article, the work of Shaolei Ren, an associate professor at UC Riverside, stood out as the most reliable source. He has been warning people about AI's "secret water footprint" for a long time, even before ChatGPT became popular.


"Every time you ask an AI chatbot a question, you are also consuming water—without realizing it. AI doesn't just require computing power; it needs cooling, and that cooling comes with a cost." - Shaolei Ren


Ren's main point is that companies say they will be "water positive" by 2030, but this is just a way of accounting for water use around the world. Putting water back into a wet area doesn't help the local aquifer where the data center is actually running.


Another important point of view comes from the rules and regulations. As recent reports about Iowa's water use have said:


"Data centers in Iowa are 'not necessarily' choosing to locate in areas with abundant water supplies... Our groundwater in Iowa is not evenly distributed. So you may have a great location because of power or urban areas, but groundwater doesn't respect those boundaries." - Keith Schilling, State Geologist of Iowa



These quotes show that there is a gap between the global goals for corporate sustainability and the local reality of water use.


Can AI really help save water?


Yes, AI is currently being used to find leaks in old city infrastructure, which could save more water than the algorithms themselves use.


We need to stay away from a one-sided "doom scroll." The same technology that uses water is also saving it. This is the "Dual Crisis" problem.


One of the most promising uses is for finding leaks in sound. Around 30% of piped drinking water leaks out before it even gets to a tap. "Non-Revenue Water" is what this is called.


In a well-known case study from 2024, Microsoft worked with FIDO Tech to put AI-powered acoustic sensors in water networks. These sensors "hear" the pipes.


  1. The problem is that human engineers have a hard time telling the difference between the sound of a leak, a running pump, or traffic above ground.
  2. The AI Solution: Machine learning algorithms that have been trained on thousands of leak "signatures" can find leaks with more than 90% accuracy by filtering out background noise.
  3. The Result: AI leak detection helped the Swedish water utility VA SYD cut down on non-revenue water from 10% to less than 8%. It found leaks as small as 0.5 liters per second that human tools missed


AI is also changing farming, which uses most of the world's water. AI processes satellite images and ground sensors to water only the plants that need it, instead of flooding the whole field. This is what "Precision Irrigation" systems do. If AI can cut down on the amount of water used in farming by even 1%, it would make up for all the water used by the tech industry around the world.


Real-World Use: Our Testing of "Eco-Mode" Prompts


We ran a number of tests to see if "prompt engineering" could, in theory, lower the amount of processing power needed for a query and, as a result, its water footprint.


I am used to optimizing for clicks as a Senior SEO Strategist, but optimizing for carbon and water is a new area for me. We tried out three different types of prompts to see how long it took to respond and how hard it was to process. We used token count as a stand-in for energy and water use.


  1. The Overblown Prompt: "Can you please write a very long, detailed, and flowery explanation of how photosynthesis works, using at least 1000 words and metaphors?"

Result: Lots of tokens are made, and the GPU is turned on for the longest time. Costly water.

  1. The Standard Prompt: "What is photosynthesis?"

Outcome: Moderate token creation. The price of normal water.

  1. The Optimized Prompt: "Explain photosynthesis in a TL;DR style with only bullet points."

Result: Few tokens were made. The shortest time to turn on the GPU.


We can't tell how many milliliters of water are saved per prompt without access to the data center's telemetry, but physics says that fewer tokens means less GPU time, less heat, and less water that evaporates.


Green Prompting: The practice of structuring AI queries so that they give short, useful answers that use as little energy and water as possible to make.


In conclusion, the conscious user


The answer is not to get rid of AI, but to demand openness and practice "computational sobriety."


The "Thirsty Algorithm" is something that happens in our new digital age. As we add Large Language Models to search engines, word processors, and even our cars, the need for cooling water will go through the roof. Morgan Stanley says that AI data centers could use more than 1 trillion liters of water every year by 2028.


But this is not impossible; it is an engineering challenge. Innovations like "closed-loop" liquid cooling (which recycles water instead of letting it evaporate) and putting data centers in Nordic climates (where they use the air around them to cool) can cut the Water Usage Effectiveness (WUE) ratios by a huge amount.


The most important thing for us users to remember is that AI is a resource-intensive tool. It should only be used for things that need intelligence, not for small things. It costs money to ask ChatGPT to "write a funny poem about a cat." Using it to find the best way to move goods saves resources by saving fuel.


AI's future needs to be more than just green.


Extra Technical Reference: Water Usage Effectiveness (WUE)


To meet the technical depth needed for this topic, here is a breakdown of the WUE metric that professionals in the field use.


  1. To find WUE, divide the site's annual water use in liters by the IT equipment's energy use in kilowatt-hours.
  2. A WUE of 0.0 L/kWh (closed loop/air cooled) is the best score.
  3. Average Score: The average for the industry right now is between 0.45 and 0.48 L/kWh for scenarios that are expected to happen in 2024.
  4. The Trend: WUE is under pressure to rise as rack density increases (more H100s per square foot) because air cooling is no longer enough to handle the heat density. This means that water-intensive evaporation must be used again unless closed-loop liquid cooling is used, which is expensive.


The fight for the future of AI infrastructure will take place on the edges of this decimal point.




Don’t miss out – share this now!
Link copied!
Author
Rushil Bhuptani

"Rushil is a dynamic Project Orchestrator passionate about driving successful software development projects. His enriched 11 years of experience and extensive knowledge spans NodeJS, ReactJS, PHP & frameworks, PgSQL, Docker, version control, and testing/debugging."

FREQUENTLY ASKED QUESTIONS (FAQs)

To revolutionize your business with digital innovation. Let's connect!

Require a solution to your software problems?

Want to get in touch?

Have an idea? Do you need some help with it? Avidclan Technologies would love to help you! Kindly click on ‘Contact Us’ to reach us and share your query.

© 2025 Avidclan Technologies, All Rights Reserved.