Does ChatGPT Use Water? Here Is Exactly How Much Water ChatGPT Uses
Does ChatGPT use water? Find out exactly how much water ChatGPT uses per prompt, why data centers need water cooling, and what this means for our environment.
PP
Pulkit Porwal
Apr 9, 2026•8 min read

On this page
When I first heard that typing a question into ChatGPT uses water, I thought it was a joke. Water and software felt like two completely different worlds to me. But after researching this topic seriously, I can tell you that yes, ChatGPT does use water — and the numbers are surprisingly big when you add them all up.
Why Does ChatGPT Use Water at All?
The first thing I want to clear up is that ChatGPT itself — the software — does not drink water. But it runs on thousands of powerful computer servers packed tightly inside massive buildings called data centers. These servers work so hard processing your questions that they generate enormous amounts of heat. If that heat is not removed quickly, the servers break down.
The most common and cost-effective way to remove this heat is through water-based cooling towers. Water absorbs the heat from the servers, then evaporates into the air, carrying the heat away. Fresh water is continuously pulled from local rivers, groundwater, or municipal supplies to replace what evaporates. There is also a second layer of water use that most people miss: the power plants that generate electricity for these data centers also use water for cooling their turbines. This is called "virtual water" — you don't see it, but it is being used on your behalf every time you send a message to ChatGPT. According to researchers at the University of Illinois, data centers can evaporate between 1 and 9 liters of water per kilowatt-hour of server energy just for cooling purposes alone.
How Much Water Does ChatGPT Use Per Prompt?
This is the question I get asked most often, and the honest answer is: it depends. But here are the best estimates based on published research so far. A 2023 paper titled Making AI Less Thirsty by researchers at the University of California Riverside is the most widely cited source on this topic. It estimated that every 20 to 50 prompts to a large language model like ChatGPT requires the equivalent of a 500 ml bottle of water. That works out to roughly 10–25 ml per single prompt.
However, newer and more efficient models like GPT-4o are significantly less power-hungry than older ones. Independent analysis by software engineer Sean Goedecke suggests that for a typical short conversation with a modern ChatGPT model, the real figure is closer to around 5 ml of water — not 500 ml. OpenAI CEO Sam Altman himself mentioned in a blog post that an average ChatGPT query uses roughly one-fifteenth of a teaspoon of water, though he did not clarify all the details behind that number. Here is a simple breakdown to make this easier to understand:
- 1 short prompt (a quick question): 5–25 ml of water
- 20–50 prompts (a working session): ~500 ml (one water bottle)
- 100 prompts in a day: 2–5 liters of water
- Millions of global users daily: Billions of liters of water per day
For more context on how AI tools compare, you can also read about the LMSYS Chatbot Arena Leaderboard 2026 to see how different AI models stack up in performance and efficiency.
The Global Scale: Why Small Numbers Become a Big Problem
Here is where things get serious. Individually, 5 to 25 ml per prompt sounds like nothing. But ChatGPT is used hundreds of millions of times every single day. When you multiply even a tiny per-prompt water cost by that volume, the totals become enormous. According to the Environmental and Energy Study Institute, U.S. data centers alone consume approximately 449 million gallons of water per day — or about 163.7 billion gallons annually.
A medium-sized data center can use up to 110 million gallons of water per year — roughly equal to the annual water use of 1,000 households. The largest ones can each consume up to 5 million gallons per day, which is comparable to the daily water needs of a town with 50,000 people. According to projections cited by Brookings Institution, water used for data center cooling could increase by a staggering 870% in the coming years as more AI facilities come online. Nearly half of the world's 9,000+ data centers are already located in regions of high water stress. This is not an abstract problem — it directly affects farming, drinking water supplies, and entire communities near these facilities. If you are looking at comparing different AI tools and their resource usage, check out this guide on the 7 best alternatives to ChatGPT in 2026.
Where Does This Water Actually Come From?
I think this part is the most important for people to understand, because "using water" sounds vague. Let me walk you through exactly where the water goes in a data center. There are three main sources of water consumption that researchers track:
- On-site cooling water (Scope 1): Water physically used inside the data center, mostly in cooling towers that use evaporative cooling. This water evaporates and is lost from the local water supply.
- Power plant water (Scope 2): The electricity needed to run the servers comes from power plants — and most thermal power plants also use water for cooling. The national average is about 7.6 liters of water evaporated per kilowatt-hour of electricity consumed.
- Manufacturing water (Scope 3): Producing a single microchip requires 8–10 liters of ultra-pure water to clean wafers and cool machinery. This is part of the upstream supply chain water footprint.
The water used on-site is the most visible. Cooling towers work through a process called evaporative cooling — water absorbs heat from the servers and then evaporates into the atmosphere. This water is not returned to the local water supply. It is simply gone. Most data centers use a closed-loop system that recirculates water, but these systems still require regular freshwater top-ups to prevent mineral buildup and bacterial growth in the pipes. IE Insights notes that Morgan Stanley projects AI-related data centers could consume more than 1 trillion liters annually by 2028 — an elevenfold increase from 2024 levels. That is not a typo.
Can ChatGPT's Water Use Be Reduced?
The good news is that this problem is not unsolvable. I have been tracking developments in this space and there are several real solutions already being deployed. The most promising ones are:
- Direct-to-chip liquid cooling: Instead of cooling the entire room with water-based air systems, pipes bring coolant directly to the chip surface. This can reduce water use by up to 90% compared to traditional methods.
- Immersion cooling: Servers are submerged in a special non-conductive liquid. This is extremely efficient and uses almost no water.
- Geographic optimization: Building data centers in naturally cooler regions like Scandinavia or Iceland dramatically cuts cooling needs. Some facilities there use zero water for cooling, relying entirely on cold outside air.
- Nighttime training: Running AI model training jobs at night or in cooler seasons reduces the heat load and evaporation losses.
- Recycled and non-potable water: Using reclaimed water or greywater instead of fresh drinking water for cooling towers.
Google disclosed in 2025 that each query to its Gemini assistant consumes just 0.26 ml of water, showing that with the right infrastructure design, per-query water use can be dramatically reduced. Advances in reinforcement learning have also enabled 14–21% energy savings through real-time cooling optimization at major data centers. AI itself is being used to make AI more water-efficient — which is an interesting twist. For developers thinking about building AI-powered tools more efficiently, see this guide on the 10 best AI prompts for web development. Also, if you are curious about how coding tools like Claude Code vs Cursor compare in 2026, that is worth a read too.
What You Can Do as a ChatGPT User
I want to be honest with you here: your individual usage has a much smaller impact than the infrastructure choices made by tech companies. Researchers and analysts consistently point out that one data center switching from traditional cooling towers to direct-to-chip liquid cooling saves far more water than thousands of users cutting their prompt count. That said, there are still things worth doing:
- Write clear, specific prompts so you get a good answer in fewer back-and-forth messages. This saves water and your time.
- Avoid using AI for things you genuinely don't need it for — not out of guilt, but out of good habit.
- Support and choose AI providers that publish transparency reports on energy and water use. Google and Microsoft have made progress here.
- Advocate for data center regulations that require water use disclosures and mandate sustainable cooling practices.
"Infrastructure choices matter 10–25 times more than individual usage. Your 50 daily queries barely register compared to one data center switching cooling systems." — Whitney A. Foster, AI environmental impact researcher
According to a Washington Post investigation, roughly a quarter of Americans have used ChatGPT since its 2022 launch — and every query carries an environmental cost. But that cost can be reduced significantly with smarter infrastructure, better regulation, and more transparency from AI companies. As users, our biggest power is in demanding that transparency and rewarding companies that take it seriously.