TheHowPage

Technology · Environment

AI Is Thirsty

Every AI prompt has a price tag nobody shows you — water, electricity, carbon. Real 2025–2026 data on ChatGPT, Gemini, Grok, Claude, and every major AI company.

Pick an AI action and see its real cost

💧 ml = millilitres of waterWh = watt-hours (1 Wh ≈ a phone screen on for 6 min)💨 g CO₂ = grams of carbon dioxide equivalent
Quantity:

Click any action above to see its real cost

The Invisible Price Tag

When you ask ChatGPT a question, something physical happens. Somewhere in a data center — Iowa, Virginia, Singapore — thousands of GPU cores light up. Fans spin at full speed. Water evaporates through cooling towers. Electricity flows from a grid that, in many US states, is still largely coal and gas.

The companies that run these systems have every incentive not to talk about this. The ones that do disclose data use accounting methods that minimise the numbers. The ones that don't disclose anything — Anthropic, OpenAI, xAI — face no legal obligation to change that. Not yet.

Here is what we actually know, sourced from peer-reviewed research, company sustainability reports, IEA data, and investigative journalism.

500ml

water per ChatGPT reply

a full drinking glass — UC Riverside, 2023

+51%

Google emissions since 2019

then they quietly deleted their net-zero pledge

945 TWh

AI data centers by 2030

more than Japan's entire electricity grid — IEA

Every Major AI Model — Side by Side

Energy, water, and CO₂ per single query. Google Search is the baseline. Numbers are per-inference, not amortised training cost.

Wh = watt-hours (1 Wh ≈ leaving a phone screen on for 6 minutes, or a 60W bulb for 1 minute). ml = millilitres. g CO₂ = grams of CO₂ equivalent.

ModelEnergy (Wh)Water (ml)CO₂ (g)vs Google Search

Google Search

Google

0.040.050.02baseline

Gemini (text)

Google · Official disclosure Aug 2025

0.240.260.03

Claude Sonnet

Anthropic · Estimated

0.33003.57.5×

ChatGPT (GPT-4o)

OpenAI · Altman disclosed Jun 2025

0.345004.328.5×

Llama 405B

Meta · arXiv benchmark

1.8618002.547×

AI Image (Midjourney)

Various · Up to 2 kWh

20002500080050000×

ChatGPT-5

OpenAI · Launched Aug 7, 2025

18.91800025472×

Water figures use UC Riverside methodology (scope-3, including power plant cooling). Company-disclosed figures typically count only on-site cooling water. Sources: UC Riverside arxiv:2304.03271, Epoch AI, Sam Altman Jun 2025, Google Cloud Blog Aug 2025, arXiv:2505.09598.

Your AI Footprint Today

How much AI do you actually use in a day? Set your typical usage below — see the real environmental cost nobody puts on the label.

💬ChatGPT / GPT-4o messages
10
💧 5.00 L3.40 Wh💨 43.2 g
🤖ChatGPT-5 messages
0
Gemini prompts
0
🟠Claude messages
0
🧑‍💻GitHub Copilot completions
50
💧 2.50 L5.00 Wh💨 25.0 g
🎨AI images generated
0
🔍AI-powered searches (Perplexity etc.)
5
💧 500 ml0.75 Wh💨 5.0 g
🎙️Voice assistant queries (Siri AI, Alexa)
3
💧 60 ml0.15 Wh💨 0.6 g

Your AI cost today

8.06 L

water

more than a toilet flush (6L)

9.30 Wh

electricity

233 Google Searches

73.8 g

CO₂

≈ driving 351m by car

Across a year, your AI habits would use roughly 2942 litres of water and emit 26.9 kg of CO₂.

Multiply your number by 600 million daily ChatGPT users to get the industry picture.

Why the Numbers Look So Different

Google says one Gemini prompt uses 0.26ml of water. UC Riverside says a ChatGPT prompt uses 500ml. Both are correct. They measure different things.

Company-disclosed (Scope 1)

Only counts water physically evaporated at the company's own data center for cooling. Does not include water used by the power plants generating the electricity.

Google Gemini: 0.26ml

Full lifecycle (Scope 3)

Includes water used at power plants to generate the electricity that runs the data center. Thermal power plants (coal, gas, nuclear) use enormous amounts of water for cooling.

ChatGPT 100-word reply: ~500ml

The scope-3 figure is the real-world impact. Power plants in water-stressed regions — Texas, Arizona, the US Southwest — use up to 2 litres of water per kWh of electricity generated. When your data center is powered by a gas plant in Texas during a drought, that water is gone.

Company Report Cards

Every major AI company made (or avoided) environmental commitments. Here is what the data actually shows. Click any company for the full picture.

3 of 7 companies have made no environmental pledges whatsoever. 1 deleted their pledge after AI made it unreachable. 1 is broadly on track.

Case Study · xAI / Grok

Memphis Is Burning

In 2024, Elon Musk's AI company built the world's largest supercomputer in South Memphis. To power it, they installed 35 gas turbines. They had permits for 15. The facility sits in one of the most polluted, predominantly Black neighbourhoods in Tennessee. No environmental impact assessment was ever conducted.

200,000

H100/H200 GPUs

World's largest cluster

1M gal/day

water from Memphis city

No recycling disclosed

1,200–2,000t

NOx per year

Likely city's largest emitter

0

Environmental pledges

xAI has made none

Timeline

Click any event for details

Why this matters beyond xAI: The Memphis case is the clearest example of AI's environmental cost falling disproportionately on communities that had no say in where these facilities are built. South Memphis residents didn't choose to host a supercomputer. They got the NOx, the formaldehyde, and the noise. The executives got the compute. The users got a faster chatbot.

Sources: SELC.org · CNBC Apr 2025 · Inside Climate News Jul 2025 · HPCwire May 2025

The Scale Problem

Individual queries seem small. The aggregate is staggering. IEA projects global AI data centers will consume more electricity than Japan by 2030.

Year: 2024

Global data centers: 415 TWh

202220262030

Global data center electricity

415 TWh

≈ same as France's entire grid

IEA confirmed

AI-specific share

62 TWh

15% of all data center electricity

Compare to a country

🇫🇷 France annual electricity

400 TWh

Population: 68M

1.0×

Data centers in 2024

415 TWh

1.0× bigger than France

France400 TWh
Global data centers (2024)415 TWh

ChatGPT annual electricity

~17 TWh

More than Slovenia's entire national grid

BestBrokers / Ember Energy

US data centers by 2030

8% of all US electricity

Up from 3% in 2022 — doubling the share in 8 years

Goldman Sachs 2024

Data center investment needed

$720 billion

Grid upgrades required through 2030

Goldman Sachs 2024

AI water footprint (2025)

≈ global bottled water

AI water consumption could equal annual global bottled water production

Peer-reviewed study, Euronews Dec 2025

Texas data centers by 2030

399 billion gallons

Texas used 25B gallons in 2025. Projected 16x growth.

Texas Tribune Sep 2025

Data centers in water-stressed regions

72%

72% of new data centers since 2022 built in areas with significant water stress

Bloomberg 2025

Ireland data centers by 2026

32% of national electricity

One country's entire grid hijacked by data centers — IEA April 2025

IEA Energy and AI, April 2025

US grid crisis (Feb 2025)

2,000 MW dropped instantly

40 data centers simultaneously dropped 2,000 MW in one incident. PJM projects 6 GW shortfall by 2027.

PJM Interconnection / Common Dreams

Scope 3 emissions increase (Big 4)

+150% (2020–2023)

Amazon, Google, Meta, Microsoft combined increased indirect emissions 150% in 3 years

ITU / World Benchmarking Alliance 2025

Moratorium bills (US states)

6 states in 2025

New York, Virginia, Georgia, Oklahoma, Vermont, Maryland introduced data center moratorium bills due to grid strain

The Deep Dive / TechPolicy.Press 2025

Governments Are Pushing Back

Bans, moratoria, and regulations enacted since 2021

🇳🇱

Netherlands

ongoing2021

Moratorium on new data centers

Amsterdam Metropolitan Area banned new data center construction through at least 2030

🇮🇪

Ireland

ended Dec 20252021

Grid connection moratorium

Dublin-area moratorium ended Dec 2025 but new connections require on-site generation. Data centers could be 32% of national electricity by 2026.

🇩🇪

Germany (Frankfurt)

ongoing2023

Grid connection ban

Grid operator implemented effective ban on new data center connections in Frankfurt region

🇺🇸

USA (6 states)

in legislature2025

Moratorium bills introduced

NY, VA, GA, OK, VT, MD introduced bills to halt new data center construction due to grid strain

🇺🇸

Texas

enacted2025

Senate Bill 6 — grid curtailment

ERCOT can now curtail or disconnect large loads during emergencies; developers must fund grid upgrades

🇺🇸

Arizona (Phoenix/Tempe)

enacted2025

Water cooling regulations

New guidelines limit evaporative cooling by data centers to address water scarcity

The Training Cost Nobody Talks About

Every conversation about AI energy focuses on inference — the cost of each query. But models have to be trained first. And training costs are orders of magnitude larger.

ModelEnergy (MWh)CO₂ (metric tons)Real-world equivalent
GPT-31,287552300 NYC–SF round trips by plane
GPT-4 (estimated)51,773–62,319~13,000Annual emissions of ~1,000 Americans
GPT-4 vs GPT-340–48×~24×Each new generation costs exponentially more

GPT-4 training estimates based on leaked GPU cluster data (Kasper Groes Albin Ludvigsen, Towards Data Science). OpenAI has not disclosed official figures. GPT-3 from Patterson et al. Google Brain paper.

And training is not a one-time cost. Models are retrained, fine-tuned, and updated continuously. The Gemini Ultra that Google uses today has been through multiple training runs. GPT-5 will cost more than GPT-4 to train. Each new generation resets the counter.

The Nuclear Bet

Microsoft, Google, and Amazon all came to the same conclusion in 2024: renewables alone cannot power AI at the scale they need. In the span of 12 months, Big Tech contracted over 10 gigawatts of potential new nuclear capacity in the US. This has never happened before.

Why not just use more renewables? Wind and solar are intermittent — they don't generate power at night or when the wind stops. AI training runs can't pause because a cloud covered the solar farm. Battery storage at the required scale doesn't exist yet. Nuclear is the only carbon-free source that runs 24/7 at gigawatt scale.

The Deals

  • Zero carbon emissions during operation
  • 24/7 reliable power — unlike wind and solar
  • 1 GW plant takes only ~1 km² of land
  • Modern SMRs can be built faster than large plants
  • Restarts like Three Mile Island reuse existing infrastructure

The editorial take: Nuclear is almost certainly the right call for AI power — it's the only credible path to carbon-free, always-on power at this scale. But note the irony: AI companies built unsustainable data centers first, created an energy crisis, then positioned nuclear as the solution. The sequencing matters. And nuclear plants also consume significant water for cooling — it's not a complete answer to AI's environmental footprint.

What You Can Actually Do

The point of this page is not guilt — it's informed choice. Here are actions that genuinely move the needle. Toggle the ones you'll actually commit to.

Use a smaller model when you don't need GPT-5

Easy

Gemini Flash, Claude Haiku, Llama 8B — these are 10–50× more energy-efficient than frontier models for simple tasks. Most coding help, summaries, and simple Q&A don't need GPT-5.

Batch your prompts — one good prompt beats five bad ones

Easy

Each prompt spins up compute. A well-crafted single prompt uses significantly less energy than 5 iterative ones to reach the same result.

Use on-device AI where available

Medium

Apple Intelligence, Gemini Nano, and local models (Ollama) run on your device with no data center involved. Dramatically lower energy and zero water consumption.

Demand environmental disclosure from AI companies

Systemic

3 of 7 major AI companies publish zero environmental data. The EU AI Act now requires GPAI providers to disclose energy consumption. The US has no such requirement yet.

Choose AI tools from renewable-powered providers

Medium

Amazon/AWS matches 100% of electricity with renewables. Google and Microsoft are buying nuclear. The grid your AI runs on matters enormously.

Skip AI when a search or a calculator will do

Easy

Not every question needs an LLM. A Google Search uses 0.04 Wh. ChatGPT-5 uses 18.9 Wh. For factual lookups, the old way is 470× more efficient.

AI is not the enemy. Opacity is. Demand that the companies building this infrastructure show their work. An industry that won't disclose its footprint has no incentive to reduce it.

How much water does one ChatGPT message actually use?

About 500ml — a full drinking glass — per 100-word response, according to the UC Riverside study (arxiv:2304.03271). This includes water used to cool both the data center and the power plants generating the electricity. Google's official Gemini figure (0.26ml) only counts water at the data center itself, not the power plant. Neither is wrong — they measure different things. The UC Riverside method gives you the full picture.

Why is ChatGPT-5 so much worse than ChatGPT-4?

ChatGPT-5 (launched August 7, 2025) averages 18.9 Wh per prompt — ranging from 2 to 45 Wh depending on complexity. That's roughly 55x a Google Search. Larger, more capable models require more compute per response. The tradeoff is better answers at higher environmental cost. OpenAI has not published efficiency improvement plans for GPT-5.

Is this actually a problem, or is AI getting more efficient?

Both are true, but efficiency isn't winning. Google's Gemini efficiency improved 33x over 12 months — impressive. But Google's total emissions rose 51% since 2019 and they deleted their net-zero pledge. The Jevons Paradox is at work: as AI gets cheaper and more efficient, usage grows faster than efficiency improves, so total consumption keeps rising. IEA projects global data center demand will hit 945 TWh by 2030 — more than Japan's entire grid.

What is xAI's Memphis situation?

xAI built a supercomputer cluster called Colossus in South Memphis in 2024. To power it, they installed 35 gas turbines — but only had permits for 15. They operated the rest without permits. The Southern Environmental Law Center filed a notice of intent to sue on behalf of the NAACP. The facility is in a predominantly Black, low-income neighbourhood. Estimated NOx emissions: 1,200–2,000 tons per year, increasing local smog by 30–60%. Some turbines were later removed after legal pressure.

Why are tech companies buying nuclear power plants?

Because AI's power demand is so large and growing so fast that renewables alone can't keep up — wind and solar are intermittent, and you can't build enough grid storage fast enough. Nuclear provides 24/7 carbon-free power at massive scale. Microsoft bought rights to restart Three Mile Island (835 MW). Google signed the first corporate small modular reactor deal with Kairos Power. Amazon signed three nuclear deals in October 2024. Collectively, Big Tech contracted over 10 GW of new nuclear in 12 months.

Is on-device AI (like Apple Intelligence) better for the environment?

Significantly better for individual queries. On-device AI eliminates the network round-trip to a data center, uses your phone's chip (which is optimised for efficiency), and doesn't consume data center cooling water. Apple's strategy of running a ~3B parameter model on-device for most tasks is genuinely greener per query. The tradeoff: the chip manufacturing itself has a carbon cost, and larger queries still go to servers.

Which AI company is most transparent about environmental impact?

Amazon/AWS is the most transparent, with detailed water efficiency data (0.15 L/kWh WUE), renewable matching disclosures, and clear progress metrics toward their 2030 water-positive goal. Google published per-query Gemini stats in August 2025 — an industry first. Anthropic discloses nothing. OpenAI discloses almost nothing (Sam Altman disclosed one per-query figure in June 2025). xAI has no disclosures and no pledges.

Does using a VPN or Tor affect AI's environmental impact?

No. The energy consumption happens at the data center when the AI model runs — your network path doesn't change that.

What can I actually do about this?

A few things that genuinely help: (1) Use smaller, more efficient models when you don't need maximum capability — Gemini Flash, Claude Haiku, Llama 8B. (2) Batch your queries — one detailed prompt beats five short ones. (3) Use on-device AI where it's available (Apple Intelligence, Gemini Nano). (4) Demand disclosure — companies that don't publish emissions data have no pressure to improve. The EU AI Act now requires GPAI model providers to disclose energy consumption. (5) Support open-source models run on renewable infrastructure.

How do AI emissions compare to other industries?

AI's 2025 carbon footprint is estimated at 32–80 million metric tons CO2e — comparable to a small European country like Denmark or Chile. For reference: global aviation emits ~800 million tons per year. AI is roughly 4–10% of aviation today, but growing at 3–5x the rate of aviation. The concern isn't the current number — it's the trajectory.

Sources

Every explainer is free. No ads, no paywall, no login.

If this helped you, consider supporting the project.

Buy us a coffee