Does a single AI query use a bottle of water?

8m

We’re living through boom-times for Artificial Intelligence, with more and more of us using AI assistants like ChatGPT, DeepSeek, Grok and Copilot to do basic research and writing tasks.

But what is the environmental impact of these technologies?

Many listeners have got in touch with More or Less to ask us to investigate various claims about the energy and water use of AI.

One claim in particular has caught your attention - the idea that the equivalent of a small bottle of drinking water is consumed by computer processors every time you ask an AI a question, or get it to write a simple email.

So, where does that claim come from, and is it true?

Reporter: Paul Connolly
Producer: Tom Colls
Production co-ordinator: Brenda Brown
Sound mix: Donald McDonald
Editor: Richard Vadon

Listen and follow along

Transcript

This BBC podcast is supported by ads outside the UK.

Want to stop engine problems before they start?

Pick up a can of C Foam Motor Treatment.

C-Foam helps engines start easier, run smoother, and last longer.

Trusted by millions every day, C-Foam is safe and easy to use in any engine.

Just pour it in your fuel tank.

Make the proven choice with C Foam.

Available everywhere.

Automotive products are sold.

Seafoam!

The ocean moves us, whether that's surfing a wave or taking in an inspiring view.

The ocean feeds us.

Sustainable seafood practices bring the ocean's bounty to our plates.

The ocean teaches us how our everyday choices, big and small, make an impact.

The ocean delights us as playful otters restore coastal kelp forests.

The ocean connects us.

Find your connection at montereybayaquarium.org slash connects.

Hello and thanks for downloading the Moreless podcast with me, Paul Connolly.

Each week we take a closer look at the numbers in the news and in everyday life.

Not to spoil anyone's fun, you understand, but instead to see if they're real, right, accurate.

Now, today we're going to try and break open the robot's brain and see what makes artificial intelligence tick along.

Specifically language models like ChatGPT, DeepSeek, Groc, Copilots, and the likes.

Lots of you have written in wondering how much water AI systems use when you ask them simple questions.

And it's one watery claim in particular that has caught your attention.

This idea that the equivalent of a small bottle of water is guzzled by computer processors every time you ask an AI that question or get it to write a short straightforward email.

So where does the claim come from and can we tell if it's true?

Before our study, when people look at the environmental impacts, they primarily look at the carbon emission, which is of course very important, but I think there are other important aspects such as the water consumption.

That's Dr.

Xiao Lei Ren.

an associate professor at the University of California, Riverside.

It's his research that's the main source of the water claim.

In 2023, he co-authored a study titled Making AI Less Thirsty.

We want to bring a more complete picture of the environmental impacts of AI to the public.

Dr.

Wren didn't have full access to the inner workings of OpenAI, the creators of ChatGPT, or to the other big tech firms behind language models.

So he did what academics so often do when faced with a data door that's bolted shut.

He annotated, he calculated, and he extrapolated.

The energy was cited from OpenAI's own paper about GPT-3 and also we cross-validated the number with a few other resources, including the paper recently published by Microsoft and in their production system.

And when we look at the water efficiency, we refer to Microsoft's own disclosure.

Microsoft runs and maintains OpenAI's data centers.

So what is AI using water for?

Well, when you fire a query ChatGPT's way, it's processed by a server, a computer, basically, in a massive data center.

All that digital processing generates heat, and to stop the servers overheating, these data centers run some heavy, juicy cooling systems.

They get rid of that heat by evaporating water into vapor and cooling towers, which then carries the heat away.

Some people have assumed that Shaolai's figures refer to just this kind of water use, the water used by data centers that are running AI systems.

But it's not just that.

We include both direct water consumption for cooling down the data center facility and also for indirect water consumption that is consumed for generating electricity.

The calculation also includes water that ends up being evaporated off-site in the generation of the electricity that powers the servers.

Water that evaporates, for example, in the cooling towers of coal power stations.

And that water is most of the water we're talking about here.

87% of us for an AI query going to an average US data center.

So, important context uploaded.

Let's get to the calculation.

Using the only relevant data Dr.

Ren and his colleagues could get their hands on, plus some fancy analysis, they came up with some numbers.

If you have about 10 to 50 queries with a medium-sized large language model, there will be 500 milliliters water consumption or the water evaporated into the atmosphere.

And that was for just answering the question without using reasoning or very sophisticated models.

So, Shaolai's paper gave an estimate for how many medium-sized AI questions, think maybe of an AI-generated email about 100 words long or so, it would take to evaporate 500 milliliters of water.

On average, according to Shaolai's calculations, if the data centers are in the US, then you can ask around 30 questions of an AI chatbot like ChatGPT before using that much water.

But that jars, doesn't it?

With the widely circulated claim that a solitary query, prompt or question uses the same amount.

But here is where there's a twist in this tale.

A while later, journalists for the Washington Post came knocking.

Shaole's calculations were all for the slightly older version of the AI system, that's ChatGPT-3, but the Post wanted to pin down how much water the newer version, version, that's ChatGPT-4, sinks when writing the same 100-word email.

First, I provided some raw data, but they did some other follow-up analysis.

Chat GPT-4 is a lot more powerful than the earlier version, GPT-3, meaning it uses a lot more processing power.

The Washington Post took Shaola's calculations, applied them to the new system, and weirdly, the resulting number was almost exactly the same.

So 519 milliliters.

Except this time it was for just one query, not 30.

Now, remember, the vast majority of that water use will be in power stations, not in data centers.

And in terms of limitations or obstacles, this data deep dive was even harder than before.

Because GPT-4 is a completely closed source.

It's not like GPT-3, which we have some concrete information about, especially the model size.

But for GPT-4, it's totally closed.

So there's naturally more uncertainties associated with our study in GPT-4.

So the accuracy of this extrapolation, it's not something we can check.

OpenAI responded to the Washington Post's findings, saying they are constantly working to improve efficiency.

But then, in June of this year, CEO of the company, Sam Altman, broke his silence on the issue in a blog post, saying that the average query uses about 0.000085 gallons gallons of water, roughly 1 15th of a teaspoon.

With Altman's numbers, the devil lies, you see, in the lack of detail.

What, for example, does he mean by the average query?

Well, essentially, in my experience, average doesn't mean much.

That is Sasha Luciani, a computer scientist and AI energy specialist at the curiously named Hugging Face, a machine learning and data science platform.

Because especially for a quote-unquote general purpose tool like ChatGPT, people are doing all sorts of things with it.

I think it could be reliable if you define the boundaries better.

If you really say for a text-only query that requires maximum, I don't know, whatever, 20 words generated, 20 tokens we say generated, then it could be that number.

But it doesn't make sense to give a single number for all queries.

It would have to be a range or much more granular than that.

A further caveat here, experts like Sasha thinks he's probably only talking about that on-site water used to cool the processors, not the off-site evaporation in power stations.

So then, back to our original question.

Does every AI query use a small bottle of water?

I think that the true number is somewhere between Altman's number and the thirsty AI number.

I think 500 milliliters seems a lot to me.

I think they tried to make an estimate of something that is inherently unquantifiable given the amount of information.

that's available to us.

So all told, when it comes to the tech Goliaths and their data, we're mostly in the dark.

Those smaller players in the AI game are starting to break ranks and to fess up.

In late July, Mistral, a French company, went public saying each typical response from LeChat, that's their version of ChatGPT, uses around 45 milliliters of water per 200 to 300 word response.

So Quick and Elementary Mats then tells us that 10 of those responses is just shy of a small bottle's worth of water.

Now, this data is so sizzlingly hot off the press that key voices in the space like Shaolai and Sasha haven't had a chance to fully fact check the findings.

But early analysis suggests that between 80 to 90% of the water usage they're reporting happens off site.

Remember, that's in power stations, not data centers.

Watch this space.

Thanks to Dr.

Shaola Ren and Dr.

Sasha Luciani.

If you've seen a number you want us to try and make some kind of sense of, then email us more or less at bbc.co.uk.

Until next time, take it handy.

Want to stop engine problems before they start?

Pick up a can of C-Foam Motor Treatment.

C-Foam helps engines start easier, run smoother, and last longer.

Trusted by millions every day, C-Foam is safe and easy to use in any engine.

Just pour it in your fuel tank.

Make Make the proven choice with ZPhone.

Available everywhere.

Automotive products are sold.

Safe home.

Suffs, the new musical has made Tony award-winning history on Broadway.

We demand to be home.

Winner, best score.

We demand to be seen.

Winner, best book.

We demand to be quality.

It's a theatrical masterpiece that's thrilling, inspiring, dazzlingly entertaining, and unquestionably the most emotionally stirring musical this season.

Suffs!

Playing the Orpheum Theater October 22nd through November 9th.

Tickets at BroadwaySF.com.