AI: Is It Ruining the Environment?

38m
The internet is abuzz with accusations that artificial intelligence is using up tons of energy and water. People are even protesting the building of new AI data centers, saying they’ll put a huge strain on local resources. But some AI defenders say that this fear is overblown and that AI isn’t actually that bad for the environment. So who’s right? We talk to science and tech reporters Casey Crownhart and James O’Donnell, and computer scientist Prof. Shaolei Ren.

UPDATE, 11/13/25: This episode has been updated to note that some AI companies advertise on the show.

Find our transcript here: https://bit.ly/ScienceVsAIEnvironment

Read James and Casey's article here: https://www.technologyreview.com/2025/05/20/1116327/ai-energy-usage-climate-footprint-big-tech/

Check out the Mythbusters GPU/CPU demonstration here: https://www.youtube.com/watch?v=WmW6SD-EHVY

In this episode, we cover:

(0:00) Chapter One: No More AI For Dank Memes?!

(3:34) Chapter Two: How Much Energy Does Your AI Query Use?

(15:37) Chapter Three: How Much Energy Does AI Use Total?

(21:18) Chapter Four: Is AI Drinking All Our Water?

(29:29) Chapter Five: Should You Quit Using AI?

This episode was produced by Rose Rimler and Blythe Terrell, with help from Meryl Horn and Michelle Dang. We’re edited by Blythe Terrell. Fact checking by Diane Kelly. Mix and sound design by Bobby Lord. Music written by Emma Munger, So Wylie, Peter Leonard, Bumi Hidaka and Bobby Lord. Thanks to all the researchers we reached out to, including Prof. Melissa Scanlan, and special thanks to Andrew Pouliot and Jesse Rimler.

Science Vs is a Spotify Studios Original. Listen for free on Spotify or wherever you get your podcasts. Follow us and tap the bell for new episode notifications.

Learn more about your ad choices. Visit podcastchoices.com/adchoices

Press play and read along

Runtime: 38m

Transcript

Speaker 1 Hi, I'm Rose Rimmler. I'm filling in for a Wendy Zuckerman, and this is Science Versus.

Speaker 1 This is the show that pits facts against filling the world with AI data centers.

Speaker 1 Today on the show, AI and the Environment.

Speaker 1 Lately, we've been hearing a lot about how power-hungry AI is.

Speaker 2 AI uses a shit ton of electricity, straining the nation's aging power grid and creating more planet warming emissions.

Speaker 1 And how thirsty it is.

Speaker 2 The amount of water that AI uses is astonishing. Asking ChatGPT to write one email is the equivalent of pouring out an entire water bottle.
One bottle of water.

Speaker 3 Let that sink in.

Speaker 2 Let that sink in.

Speaker 1 And the major culprit here is the data centers. Warehouses full of computer servers that AI needs to function.

Speaker 1 And tech companies are trying to build more of these data centers, but people who live nearby are protesting them, often saying that they're gonna compete for their electricity and use up their water

Speaker 2 we do not want a data center built in st. Charles City

Speaker 1 because of all this around some corners of the internet using AI has become kind of a faux pas especially if you use it for something silly You are actively contributing to global warming and climate change, all because you want to Photoshop Chris Brown into your pictures.

Speaker 2 So next time you use AI to generate an image for a meme, think about the impact on the environment first. Stop ruining your planning for a f ⁇ ing Instagram post.

Speaker 1 But on the flip side, we've got people pushing back against this idea.

Speaker 1 They say that these reports are skewed or misleading and that the impact of AI on the environment isn't nearly as bad as a bunch of other stuff we're already doing, like eating meat or taking international flights.

Speaker 1 In fact, recently, some of the big AI companies have said that their products only use a tiny bit of power and a few drops of water for each prompt. So, what's really going on here?

Speaker 1 Is AI actually ruining the planet, or have the bots been framed?

Speaker 1 Because when it comes to AI and the environment, there's a lot of stop ruining your planet for a fucking Instagram post, but then there's science,

Speaker 1 and that's coming up after the break. And full disclosure: some AI companies do advertise on science versus

Speaker 2 This episode is brought to you by Ford Blue Cruise. It's not just where you're going, it's how you feel when you get there.

Speaker 2 With hands-free highway driving, Ford Blue Cruise helps you recharge and reset behind the wheel so you can be more present in your vehicle and more connected to the moments that truly matter.

Speaker 2 One moment that I always love is when I'm driving with a friend and a song comes on the playlist that we both love, just like an absolute jam, a favorite.

Speaker 2 And without even thinking about it, we'll stop what we're doing, stop talking, mid-conversation, maybe even mid-sentence, put the windows down and sing basically at the top of our lungs.

Speaker 2 And then when the song is done, you know, usually one of two things will happen.

Speaker 2 Either we'll play it again, rinse and repeat, do the same thing all over again, or we'll continue the conversation, pick up exactly where we left off, even if it was mid-thought, maybe even mid-sentence.

Speaker 2 And both of those options are pretty good.

Speaker 2 Arrive energized, ready for your hands-on life with Blue Cruise. Consumer Reports' top-rated active driving assistance system.
Visit ford.com slash Blue Cruise to learn more.

Speaker 2 Available driver assist feature does not replace safe driving or driver's need to control the vehicle. Terms apply.
Consumer Reports does not endorse products or services.

Speaker 2 To read the full report and for additional details, visit www.ford.com/slash bluecruise.

Speaker 2 This episode is brought to you by Ford Blue Cruise. There's something to be said about long drives, the playlists, the games, the snacks, the goofy stuff we do for our entertainment.

Speaker 2 Ford Blue Cruise makes these moments even better. With hands-free highway driving, it takes one thing off your plate so you can fully enjoy the drive and the company and every mile of the journey.

Speaker 2 It makes me think of this trip I took with my best friend years and years ago.

Speaker 2 We were in the car for 17 hours each way across a lot of desert and we decided to keep a notebook, kind of a log for the entire drive.

Speaker 2 Every silly inside joke, every vanity license plate that made us laugh, every single weird place we stopped along the highway, like the haunted wax museum with a lot of weird and creepy artifacts.

Speaker 2 Everything, it all went into the notebook. And it ended up being such a cool thing because that notebook became this memento, this souvenir of that very, very long, very, very fun road trip.

Speaker 2 And I still have it.

Speaker 2 Make more memories with Blue Cruise, Consumer Reports' top-rated active driving assistance system. Visit ford.com slash Blue Cruise to learn more.

Speaker 2 Available driver assist feature does not replace safe driving or driver's need to control the vehicle. Terms apply.
Consumer Reports does not endorse products or services.

Speaker 2 To read the full report and for additional details, visit www.ford.com/slash bluecruise.

Speaker 1 Welcome back. I'm Rose Rimmler.
I'm a senior producer at Science Versus, and I'm here with our editor, Blive Terrell. Hi, Blive.

Speaker 2 Hey, Rose.

Speaker 1 Blive, it seems like you're my AI buddy. I invite you to talk to me about controversies when it comes to AI.

Speaker 2 It's because I'm part robot.

Speaker 1 I would say that of the team, you're the person who is like most tuned into this idea about AI using up all the energy, using up all the water. You've been into this for a while.

Speaker 2 Yes, this is actually, I am one of those people who

Speaker 2 probably shared a meme, Rose, without knowing if it was true

Speaker 1 on the water use or the whatever.

Speaker 2 Like, I do remember seeing those memes and being like, is this true?

Speaker 2 And then, and then honestly, for for me, it did, it did make me take a step back from AI and be like, before I get involved in this, I, I do want to know the truth.

Speaker 2 Like, is this actually terrible for the environment? Is this actually terrible for the water? Because why would I want to integrate it into my life if it is?

Speaker 1 Right. So the first question is, why do we think AI would use so much more energy than all the other stuff that we do in our digital lives?

Speaker 1 Just like messing around on the computer, posting on Instagram, watching Netflix. Like this is all stuff that we do pretty routinely and don't think a lot about the like footprint of that behavior.

Speaker 1 Right.

Speaker 2 Like looking at pictures of Jeff Goldblum. Yeah.
How much energy is that using?

Speaker 2 Just as an example, hypothetically.

Speaker 1 Photoshopping Jeff Goldblum as your prom date.

Speaker 2 You know, I don't know who you might be talking about, Rose, but that sounds like a pretty good use of electricity and energy

Speaker 1 and water, no matter how much it takes.

Speaker 1 So that kind of stuff also requires data centers and energy to run them.

Speaker 1 But the thing that's different about AI is that their servers are using a different kind of computer chip. So normal computing uses a CPU, but AI uses a GPU.

Speaker 1 And if you're familiar with video games, you might think of this as like a graphics card. But it's actually become the powerhouse behind machine learning.

Speaker 1 This is an extremely visual and potentially copyrighted

Speaker 1 analogy, but I was looking right on YouTube.

Speaker 2 I love a copyrighted analogy.

Speaker 1 The Mythbusters guys, they did a demonstration of a CPU versus a GPU. And in their demonstration, they used paintball guns.
Okay.

Speaker 1 The CPU was like programming a paint, one paintball gun to draw a happy face with like one paintball pellet at a time, like firing at a piece of paper on a wall. Okay.
That's a CPU. A GPU was like...

Speaker 1 200 paintball guns all bound together,

Speaker 1 making it like like a mega paintball gun. And like one switch is hit, and they all fire at once.
And the image that they create is the Mona Lisa. Oh, man, those guys are good.

Speaker 2 Mythbusters.

Speaker 1 So the point is that while CPUs are good at doing one task after another, the GPUs are good at doing a bunch of tasks at once. And that requires a lot more energy.

Speaker 2 Okay. How much energy? Are you ready for that? Yeah, we're ready.

Speaker 1 And I got a bit of an assist here. I talked to some journalists who have covered this stuff for years.

Speaker 3 My name is James O'Donnell. I'm a senior reporter for AI at MIT Technology Review.

Speaker 2 I'm Casey Crownhart. I'm a senior climate reporter at MIT Technology Review.

Speaker 1 So James and Casey both report a lot on AI and energy use. Okay.

Speaker 1 And about a year ago, they started a project trying to figure out like, how much does an average query or prompt to say chat GBT, how much energy does that use?

Speaker 1 And they were inspired to do that because they were seeing all these numbers out there floating around that just didn't seem all that reliable. Here's Casey.

Speaker 2 These kind of wild estimates of, you know, oh, a query to something like ChatGPT uses this much water and this much energy and isn't that so much?

Speaker 2 And so I think that that started to kind of get our gears turning and wondering, you know, is that right? How can we add all of this up?

Speaker 2 What does it all add up to?

Speaker 1 So Casey and James looked around for the real number, but...

Speaker 3 We learned very quickly that it's not going to be so easy to know that number.

Speaker 3 Companies, they're not particularly willing to share the details of how much energy their AI models require to answer one question. And so, you know, we weren't going to get it from them.

Speaker 1 And so. What did they say when you reached out and asked?

Speaker 3 They said in so many words, no.

Speaker 1 So James and Casey went a different route. There are AI models that are not proprietary.

Speaker 1 Anybody can use them, even download them, host them on their own computer as long as they have the power to do that. These are open source models.

Speaker 1 And you can kind of open the hood, poke and prod them.

Speaker 1 And so James and Casey teamed up with experts, including academics at the University of Michigan, to measure it themselves.

Speaker 1 So they ran a bunch of different prompts through an open source large language model called Lama.

Speaker 1 And then they were able to actually measure how much energy those requests required. And so they got some answers.

Speaker 1 Are you curious?

Speaker 2 Yes, I would love to know. Give me the answers.

Speaker 1 Well, there's a range here.

Speaker 2 I was really struck throughout this project of, I think we went in and I was looking for kind of one definitive answer. You know, like, what is AI's energy burden?

Speaker 2 And I think that one of my biggest takeaways was just how much it depends. It depends on the model.
It depends what you're asking.

Speaker 2 And so there's just this really big range. That was one of my biggest takeaways.

Speaker 1 So,

Speaker 1 you know, basically when

Speaker 1 people like us say, I asked AI, you know, we kind of act like AI is this one thing. And it's totally not.
There are all these different models. And these models come in different sizes.

Speaker 2 So a model, sorry, a model is like a Chat GPT or a Gemini or a Claude or whatever.

Speaker 1 Yeah. And they're within ChatGPT, Gemini Claude, there are multiple models.
Within Llama, there are multiple.

Speaker 1 And some are bigger, some are smaller.

Speaker 1 If you imagine that this AI model, like imagine it's

Speaker 1 like the command of a spaceship, or actually my favorite is like a switchboard with tons of knobs and dials.

Speaker 1 You can kind of imagine that's what these parameters are. James says.

Speaker 3 Each of those knobs is helping the AI come up with a better answer, but also each of those knobs requires energy to to operate.

Speaker 1 So the smallest model that the team looked at for this analysis had 8 billion parameters. So 8 billion.

Speaker 1 That sounds big. The biggest one they looked at had 400 billion parameters.
400 billion. And when it comes to the

Speaker 1 big players here,

Speaker 1 we actually don't know how many parameters they have.

Speaker 1 But James said, if he had to guess, it's...

Speaker 3 You know, in the order of trillions.

Speaker 2 Whoa.

Speaker 3 Really Really big. A lot of knobs.

Speaker 1 A lot of knobs. I mean, I'm imagining basically like a switchboard, but now I have to completely change it because it's like a switchboard that goes on for miles.

Speaker 3 Yes, that's right.

Speaker 2 So these parameters, these parameters you're talking about, which is like sort of the what underpins the model, I guess.

Speaker 1 They are, yeah, they're numbers, values,

Speaker 1 and the more of them that there are, the better the model is at learning patterns and making predictions, which is how large language models work. Okay.

Speaker 2 Okay, so say I have like one request and I pop it into a model with eight billion parameters,

Speaker 2 and then I pop that like same request into a model with like 400 billion parameters.

Speaker 2 That same request is going to use different amounts of energy based on the model that I'm using.

Speaker 1 Yes, and actually, we can move into, we don't even have to hypotheticalize here. I have real numbers for you.
Oh, nice.

Speaker 2 Okay.

Speaker 1 So the smallest llama model that the team used, they fed in some prompts like teach me about quantum computing or suggest some travel tips.

Speaker 1 And then they measured, you know, how much energy that used. The smallest model, when it spit out an answer, used on average 114 joules.

Speaker 2 Oh, great. Great.

Speaker 2 That's very helpful. Thank you.

Speaker 1 Are you being sarcastic? Do you want some other way to think about this?

Speaker 2 Yes, please give give me something more.

Speaker 1 Well, I didn't, you know, it was James and Casey. So they came up with something for some context.

Speaker 1 One thing they converted these energy units into is a fun new type of unit called microwave seconds. I love the microwave seconds unit.

Speaker 2 It's so much more relatable than joules or watt hours.

Speaker 2 Casey gets me.

Speaker 1 All right, so 114 joules is roughly a tenth of a second in a microwave.

Speaker 2 Okay, so that's one query, small model, a tenth of a second in the microwave.

Speaker 1 Yeah. Okay.
The biggest model, which was 50 times bigger,

Speaker 1 that was like zapping something in a microwave for eight seconds. Oh.

Speaker 2 So that's the biggest model was still only, for one query, the biggest model was still only eight seconds. Okay, that doesn't even get my rice remotely hot.

Speaker 1 Right. I mean, and after James and Casey published their article,

Speaker 1 frustratingly for them, OpenAI and Google did release a little bit of information on how much energy their text prompts use on average.

Speaker 1 And what they said suggests that a text query is equivalent to one or two seconds in the microwave.

Speaker 2 Okay.

Speaker 1 So basically what we can tell you is like a text prompt to a large language model is probably on the order of zapping something in the microwave for less than 10 seconds.

Speaker 2 Okay.

Speaker 1 And then for images.

Speaker 1 This is a different kind of machine learning, but it also uses a fair amount of energy. And I would have assumed that this image making thing is inherently more energy sucking than text making.

Speaker 1 But as it turns out, that is not necessarily the case. Here's James.

Speaker 3 If you have a really big, large language model that's generating text and answers, it may actually use more energy than generating an image.

Speaker 3 And that was kind of counterintuitive for me because, you know, you think about like these AI models that come up with fantastical images that we've all seen over the past few years.

Speaker 3 And it just seems like such an intense process to kind of create that from scratch.

Speaker 1 Yeah, because it always takes longer too than getting your text back.

Speaker 3 Yeah, exactly.

Speaker 3 But what we found was that if you have a really large text model, it has so many parameters, so it has so many knobs and dials that it actually can use up more energy than generating, you know, certain types of images.

Speaker 1 They found that making an image was like running a microwave for five and a half seconds.

Speaker 2 So like a big language model can be like eight seconds of microwave time. So that's a little less.

Speaker 1 Okay, right. I'm with you.
And you know, for all this stuff, if you don't like microwave time,

Speaker 1 you could also think about it in light bulb time. So it's like running an LED light bulb for somewhere between 10 seconds and two minutes.

Speaker 2 So I guess, Rose, what this maybe tells me is that if I wanted to make my Jeff Goldblum prom picture with AI,

Speaker 2 that is a slightly less energy intensive process than perhaps using AI to write romantic Jeff Goldblum fanfiction.

Speaker 1 Possibly if you use a really big model to write your Jeff Goldblum fan fiction.

Speaker 2 Obviously, I would need a very large model for this work, Rose.

Speaker 2 Okay, I've got it.

Speaker 1 And then there's video generation.

Speaker 1 This might not surprise you to hear that that used the most energy.

Speaker 1 So the team, what they did was they looked at an open source video generation model, and they just made like a crap video.

Speaker 1 It was 16 frames a second, five seconds long they compare it to like the quality of a silent film era type film

Speaker 1 okay and that one that would be the equivalent of over an hour in the microwave that's how i don't think i've ever microwaved anything for an hour i don't think i have either um a long time in the microwave for sure That one scares me more because I'm seeing a lot of AI generated videos out there.

Speaker 2 Yeah, that's what's huge right now and getting bigger, right? There's a ton of, what is it, Sora?

Speaker 1 Yeah, Sora is big right now. Yeah.
And they can look incredibly realistic, right? Right. We don't know if that's also using up as much energy.

Speaker 1 We asked OpenAI, which makes Sora, and they didn't give us any information on Sora's energy use. And James and Casey didn't want to speculate.
It's a different model.

Speaker 1 But it's probably using a fair amount of electricity. I think that's safe to assume.

Speaker 2 So, I mean, but, okay, so what we have so far is all about like individual use. Yeah.
But I want to, but what I want to know, though, is like, obviously,

Speaker 2 lots of us are doing this, lots of us are using this. Like, what is the impact if you like add it all up? If you like scoop up all the AI use that we're doing, like, what do we know about that?

Speaker 1 Right. And it's interesting, like, on an individual level, certainly like the texting and image generation stuff, they're not that crazy energy intensive.

Speaker 1 But that doesn't leave AI off the hook because when you do zoom out, and to answer your question, it really adds up fast because OpenAI says it receives two and a half billion prompts per day from people around the world.

Speaker 2 Wow.

Speaker 1 And AI in general is getting integrated into all these institutions, which I think a lot of us are noticing.

Speaker 1 In fact, one survey of a variety of organizations around the world found that 78% of them are now using AI to some extent.

Speaker 2 That is a lot.

Speaker 1 And so nerds have looked at how much electricity is going to data centers to see if AI has made an impact. And they saw that from 2014 to 2023, the electricity consumption of data centers tripled.

Speaker 1 That's according to a report from the Lawrence Berkeley National Laboratories.

Speaker 2 Oh, wow. So like in this, and that's the AI period? Like that's like sort of

Speaker 1 the age of AI like taking off. Taking off.
And the energy suck is expected to keep sucking up more and more.

Speaker 1 One analysis predicts that by 2028, AI data centers will use as much electricity as a quarter of U.S.

Speaker 2 households use per year. Wow.

Speaker 1 Imagine adding 25%

Speaker 1 more households to the U.S. in 2028.
That's what the prediction is that these AI data centers are going to use up. That doesn't sound good.

Speaker 1 Well, I mean, Casey, who's a climate reporter, she was like, you know, it's not the electricity per se that's the problem here.

Speaker 2 That's kind of the crucial thing that I like to bring up and really harp on is that, you know, if we had abundant solar and wind power and batteries, you know, we might be less concerned about some of this, this energy demand.

Speaker 2 But the reality is that grids around the world are still largely relying on fossil fuels. So it's not good.

Speaker 1 Right now in the U.S., only 9% of the country's power comes from renewable sources. It's still mostly fossil fuels that we use to power our electric grid.
A third of our energy comes from petroleum.

Speaker 1 A third comes from natural gas, which is another fossil fuel. Both are greenhouse gas emitters.
But and coal?

Speaker 1 Coal is in the mix too. It's 8%.

Speaker 2 So just a lot of this energy is dirty.

Speaker 1 And of course, there are other countries with cleaner energy grids than the US. But more than half of the data centers for the world are here in the US.

Speaker 2 Well, and you know, I feel like the headlines, some of the headlines I've seen around this, Rose, have been like related to nuclear energy because there were headlines a while back that one of these companies was going to reopen Three Mile Island, which is this nuclear plant that was shut down because of an accident.

Speaker 2 And

Speaker 2 so there was talk of like that being reopened and like, you know, really a lot of these companies being very interested in what's going on with nuclear.

Speaker 2 So it does make me wonder, could nuclear help if we can get that ramped up?

Speaker 1 I asked Casey about that and she was like, the thing about nuclear reopening or building a new nuclear plant, it takes so long.

Speaker 1 The last nuclear plant that we built in the U.S. took 15 years to complete.
Yeah. And companies are just not going to wait for that to happen.
And they're not. They're not waiting for it.

Speaker 1 I mean, look at XAI.

Speaker 1 They brought in gas-burning generators to run their data center in Tennessee.

Speaker 2 Right. Okay.

Speaker 2 Oh, okay. Well, that sucks.

Speaker 1 Yeah, so I reached out to XAI and I didn't hear back.

Speaker 1 I also contacted Google and Anthropic just to ask about all this stuff stuff that we've been talking about. I didn't get answers from them by our deadline.

Speaker 1 OpenAI did get back to me.

Speaker 1 They mostly pointed me to stuff that's already publicly available, open letters and blog posts, that kind of thing, talking about their energy use and how they see that in the future.

Speaker 1 And basically, what OpenAI is saying is that they want to work with the government to add capacity to the grid.

Speaker 1 And they say that they want that energy to come from all kinds of sources, including renewables.

Speaker 2 Uh-huh. Okay.

Speaker 1 And just overall, I will say there might be some changes coming for the positive.

Speaker 1 So the energy that AI requires to answer your query or make your image or your video, that could be going down because a lot of the tech companies are trying to make their models more efficient.

Speaker 1 One way they're doing that is by turning off some of the parameters that we talked about earlier when they don't necessarily need them to answer a particular question or do a task.

Speaker 1 So that's like shrinking the switchboard essentially as needed.

Speaker 2 So So there is some evidence that the tech companies are like trying to adjust to make this thing.

Speaker 1 Yes, it might get better.

Speaker 2 Okay, so that's energy rose, but

Speaker 2 I know there's another piece to this. Yeah.

Speaker 2 What about water? What is going on with water? Right.

Speaker 1 So we're going to talk about that after the break.

Speaker 2 This episode is brought to you by ServiceNow. AI is only as powerful as a platform it's built into.
That's why it's no surprise that more than 85% of the Fortune 500 use the ServiceNow AI platform.

Speaker 2 While other platforms duct tape tools together, ServiceNow seamlessly unifies people, data, workflows, and AI connecting every corner of your business.

Speaker 2 And with AI agents working together autonomously, anyone in any department can focus on the work that matters most. Learning how ServiceNow puts AI to work for people at servicenow.com.

Speaker 2 This episode is brought to you by Ford Blue Cruise. So, some of the most unexpected, unforgettable moments happen in the car.

Speaker 2 A life-changing decision, a quiet, I'm proud of you, and the kind of conversation that can only happen when you're both looking straight ahead.

Speaker 2 And Ford Blue Cruise is about making space for those moments. Because with hands-free highway driving, you can be more present in the vehicle and more connected to the moments that truly matter.

Speaker 2 Like this trip I took recently with some really close friends. We were all ready to go.

Speaker 2 We had our songs, we had our snacks, we had our suitcases, but it also turned out that each of us had some things we were working through.

Speaker 2 Maybe it was a health thing, a relationship thing, a work thing, and we all ended up bringing up like what was going on with us kind of one by one.

Speaker 2 You know, the other people in the car would ask questions, they would offer advice where it made sense. And we ended up spending hours like this on the road.

Speaker 2 It was almost like a group therapy session. Very impromptu.
And by the time we got to our destination, we realized that we hadn't listened to a single song on the playlist we'd made.

Speaker 2 We'd all been too busy, just talking.

Speaker 2 Hands free on the highway, fully present in the moment. That's Blue Cruise.
Consumer Reports' top-rated active driving assistance system. Visit ford.com slash Blue Cruise to learn more.

Speaker 2 Available driver assist feature. Does not replace safe driving or driver's need to control the vehicle.
Terms apply. Consumer Reports does not endorse products or services.

Speaker 2 To read the full report and for additional details, visit www.ford.com slash bluecruise.

Speaker 2 Think your lashes have hit their limit? Discover limitless length and full volume with Maybelline Sky High Mascara.

Speaker 2 The Flex Tower brush bends to volumize and extend every single lash from root to tip. And the lightweight bamboo infused formula makes lashes feel weightless.

Speaker 2 Now in eight bold shades so you can take your lashes to new heights every day. Visit Maybelline.com to shop Sky High Mascara Now.

Speaker 2 Sparkle throughout the night with Born in Roma fragrances by Valentino Beauty. Each bottle holds the energy of Rome after dark.

Speaker 2 Donna Born in Roma blends luxurious jasmine with rich, creamy vanilla, creating a sensual and vibrant signature scent.

Speaker 2 Uoma Born in Roma fuses aromatic sage and smoked vetiver, leaving a lasting impression that lingers well into the early hours. Shop Born in Roma by Valentino Beauty, now at Ulta.

Speaker 1 Welcome back. I'm Rose Rimmler.
I'm here with Live Terrell. Hello.
So let's talk about water and AI. And if you want to talk about water and AI, you call up Shale Rin.

Speaker 1 He is a professor at UC Riverside. He's actually in the computer engineering department, but he focuses on sustainability.

Speaker 1 Most people in his field look at, you know, energy, greenhouse gases, like we were just talking about, but Chalet has kind of forged his own path because he's thought about conserving water for just a lot of his life.

Speaker 2 I spent my first few years in a small town back in China.

Speaker 2 We just had access to fresh water,

Speaker 2 drinking water for half an hour each day. During those half an hour, we had to use a big bucket to collect the water and use it for the the rest of the day.

Speaker 2 So in my memory, I just never thought water is something unlimited. It's just

Speaker 2 a finite resource.

Speaker 2 You've never taken it for granted. Right.

Speaker 1 And so one reason AI uses a lot of water is something that you've probably heard before.

Speaker 1 The data centers get really hot because they're running all these fancy chips doing all this computation like we were talking about earlier.

Speaker 1 And so these buildings, they often use a cooling tower that uses water to cool everything down.

Speaker 2 Just like our human bodies, we sweat and we feel cooler. For data center, if you use water evaporation, you can take away the heat very naturally, very efficiently.

Speaker 1 And where do they get that water from?

Speaker 2 Most typically is from the

Speaker 2 municipal water infrastructure system.

Speaker 1 So the same as where if I live there, if I were to turn my tap on. Yeah.

Speaker 2 So they get water where everyone gets water from the faucet, basically.

Speaker 1 And the reason for that is they want like clean, filtered water because if there was salt or minerals or like gunk in it, then it could gum up this system basically.

Speaker 2 Okay.

Speaker 1 So as the water cools the data centers, it you know it evaporates away. It evaporates.

Speaker 1 And if I remember my like kindergarten, you know, the water cycle, when water evaporates, it eventually comes back as rain, right?

Speaker 2 So why do we need to worry about this? So the evaporated water, yeah, it still stays within our global water cycle system.

Speaker 2 It doesn't go away from the Earth. But But still, when the water will be coming back and where it will be coming back, that's highly uncertain.
And it's very unevenly distributed across the globe. So

Speaker 2 due to the

Speaker 2 long-term climate change, we're seeing more and more uneven distribution of the water resources. So essentially, the wetter regions are getting wetter and the drier regions are getting drier.

Speaker 1 So even if the water is evaporated in, say, Arizona, that doesn't mean it'll come back as rain in Arizona, at least not anytime soon.

Speaker 2 You're correct. Okay.

Speaker 2 Okay, so the argument is it's using a bunch of water, it's drawing it out from where everyone else is getting their water, and it's not

Speaker 2 that's not necessary, and it's not necessarily going to be replenished that easily.

Speaker 1 Well, yeah, yeah, it's going to evaporate the drinking water in Tucson, and that water might next show up as a flood in... Shanghai.

Speaker 2 Right, okay.

Speaker 1 So let's talk about how much water is actually getting used here.

Speaker 1 Chalet and his team, they went down this rabbit hole fairly recently and they published a paper that kind of went viral. In fact, a lot of people turn their results into a meme.

Speaker 1 Basically it's saying that every time you use AI, they'll say in different ways, like every time you chat with ChatGPT, every time you write an email with AI, you're consuming a bottle of water.

Speaker 1 Have you seen this, Blythe?

Speaker 2 Yes, yes. This was one of the memes I first saw and shared without without evidence.

Speaker 1 I've seen videos of people filming themselves like with a nice, beautiful, fresh bottle of water from the store, opening it up and like pouring it down the drain and saying like, this is what you're doing when you use AI or someone will be like dressed up and pretending to be AI, like dressed as a robot, and they're just like guzzling water.

Speaker 1 But that's not quite accurate.

Speaker 2 Yeah. So that's

Speaker 2 that's an yeah, that's a distortion of the message that we show you in the paper. A distortion.

Speaker 1 Here's what they they actually found.

Speaker 1 So they found that if you have a back and forth conversation with, in this case, the model they looked at was ChatGPT3. That's a slightly older model, but if you have a back and forth with ChatGPT3,

Speaker 1 medium-length messages, if you go back and forth for on average about 30 times, that uses up essentially the volume of a bottle of water, a half liter of water.

Speaker 2 Okay. So it's like a, it's a decent conversation that gets you to that half half liter.

Speaker 1 Yeah, and that's where the meme comes from.

Speaker 1 So it's not super duper wrong, but what they're getting wrong or misunderstanding is that the fresh drinking water that's used to cool the data center, that's actually only a small part of this calculation.

Speaker 1 So out of this half liter of water that we're talking about, only about 12% of it is drinking water that's used directly by the data center for cooling.

Speaker 2 Oh.

Speaker 1 And the rest of it is non-potable water from elsewhere. It's drawn out of rivers, lakes, whatever.
It's used in the process of making electricity.

Speaker 1 So that brings us back again to the power plants, you know, that old chestnut.

Speaker 2 Okay, but wait. So it's talking, so some of this is drinking water, but some of this is like.

Speaker 2 But most of it is not. But most of it's not.
But, I mean, but still, like, that's water in the environment could eventually become drinking water, right?

Speaker 2 So like, why does, so why does that distinction actually really matter? Well,

Speaker 1 if you think the data center moving into your town is a threat because it's going to turn on a bigger tap than yours, that's not quite right. And I asked Shelly about that.

Speaker 1 Do you think it's possible that a town will accept a data center and it uses up all the town's water, essentially? Like you live next door to a data center, you turn your tap and no water comes out.

Speaker 2 I think in certain towns, it could be possible, but in most of the towns, I think the U.S. infrastructure tends to be,

Speaker 2 at least for the water infrastructure,

Speaker 2 they should be able to have the capacity available for data centers.

Speaker 1 He said that the biggest problems here might be likely to happen in really small towns with really old or limited water infrastructure.

Speaker 1 Okay. So when I see people talking about how data centers are using up water, I think like we might be ignoring the bigger issue here, which is the water used by power plants.

Speaker 1 And by the way, if we had more wind and solar on the grid, the water use would go down.

Speaker 1 But anyway, as of right now, overall, taking into account the water used by power plants and the water used for cooling, we know that data centers consume 0.3% of the nation's water supply.

Speaker 1 I asked Shelly about this. I don't know what to make of that.
Is that a lot or is that a little? 0.3%.

Speaker 2 So it's roughly the same amount of

Speaker 2 total public water supply in Rhode Island. So whether this 0.3% is high or not,

Speaker 2 I would say it's modest.

Speaker 2 It's not that much.

Speaker 1 Brings up the question, should we be letting Rhode Island use all that water? I mean, what has Rhode Island done for anyone else lately?

Speaker 2 Finally, the podcast is getting around to that question, which I've also had for years. What is the point of Rhode Island?

Speaker 1 Yeah, I mean, and the water used for the data centers for the power generation and cooling is projected to go up. It's actually expected to double in the next few years.

Speaker 1 But ultimately, Chalet and another expert I spoke to said that whether or not this becomes a problem is a regional question.

Speaker 1 It makes more sense to be granular about this. Like, is the water being taken from an area that doesn't have the capacity? You just can't paint with a broad brush here.

Speaker 2 So

Speaker 2 complicated, I guess,

Speaker 2 is where we so often land.

Speaker 2 Okay, so taking all this together, Rose,

Speaker 2 where do you land? Like, how, how evil is AI when it comes to the environment?

Speaker 1 I asked all of our guests basically that same question. I kind of put it in terms of like, well, do you personally use AI knowing about all these environmental impacts?

Speaker 1 Because these are people, all these people care a lot about the environment and these issues. And all of them, Casey, Shelley, James, they all said that, yes, they do still use AI.

Speaker 2 I'm awful at planning trips. So asking for an itinerary for going on a road trip or something, that's, I found that that's really helpful.

Speaker 2 I use it to polish my text writing, to help me answer some questions. And also my students use AI to generate paper summaries.

Speaker 2 So I've used AI for technical things, like how to do certain repairs on my bike.

Speaker 2 But I've also used it for seeing what people have said on a certain topic, like hikes in New England with the best views.

Speaker 1 But everybody agreed that we should be thoughtful about how we use it, given this energy and water requirement as well.

Speaker 2 So it's annoying because part of me is like, you know, the companies that make this and that are using this, like they're and that are like using it for their products and services that I'm using, like they're the ones who I want to think about their AI use, I want them to be thinking about whether they really need to use this or not.

Speaker 2 And I want them to be thinking about that in the context of energy use, water use, climate change, right? Like, that's my dream.

Speaker 1 Yes, it's on the companies, it's on the government. I mean, I think that the

Speaker 1 my takeaway here is that, like, I'm not sure AI is the villain.

Speaker 2 I think the villain is our

Speaker 1 reprehensible

Speaker 1 and baffling inability to switch to renewable energy and to put any kind of real effort into getting off of fossil fuels. Right.
It's the same enemy we've been fighting for 50 years or whatever.

Speaker 2 Right. Right.

Speaker 1 Also, I think that one reason AI is getting people riled up as opposed to like those old climate offenders flying, eating meat, you know, that kind of thing is people see the value.

Speaker 1 in the trade-off of the environmental impact of something like taking a flight or eating a burger.

Speaker 1 There's an obvious benefit to those things. With AI, yes, some people have found it really useful, but a lot of people haven't, and they just don't think it has much value at all.

Speaker 1 In fact, one survey found that 61% of people in the U.S. think that AI has more drawbacks than it has benefits.

Speaker 2 Okay, so more than half of us are just like, no,

Speaker 2 no, no, we ate this shit. Yeah.

Speaker 2 No, thank you. Exactly.
Okay.

Speaker 1 I think that's one reason AI is our current villain.

Speaker 1 When, in fact, I think the villain is, I think that's like a nostril on the larger villain, which is the evil monster that is keeping us glued to fossil fuels.

Speaker 2 Right.

Speaker 2 The nostril. Okay, I

Speaker 2 appreciate that picture.

Speaker 2 Okay, so I do want to know one last thing though. Has learning this and digging into all of this AI and energy and water stuff, has it changed how you use AI?

Speaker 1 Yeah, a little bit.

Speaker 1 Also, I think the novelty is wearing off a bit. And I was never like using it a ton, but

Speaker 1 I don't know. I was asking people about like, what's some stupid stuff that you've seen generated by AI? And you're like, oh my God, that wasn't worth the energy.

Speaker 1 And I thought of my own playing around with it. And I was like, remember that time I had AI generate an image of

Speaker 1 my boyfriend cuddling with my cat? Because my cat doesn't like him. So I was like, oh, this is what it would be like if you guys got along, you know, and I sent it to him.

Speaker 1 And I was like, I don't think I would do that again. I don't think that was worth the energy.

Speaker 2 So it has changed a little bit how you would make that value assessment, kind of.

Speaker 2 Is using AI for this thing going to like add value? Is it actually really useful for this?

Speaker 1 Or... Or could I just glue salmon to his fingers? And then the cat would actually maybe come over.

Speaker 2 Yeah, you know, yes, yes, Rose.

Speaker 1 Let's go back to

Speaker 1 the basics.

Speaker 2 Let's go back to the basics of gluing salmon to our boyfriend's fingers to get our cat to like him. Yep.

Speaker 1 That's science versus.

Speaker 2 Thanks, Blai. Thanks, Rose.

Speaker 2 Oh, and while we're here, how many citations are in this week's episode?

Speaker 1 There are 66 citations.

Speaker 2 Where can people find them?

Speaker 1 They can find them in our transcript. The link to the transcript is in our show notes.

Speaker 1 Also, in our show notes, we'll put a link to the article that James and Casey wrote for MIT Technology Review. It's really good.
People should go read it.

Speaker 1 And people should also check out our Instagram. We've got some interesting stuff there.
Maybe even a little Jeff Goldblum content for you.

Speaker 1 Give the people what they want.

Speaker 2 Exactly.

Speaker 2 Great. Love it.

Speaker 1 This episode was produced by Rose Rimmler and Blythe Terrell with help from Meryl Horn and Michelle Deng. We're edited by Blythe Terrell.
Fact-checking by Diane Kelly.

Speaker 1 Mix and sound design by Bobby Lord. Music written by Emma Munger, So Wiley, Peter Leonard, Bumi Hidaka, and Bobby Lorde.

Speaker 1 Thanks to all the researchers we reached out to, including Professor Melissa Scanlon, and special thanks to Andrew Puglia and Jesse Rimmler. Science Versus is a Spotify Studios original.

Speaker 1 Listen for free on Spotify or wherever you get your podcasts. Follow us and tap the bell for new episode notifications.
We'll fact you soon.