Can we afford to power AI?
In the last decade, the number of data centres worldwide has doubled. As the tech bros spend big to render AI an irreplaceable part of our everyday lives, the energy and water required is truly eye-watering.
Nvidia, Google, Microsoft and Meta have already seen their cumulative emissions rise by 72% in the last five years, and given the direction AI is headed, it’s likely that our AI footprint today is the smallest it will ever be. Hundreds of millions of people are using AI every day - and that number is growing fast, but is a world with widely available AI actually sustainable?
Follow If You're Listening on the ABC Listen app.
Check out our series on YouTube: https://www.youtube.com/playlist?list=PLDTPrMoGHssAfgMMS3L5LpLNFMNp1U_Nq
Press play and read along
Transcript
Speaker 1 ABC Listen, podcasts, radio, news, music, and more.
Speaker 1 Hi, it's Sam Hawley from ABC News Daily, the podcast that brings you one big story affecting your world each weekday in just 15 minutes.
Speaker 5 I don't think Kamala Harris any more than Hillary Clinton has found a way to talk about the MAGA movement and what it means and what it represents and to deal with that essential problem for the Democrats, join me for ABC News Daily.
Speaker 1 Find it on the ABC Listen app.
Speaker 2 This podcast was produced on the lands of the Wabakal and Gadigal people.
Speaker 2 There were few more effective pieces of government propaganda in the 1990s than this.
Speaker 6 Leaving the tap running when you clean your teeth wastes up to five litres of water.
Speaker 2 If you're a student at Sacred Heart Primary School in Brisbane in the early 90s, you might remember a strange visit from the Queensland Premier Wayne Wayne Goss and an anthropomorphic water drop.
Speaker 8 The Waterwise program has been running since 1992, but today the Premier refreshed it by launching a pointy-headed drip named Mr Wizzy.
Speaker 2 The Premier was there to deputise the children as junior spies.
Speaker 10 I want you to be my water police to make sure not just you, but your brothers and sisters and your parents turn off the tap while they're brushing their teeth.
Speaker 2 I'd be surprised if you could find a 90s kid anywhere in Australia who leaves the tap on while brushing their teeth.
Speaker 2 This particular act of water conservation is very heavily ingrained in the national psyche, and so are various methods of saving electricity.
Speaker 12 Finding simple ways to reduce the amount of power we use, such as turning appliances off at the wall, can make a big difference.
Speaker 2 We've all been spending years trying to do the right thing, and now it kind of feels like the big tech companies are throwing it all out the window to power the AI boom.
Speaker 13 A data center compass like this can consume as much power and water as an entire city.
Speaker 4 20% of the water in this dam is consumed over the course of one year by ChatGPT alone.
Speaker 2 The industry is getting quite large now.
Speaker 2 In the second half of 2025, the tech sector is responsible for a third of the value of the S ⁇ P 500 index, something that hasn't happened since the dot-com bubble burst 25 years ago.
Speaker 3 Wall Street has been caught up in AI mania as investors bet on the promise of new products.
Speaker 14 In 2024, 1.25 trillion US dollars was spent building out the infrastructure for AI, including buying chips, building data centers and the power plants to cool those data centers.
Speaker 2 So I want to get a sense of the scale of the resources used by all these things.
Speaker 2 When we use generative AI, something like ChatGPT, is it like leaving the tap on while we're brushing our teeth, or is it more like running an air conditioner on full blast with all the windows open?
Speaker 2 Or is it even worse? And with the market betting on AI to keep growing in size for years to come, is any of this sustainable? I'm Matt Bevan, and this is If You're Listening.
Speaker 2 Okay,
Speaker 2 to talk about this properly, you've got to understand how generative AI and neural networks function. This stuff is obviously complicated.
Speaker 2 The people who truly understand these things are paid literally hundreds of millions of dollars these days, which is why we invented metaphors.
Speaker 2 Think about a computer like this show. You might think this is how I sound naturally, but when I record, it actually sounds like this.
Speaker 15 So this is raw audio, which is exported and sent to our producer Adair who does a lot of tasks in succession to get me sounding schmick.
Speaker 15 First she cleans up the background noise.
Speaker 2 Despite these walls being pretty thick they're not thick enough to fully block out the sound of my kids.
Speaker 2 Then the audio is levelled and normalized which basically evens out all the loud and quiet parts. Next up are EQ and compression which make me sound like this and then of course there's the music.
Speaker 2 Choosing the right music can take a very long time, but eventually we get there.
Speaker 2 This whole process takes Adair about a day. And if you asked her if she could do it all in 10 seconds, obviously the answer would be no, that's physically impossible.
Speaker 2 All of the jobs need to be done in that order to get the show sounding right. And doing tasks one by one in a specific order is how a traditional computer chip works using what's called a CPU.
Speaker 16
The CPU is what runs a computer. It's called the central processing unit.
That's where the brains of the computer are.
Speaker 16 It controls the transformation of data or the processing of data from one place to another.
Speaker 2 CPUs do complicated operations very quickly but still sequentially, one at a time. Now this is great for most things you do on a computer.
Speaker 6 Word processing skills can be picked up in a fraction of the time and even a relative novice can produce professional looking paperwork.
Speaker 17 Email is just like sending messages by typing them and putting them an envelope. The difference is that it's an electronic envelope that disappears out the back of your computer.
Speaker 6 Cruising the internet you'll find items for sale. Photos, video pictures,
Speaker 6 even home movies of cooking hamburgers.
Speaker 2 But it's not so great for doing things that require a huge amount of complicated visuals to be generated simultaneously. For example, every modern video game.
Speaker 18 Certainly there is no more demanding application for a computer than today's graphics intensive games.
Speaker 2 Video games use what's called a GPU or a graphics processing unit which works by breaking down complicated operations into thousands of less complicated operations and doing them all at once.
Speaker 18 So this is all being generated and driven right now and
Speaker 2 so if you apply that to this show it's basically like rather than than have one producer who has to do all the tasks you get a hundred producers all of whom has one tiny task that they all do simultaneously.
Speaker 2 So rather than the show taking a whole day to mix now it can be done in a matter of seconds and crucially each one of those interns has no idea what anybody else is doing they're just doing their own job.
Speaker 2 Now, the ABC tells me they do not have the budget for 100 producers on this show so we're sticking with Adair the CPU Shepherd as I I have just started calling her right now.
Speaker 2 CPUs might take a bit longer, but they can do way more complicated jobs. That meant that for most of computer history, CPUs were way more lucrative than GPUs.
Speaker 2 GPU companies like Nvidia lived in the shadow of their CPU-making rivals.
Speaker 2 But the interesting thing is, lots of small simultaneous operations isn't just the best way to render the Super Mario Bros.
Speaker 2 and Lara Croft.
Speaker 2 It's also how human brains work.
Speaker 19 The human mind, the human soul, is a computer program being run on a wet computer called the human brain.
Speaker 2
What a very upsetting way of putting it. But you get Hannibal Lecter's point.
Lots of little neurons doing very small jobs, which all add up to one very complicated task. A very wet, complicated task.
Speaker 9 We know electrical impulses are passed along neural pathways as we think and send messages to the rest of our body.
Speaker 9 So one branch of AI has replicated this system with a series of neural networks or on-off switches that mimic the brain's electrical activity.
Speaker 9 It's hoped that these systems will be able to learn just like a child.
Speaker 2 Research into neural networks has been going on for decades with scientists hoping to find ways for computers to solve problems for themselves with experiments like this little robot called Stumpy.
Speaker 9 Stumpy has been given a task to walk. He knows his end goal but has to teach himself how to get there by trial and error in much the same way as a toddler would learn.
Speaker 2 When we talk about AI training in a modern context, this is what we mean. It's Stumpy figuring out how to walk through trial and error, but on a massive scale.
Speaker 2 All that trial and error takes a really long time if you're doing things sequentially on a CPU.
Speaker 2 Eventually, researchers realized that running these programs on GPUs, graphics cards, where thousands of tasks were being done simultaneously, made them run much faster.
Speaker 2 At the beginning of 2022, just before this AI boom kicked off, NVIDIA, the GPU company, was making about $10 billion a year, or 40% of its revenue, selling GPUs to companies wanting to use them in building neural networks to train AI programs and mining cryptocurrency.
Speaker 2 Then, in March that year, NVIDIA CEO Jensen Huang made an announcement.
Speaker 18 Introducing NVIDIA H100.
Speaker 20 The H100 is a massive 80 billion transistor chip using TSMC4N process.
Speaker 2 This announcement of the H100 Hopper GPU chip was absolutely impenetrable for most people.
Speaker 20 Hopper H100's four petaflops of FP8 is an amazing six times the performance of Ampere A100's FP16.
Speaker 2 What, four petaflops, you say?
Speaker 20 Two petaflops of FP16.
Speaker 20 One petaflops of TF32.
Speaker 20 60 teraflops of FP64 and FP32.
Speaker 2 Very exciting! But petaflops of TV32 aside, the key part of the announcement was that the hopper chip had been designed specifically to make training AI models faster, particularly transformer models.
Speaker 2 The T in chat GPT stands for transformer, by the way.
Speaker 20
The transformer is unquestionably the most important deep learning model invented. Hopper introduces a transformer engine.
Transformer model training can be reduced from weeks to days.
Speaker 2 Now, this announcement that barely makes sense to most people was a very big deal in the tech industry.
Speaker 2 The big deal attitude can best be understood through a story of a fancy dinner that was happening in Silicon Valley soon afterwards.
Speaker 7 I went to dinner with Elon Musk, and Nobu, Palo Alto, I went to dinner with Elon Musk, Jensen Wong, and
Speaker 7 I went to dinner.
Speaker 2 This is billionaire Larry Ellison, founder of tech company Oracle. The restaurant he's talking about, Nobu, is a fancy Japanese restaurant in a hotel he owns in downtown Palo Alto.
Speaker 7 And I would describe the dinner as me and Elon begging Jensen for GPUs.
Speaker 7 Please take our money.
Speaker 2 It seems like Musk and Ellison were desperate.
Speaker 7 Please take our money.
Speaker 7 You're not taking enough that we need you to take more of our money, please. It went okay.
Speaker 2 It worked.
Speaker 2 The reason they were so desperate is that both of them and several other big tech company CEOs are in an arms race to create the best AI program, which requires an immense amount of processing power.
Speaker 7 The desire to build the most capable neural network in the world, getting there first, is a big deal.
Speaker 2
See? Big deal. This race, the demand for hopper chips, has turned NVIDIA into the world's largest company by market cap.
At the end of November 2025, it became the world's first $5 trillion company.
Speaker 2 It now makes 90% of its income from selling GPUs to AI companies.
Speaker 2 But this isn't just a race for chips. There are a lot of other resources required to create the best generative AI as well.
Speaker 2 Because the thing is, as anyone who has ever tried to play a newly released game on an old computer will know, GPUs get real hot when they're working hard.
Speaker 2 You can't just feel it. You can hear it as that little fan inside your PC spins so fast it sounds like it's about to take off.
Speaker 2 Running a lot of GPUs in close proximity,
Speaker 2 they work faster if you put them close together, uses a huge amount of energy.
Speaker 7 We're building data centers, I mean, my God, we're building nuclear reactors. Are you kidding me? That sounds completely made up, but it's not.
Speaker 7 You need a lot of power to power, I mean, it's acres of these GPU clusters.
Speaker 2 And all those clusters create a lot of heat. which needs to be removed.
Speaker 11 There's so many fans and pumping equipment keeping the systems operating and cool, they generate an awful lot of noise.
Speaker 2 These data centers are vast multi-story monoliths doing,
Speaker 2 I mean, at least 60 teraflops of FT32 every second or something like that. And fans aren't enough to keep them cool.
Speaker 22 These pipes are pushing cool water around the system to prevent overheating.
Speaker 2 But how much energy and water are we talking about here?
Speaker 2 So I've got an energy meter attached to my pool pump, which has measured my usage throughout October.
Speaker 2 It's been uncommonly high because we did some landscaping and the pool got filthy and I've been cleaning it all out.
Speaker 2 So as a result, in October, the pool pump was responsible for about half of our household electricity usage. And by the way, don't forget, we've got solar panels, so we're generating more than we use.
Speaker 2 But my pump is a one kilowatt pump, so running it for an hour consumes one kilowatt hour of electricity.
Speaker 2 I've been running it for an average of 8.36 hours a day, so that's 8.36 kilowatt hours every day.
Speaker 2 According to data provided by OpenAI boss Sam Altman, every time someone asks ChatGPT a question, it uses the energy equivalent of 1.23 seconds of my pool pump, which sounds like this.
Speaker 2 So one day of pumping is equivalent to about 25,000 prompts. But obviously text-based conversations isn't all we've got ChatGPT and other AI tools working on.
Speaker 2 Generating an image, well, that's about five seconds of pool pump, according to the International Energy Agency's latest report.
Speaker 2 Then of course there's video. The IEA's report says a video uses the equivalent of seven minutes of pool pump, but don't worry, we won't subject you to seven minutes of pool pump.
Speaker 2 And the thing is, it's probably actually way more than that because they were measuring the usage of an unbelievably lo-fi video platform.
Speaker 2 Sam Altman's company just launched a new video platform called Sora 2, which generates three minute long, high resolution videos based on AI prompts.
Speaker 2 Sam's little blog post didn't tell us how much energy that is using, but it's going to be quite a lot.
Speaker 2 But what about water? Well, again, according to Sam Altman, each prompt uses 1 15th of a teaspoon of water, and that's such a small amount, it's kind of impossible to imagine.
Speaker 2 But if you imagine the 25,000 prompts that equate a day of pumping, it's seven and a half liters. So less than you use to wash the dishes in your kitchen sink.
Speaker 2 If you're worried that your personal usage of AI is destroying the environment, you can rest assured that it isn't.
Speaker 2 You're doing a hundred other things every day that use more water and electricity than AI.
Speaker 2 Even if we ignore everything except the highest estimates for AI water and electricity consumption, it's not a serious issue at the moment.
Speaker 2 In the United States in 2024, all data centers combined, AI and non-AI, use roughly the same amount of water as was used to grow American lettuce.
Speaker 2 All of those data centers used less than half as much water as the US golf industry.
Speaker 2 But now it's time for caveats, and there's a lot of them.
Speaker 2 For one thing, while I almost exclusively run my pool pump using solar power, AI data centers operate day and night, and they use fossil fuels to make up gaps in renewable and nuclear energy supply.
Speaker 2 Oh yeah, they're buying nuclear power stations, by the way, to make sure they can keep their GPUs humming.
Speaker 3 When I say nuclear, you say energy. Nuclear!
Speaker 2 Energy! Nuclear! Energy!
Speaker 23 Nuclear!
Speaker 2 Energy!
Speaker 2 Also, that data on electricity and water just comes from a blog post by Sam Altman, not an independent audit.
Speaker 2 I'm giving you way more information about my pool pump than he is about OpenAI's data centers.
Speaker 2 Another issue is that while on a global scale, AI data centers aren't using a ridiculous amount of energy or water, data centers aren't spread evenly across the globe.
Speaker 2 They're concentrated in Europe, the US and China. And they're further concentrated in specific regions.
Speaker 24 When you look at electricity use in Ireland, 21% of it is used by data centers which is a massive amount.
Speaker 2 In Ireland tech companies have taken advantage of tax incentives to set up massive data centers around Dublin.
Speaker 24 I don't think it's worth having in this local area 40 of the 80 data centers in Ireland are located right here.
Speaker 2 In the US state of Virginia the tech companies have started running TV ads to try and soften their image as they press ahead with plans for hundreds more data centers.
Speaker 25 Virginia's data centers are there for you. Is ready? We hope your families conserve energy and live more sustainably.
Speaker 2 They're already using more than a quarter of the state's electricity. Barring any catastrophic market crashes,
Speaker 2 AI's appetite for water and electricity is projected to keep growing every year. And this is really the challenge.
Speaker 2 Right now, AI is not a big environmental problem, but the thing is, every other part of the economy is finding ways to use less energy and less water. Take the American golf industry, for example.
Speaker 2 In the last 20 years, the amount of water used on golf courses has fallen by about 30%.
Speaker 2 In the last 20 years, everyone in the Western world has been turning off the tap while brushing their teeth and switching off appliances in the wall and like doing other things that are significantly more helpful than those things and all that means per capita energy use is down.
Speaker 2 Carbon emissions per capita are down, but not for the AI industry. Nvidia, Google, Microsoft and Meta have seen their cumulative emissions rise by 72% in the last five years.
Speaker 2 They have made attempts to build wind and solar farms to offset that, but they cannot keep up.
Speaker 2 So at an individual level, for you, the average person just trying to keep the preteen water police off their back, AI isn't a big deal.
Speaker 2 But at a societal level, we're adding a new drain on resources and a new source of emissions at a time when most Western countries are cutting back on all of that.
Speaker 2 But the AI companies have a counter-argument.
Speaker 22 While these servers are churning through electricity and water, they're also helping to solve some of the world's biggest environmental challenges, things like advanced climate modelling and bushfire predictions.
Speaker 2 Google says in its sustainability report that data centers are projected to represent only about 3% of total global electricity demand in 2030, despite delivering an outsized contribution to economic growth and scientific advancement.
Speaker 2 They say that AI will be a net benefit to the world once the construction phase is over and it's running at full capacity.
Speaker 21 Every working person, not working person, every child should address and engage AI right away.
Speaker 26 And the reason for that is because AI is the greatest equalizing force.
Speaker 23 If I were 22 right now and graduating college, I would feel like the luckiest kid in all of history. You have access to tools that can let you do what used to take teams of hundreds.
Speaker 27 It is a renaissance. It is a golden age.
Speaker 27 We are now solving problems with machine learning and artificial intelligence that were, you know, kind of in the realm of science fiction for the last several decades.
Speaker 2 While making these last two episodes, I've been testing out how useful various AI tools are in helping me with my work.
Speaker 2 I've had mixed results. Last week, I asked ChatGPT if it could help me find news clips covering the collapse of Pets.com as it was happening in 2000.
Speaker 2
It sent me links to the commercial featuring thepets.com sock puppet talk. Today we're coming to you from one crazy dog part.
I I said I didn't need the commercial. I found that myself.
Speaker 2 What I needed was archival, video, or audio news clips. It sent me links to a bunch of articles from US news outlets, none of which had any video or audio clips.
Speaker 2 We went around in circles a few more times until finally it sent me
Speaker 2 the sock puppet dog commercial. Another day, I asked it if it could help me figure out when this dinner took place.
Speaker 7 Elon Musk, Jensen Wong, and
Speaker 7 I went to dinner.
Speaker 2 The best it could do was to tell me that it happened sometime before Larry Ellison told the story of the dinner in that clip. What an excellent use of 1 15th of a teaspoon of water.
Speaker 2 Writing this episode required a lot of data to be gathered and converted into usable units. All the data about golf course water usage, for example, was provided in acre feet.
Speaker 2 And sorry, but we don't use that unit here in the civilized world. ChatGPT could kind of help with that, but occasionally it would spit out data that doesn't make any sense.
Speaker 2 And occasionally, if it couldn't find the numbers I was looking for, it would just make them up. That's a bit of a problem if you're a journalist.
Speaker 2 The amount of time that I spent checking its work and trying to coerce it into doing what I actually asked it to do far outweighed the amount of time that the AI actually saved me.
Speaker 2 Now, I'm not saying that AI is all bad. There are good examples of times when it can be used very productively as a tool, and it's definitely 100%
Speaker 2 way better at coding than me. But the vast majority of AI right now definitely needs a lot of human supervision.
Speaker 2 I think at the current stage we're at with these AI tools, It's a pretty common experience among people who give AI a go to find that the work it does for you is equivalent to a very, very enthusiastic but ultimately quite incompetent intern.
Speaker 2 Or a slightly better but also slightly worse search engine? One thing it did help me find was that statistic about AI using about as much water as lettuce farming. But you can trust lettuce.
Speaker 2
It does what you expect it to. It grows to a certain size and then you eat it and absorb its nutrients.
Do we know how big AI is going to get?
Speaker 2 How many hopper chips will be needed to train ChatGPT 6 or 7 or 700?
Speaker 2
Also, you don't have to eat lettuce if you don't like it. AI, meanwhile, is getting shoved into everything like cottage cheese.
AI's costs and benefits are all in the future.
Speaker 2 We've been promised that it's all going to be okay, and that we'll have some magical Zen utopian future where we don't have to work and where AI is going to solve all of the world's problems from loneliness to climate change.
Speaker 2 But the people working in the industry have a truly ungodly amount of money invested in making sure this pays off. The bubble is truly massive, and they really would rather it didn't pop.
Speaker 2 So I'm not exactly sure how much you can trust their promise that it's all going to be okay.
Speaker 2
This episode of If You're Listening was written by Cara Jensen-McKinnon and me, Matt Bevan. It's produced by Adair the CPU Shepherd.
Supervising producer is Cara Jensen-McKinnon.
Speaker 2 Tuesday is the 50th anniversary of the dismissal of Prime Minister Gough Whitlam, the biggest political drama in Australian history.
Speaker 2 You'll hear plenty about it on the day, but what we're going to probe into specifically is whether there was any foreign intelligence involvement.
Speaker 2 You've probably heard the rumours that the CIA was involved, but where did these rumours come from? And what are the odds that they're true? That's next on if you're listening.