Monologue: OpenAI's Albatross
In this week's monologue, Ed Zitron walks you through how OpenAI's new image generator has become a massive burden on the company, and how OpenAI’s melting GPUs are a bad omen for the future.
Vote for Better Offline's "Man Who Killed Google Search" as the best business podcast episode in this year's Webby's! Open until April 17! Vote today!
https://vote.webbyawards.com/PublicVoting#/2025/podcasts/individual-episode/business
Vote for Weird Little Guys in this year’s Webby’s! https://vote.webbyawards.com/PublicVoting#/2025/podcasts/individual-episode/crime-justice
OpenAI’s new image generator is now available to all users - https://techcrunch.com/2025/03/31/openais-new-image-generator-is-now-available-to-all-users/
OpenAI Forecast Shows Shift From Microsoft to SoftBank (citation for $13 billion number) - https://www.theinformation.com/articles/openai-forecast-shows-shift-from-microsoft-to-softbank?rc=kz8jh3
---
LINKS: https://www.tinyurl.com/betterofflinelinks
Newsletter: https://www.wheresyoured.at/
Reddit: https://www.reddit.com/r/BetterOffline/
Discord: chat.wheresyoured.at
Ed's Socials:
https://www.instagram.com/edzitron
https://bsky.app/profile/edzitron.com
https://www.threads.net/@edzitron
See omnystudio.com/listener for privacy information.
Listen and follow along
Transcript
This is an iHeart podcast.
Hi, I'm Morgan Sung, host of Close All Tabs from KQED, where every week we reveal how the online world collides with everyday life.
There was the six-foot cartoon otter who came out from behind a curtain.
It actually really matters that driverless cars are going to mess up in ways that humans wouldn't.
Should I be telling this thing all about my love life?
I think we will see a Twitch stream or a president, maybe within our lifetimes.
You can find Close All tabs wherever you listen to podcasts.
Cold Zone Media.
Give me that turtleneck shirt, a Tim Pan Apple, a German Shepherd, a wristband standard, and a lurching red bird, and more braves than the Turner Network.
This is your weekly Better Offline monologue, and I'm your host, Ed Zitron.
Now, before we go any further, I need your help.
I look Better Offline is up for a Webby and I really need you to vote for Best Episode in the Business category.
It's the man who killed Google search.
It's Prabhagar Raghavan.
Let's get him.
I realize it's a huge pain in the ass to sign up for something and vote, but I've never won an award in my life and I'd really appreciate it.
Link is going to be in the episode notes.
And while you're there, also vote for the wonderful Molly Conger's Weird Little Guys, which I'll also have a link to.
I know signing up to stuff is annoying.
I'm asking a lot from you, but there you go.
I'm doing it.
Anyway, to the monologue.
I feel like we're approaching a choke point in the whole generative AI bubble, the culmination of over a year of different narratives and pressures that I believe will lead to an ultimate collapse.
Last week, OpenAI released an image generator with GPT-4.0, which quickly gained massive attention for its ability to create images in the style of famed Japanese animation company Studio Ghibli.
And to be clear, I think these images are an abomination, and everyone involved in launching this tool has committed a mortal sin.
Anyway, nevertheless, creating these disgusting, disgraceful images comes at an incredibly high cost.
And for the last week, OpenAI CEO Sam Altman has been complaining about their GPUs melting, leading to OpenAI having to limit free users to only three image generations a day, along with longer wait times and capacity issues with video generator Sora.
To make matters worse, Altman also announced that, and I quote, by the way, that users should expect new releases from OpenAI to be delayed, stuff to break, and for services to sometimes be slow as we deal with capacity challenges.
This led me to ask a very simple question that I think everybody in the tech media really should be asking.
Why can't Sam Altman ask Microsoft for more GPUs?
The answer is, as you may have guessed from my last monologue, is that there may not actually be capacity for them to do so.
OpenAI's relationship with Redman has grown kind of chilly over the past year.
I'd speculate that Microsoft has refused to provide additional immediate capacity or has refused to provide capacity on the chummy terms that OpenAI previously enjoyed, receiving a significant discount on the usual ticket prices in the past.
We know that Microsoft has both walked away from 2 gigawatts of future compute capacity and declined the option to spend another $12 billion on CoreWeave's compute.
And CoreWeave, if you don't remember, they're the publicly traded data-centered AI company, a whole dog's dinner onto itself.
And analyst House TD Cohen suggested that this is a sign that Microsoft is no longer willing to shoulder the immense financial burden of supporting OpenAI.
Even though OpenAI picked that option up, by which I mean they took the $12 billion of compute, it isn't clear if CoreWeave can actually build the capacity they need, and definitely don't think they're going to be able to do it in the time they need it.
Microsoft allegedly walked away from CoreWeave due to its failure to deliver, and that delivered the services they asked for, and indeed probably the compute as well.
If that's true, it's unclear what has changed to make Core Weave magically able to support OpenAI, or even how a company that's drowning in high interest debt can finance the creation of several billion dollars worth of new data centers.
Also, it's not quite as simple as OpenAI calling up a data center company with a bunch of GPUs and running chatgpt.exe.
OpenAI likely has reams of different requirements, and the amount of GPUs they will need will likely vary based on demand, putting them in a problematic situation where they could be commuting to a bunch of compute that they don't need if demand slows down.
I've heard that companies generally want a 6-12 month commitment for GPUs too.
The cost is fixed no matter how much they get used, or at least there's a minimum commitment.
But let's assume for a second that demand for chat GPT continues to rise.
How does OpenAI actually get that compute?
If Microsoft isn't handing it over, and the information reports that OpenAI still projects to spend about $13 billion on Azure Cloud in 2025, there aren't really a ton of other options, especially for a company with such gigantic requirements, meaning that whatever infrastructure OpenAI is building is a patchwork between smaller players, and using so many smaller providers likely creates unavoidable inefficiencies and overhead.
I'm naming another pale horse of the AI apocalypse, by the way.
Limits to service and service degradation across ChatGPT.
OpenAI is running out of compute capacity.
They've talked about it since October of last year.
And ChatGPT's new image generation is a significant drain on their resources, meaning that to continue providing their services, they're going to need to expand capacity or reduce access to services otherwise.
The problem is that expanding is extremely difficult.
Data centers take three to six years to build and OpenAI's planned Stargate data center won't have anything ready before 2026 at the earliest, which means we're approaching a point where there simply might not be enough data centers or GPUs to burn.
While OpenAI could theoretically go to Google or Amazon, both of those companies are invested in Anthropic and have little incentive to align with OpenAI.
Meta is building their own ChatGPT competitor and Elon Musk despises Sam Altman.
Real shithead versus fuckwad situation there.
While I can't say for certain, I can't work out where OpenAI will get the capacity to continue.
And I just don't know how they're going to expand their services if Microsoft isn't providing capacity.
Yes, there's Oracle, which OpenAI has a partnership with, but they're relatively small in this space.
ChatGPT's image generation has become this massive burden on the company, right at the point where it's introducing some of its most expensive models ever.
And the products themselves are extremely expensive to run.
Deep Research is perhaps the best example, using OpenAI's extremely expensive O3 model, which can cost in some cases as much as $1,000 per query.
Deep Research is probably cheaper, but not that much cheaper.
Probably,
I've heard rumors, and this is a rumor, it's a rumor, I've heard like a dollar or two per query.
If that's the case, that's fucking insane.
Anyway, while OpenAI could absorb the remaining capacity at say Crusoe, Lambda, and Coreweave, this creates a systemic risk where every GPU provider is reliant on OpenAI's money, and this assumes that they'll actually have enough to begin with.
OpenAI also just closed the largest private funding round in history.
40 billion theoretical dollars valuing the company at a ridiculous $300 billion raised from, you guessed it, SoftBank and other investors.
That's good news, right?
Not really.
In truth, OpenAI really only raised $10 billion, with $7.5 billion of those dollars coming from SoftBank and another $2.5 billion coming from other investors including Thrive Capital and Microsoft.
The remaining $30 billion of which SoftBank is on the hook for $20 billion of will arrive at the end of the year.
That's all we've got.
But OpenAI will only get $10 billion from SoftBank, so bringing it down to a $30 billion round total, if OpenAI fails to convert from a non-profit to a for-profit company by the end of 2025.
A massive acceleration there.
As a reminder, OpenAI is a weirdly structured non-profit with a for-profit arm, and their last round of funding from October 2024 had another caveat, that if OpenAI failed to become a for-profit company by October 2026, all investment dollars would convert into debt.
I've also read that they would have to hand the money back.
I'm not sure whether that's the case.
Debt is the one that's been reported the most.
Furthermore, OpenAI loses money on every single prompt on ChatGPT, even from their $200 a month ChatGPT pro subscribers.
The burdensome interest payments would make it even harder for OpenAI to reach break-even, which right now it doesn't even seem like they can do anyway.
As another reminder, SoftBank is a company that has now invested in two different fraudulent schemes, WireCard and Green Seal Capital, the latter of which helped put the nail in the coffin of Credit Suisse back in 2023 and put $16 billion into WeWork.
It will be incredibly,
some might say impossibly difficult, and I'll cover this in a future episode, to convert OpenAI into a for-profit company.
And the fact that SoftBank is putting this caveat on their investment heavily suggests that they have doubts it will happen.
And I must be clear, when the monopoly man is getting nervous, you should get nervous too.
The fact OpenAI accepted these terms also suggests they're desperate and I don't blame them.
They've committed $18 billion to the Stargate data center project, will spend $13 billion on Microsoft Compute alone in 2025, according to the information, and they've now created an incredibly popular product that will guarantee people come and use it like twice and then never use it again.
Now keep a keen eye on any restrictions that OpenAI makes on ChatGPT in the coming months.
I do not see how this company survives nor do I see how they expand their capacity much further.
Price increases, rate limits and other ways of slowing down the pressure on their servers will likely suggest that OpenAI is up against the wall, both in their ability to support the services they provide and the costs they must bear to provide them.
We are entering the hysterical era of the bubble, a time when the craziest stuff will happen as the money does everything it can to keep the dream alive.
I look forward to telling you what happens happens next.
Be honest, how many tabs do you have open right now?
Too many?
Sounds like you need close all tabs from KQED, where I, Morgan Sung, Doom Scroll so you don't have to.
Every week, we scour the internet to bring you deep dives that explain how the digital world connects and divides us all.
Everyone's cooped up in their house.
I will talk to this robot.
If you're a truly engaged activist, the government government already has data on you.
Driverless cars are going to mess up in ways that humans wouldn't.
Listen to Close All Tabs, wherever you get your podcasts.
Ah, smart water.
Pure, crisp taste, perfectly refreshing.
Wow, that's really good water.
With electrolytes for taste, it's the kind of water that says, I have my life together.
I'm still pretending the laundry on the chair is part of the decor.
Yep, here you are, making excellent hydration choices.
I do feel more sophisticated.
That's called having a taste for taste.
Huh, a taste for taste.
I like that, SmartWater.
For those with a taste for taste, grab yours today.
There's nothing like sinking into luxury.
At washable sofas.com, you'll find the Anibay sofa, which combines ultimate comfort and design at an affordable price.
And get this, it's the only sofa that's fully machine washable from top to bottom, starting at only $699.
The stain-resistant performance fabric slip covers and cloud-like frame duvet can go straight into your wash.
Perfect for anyone with kids, pets, or anyone who loves an easy-to-clean spotless sofa.
With a modular design and changeable slip covers, you can customize your sofa to fit any space and style.
Whether you need a single chair, love seat, or a luxuriously large sectional, Annabe has you covered.
Visit washable sofas.com to upgrade your home.
Right now, you can shop up to 60% off store-wide with a 30-day money-back guarantee.
Shop now at washable sofas.com.
Add a little
to your life.
Offers are subject to change and certain restrictions may apply.
Since 1983, Nissan has been building award-winning vehicles right here in America.
And this summer, they're making it easier to drive one home.
No new tariffs, just lower MSRPs on the Rogan Pathfinder.
So you can get the car you want at a price that feels right.
But don't wait.
These deals are only here for a limited time and while supplies last.
See why Nissan is number one for new vehicle quality among mainstream brands.
Learn more at nissanusa.com.
For JD Power 2025 award information, visit jdpower.com/slash awards.
This is an iHeart podcast.