The Hater's Guide To The AI Bubble, Pt, 2
In part two of this week's three-part Better Offline, Ed Zitron walks you through how little money there is in generative AI, how Anthropic and OpenAI are killing their own customers, and why there may never be a profitable LLM company.
YOU CAN NOW BUY BETTER OFFLINE MERCH! Go to https://cottonbureau.com/people/better-offline and use codeΒ FREE99 for free shipping on orders of $99 or more.
---
LINKS: https://www.tinyurl.com/betterofflinelinks
Newsletter: https://www.wheresyoured.at/
Reddit: https://www.reddit.com/r/BetterOffline/Β
Discord: chat.wheresyoured.at
Ed's Socials:
https://www.instagram.com/edzitron
https://bsky.app/profile/edzitron.com
https://www.threads.net/@edzitron
See omnystudio.com/listener for privacy information.
Listen and follow along
Transcript
This is an iHeart podcast.
On Fox One, you can stream your favorite news, sports, and entertainment live, all in one app.
It's fing roll and unfiltered.
This is the best thing ever.
Watch breaking news as it breaks.
Breaking tonight, we're following two major stories.
And catch history in the making.
Gibby, meet Freddy.
Debates,
drama, touchdowns.
It's all here, baby.
Fox One.
We live for live.
Streaming now.
Be honest.
How many tabs do you have open right now?
Too many?
Sounds like you need Close All Tabs from KQED, where I, Morgan Sung, Doom Scroll so you don't have to.
Every week, we scour the internet to bring you deep dives that explain how the digital world connects and divides us all.
Everyone's cooped up in their house.
I will talk to this robot.
If you're a truly engaged activist, the government already has data on you.
Driverless cars are going to mess up in ways that humans wouldn't.
Listen to Close All Tabs, wherever you get your podcasts.
There's more to San Francisco with the Chronicle.
There's more food for thought, more thought for food.
There's more data insights to help with those day-to-day choices.
There's more to the weather than whether it's going to rain.
And with our arts and entertainment coverage, you won't just get out more, you'll get more out of it.
At the Chronicle, knowing more about San Francisco is our passion.
Discover more at sfchronicle.com.
Every day has a to-do list, but adding Enjoy Belveda to yours can help you knock out the rest of it.
Belveda breakfast biscuits are a tasty and convenient breakfast option when paired with low-fat yogurt and fruit that provide steady energy all morning.
While Belveeta Energy snack bites give you the perfect mid-morning refuel, best part, they both taste great.
So make the most out of your morning with a bite of Belveeta.
Pick up a pack of Velveeta at your local store today.
Hello and welcome to Better Offline.
I'm your host, Ed Zitron.
Subscribe to the newsletter, buy the merchandise, it's all in the notes.
And we're on the second installment of our three-part haters guide to the AI bubble and the cracks within the generative AI industry and how they're becoming bigger and scarier and the potential economic meltdown caused by a collapse in generative AI spending.
Well, it's not really generative AI spending, it's literally just fucking GPUs.
And I think it might be sooner and likelier than many think.
Toward the end of the last episode, we talked about one of the inane comparisons we hear between today's nation-state-sized spending on Gen AI capital expenditures and the investments that Amazon made when scaling Amazon web services, which was literally the the foundation of cloud computing at scale, I would say.
Someone's going to email and say I'm wrong, not going to read it.
And I had to cut things short because we ran out of time, but I want to continue the conversation because I think it's important to examine this comparison thoroughly, if not just to explain why it doesn't work.
It's also I want to stop, I want to stop hearing it.
I want when people say it to me, I just want to send them this fucking episode and say, leave me alone, buddy boy.
But,
but, but, but the first point I want to make in this episode is that generative AI and large language models do not resemble Amazon web services or the greater cloud compute boom.
And generative AI is not infrastructure.
Now, some people compare LLMs and their associated services to Amazon Web Services or services like Microsoft Azure or Google Cloud.
They're giant multi-billion dollar operations that basically share their server capacity with companies wanting to run stuff on the internet or
within their own systems.
A very fudgy way of putting they help make sure that applications work online.
These are very, very useful services.
And by the way, people are wrong to make the comparison between them and LLMs, as I'll get into.
Now, Amazon Web Services, when it launched, comprised of things like, and forgive me how much I'm going to dilute this, Amazon's Elastic Compute Cloud, EC2, where you rent space in Amazon servers to run applications in the cloud, or Amazon Simple Storage, S3, which is enterprise-level storage for applications.
Storing things is not just like a simple hard drive.
It's redundancy, it's making sure it's copied in places so latency comes down, tons of other things.
But in simpler terms,
if you were providing a cloud-based service, you used Amazon to both store the stuff that the service needed and the actual cloud-based processing.
So compute, so like your computer loads and runs applications, but delivered to thousands or millions of people online.
And this is a huge industry.
Amazon web services alone brought in web revenues of over $100 billion in 2024.
And while Microsoft and Google don't break out their cloud revenues, they're similarly large parts of their companies.
And Microsoft has used Azure in the past to patch over shoddy growth.
These services are also selling infrastructure.
You aren't just paying for compute, but the ability to access storage and deliver services with low latency so users have a snappy experiences
wherever they are in the world.
And I know I just said a snappy experiences.
I'm not editing it.
The subtle magic of the internet is that it works at all.
And a large part of that is the cloud compute infrastructure and oligopoly of the main cloud providers having such vast data centers.
This is much cheaper than doing it yourself until a certain point.
Dropbox moved away from Amazon Web Services at scale, for example.
But this also allows someone to take care of the maintenance of the hardware and make sure it actually gets your stuff to your customers you also don't have to worry about spikes in usage because these things are usage based hence the elastic and you could always add more compute to meet demand or just have it in a particular time there is of course nuance security specific features content specific delivery services database services there's nuance behind these clouds you're buying into the infrastructure of the infrastructure provider and the reason these products are so profitable is that in part you are handing off the problems and responsibility to somebody else And also, most web applications are not that demanding of cloud compute.
They might be at scale, expensive to provide to millions of people, but Facebook was not a super complex, I don't know, website depending on thousands or millions of GPUs.
And based on that idea, there are multiple product categories you can build on top of something like AWS because ultimately cloud services are about Amazon, Microsoft, and Google running your infrastructure for you.
Large language models and their associated services are completely different, despite these companies attempting to prove otherwise.
And it starts with a very, very simple problem.
Why did any of these companies build these giant data centers and why did they fill them full of GPUs?
Amazon Web Services was created out of necessity.
Amazon's infrastructure needs were so great that it effectively had to build out the software and hardware necessary to deliver a store that sold theoretically everything for theoretically anywhere, handling both the traffic from customers, delivering the software that runs Amazon.com quickly and reliably, and well, making sure things kept working, making sure they were stable and it didn't need to come up with a reason for people to run web applications they were already running applications client-side on their computers they realized that doing so at scale would be cool or they were already doing so in a way that was likely not particularly cost effective and the ways they were doing so they were inflexible and they required specialist skills and indeed physical infrastructure personnel that were quite expensive So Amazon Web Services took something that people already did and what there was actually a proven demand for and made it better and scaled it.
Eventually, Google and Microsoft copied them because that's all they can do.
And that appears to be the only similarity with generative AI: that due to the ridiculous costs of both data centers and GPUs necessary to provide these services, it's largely impossible for others to enter the market.
Yet, after that, generative AI feels more like a feature of cloud infrastructure rather than the infrastructure itself.
AWS and similar mega clouds are versatile, flexible, and multifaceted.
Generative AI does what generative AI does.
And
well, that's about it.
You can run lots of different things on AWS.
What are the different things you can run using large language models?
What are the different use cases and indeed user requirements that make this the supposed next big thing?
Perhaps the argument is that generative AI is the next AWS or similar cloud service because you can build the next great companies on the infrastructure of others, the models of, say, OpenAI and Anthropic and the service of Microsoft.
Okay.
Okay.
Let's humor this point too.
You can build the next great AI startup and you have to build it on one of the mega clouds because they're the only ones that can afford to build the infrastructure.
One eensy wincy teeny weeny small problem.
Companies built on top of large language models don't make much money, and in fact, they're almost all deeply unprofitable.
But let's establish a few facts to get going.
I just said flacks?
Flacks?
Jesus Christ, facts.
Here are the facts I'm establishing.
Outside of one exception, Midjourney, which claimed it was profitable in 2022, which may not still be the case, I've actually reached out to Was them and they didn't get back to me.
Every single LLM model is company company is unprofitable, often wildly so.
Outside of OpenAI, Anthropic and AnySphere, which makes the AI coding app cursor, there are no large language model companies, either building models or services on top of others' models, that make more than $500 million in annualized revenue, meaning month times 12.
Outside Midjourney's 200 million ARR and Ironclads, 150 million ARR,
also fucking perplexity.
There are only 12 generative AI-powered companies making $100 million annualized or $8.3 million a month in revenue.
Though the database, and this is the information's AI generative AI database, doesn't have Replit, which also announced it hit $100 million in annualized revenue, I've included it in my statement of facts.
Of these companies, two of them have been acquired.
MoveWorks, acquired by ServiceNow in March 2025 after the company shit the bed big time, and Windsurf, which was acquired by Google and Cognition in July 2025 in one of the most annoying deals of all time.
But for the sake of simplicity, I've left out companies like Surge, Scale, Turing, and together, all of whom run consultancies selling services and training stuff for training models.
Otherwise, there are seven companies total that make $50 million or more annual recurring revenue, which is $4.16 million a month.
Now, none of this is to say that $100 million isn't a lot of money to you and me.
I just want to be clear: if you want to give me $100 million, I'll do anything.
I'll oink like a pig for you.
Anyway, but in the world of software as a service or enterprise software, this is jump change.
HubSpot had revenues of $2.63 billion in its 2024 financial year.
We're three years into this crap.
And generative AI's highest-grossing companies outside of OpenAI, 10 billion annualized as of June, and Anthropic, 4 billion annualized as of July.
Don't like saying that word.
Both of them lose billions a year after revenue.
There are really three problems here.
Businesses powered by generative AI do not seem to be popular.
Those businesses that are remotely popular are deeply unprofitable.
And even the less popular generative AI-powered businesses are also deeply unprofitable.
But I want to start somewhere because I keep hearing about fucking cursor.
Fucking let's start with any sphere and cursor.
And
their app, Cursor, it's an AI-powered coding app and they have $500 million of annualized revenue.
Pretty great, right?
Huh?
It hit $200 million in annualized revenue in March and then hit 500 million in June after raising $900 million.
That's amazing.
Ed, Ed, it's time.
Walk to the garage, Ed.
It's over for you.
Wrong.
It's a mirage.
Cursor's growth was a result of an unsustainable business model that it's now had to replace with opaque terms of service, dramatically restricting access to models and rate limits that effectively stop its users using the product at the price point they were used to.
Go to r slash cursor on Reddit.
Take a look.
Take a look at how happy everyone is.
I want to know why my peers in the media don't seem to have the ability to talk to actual fucking customers.
It's ridiculous.
This company is circling the drain and nobody seems to want to talk about it, despite how big a deal that is.
Oh, also, Cursor is horribly unprofitable, and I believe they're a sign of things to come in generative AI.
A couple of weeks ago, I wrote up the dramatic changes that Cursor made to its service in the middle of June on my premium newsletter, and discovered that they timed these changes precisely with Anthropic and OpenAI to a lesser extent, adding service tiers and priority processing, which is tech language for pay us extra if you have a lot of customers or face rate limits or service delays, arsehole.
These price shifts have also led to companies like Replip having to make significant changes to their pricing model that disfavors users.
People are finding in really simple terms that what they used to get for 20 bucks is much, much, much, much, much smaller.
Cursor users hit rate limits.
Replit users are hitting rate limits.
And even then, when they try and do the same things, they're spending way more money if they go pay as you go.
It's a complete fast.
But I'm going to repeat some of this stuff from the premium newsletter because there is a time of events that I believe are going to be in the big short to AI Boogaloo.
All right.
In or around May 5th, 2025, Cursor closed the $900 million funding round.
In or around May 22nd, 2025, Anthropic launched Claude 4, Opus, and Sonnet, new models with Sonnet and Opus, both of them kind of well-known for coding.
And on May 30th, 2025, they added service tiers, including priority pricing, specifically focused on cache-heavy products like Cursor.
And the cache is when you put stuff that you're going to be looking at regularly, take a look at it, and you can use it more readily.
Cache is gen, the C-A-C-H-G, by the way, is generally something that's for efficiency.
The idea that you would add a toll onto the cash is fucking disgusting and only targeted at coding startups.
But on May 30th, 2025, Reuters reported that Anthropic's annualized revenue hit $3 billion with a key driver being code generation.
This translates to around $250 million in monthly revenue.
June 9th, 2025, CNBC reported OpenAI had hit $10 billion in annualized revenue.
And yeah, when they said annual recurring revenue, they meant annualized.
But the very same day, they cut the price of their O3 model by 80%, which competes directly with Claude 4 Opus, by the way.
And this was a direct and aggressive attempt to force Anthropic to kind of like make to either lower prices or compete.
It's just shitheads fucking around with arseholes.
But on or around June 16th, 2025, Cursor changed its pricing, adding a new $200 a month ultra tier that, in their own words, was made possible by multi-year partnerships with OpenAI, Anthropic, Google, and XAI, which translates to multi-year commitments to spend, which can be amortized as monthly amounts.
A day later, on June 17th, Cursor dramatically changed its offering for its $20 a month subscriptions to usage-based, where one got at least the value of their subscription.
So a $20 a month person would get more than $20 of API calls.
In compute,
along with arbitrary rate limits and unlimited access to Cursor's own slow model that its users hate.
Then on June 18th, Replit, another Vibecoding company that I previously mentioned, announced their effort-based pricing increases that were massive.
July 1st, the information reported that Anthropic hit $4 billion of annualized revenue, making $330 million a month, an increase of $83 million a month, or just under 25% in the space of a month.
Hmm, where could that money have come from?
There's more to San Francisco with the Chronicle.
There's more food for thought, more thought for food.
There's more data insights to help with those day-to-day choices.
There's more to the weather than whether it's going to rain.
And with our arts and entertainment coverage, you won't just get out more, you'll get more out of it.
At the Chronicle, knowing more about San Francisco is our passion.
Discover more at sfchronicle.com.
Be honest, how many tabs do you have open right now?
Too many?
Sounds like you need close all tabs from KQED, where I, Morgan Sung, Doom Scroll so you don't have to.
Every week, we scour the internet to bring you deep dives that explain how the digital world connects and divides us all.
Everyone's cooped up in their house.
I will talk to this robot.
If you're a truly engaged activist, the government already has data on you.
Driverless cars are going to mess up in ways that humans wouldn't.
Listen to Close All Tabs wherever you get your podcasts.
In business, they say you can have better, cheaper, or faster, but you only get to pick two.
What if you could have all three at the same time?
That's exactly what Kohir, Thomson Reuters, and specialized bikes have since they upgraded to the next generation of the cloud.
Oracle Cloud Infrastructure.
OCI is the blazing fast platform for your infrastructure, database, application development, and AI needs, where you can run any workload in a high availability, consistently high performance environment and spend less than you would with other clouds.
How is it faster?
OCI's block storage gives you more operations per second.
Cheaper?
OCI costs up to 50% less for computing, 70% less for storage, and 80% less for networking.
Better?
In test after test, OCI customers report lower latency and higher bandwidth versus other clouds.
This is the cloud built for AI and all your biggest workloads.
Right now with zero commitment, try OCI for free.
Head to oracle.com slash strategic.
That's oracle.com slash strategic.
Every business has an ambition.
PayPal Open is the platform platform designed to help you grow into yours with business loans so you can expand and access to hundreds of millions of PayPal customers worldwide.
And your customers can pay all the ways they want with PayPal, Venmo, Pay Later, and all major cards so you can focus on scaling up.
When it's time to get growing, there's one platform for all business: PayPal Open.
Grow today at paypalopen.com.
Loan subject to approval in available locations.
In simpler terms, Cursor raised $900 million and very likely had to hand large amounts of that money over to OpenAI and Anthropic to keep doing business with them, then immediately change the terms of service to make them worse for their customers.
And as I said at the time, and this is a direct quote from my newsletter,
while some may no, I can't do the Kevin Roos voice and do my own stuff.
Pardon me.
While some may believe that OpenAI and Anthropic hitting annualized revenue milestones is good news, you have to consider how these milestones were hit.
Based on my reporting, I believe that both companies are effectively doing steroids, forcing massive infrastructural costs onto big customers as a means of covering the increasing costs of their own models.
There is simply no other way to read this situation.
By making these changes, Anthropic is intentionally making it harder for its
largest customer to do business.
By the way, Cursor is their largest customer, creating extra revenue by making Cursor's product worse by proxy.
What's sickening about this particular situation, it doesn't really matter if Cursor's customers are happy or sad.
They, like OpenAI's enterprise priority access API, Anthropic in this case, require a long-term commitment which involves a minimum throughput of tokens per second as part of their tiered access program.
If Cursor's customers drop off, both Anthropic and OpenAI still get their cut, and if Cursor's customers somehow outspend those commitments, they'll either still get rate limited or Anisphere will incur more costs.
Why do you care about this?
Well, Cursor is the largest and most successful generative AI company by far, and these aggressive and desperate changes to its products suggest that A, that its products are deeply unprofitable, and B, that its current growth was a result of offering a product that was not the one it would sell in the long term.
Cursor misled its customers and its current revenue is as a result highly unlikely to stay at this level.
Worse still, two Anthropic engineers left from the Claude Code team to go and work at Cursor two weeks ago and they have already come back.
This heavily suggests that whatever they saw over there wasn't compelling enough to make them stay.
As I also said, while Cursor may have raised $900 million, it was really OpenAI, Anthropic XAI, and Google that got that money.
At this point, there are no profitable enterprise AI startups, and it's highly unlikely that the new pricing models by both Cursor and Replit are going to help.
These are now the new terms of doing business with the big model companies.
A shakedown, where you pay for priority access or tiers or face indeterminate delays or rate limits.
Any startup scaling into an enterprise integration of generative AI, which means in this case, anything that requires a level of service uptime, has to commit to both a minimum amount of months and a throughput of tokens, which means that the price of starting an AI company that gets any kind of real market traction just dramatically increased.
Well, one could say, oh, perhaps you don't need priority access.
The need here is something that can be entirely judged by Anthropic and Open AI in a totally opaque manner.
They can and they will throttle companies that are too demanding on their systems, as proven by the fact that they've done this to Curser multiple times.
But okay, why does cursor matter so much?
It's simple.
Generative AI will not get big on selling consumer software.
Without an enterprise SaaS story, they're dead.
And I realize, I know, okay, folks,
it's kind of a little boring hearing about software as a service, despite the fact that it's a huge, several hundred billion dollar industry.
But this is the only place where generative AI can really make money.
Companies buying hundreds of thousands of seats are how industries that rely on compute grow.
And without that growth, they're going nowhere.
To give you some context, Netflix makes about $39 billion a year in subscription revenue from consumers, and Spotify about 18 billion.
These are the single most popular consumer software subscriptions in the world.
And OpenAI's 15.5 million subscribers suggests that OpenAI can't rely on them for the kind of growth that would actually make the company worth $300 billion, or more.
Cursor, as it stands, is the one example of a company thriving using generative AI, a software company selling software, and it appears its rapid growth was the result of selling a product at a massive loss.
As it stands today, Cursor's product is significantly worse, and its Reddit is full of people furious at the company for the changes.
In simpler terms, Cursor was the company that people mentioned to prove that startups could make money by building products on top of OpenAI and Anthropics models.
Yet the truth is the only way to do so
and grow is to burn tons of money.
While the tempting argument is to say that Cursor's customers are addicted and will keep paying, this is clearly not the case, nor is it an actual business model.
Like people that say this, I have never had a drug addiction, but I know people that do.
It's nothing like software.
Stop making that comparison.
It's insulting to the victims of addiction.
But anyway, this story showed that OpenAI and Anthropics are actually the biggest threats to their customers and will actively rent, seek, and punish any of their success stories, looking to loot as much as they can from them before they copy their products.
To put it bluntly, Kirstus' growth story was a fucking lie.
It reached $500 million in annualized revenue selling a product it can no longer afford to sell and could not afford to sell long term, suggesting material weakness in its business and any and all coding startups.
It's also remarkable and a shocking failure of journalism that this isn't in every single article about any spare.
I'm doing this part-time.
Why am I the asshole here?
Like, I'm, I don't know.
But, really, though, I do have a question.
Where are all the consumer AI startups?
I'm genuinely serious.
What have you got for me?
Perplexity?
Perplexity.
Perplexity only has $150 million in annualized revenue, and they spent 167% of their revenue in 2024, or $57 million of spending on revenues of $34 million on compute services from Anthropic, OpenAI, and Amazon.
They lost $68 million.
And worse still, they still have no path to profitability.
And it's not even making anything new.
They're a search engine.
They have an AI browser.
But don't worry, professional gas bag Alex Heath just did this insane and flummoxing interview with CEO Aravin Srivenas, who, when asked how perplexity would become profitable, appeared to experience what seems to be a stroke.
Like, I'm about to read something to you, and it's going to sound strange.
But this is exactly what was said.
Maybe let me give you another example.
You want to put an ad on Meta, Instagram, you want to look at ads done by similar brands, pull that, study that, or look at AdWords pricing of a hundred different keywords and figure out how to price your thing competitively.
These are tasks that could definitely save you hours and hours and maybe even give you an arbitrage over what you could do yourself because AI is able to do a lot more.
And at scale, if it helps you to make a few million bucks, does it not make sense to spend $2,000 for that prompt?
It does, right?
So I think we're going to be able to monetize in many more interesting ways than chatbots for the browser.
I want to be fucking clear about something.
Alex Haith seems like a nice guy.
If someone said that to me, I'd ask them if they could smell toast.
I'd be like, Aravind, mate, are you okay?
How many fingers am I holding up, Aravind?
Are you all right?
Did you hit your head on something?
The ceilings don't seem that low in here, but mate, you're just spewing utter fucking nonsense.
I've read this paragraph multiple times.
I do not know what he's getting at.
I think he's suggesting something about how you could ask it to tell you what to do with ads.
I don't know.
I don't know.
This is the big, probably the biggest consumer AI company that isn't open AI, and they speak like they're an insane person or a stupid person.
Check out the business idiot trilogy for what I think there.
I also mentioned them earlier, but I don't want you to talk to me about AI browsers.
Anyone humoring AI browsers is a, is
being an imbecile for some reason.
They are not a business model.
How are people going to make money on the browser?
Hmm.
What do these products actually do?
Oh, they can poorly automate accepting linked invites.
Wow
Wow, it's like God himself has personally best my computer big fucking deal in any case It doesn't seem like you can really build a consumer AI startup that makes any real money or approach being a real company other than chat GPT I guess and that's because the generative AI software market is small with little room for growth and no profits to be seen Arguably the biggest sign that things are in are troubling in the generative AI space is that we use the term annualized revenue at all, which as I've mentioned repeatedly means multiplying a month month by 12 and saying, that's our annualized baby.
The problem with this number is that, well, people cancel things.
While your June might look great, if 10% of your subscribers churn in a bad month due to a change in your terms of service, for example, that's a huge chunk of your annualized revenue gone and likely gone forever.
But the worst sign is that nobody is saying the monthly figures, mostly because the monthly figures fucking suck.
$100 million of annualized revenue is $8.33 million a month.
To give you some scale, Amazon Web Services hit $189 million, $15.75 million a month in revenue in 2008, two years after founding.
And while it took until 2015 to hit profitability, it actually hit break-even in 2009, though it invested in cash and growth for a few years later.
And I should be clear, them doing that justified so many startups burning cash.
So many startups like, yeah, look at AWS,
they were investing in growth, which is a fair thing for companies to do, but I'm being an asshole.
But right now...
There is not a single generative AI software company that's profitable and none of them are showing the signs of the kind of hypergrowth that previous big software companies had.
Well, Cursor technically is the fastest growing software as a service company of all time.
It got there by basically lying.
Cursor is never bringing back the product at the $20
price point that they were selling.
They're never doing it.
The money they earned was earned.
It's not fraud because they didn't do it.
deceptive i guess it was deceptive but it's not really to the it's just fucking lying it's just lying and who knows what happens to cursor now but you know what?
I'm harping on Cursor a bit.
What other software startups are there?
Glean?
Glean?
Fucking Glean.
Everyone loves to talk about enterprise search company Glean, a company that uses AI to search and generate answers from your company's files and documents.
Fun fact also, Salesforce, which owns Slack, has now blocked them from searching Slack.
Just arsehole on arsehole violence.
In December 2024, Glean raised $260 million, proudly stating that it had over $550 million in cash with best-in-class ARR growth.
A few months later, in February 2025, Glean announced that it had achieved $100 million in annual recurring revenue in fourth quarter FY25, cementing its position as one of the fastest-growing SaaS startups and reflecting a surging demand for AI-powered workplace intelligence.
In any case, ARR could literally mean anything as it appears to be based on quarters, meaning it could be an average of the last three months, I guess.
Anyway, in June 2025, Glean announced it had raised another funding round, this time raising $150 million in it.
Troublingly added that since its last round, it had surpassed $100 million in ARR.
Five months into the fucking year, and your revenue is basically the same?
That isn't good.
That isn't good at all.
Also, what happened to that $550 million in cash?
Why did Glean need more?
Hey, wait a second.
Take a look at this.
Glean announced their raise on June 18th, 2025, two days after Curses' price increase and the same day that Replit announced the similar price act.
It's almost as if the dramatic pricing increase is affected them due to the introduction of Anthropic Service Tears and Open AI's priority processing, but I'm guessing.
I know I'm guessing, but it is kind of weird that all of these companies raise money and all announce these things around the same time.
There's more to San Francisco with the Chronicle.
There's more food for thought, more thought for food.
There's more data insights to help with those day-to-day choices.
There's more to the weather than whether it's going to rain.
And with our arts and entertainment coverage, you won't just get out more, you'll get more out of it.
At the Chronicle, knowing more about San Francisco is our passion.
Discover more at sfchronicle.com.
Be honest, how many tabs do you have open right now?
Too many?
Sounds like you need close all tabs from KQED, where I, Morgan Sung, Doom Scroll so you don't have to.
Every week, we scour the internet to bring you deep dives that explain how the digital world connects and divides us all.
Everyone's cooped up in their house.
I will talk to this robot.
If you're a truly engaged activist, the government already has data on you.
Driverless cars are going to mess up in ways that humans wouldn't.
Listen to Close All Tabs, wherever you get your podcasts.
Every business has an ambition.
PayPal Open is the platform designed to help you grow into yours with business loans so you can expand and access to hundreds of millions of PayPal customers worldwide.
And your customers can pay all the ways they want with PayPal, Venmo, Pay Later, and all major cards so you can focus on scaling up.
When it's time to get growing, there's one platform for all business, PayPal Open.
Grow today at PayPalopen.com.
Loan subject to approval in available locations.
Mint is still $15 a month for premium wireless.
And if you haven't made the switch yet, here are 15 reasons why you should.
One, it's $15 a month.
Two, seriously, it's $15 a month.
Three, no big contracts.
Four, I use it.
Five, my mom uses it.
Are you playing me off?
That's what's happening, right?
Okay.
Give it a try at mintmobile.com slash switch.
Upfront payment of $45 for three-month plan, $15 per month equivalent required.
New customer offer first three months only.
Then full price plan options available.
Taxes and fees extra.
See mintmobile.com.
Hey, that reminds me.
I got another problem.
I got another problem here because I think that there is another reason why
the cycles kind of keep repeating.
You get a company that grows and then they kind of go nowhere because,
well, the company doesn't really seem to have a total addressable market much bigger than 100 million ARR.
And I think it's a little simple.
It's quite simple, in fact.
There really are no unique generative AI companies.
And building a moat on top of LLMs is near impossible.
If you look, and man, am I going to get some emails about this, but bring them on.
If you look at what generative AI companies do note that the following is not a quality barometer it's probably one of the following things they're either chat bot one either you ask questions or talk to this includes customer service bots searching summarizing or comparing documents with increased amounts of complexity of documents or quantity of documents to be compared this includes being able to ask questions of documents web search deep research meaning long-form web search that generates a document where some parts of it will inevitably be hallucinated or derived from low-quality sources generating text images voice, or in some rare cases, video, using AI to generate AI, I mean, to write, edit, or maintain code, transcription, translation, or photo and video editing.
Every single generative AI company that isn't open AI or anthropic, and honestly, kind of those two, does one or a few of these things.
And I mean every one of them.
And it's because every single generative AI company uses large language models, which have inherent limits on what they can do.
LLMs can generate, they can search, they can kind of edit, they can sometimes transcribe accurately, and they can sometimes sometimes translate much more, well, much less accurately, I guess.
Within weeks of Cursor's change to its services, Amazon and ByteDance release competitors that, for the most part, do exactly the same thing.
Sure, there's a few differences in how they're designed, but design is not a moat, especially in a high-cost negative profit business, where your only way of growing is to offer a product you can't sustain.
The only other moat you can build is the services you provide, which when your services are dependent on a large language model, are dependent on the model developer, who, in the case of OpenAI and Anthropic, could simply clone your startup because the only valuable intellectual property is the models, and those models are theirs.
You may say, well, nobody else has any ideas either, to which I say I fully agree.
My Rotcom bubble thesis suggests that we're all out of hypergrowth ideas, and yeah, I think we're out of ideas related to any large language models, too.
At this point, I think it's fair to ask, are there any good businesses you can build on top of generative AI or large language models?
I don't mean add features related to.
I mean an AI company that actually sells a product that people buy at scale that isn't called ChatGPT or Claude.
In previous tech booms, companies would make their own models, their own infrastructure, or the things that make them distinct from other companies.
But the generative AI boom effectively changes that by making everybody build stuff on top of somebody else's models, because trading your own models is both extremely expensive and requires vast amounts of infrastructure and just pure power.
As a result, much of this boom is about a few companies, really two if we're honest, getting other companies to try and build functional software for them.
And these companies, OpenAI and Anthropic, are their customers' weak point in a relationship that veers from symbiotic to parasitic at a moment's notice.
I cannot stress enough how bad OpenAI and Anthropic are for their business customers.
Their models are popular, by which I mean their customers, customers will expect access to them, meaning that OpenAI and Anthropic can, as they did to Cursor, arbitrarily change pricing, service availability, and functionality based on how they feel that day or whether they need to pump their annualized revenue for investors.
Don't believe me?
Anthropic cut off access to AI coding coding platform Windsurf because it looked like they might get acquired by OpenAI.
They never were.
They just harmed that business.
They just cut a hole in them.
Why?
Because they might touch another business.
The most anti-competitive shit in the world and everyone sat there clapping like a fucking seal.
Disgusting.
Even by big tech standards, this fucking sucks.
And these companies will do it again.
But you know what?
Let's talk about the actual uses of generative AI because the limited number of use cases is because large language models are all really, really similar.
Because all large language models require more data than anyone has ever needed, including like four times the amount of data on the internet, they all basically have to use the same thing, either taken from the internet or bought from one of the few companies like ScaleSurge, Turing Together or whoever.
While they can get customized data or do customized training and reinforcement learning, these models are all transformer-based and they all function similarly.
And the only way to make them different is by training them, which doesn't make them that much different, just better at things they already do.
And good lord, is it so, is generative AI so ungodly expensive?
And the training is as well, by the way.
They have to pay real humans as well, which they hate doing.
And even when they're paying outsourced labor in Kenya at $2 a pop, they're still losing a ton of money.
It's really crazy, actually, how badly built all of this is.
And I already mentioned OpenAI and Anthropic's costs, as well as Perplexity's $50 million bill in a year to Anthropic, Amazon, and OpenAI off of a measly $34 million in revenue.
These companies cost too much to run, and their functionality doesn't make enough money to make them make sense.
And the problem isn't just the pricing, but how unpredictable it is.
As Matashare wrote for CIO Dive last year, generative AI makes a lot of companies' lives difficult through the massive spikes in costs that come from their power users, with few ways to mitigate those costs.
One of the ways that the company manages their cloud bills is by having some degree of predictability, which is difficult to do with the constant slew of new models and demands for new products to go with them, especially when said models can and can, and do often cost more with subsequent iterations, not necessarily for much return, except if you're a company like a coding company, your customers are going to actually ask you for the new models.
As a result, it's hard for AI companies to actually budget.
But Ed!
What was that?
Ed, what about agents?
Aren't they the thing that'll eventually make the insane broken calculus behind generative AI actually work?
What is your accent, mate?
Anyway.
Anyway.
Let me tell you about agents.
The term agent is one of the most egregious acts of fraud I've seen in my entire career writing about this crap, and that includes the metaverse.
When you hear the word agent, you are meant to think of an autonomous AI that can go and do stuff without oversight, replacing someone's job in the process, and companies have been pushing the boundaries of good taste and financial crimes in pursuit of them.
Most egregious of them is Salesforce's Agent Force, which lets you deploy AI agents at scale, that's a quote, and brings digital labor to every employee, department, and business process, another quote from Salesforce's website.
These are two blatant fucking lies.
Agent Force is a goddamn chatbot program.
It's a platform for launching chatbots.
They can sometimes plug into APIs that allow them to access other information, but they're neither autonomous nor agents by any reasonable definition.
Not only does Salesforce not actually sell agents, its own research shows that the agents and agents in general only achieve around 58% success rate on single-step tasks.
And I'm going to quote the register here.
This means tasks that can be completed in a single step without needing follow-up actions and more information.
On multi-step tasks, so you know, most tasks, they succeed at a depressing 35% of the time.
Last week, OpenAI announced its own chat GPT agent that can allegedly go and do tasks on a virtual computer.
In its own demo, the agent took 21 minutes or so to spit out a plan for a wedding with destinations, a vaid calendar, and some suit options, and then showed a pre-prepared demo of the agent preparing an itinerary of how to visit every major league ballpark that's baseball for the non-Americans out there.
In this example's case, Agent took 23 minutes and produced arguably the the most confusing map I've seen in my life.
You can see the map in the newsletter version of this episode.
It's hilarious.
It missed out every single major ballpark on the East Coast, including Yankee Stadium and Fenway Park, which are two of the most well-known stadiums in sports, and added a bunch of random ones, like one in the middle of the Gulf of Mexico.
What team is that, Sammy?
The Deepwater Horizon Devils?
Is there a baseball team in North Dakota, Clammy Sammy?
Sammy!
I also should be clear, this was a pre-prepared example.
This is the best they had.
I want to see the kiting room footage on this because you best bet that that map looked like straight dog shit.
As with every large language model product, and yes, that's what this is, even if OpenAI won't talk about what model, results are extremely variable.
Agents are difficult because tasks are difficult, even if they can be completed by a human being that the CEO thinks is stupid.
What OpenAI appears to be doing is using a virtual machine to run scripts that its models trigger.
Regardless of how well it works, and it works very, very, very, very poorly and inconsistently.
It's also very likely expensive to run.
In any case, every single company you see using the word agent is trying to mislead you.
They're lying.
Gleans AI agents are chatbots with if, this, then, that functions that trigger events using APIs, which means if an event happens, another thing will be triggered.
Not taking actual actions because that is not what LLMs can do.
ServiceNow's AI agents that allegedly act autonomously and proactively on your behalf are, despite claiming they go beyond better chatbots, still ultimately better chatbots that use APIs to trigger different events using if this, then that functions.
Sometimes these chatbots can also answer questions that people might have or trigger an event somewhere.
Oh, right.
That's literally the same thing.
The closest we have to an agent is any kind of coding agent, which is they can make a list of things that you might do on a software project and go and generate code and push stuff to GitHub when you ask them to.
And they can do so autonomously in the sense that you can just let them do what a model that doesn't know anything and has no consciousness thinks is right based on its corpus of data and the things you give it access to and it's about as safe as that sounds when i say ask them to and go and i mean that these agents are not intelligent at all they do not have intelligence and when let run rampant fuck up everything and create a bunch of extra work also a study found that ai coding tools made engineers 19 slower Nevertheless, none of these products are autonomous agents.
Anybody using the term agent likely means chatbot.
And all of this is working because the media keeps repeating everything these companies say.
It's a disgrace.
We need to stop this.
I realize we've taken a kind of a scenic route here though.
But I needed to lay the groundwork because I really am alarmed.
According to a UBS report from the 26th of June, the public companies running AI services are making absolutely pathetic amounts of money from AI.
Microsoft, according to UBS, is making annual revenues of somehow less than the information reported at $2.1 billion.
ServiceNow is making less than $250 million.
Adobe, less than $125 million.
Salesforce, less than $100 million.
Now, ServiceNow said $250 million ACV, annual contract value.
This may be one of the more honest explanations of revenue I've seen, putting them in the upper echelons of AI revenue, unless, of course, you think about it for a couple seconds and think, are these all AI-specific contracts?
Or perhaps they're in contracts where you've taped AI onto the side.
Ah, it gives a shit.
It's also year-long agreements that could churn.
And according to Gartner, over 40% of agentic AI products will be cancelled by end of 2027.
And really, you gotta laugh at Adobe and Salesforce, both of whom talk such a goddamn fuckton about generative AI and yet have only made amazing
125 million in analyzed revenue from it.
Pathetic!
Crap.
Dog shit.
These aren't futuristic numbers.
They're barely product categories and none of this seems to include costs.
Oh, well.
Good grief.
Look, a lot of what I've been saying is reminiscent of the previous podcast, and I've gone over this a lot because I really want to make it clear that the signs are very troubling and that the things I've warned you about for the past couple of years are only getting worse.
And the cliff's coming up, things are only getting closer.
When we tumble off of it, things may get really, really bad.
And in the next episode, we'll talk about how and what that tumble might look like and the noises I'm going to make when it happens.
Thank you for listening listening to Better Offline.
The editor and composer of the Better Offline theme song is Matosowski.
You can check out more of his music and audio projects at matosowski.com.
M-A-T-T-O-S-O-W-S-K-I dot com.
You can email me at easy at betteroffline.com or visit betteroffline.com to find more podcast links and of course my newsletter.
I also really recommend you go to chat.where's your ed.at to visit the Discord and go to r/slash betteroffline to check out our Reddit.
Thank you so much for listening.
Better Offline is a production of CoolZone Media.
For more from CoolZone Media, visit our website, coolzonemedia.com or check us out on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.
Hi, I'm Morgan Sung, host of Close All Tabs from KQED, where every week we reveal how the online world collides with everyday life.
There was the six-foot cartoon otter who came out from behind a curtain.
It actually really matters that driverless cars are going to mess up in ways that humans wouldn't.
Should I be telling this thing all about my love life?
I think we will see a Twitch stream or president maybe within our lifetimes.
You can find Close All tabs wherever you listen to podcasts.
Every business has an ambition.
PayPal Open is the platform designed to help you grow into yours with business loans so you can expand and access to hundreds of millions of PayPal customers worldwide.
And your customers can pay all the ways they want with PayPal, Venmo, Pay Later, and all major cards so you can focus on scaling up.
When it's time to get growing, there's one platform for all business, PayPal Open.
Grow today at PayPalOpen.com.
Loan subject to approval in available locations.
From Australia to San Francisco, Cullin Jewelry brings timeless craftsmanship and modern lab grown diamond engagement rings to the US.
Explore solitaire, trilogy, halo, and bezel settings, or design a custom piece that tells your love story.
With expert guidance, a lifetime warranty, and a talented team of in-house jewels behind every piece, your perfect ring is made with meaning.
Visit our Union Street showroom or explore the range at colournjewelry.com.
Your ring your way.
Ready to take advantage of an incredible deal at Mazda?
September is the final month of eligibility for federal $7,500 electric vehicle lease cash on the new Mazda CX70 and CX90 plug-in hybrid.
All Mazda current inventory is unaffected by new tariffs.
See your local Mazda dealer for details.
$7,500 electric vehicle lease cash offer expires at the end of September.
Don't miss out.
$7,500 lease customer cash good toward 2025 CX70 PHEV and CX90 PHEV when leasing through Mazda Financial Services.
Lease customer cash can be combined with other public offers, including lease incentive offers.
Lease customer cash cannot be combined with APR or other customer cash offers.
Lease customer cash is not redeemable as cash or cash back option.
Lease customer cash is only available on approved credit.
Not all customers will qualify for credit approval or offer.
Limit 1 discount per customer per vehicle.
Lease customer cash offer only available in United States regardless of buyer's residency.
Void were prohibited.
Apply within the lease structure.
as a capital cost reduction.
Lease customer cash is only available on participating Mazda dealer's current inventory, which is subject to availability.
Offer ends 9:30-2025, and you must take delivery prior to the expiration of offer.
See Participating Mazda Dealer for complete details.
This is an iHeart podcast.