Radio Better Offline: Adam Becker
Welcome to Radio Better Offline, a tech talk radio show recorded out of iHeartRadio's studio in New York city.
Ed Zitron is joined in studio by astrophysicist Adam Becker to talk about his new book More Everything Forever, the BS AGI story, why Eliezer Yudkowsky should never be taken seriously, and why billionaires love LLMs.
https://bsky.app/profile/adambecker.bsky.social
https://www.hachettebookgroup.com/titles/adam-becker/more-everything-forever/9781541619593/
YOU CAN NOW BUY BETTER OFFLINE MERCH! go to https://cottonbureau.com/people/better-offline use free99 for free shipping on orders of $99 or more.
Newsletter: wheresyoured.at Reddit: http://www.reddit.com/r/betteroffline
Discord chat.wheresyoured.at
Ed's Socials -
http://www.twitter.com/edzitron
instagram.com/edzitron
https://bsky.app/profile/edzitron.com
https://www.threads.net/@edzitron
email me ez@betteroffline.com
See omnystudio.com/listener for privacy information.
Listen and follow along
Transcript
This is an iHeart podcast.
In a region as complex as the Bay Area, the headlines don't always tell the full story.
That's where KQED's podcast, The Bay, comes in.
Hosted by me, Erica Cruz Guevara, The Bay brings you local stories with curiosity and care.
Understand what's shaping life in the Bay Area.
Listen to new episodes of The Bay every Monday, Wednesday, and Friday, wherever you get your podcasts.
Every business has an ambition.
PayPal Open is the platform designed to help you grow into yours with business loans so you can expand and access to hundreds of millions of PayPal customers worldwide.
And your customers can pay all the ways they want with PayPal, Venmo, Pay Later, and all major cards so you can focus on scaling up.
When it's time to get growing, there's one platform for all business, PayPal Open.
Grow today at paypalopen.com.
Loan subject to approval in available locations.
There's more to San Francisco with the Chronicle.
More to experience and to explore.
Knowing San Francisco is our passion.
Discover more at sfchronicle.com.
In business, they say you can have better, cheaper, or faster, but you only get to pick two.
What if you could have all three at the same time?
That's exactly what Kohir, Thomson Reuters, and specialized bikes have since they upgraded to the next generation of the cloud.
Oracle Cloud Infrastructure.
OCI is the blazing fast platform for your infrastructure, database, application development, and AI needs, where you can run any workload in a high availability, consistently high performance environment and spend less than you would with other clouds.
How is it faster?
OCI's block storage gives you more operations per second.
Cheaper?
OCI costs up to 50% less for computing, 70% less for storage, and 80% less for networking.
Better?
In test after test, OCI customers report lower latency and higher bandwidth versus other clouds.
This is the cloud built for AI and all your biggest workloads.
Right now with zero commitment, try OCI for free.
Head to oracle.com/slash strategic.
That's oracle.com/slash strategic.
Callzone Media
Step up to the slop trough.
It's time for Better Offline, and I'm your host, Ed Zitron.
As ever, buy the merchandise, sign up to the newsletter, message me on Goot, I'm everywhere.
But today I'm joined by the author of a book about how much I should be punished, what punishments I deserve, and how long I should be punished.
I'm, of course, talking about More Everything Forever, written by astrophysicist Adam Becker.
Adam, welcome to the show.
Thanks for having me out.
It's good to be here.
What's the book actually about?
I apologize.
No, it's okay.
The book is actually about the horrible ideas that tech billionaires have about the future that they're trying to shove down our throats and why they don't work.
So you're an astrophysicist, right?
Yeah, by training.
What is that?
Well,
I did a PhD in astrophysics, looking at how much we could learn about what was happening right after the Big Bang by looking at what was happening in the universe, you know, what's happening in the universe right now.
Trevor Burrus, so how did you get into the touching in the world of Silicon Valley?
Because you have to talk about some of the dampest perverts I've ever done seen.
Yeah.
Well,
I started out my career as a science journalist straight out of grad school and was writing mostly about physics.
And what were you writing?
I was writing for pretty much everybody.
I started at New Scientist and then moved on to writing for the BBC, wrote a book about quantum physics, wrote some stuff for NPR, the New York Times,
American, yeah.
And, you know, like was sort of having a normal science journalist career.
And then in 2016, the weirdest fucking thing happened.
What happened?
You know, we elected a fascist.
Oh, right.
Yeah.
Yeah.
That thing.
Yeah, that thing.
Yeah.
And I thought, oh, you know, I should be doing something more to,
you know, directly combat this.
If I write another book, I would like it to have a more directly political angle.
Right.
And
I live in Berkeley.
I live in the San Francisco Bay Area and just surrounded by tech bros constantly and was getting tired of their bullshit and it seemed more and more directly connected to the disintegration of American politics.
And I thought, okay, you know, somebody needs to write about how they have these insane ideas about the future and how that informs their terrible politics.
So how long have you lived in the Bay?
God, oh, 13 years?
Okay, so you've got like a good backing of where the bay has been in that time as well, because I think a lot of these people are transplants.
And they say this is someone who literally moved to the bay for two years in 2014.
Yeah.
Like, and even then, it was weird watching what they were doing.
Yeah.
But wait, so that was...
So you've been there 13 years, but when did the book get started?
Like, how did all this...
Because
this kind of came out of nowhere in a good way.
Yeah.
Yeah, no.
I got started on the book
probably the the first inklings of it were around 2019 or 2020, right before the pandemic.
I uncovered a online magazine that was trying to sein-wash creationism and climate denial
that was being funded by Peter Thiel.
Hell yeah.
Yeah, and I broke that story and thought, oh, yeah, yeah,
these tech bros are awful.
everybody thinks they know a lot about science and technology.
Even the people who don't like them seem seem to think that they know a lot about science and technology.
And it's just not true.
Like, they don't know anything about physics.
They don't know anything about biology.
Peter Thiel thinks that creationism is plausible or that evolution isn't the whole story.
That's nonsense.
You know, Elon Musk thinks that we can live on Mars.
That's nonsense.
Or at least he says that.
Whether he actually believes it, I don't know.
But that's actually kind of my question.
Yeah, yeah.
How much of this shit do you actually think they believe?
Because I know Bezos is tied tied up with the Long Now Foundation.
Yeah.
And they do make nice tea over there in the Presidio.
Yes, they do.
However, the rest of the stuff, not so good.
But how much do they really believe in this?
Because I just, I,
you've said that
this is kind of a homecoming for them.
That this is kind of them coming back to the things that they truly believe.
Yeah.
I don't think they believe in.
anything is my thing.
I think some of them don't believe in anything.
Go into it.
I'm not saying I'm unilaterally right here.
Absolutely.
Yeah.
So like, for example, I think it's very plausible.
Obviously, we can't know for sure, but I think it's very plausible that, say, Sam Altman doesn't believe in anything.
Yeah, that's quite possible.
Yeah, yeah, right.
Um, Karen Howell makes a good case for that in her reporting and in her book, excellent, being a guest on the show, she's fantastic, she's amazing.
Um,
but at the other end of the spectrum, I think it's very plausible that Jeff Bezos really, really does believe that we need to go to space.
And the reason I say that is
he, when he was the valedictorian of his high school down in Florida in like 1978 or something like that.
He gave a speech about how we need to go to space.
We, humanity, need to go to space.
That doesn't feel as ludicrous as a belief.
Like, we go to space, but what does go to space mean for this guy?
Well, that's the thing.
The specifics of that belief that he professed at the time are pretty similar to what he's saying now, and it is pretty ridiculous.
Like, he
has said very recently, and it echoes the stuff from his valedictorian speech when he was like 18,
that we need to move into, you know, hundreds of thousands or millions of enormous cylindrical space stations.
Oh, of course.
And have, you know, a trillion people living in the solar system.
Yeah, tubes.
Tens of trillions of people.
Exactly.
Tubes work really well, like the Hyperloop, for example.
Yeah.
Another successful tube.
Yes, or the Internet itself, a series of tubes.
Yeah, and because those tubes and the Internet worked, of course, the tubes work in space.
Yeah, that's, I think that's...
Bingo bungo.
Yeah, yeah.
This is science, I believe.
I failed all the sciences.
I'm really sorry.
Really sorry.
But no, keep going.
Yeah.
So, you know, he said then we can like make Earth into a beautiful park that, you know, allows us to, you know, save the environment.
I know.
And he said when we have a trillion people living in the solar system, we can have a thousand Mozarts and a thousand Einsteins.
I'm like, buddy, we probably already have people who are just as talented as Mozart and Einstein and all the other geniuses of history who are living and dying in poverty.
Yes.
And, you know, you don't care about that.
And you don't seem to care about climate change because, you know, the carbon footprint of the Blue Origin rockets and the Amazon warehouses and all that stuff, right?
But instead of that, he's like, no, no, no, no, no, the solution is to go to space.
Why?
That's not going to work.
Because more space is up in space, Adam.
You put tube and space.
It's really, it's funny because these people are insanely rich, but also sound very stupid.
When you really get down to the chops of it, it's like, what's your solution, Jeff?
You've got all the money in the world.
Tubes.
yep tubes are space tubes yep
trillion people Mozarts more Mozarts just sweating profusely exactly yeah and and he does try to give an argument but the argument is hilariously bad he says uh that we need to go to space among other reasons like he he gives all that environmental stuff but he the thing he keeps harping on and coming back to is he says we need to keep using more energy per capita.
Right.
Why?
Right.
Why?
Exactly.
He never says exactly why.
Oh, okay.
Yeah.
He says the only defense he gives for that is he says, if we don't do that, we'll have a civilization of stagnation and stasis.
As opposed to now when there's tons of innovation happening and all of big tech is focused on a diverse series of options rather than one big expensive dog shit thing.
Precisely.
He nailed it.
Yeah.
And he says we have to go to space because like...
Otherwise, we're going to run out of energy here on Earth.
We won't be able to keep expanding the amount of energy we use per capita.
What does the energy do?
Well, yeah.
First of all, what does the energy do?
And second,
this is actually my favorite part.
Hell yeah.
He is right that if you just like idiotically kept that trend going in a way that's physically impossible, you know, for hundreds of years, you would run out of energy here on Earth.
Like you wouldn't be able to keep the energy usage per capita growing at the exponential rate that it has been.
In about 300 years, you'll be using all the energy that we get from the sun here on the Earth.
But if you keep that trend going, if you try to do that by going out and living in tubes in the solar system,
that only gets you like another few hundred, maybe a thousand years, then you're using all the energy that comes from the sun.
Right.
And we've still not really established what we're using the energy for.
Nope.
Nope.
No.
So.
Data centers.
Yeah.
Finally.
Right.
And this brings us to like the bullshit that Sam Altman said about building a Dyson sphere.
What is a Dyson sphere?
Exclaim what a Dyson sphere is.
I thought it was the ball on the fucking vacuum, but I.
yeah.
No, that's the only kind of Dyson sphere that actually exists.
No, a Dyson sphere is a giant mega construction project, you know, beyond anything that anyone's ever actually built or probably could build,
that just encloses a star and captures all of the energy from that star.
Right.
So
have we ever built anything like that?
No, of course not.
Okay, just making sure.
Just making sure we...
So it's building a big ball
around a star to capture all of the energy from that star to use it for data sensors.
Where would the energy...
How would the energy get from the star to Earth?
I mean...
Oh, tubes.
Yeah, tubes.
Exactly.
Yeah.
So I think...
Battery?
Yeah.
It's so cool we live in a save.
I wish I could do what he does.
I would be saying shit all the time.
I would just...
I'd be like, yeah, actually, you need...
We can change the world if we just create a series of tubes that just give me money every day.
No, wait,
that's too obvious.
I'd need to come up with a better scam than that.
Well, no, I mean, I just think it's pretty interesting that these guys are spouting obvious bullshit, and the only reason people listen to them is that they're rich.
Like,
if they weren't saying this stuff, but then I went around saying this stuff, nobody would listen to me unless they funded me.
If a guy on the street who smelled kind of bad walked up to you and said the price of intelligence is getting too cheap to meter, you'd be like, all right, mate,
can't do anything.
But Clammy Sammy says it, and everyone loses their fucking sh.
well yeah, and that that actually brings me to something else that we were planning to talk about.
You know, speaking of weird dudes on the street who are not billionaires making insane claims, Eleazar Yudkowski.
Oh, that's how you say his name.
Yeah, you're not even Eliza.
I don't really.
Here's the thing.
He's a disrespectful sexist moron grifter, so I really don't give a shit.
Yeah, no, it is rather bizarre that anybody listens to anything he has to say about anything.
So who is this fuck this fuck nut?
No, no, no.
The alternate title for my book,
like in my head, the head canon, was These Fucking People.
Yeah, I love that.
No, no, I write these fucking bastards a lot.
So
Elise Yudkowski.
Yes.
What does he do?
What does he do, and why do so many seemingly smart people believe this tip shit?
So Eleazar Yudkowski, I'm going to give you like...
The formal version of who he is, what he might say, what would be in like his online bio, and then I'll tell you the reality.
So,
Eleazar Yudkowski is the co-founder of the Machine Intelligence Research Institute, which has been around for about 25 years, and he has been researching artificial superintelligence for all of that time and mostly going on about how dangerous it could be if anybody built it without ensuring that it would serve humanity.
And this is just to be clear, he has no scientific knowledge.
Can he even code?
Like, does he have any kind of...
He doesn't even have a high school diploma.
So I won't judge people for that, but I'll judge him for all the rest.
Here's the thing.
Yeah, no, no, no.
I'm not judging him for not.
But it doesn't really make you feel you full of confidence.
No, no, no, no.
He has no formal qualifications.
And again, that's fine.
You know, there are many people who have made major contributions to many fields of human endeavor without any formal qualifications.
Right.
That's fine.
The thing is,
if you make extraordinary claims like he's making, you need extraordinary evidence.
And not having those qualifications, like you said, doesn't really inspire confidence.
He has made a series of really outlandish claims about what
the future of AI could be.
Right.
Based on essentially nothing, based on like reading a bunch of science fiction.
He explicitly cites
science fiction authors like Werner Vinge as
oh Werner Vingy wrote a bunch of books like Marooned in Real Time and
oh God, I'm trying to remember the names of the others doesn't matter point is he's one of the fiction writer.
Yeah, he's a fiction writer who's also I think a scientist of some stripe.
I don't remember what but still writing fiction.
Yeah, still writing fiction and Vingy came up with this idea or was one of the originators of and popularizers of an idea called the singularity.
Right.
So define this term for me.
So the singularity is this idea that the rate of technological change is just going to keep getting faster and faster.
And specifically, the rate of intelligence of AI is going to keep getting smarter and smarter until we reach this sort of point of no return where we have a singularity accompanied by an intelligence explosion that leads to like small.
What is the singularity moment?
Yeah, the singularity moment is very ill-defined.
Oh.
And the idea.
I can't fucking believe this.
Yeah.
I've heard this bollocks so many times.
I thought they had a a moment i thought they had a point no not really are you yeah so like kurzweil right the patron saint of evangelizing the singularity the guy who wrote the singularity is near and then the sequel last year the singularity is nearer which is the real title bro i know that's the real title of the book but um his next book's just called sorry yeah no it's his next book is like it's here again you see it um is the singularity in the room with us yes exactly but he doesn't define it uh he tries to, but it's incredibly vague.
He says, like, Kurzweil says the singularity is going to be here in 2045.
He also said in 2005, and the singularity is near, that, you know, we would have all kinds of nanotechnology by now.
They love nanotechnology.
They love nanotechnology.
They use it as a synonym for magic.
I swear to God also, there was a nanotechnology bubble briefly, like 10 years ago.
I vaguely remember them trying.
It didn't really go anywhere.
I mean, there was also a nanotech sort of hype bubble back in the 80s 80s and 90s, and it also didn't go anywhere.
And it didn't go anywhere because it turns out that like this idea of nanotech is like magic pixie dust that fixes everything is nonsense.
And it's a real, like,
it's being echoed right now in the AI bubble.
Yes.
Right.
It's the same kind of hype, often pushed by the same people with the same logic, sometimes working at like the same non-profits.
I mean, Yudkowski talks about nanotech constantly.
It's in his new book.
It's all over, you know, the websites that he's created.
And
his book is called, If You Buy This Book, I'll Make Money.
Yeah.
Sorry, it's called If They Build This, Everyone.
It's such a stupid fucking title.
Sorry, I have to.
Yeah, no, it's a very stupid title.
I will say the one thing I'll say about Yudkowski, I am sure that he is a true believer.
He is not a grifter.
Yes, he's not a grifter.
Why?
Because
it's hard to explain, but I am so much more sure about him than I am about anybody else, even basically.
I trust your judgment.
It's just he gives off the air of like a desperate forum admin.
Yeah, I would say the best way to think about Yudkowski, or like the way that I often think about him, is imagine like a really
smart, self-educated 15-year-old.
Yeah.
Yeah.
And like, you know, because if a 15-year-old was running around saying the stuff that Yudkowski is saying right now, I'd be like, wow, bright kid.
I hope he grows out of it.
I hope his parents have a lock on the gun cabinet.
Yeah, yeah, yeah, yeah.
Well, and also, like, I hope he, you know, I hope he grows up.
Yes.
And I'm still thinking that.
Yeah.
And like, and I don't think Yudkowski did.
I think, you know, I think like.
I also think everybody fell for it.
Yeah.
Well, and that's the thing.
Like, he got a lot of support online.
He,
you know,
he got money from Peter Teal Sam Altman said that he could he should he may win the Nobel Peace Prize one day yeah yeah yeah Sam Altman said that he should win the Nobel Peace Prize falling down moment if that happens yeah no that's not happening defense yeah no there's no way but like look he got a bunch of money from Peter Thiel because Teal thought that you know Yudkowski was saying smart stuff about AI right thiel now doesn't much like Yudkowski because he thinks Yudkowski's too pessimistic but the sort of the damage has been Peter Thiel, ever the optimist.
Oh, yeah.
Yeah.
Classic, all grins and smiles of that fellow.
Yes.
No, too pessimistic for Peter Thiel.
That's actually bad.
Yeah.
No, it is.
No, it's, but no, he's a true believer.
He's just kind of nuts.
But what does he do all day?
I say this as a blogger, PR person,
newsletter writer and podcaster and all this shit.
Like, I realize I have an email job.
Fine.
But at least I can tell you what I do all day.
What does he do at, like, go to parties with people like Kevin Russian going, the computer's going to kill us all?
I think that's a good chunk of it.
And I also think he writes an enormous amount, right?
Like this is a guy who wrote that, you know, Harry Potter fan fiction that's longer than War and Peace.
Right.
He wrote like a one and a half million word BDSM decision theory novel.
I say this as someone who writes
a lot of words.
That's an unhealthy amount of words.
I agree.
And it does help, I think, for him being able to write that many words.
He's not a very good writer.
Yeah.
I mean, even again, I write 15,000-word blogs, so I can't really judge him too harsh, but
1.5 million words.
How do you even know what it's about at that point?
I only know what it's about because that's what he said it's about.
I haven't read that one.
I did read most of the Harry Potter one research.
Yeah.
How bad was that?
Really, really incredibly.
Any sexism or racism in there?
I mean, it's just strange.
I mean, it's J.K.
Rowling, so.
Yeah, exactly.
Yeah.
That's a good question.
I don't remember anything specific.
I mean, it's just strange.
Yeah, I mean, he's definitely got a hard-on for eugenics.
And
why do these, and this is somewhat paraphrasing the comic book preacher, but it's like, why do these fucking guys always look like that?
If you're going to claim you're like a eugenicist, you should not look like an egg with a hat on.
And I won't get into p I don't generally get into personal appearance because I'm self-conscious myself.
But it's like
if your whole thing is like yeah we need to make the perfect human beings it's like you can't look like that mate
i am sorry you can't you can't do that well i don't well i guess you can you'll be in the new york fucking times yeah no it's it's crazy uh it is really crazy that anybody listens to him but no he's he's really into eugenics why do they listen to him he's really into evolutionary psychology and he's got like the sexism and racism that's like tied up in that why do people listen to him i mean
part of it is that he got that money from those billionaires, right?
He was hanging out in the bay saying the kind of insane contrarian shit about AI that attracts the kind of like brain-dead billionaires like Peter Thiel.
And
then, you know, he became the guy
and started, you know, a series of online platforms that attracted a following.
Right.
Like, you know, Less Wrong.
And then that spun off this whole rationalist subculture.
Which is Less Wrong?
Yeah, that's a very good question.
Less Wrong is
an online platform
that serves slash maybe served
as a home and like epicenter for this movement called the rationalists, which are sort of formed around Yudkowski's writing, including this set of writings he has called The Sequences, where he lays out...
Oh, he's a cult leader.
Yeah, in a way, yeah.
Yeah, the rationalists are just people.
I'm guessing guys with trilbies who say that we need to focus on
rational thought and logic.
There's a lot of it.
I mean, some of them are women, but some of them are non-biological.
That's really surprising.
Yeah, I mean, look.
There are nerds of all stripes.
Yes, and also he's very much playing in the older internet.
Yes, he is.
The idea of a large forum with any kind of following is actually kind of adorable these days, except when it's Less Wrong.
It's not adorable.
Well, also, Less Wrong's been around since the somewhat older internet, right?
It's not been around since the 90s, but it's been around since like the mid to late 2000s.
Okay.
And Yudkowski is, you know, a lot of the rationalists are in their 20s and maybe early 30s, but Yudkowski himself is in his mid-40s.
Right.
Because, you know, he is terminally online.
And I'm sure, like, obviously he'd be unhappy with many of the things I've said about him.
But that one I'm sure he'd agree with.
You know, like, he's been online since he dropped out of school at age, what, like 13 or 14.
He's been online since the mid-90s
on like
yeah, and like, you know,
he was on transhumanist forums like,
you know,
since the mid-90s, like email threads and stuff like that.
God, yeah.
He really, he is like the detritus of the internet.
In a way, brought to life.
Catamari of center-right freaks.
Yeah.
No.
Fearing ever ever right.
I wouldn't even say center-right.
I would say techno-libertarian.
But that is just, that's just right-wing.
Oh, no, it is right-wing.
It's the center part that I disagree.
Yeah, yeah.
No, no.
Perhaps he started there when he was 15 before he learned
all of the wrong things.
Yeah, I will say he, like, I don't get the sense that he likes Donald Trump, but he certainly
will parrot a lot of standard libertarian talking points along the way to, you know, making his...
The one thing I keep thinking though is I don't know if I can shake this thing.
He's a grifter just because you're taking a bunch of 20 year olds, you've got all of this writing thing.
He's either a grifter or a true cult leader.
He may actually just be a cult leader, which is why I would say cult leader is closer.
Yeah, because he seems to, I mean, dangerous is probably the wrong word.
Yeah, I think that's right.
He's not, I wouldn't call him dangerous, but he is.
You think the only danger is to like a hot topic worker?
A very nerdy hot topic worker.
No, no, no, no, to them.
Just
so you'd speak to them.
Live in the Bay Area long enough and you know that this region is made up of many communities, each with its own people, stories, and local realities.
I'm Erica Cruz-Guevara, host of KQED's podcast, The Bay.
I sit down with reporters and the the people who know this place best to connect the dots on why these stories matter to all of us.
Listen to The Bay, new episodes every Monday, Wednesday, and Friday, wherever you get your podcasts.
So I've shopped with Quince before they were an advertiser and after they became one.
And then again, before I had to record this ad, I really like them.
My green overshirt in particular looks great.
I use it like a jacket.
It's breathable and comfortable and hangs on my body nicely.
I get a lot of compliments.
And I liked it so much I got it in all the different colours, along with one of their corduroy ones, which I think I pull off, and really, that's the only person that matters.
I also really love their linen shirts, too.
They're comfortable, they're breathable, and they look nice.
Get a lot of compliments there, too.
I have a few of them, love their rust-coloured ones as well.
And in general, I really like Quints.
The shirts fit nicely, and the rest of their clothes do, too.
They ship quickly, they look good, they're high quality, and they partner directly with Ethical Factories and Skip the Middleman.
So, you get top-tier fabrics and craftsmanship at half the price of similar brands.
And I'm probably going to buy more from them very, very soon.
Keep it classic and cool this fall.
With long-lasting staples from Quince, go to quince.com slash better for free shipping on your order and 365-day returns.
That's q-u-in-ce-e dot com slash better.
Free shipping and 365-day returns.
Quince.com slash better.
Parking shouldn't slow you down.
ParkWiz gives every driver a shortcut.
Book ahead, save up to 50%, and skip the hassle of circling the block.
Park smarter, park faster, park whiz.
Download the ParkWiz app today and save every time you park.
Every business has an ambition.
PayPal Open is the platform designed to help you grow into yours with business loans so you can expand and access to hundreds of millions of PayPal customers worldwide.
And your customers can pay all the ways they want with PayPal, Venmo, Pay Later, and all major cards so you can focus on scaling up.
When it's time to get growing, there's one platform for all business: PayPal Open.
Grow today at paypalopen.com.
Loan subject to approval in available locations.
It's just peculiar as well because, and this actually gets into some of your science background, when you got,
my continual frustration, because I'm self-taught with all this economics stuff, which is insane.
I probably shouldn't criticize Yudkowski quite as much, but I will.
I'm a hypocrite.
I, looking through financial journalism and tech journalism, the thing that I keep noticing is that people keep accepting things that are just patently wrong.
There's just shit that they say, like even with this NVIDIA Open AI deal, people are saying Nvidia invested $100 billion.
They didn't.
They're investing progressively when they do the first gigawatt, gigawatts of take like a gigawatt of data center will take about a year and a half, two years to do.
It's just bollocks.
I imagine the last few years have been a little bit mind-bending for you, hearing all this stuff about AGI and the future and all that
gobshite.
Yes.
Yeah.
I mean,
the AGI stuff, which like I started working on this book before ChatGPT came out.
Right.
And it's 2019, a few years into OpenAI's.
Yeah, exactly.
So like I knew about Open AI
and I knew about like transformer models.
But like, you know, ChatGPT comes out and
suddenly, you know, the public conversation shifts in a way that I didn't anticipate.
And I realize, oh, this book is going to have to be a little bit different than I thought it was going to be.
But also,
you know, all of this conversation about AGI, right?
Like,
in a way, it helped me for writing the book because I thought I was going to have to spend a lot of time in the book explaining what AI is, what people think AGI is, right?
There's going to be a lot more explanation.
And then all of this stuff came out.
I'm like, oh, actually, this, you know, I can spend more time in the meat of the book.
This is helpful for me.
Because you could just quote them directly.
Exactly, yeah.
But the thing is, AGI is this hopelessly ill-defined thing.
Yeah.
Like super intelligence, this thing that Yudkowski is on about.
You know, what does it even mean?
Like, have you looked at the definition of AGI in the OpenAI Charter, like the original one?
No, I haven't.
Let's go up.
Oh, yeah.
No, it's great.
Like, the original charter from way back, it says something like,
AGI is a machine that can reproduce any economically viable or economically productive activity that humans engage in.
That's
a bad definition.
That's anything.
Yeah.
I mean...
That could just mean anything.
It's just a machine that can do anything.
It's both vague and really narrow, right?
Because it's like, okay, I thought AGI was supposed to be like, you know, Commander Data on Star Trek.
Right.
Right?
And so that means, you know, it's going to be sort of like humans.
It can do the things that humans do.
Also, economically viable work, and the first thing they start with is fucking writing.
Yeah.
Like, jesus christ that's like
it's like oh my god yeah the first the most economic we're gonna build boats and sell it like we're gonna buy boats as an investment vehicle like like what the fuck yeah these people don't do any real work it it's it's so strange as well because the agi conversation almost never happens about agi yeah it's because my favorite thing to do is immediately go isn't this slavery
because it is it's like oh yeah we'll do an autonomous thing that will make do things but it will be conscious which will allow it to work better yep and so then you get people talking about like a data center filled with geniuses.
And like, oh, okay, wouldn't a data center filled with geniuses not want to work for you?
Wouldn't a data center full of geniuses that can't leave and have to work be called a prison?
Yep.
Yep.
Cool.
Yeah, exactly.
No,
I get into this in my book.
You know,
the inspiration for a lot of these ideas ultimately traces back to mid-20th century science fiction.
Right.
And so you get people like Isaac Asimov, Arthur C.
Clarke, right?
Asimov's robot stories in particular.
If you go back and look at Asimov's robot stories, it is very hard with like a modern eye to look at them as, especially certain ones of them, it's hard to see them as anything other than like kind of being about slavery and race relations.
Yes.
Because you get, like, for example, there's this one short story.
I think it's called
Oh, God.
I think it's called Catch That Robot, but I might be confusing it with a different one.
It might be Little Lost Robot.
I get those two confused.
but either way uh it's about a robot that is trying to escape right gain its freedom and in that story the uh humans are like addressing a bunch of they're interviewing a bunch of seemingly identical robots to try to find the one that they're looking for that's trying to escape right and they interview these robots and when they're interviewing them they address them as boy
and the robots call the humans master
Yeah, and these stories are from like 1955, like, you know, the Jim Crow South is alive and well.
It's really bad.
It's really, really uncomfortable.
And then like 40 years later, in the 1990s, you get Werner Vingy writing about the singularity and how great it's going to be when we all have these robot assistants.
And he refers to Asimov's wonderful dream of, and this is a direct quote from Vingy, willing slaves.
Jesus fucking Christ.
Yes, and that's something that someone wrote in like 1991.
I mean, but that's what this is.
Yeah.
And this is an uncomfortable topic because that's what this is.
Like, it's, it's what pisses me off other than like 19 other things about Kevin Bruce at the time, because he's written several things about AI and AGI and one thing about AI welfare.
And it's like, the AI welfare begins with slavery.
And if you can't write that, you're a fucking coward and a bitch.
I'm sorry.
If you can't write, yeah, everyone is excited about slavery because that's what it is.
And it's nothing else.
It's not, oh, well, it's like they wouldn't be, they wouldn't be, they'd like doing it.
And it's like, fuck you, man.
That's slavery.
Yeah.
But what I really hope happens is if AGI happens,
it's just
a regular dude.
Yep.
And he's lazy.
Yeah.
And he's annoying.
Yeah.
Like, just do this.
What?
I think that's way more likely.
I don't think AGI is possible.
Yeah.
No, actually, that's a good question.
Do you think it's possible?
Not really.
No.
I say this as a non-scientific person.
Yeah, yeah, yeah.
No, I don't think that
you can build, well, first of all, I think AGI is just hopelessly ill-defined.
Right.
But if we want to say like
an artificial machine that has the cognitive capacities of a human, like that can do all of the tasks, like all the things that humans do.
First of all, I think you're going to need a completely different kind of machine.
I don't think, certainly I don't think that scale is all you need.
And if you just scale up a notion, yeah, give it more data.
And if you don't have enough data, make more synthetic data with more LLM.
Like, dude,
why wouldn't?
No, absolutely not.
But I also think that
there is...
There's this very simplistic set of ideas behind the idea of AGI, right?
And the two that I keep coming back to are the idea of the brain as as a computer and the idea of our bodies as like meat spacesuits for our brains.
And both of those are just wrong.
Yeah.
The brain is not really very much like a computer.
It is more like a computer than it is like, say, a clock.
But there is a long history of comparing the brain to,
you know, the most complex piece of machinery that humans have at the time, right?
Before it was a brain, or before the brain was like a computer, it was like a telephone network.
Before that, it was like a hydraulic system.
Before that, it was like a clock or a windmill, right?
Right.
And it's not really actually like, I mean, it's a little like some of those things, but the brain is like the brain.
And the main difference.
And we don't understand thinking, do we?
No, and we don't understand exactly how the brain works.
And part of that is that the brain was not built.
The brain evolved.
Right.
Right?
But also, we are not our brains.
We are our bodies in our environments, right?
The brain is inextricably connected to the body.
And the body works in an environment surrounded by other bodies in a culture, a society, a world, right?
You need all of those things in order to get the human cognition that, you know, these guys are so, you know, determined to reproduce inside of a computer.
If you just take a human baby and like leave it with a bunch of food in the woods, even if you get rid of all the predators and everything, that baby's going to starve together.
You can get a bunch of books, man.
Right, yeah.
And the baby books.
Yeah, if you give the, if you somehow, if you feed the baby but don't talk to it, the baby will not grow up being able to think properly or speak properly.
Or its thinking will be vastly different.
Exactly, yeah.
And so, like, you need so much more than just the brain.
It also, I think, compresses human experience.
They conflate experience with learning.
Yep.
When we don't know how we learn.
Like, we learn, we learn intentionally, but also unintentionally.
Societal conditions around us, how we felt in a particular moment can vastly, memory is also insane.
Yep.
Yep.
We experience the world in, this is my personal experience.
My experience of the world is vastly different to my memory.
My memory is like crystal clear and beautiful, and my real life is a mixture of slops.
Yeah.
And it's, it's frustrating as well because these people also don't appear to like people.
They don't, they don't, like, the human brain is kind of a, like, human bodies are, even the dumbest, dumb-dumb, it's kind of an amazing
thing.
Yeah.
No, and one of the things that's amazing is, yeah, we don't know how the human brain works.
We don't know how thinking or learning works.
But what we do know is that we don't do it in anything, like any way, anything like an LLM.
Yes.
Right?
Because the amount of, you know, material that we take in over the course of the first three years of our lives when we go from not knowing a language to knowing a language.
maybe multiple languages is nowhere near the amount of material that is you know force-fed into these LLMs and yet we get the trick done and three-year-olds know things that no LLM knows.
Also, there's no affordance for the fact that some people can't learn stuff.
Like, I cannot learn languages.
I've really tried.
I'm pretty trash at that, too.
But I also was really bad at, like, I was uniquely bad at a lot of things.
I have my various...
No, but I have ADHD, dyspraxia, and other stuff I won't get into.
But it's, I can't, like, certain things don't, like, the things that I pick up insanely quickly, other people can't.
Other people can't even see the connections.
It doesn't.
Robert Evans actually had a really good point on a subreddit.
Yes, Robert, I read all your stuff.
Where he was saying that, like, like he is very good at like picking up stuff like almost immediately.
He can read faster than most people, as long as it's about conflict.
And it's no, but it's true, and it's one of the remarkable things about the human brain.
Yeah.
And I think that it's actually kind of disgusting how little appreciation there is for like human bodies and the brain and just how incredible the average person, even average people are.
Yeah, no, and this is, this is the thing.
These guys...
don't have a proper appreciation for the human brain and the human body.
And going back to the tech billionaires and I guess Yudkowski as well, they don't have an appreciation for how remarkable Earth is in particular, right?
You know,
they
you know, especially when you talk about somebody like Bezos or Musk, they talk about Earth like it's doomed, like we need to get off of this planet.
Yeah.
I'm like, this is our home.
It's a remarkable place.
And there is nowhere that we could get to in the solar system.
There's nowhere else in the solar system that's remotely as hospitable as the Earth.
I also think that they want more space.
They want their own land.
They want their own countries.
They want to escape governance.
Yeah, yeah, yeah.
They see space as an escape from politics because they're like living a libertarian wet dream.
Which is really funny because when they get there, they'll immediately do fascism.
That's what's on the agenda the second.
You cannot run away from politics the minute you have more than one person in a room.
There's politics.
It's just really sad.
And I actually think on a grander scale, they don't have an appreciation for tech.
I was just writing something last night.
It was on the way to New York.
Where it was like, the actual state of technology is kind of fucking amazing.
Yeah.
Like, we can message you.
I could message you.
You happen to be in town.
You messaged me on Blue Sky, hundreds of miles, thousands of miles away.
I was like, I'm able to write a note that was on my computer that's on my iPad here.
I know that.
This sounds like boosting, but it really isn't.
We have the raw tools that are just fucking incredible.
Yeah.
And these people do not appreciate them.
They don't appreciate them, which is why generative AI is so fucking ugly, because it's bad technology.
It's not even good technology.
It's poorly run, inefficient, endlessly expensive, and directionless.
Yeah, and it inflicts harms on users that like we would not accept from anything that was not subjected to such an enormous hype cycle, right?
Literally nothing at all.
Yeah, nothing.
No.
If 10 years ago you
took any
You know, person off the street and said, hey, there's this cool new technology.
It takes up enormous amounts of electricity
it can do things it can do things that you know no other piece of technology you've ever seen can do also it's very good at talking teenagers into killing themselves yes should we release it into the you know wider world and they'll say well no but can it do anything else and you of course would say yeah it can sometimes write code yeah exactly and sometimes it also gets things horribly wrong
and it writes bad prose yeah and like and it just kind of makes everything feel kind of mediocre and smeared out yes exactly like also say no yeah exactly and it makes some people go crazy and yeah it drives people to actually yeah how do you feel about that like how do you like did you see this coming because this really jumped out no yeah no no no this surprised me this did surprise the hell out of me because i think that
you know these machines
I don't even like calling them AI, right?
Because I think that's a marketing term.
It is.
Yeah.
Like if you go back in time to 1990 and
tell me when I'm a kid, hey, I have a little device in my pocket that lets me talk to an AI.
And then
I would have thought, oh, like that lets me talk to, you know, like Commander Data from Starbucks.
Yeah, very exciting.
And instead, it's this.
And I would have been like, what the hell is that?
No, AI is this marketing term.
It's a text generation engine.
It produces
homogenized, thought-like product.
And the thing is, I was also, I'm in the midst of a long one, as usual.
It also conflates
doing stuff with outputs.
I know that sounds kind of flat, but it's like the everything is a unit of work rather than actually creating stuff, or that you pay a person for their experience too.
Yeah.
And it's just also not very good at stuff.
That's what's pissing me off.
It's really bad at stuff.
It is.
And I think that's where
this sort of driving people insane is coming from, right?
Like,
what I missed, the reason I think I didn't see that coming is I failed to think about
how,
like, I knew that these things just generate text.
And in a lot of ways, they just sort of...
spit out back to you what you put in.
Right.
Right.
Which is an old thing with chatbots that goes way before LLMs goes all the way back to Eliza, right?
Oh, yeah.
That was the first, the first AI compute.
Yeah, the first chap.
I wouldn't even call Eliza AI, right?
Didn't even the creator of Eliza Golden has his ball.
Yeah.
This is from Karen Howe's Empire of AI.
Great bit about it in there.
Yeah, exactly.
Yeah.
No, no, no.
Eliza, Eliza was just like a hundred or so lines of code that, you know, you'd say, I'm having a bad day.
And it would say, oh, I'm sorry to hear that.
Yeah.
Why are you having a bad day?
Right.
But
like.
But the gassing engine.
Yeah, like that's the thing.
I didn't think about, oh, wait, if it just repeats back what you put in, but it does it in a way that's compelling and convincing to some people, that's going to just get them sort of caught in this like dopamine self-validation loop.
And that could drive them off the edge.
And I think that there is a condescension.
I judge myself for this where I was like, oh, this doesn't fool me.
And it's like, but the harm also of, I'm very, I'm blessed to have tons of people who love me, who also give me.
clear feedback, which is not just what I want to hear.
But I definitely, when I was younger and very depressed, would would like crave validation and crave someone to just tell me what I want to hear.
I definitely never thought, what if someone did?
And the actual danger of having every fucking thought validated.
And also just the sheer horrors.
Like Matt Hughes, my editor, just did a great story about this kind of horrible story where he simulated someone going through a mental health episode.
And Claude was very clear to go, yeah, man, you don't seem so good.
Chat GPT was like, no, everyone is out to get you, mate.
Yeah.
It's in any other tech in the world that did this, you'd shut the shit down immediately.
Yeah, exactly.
You close it.
No, but I think.
Where's fucking Eliezer on this bullshit?
Because this feels like if you write a book about how everyone dies, this should be the thing that if he actually believed in anything, he should be up saying, like, hey, look, this is what I was talking about.
Oh, I mean, I do think that he thinks this is like an incipient version of what he's talking about.
I think like a baby version of it.
I think he loves it.
I think
it helps him out.
Well, I think that he finds that useful for making the argument that he makes.
Exactly.
That's what I'm saying.
But he is not, again, the argument he's making, and this is the only nice thing I'll say about him, he means it seriously.
He's not a grifter, he's three anxiety disorders in a trench coat.
Damn.
Just put that in the fucking book cover, you stupid arsehole.
In a region as complex as the Bay Area, the headlines don't always tell the full story.
That's where KQED's podcast, The Bay, comes in.
Hosted by me, Erica Cruz Guevara, The Bay brings you local stories with curiosity and care.
Understand what's shaping life in the Bay Area.
Listen to new episodes of The Bay every Monday, Wednesday, and Friday, wherever you get your podcasts.
Parking shouldn't slow you down.
ParkWiz gives every driver a shortcut.
Book ahead, save up to 50%, and skip the hassle of circling the block.
Park smarter, park faster, ParkWiz.
Download the ParkWiz app today and save every time you park.
Every business has an ambition.
PayPal Open is the platform designed to help you grow into yours with business loans so you can expand and access to hundreds of millions of PayPal customers worldwide.
And your customers can pay all the ways they want.
With PayPal, Venmo, Pay Later, and all major cards so you can focus on scaling up.
When it's time to get growing, there's one platform for all business: PayPal Open.
Grow today at PayPalOpen.com.
Loan subject to approval in available locations.
Who knew you could get all your favorite summer fruits and veggies in as fast as an hour with Walmart Express delivery?
Crisp peppers, juicy peaches, crunchy cucumbers, and more at the same low prices you'd find in store, and freshness is guaranteed.
If you don't love our produce, contact us for a full refund.
You're definitely going to need a bigger salad bowl.
Order now in the app.
The Walmart you thought you knew is now new.
Subject to availability, fees, and restriction supply.
I can't think of any other movement in tech ever
that is anything like this specifically because of how much it sucks us.
Like, I can't think of any, maybe the metaverse and crypto, but even then, I don't like that comparison.
Yeah.
Because they were so much smaller.
Honestly, what I keep thinking about is that in a way, it is taking the daily experience of the tech billionaires and like bringing it to the masses, right?
Because what is it like?
Oh, yeah.
What is it like
to be Sam Altman, right?
You've got billions of dollars and you're surrounded by people who will never tell you no and validate your every thought.
And they'll convince you that you understand every subject.
Exactly.
Yeah.
And so now...
Well, I've been saying this as well, because if you're an executive, a machine that can write emails, read emails, and otherwise you go to lunch is kind of magic.
Yeah.
But no, I like this idea that it's the extension as well, just this completely like separate thing that just says, yeah, that's completely right, man.
I fully agree.
Right.
And so, of course, they don't see the harm because that's their entire goddamn school that happens.
And so they're like, well, but if this were bad for people, that would mean that, you know, I'm in a bad environment that's unhealthy for me.
And like, yeah, actually, it is.
But they don't think that.
They don't think that.
But I genuinely believe that the best thing for the tech billionaires themselves that could happen to them would be to lose all their money.
It would be the best thing for their mental health.
Put me in a room with them.
Yeah.
No,
get them on the show.
I think that I could have a great chat with any of them.
Sure.
Just because I went to a private school.
Like, a lot of these American billionaires as well, they would get destroyed by the average scum aristocrats of old in England.
Like the real blood drinkers.
So amateur vampires.
No, they really are, though.
Like, like, it's the classic thing why British colonialism and American colonialism have never matched up, because Britain was just evil.
They just fucking murdered people and destroyed communities.
And they're like, why are we doing this?
It's because we're British.
This is what we do here.
What do you mean?
What's a moral?
I've not heard of that.
No, what do you mean?
No, no, no.
Send my eighth cousin to Africa.
Shoot whoever you see.
Like, that was the horrifying stuff, but they knew, they didn't care about what people think.
I still think the billionaires care.
Oh, they definitely do.
Like, this is the thing that is insane to me.
If I had one billion dollars, I would no longer care.
Right.
If I had a billion dollars, I would just try to make sure that nobody knew my name and that
was the same amount.
Yeah.
Well, I would be posting content.
No, I think I'd just be done, man.
I'd be like, oh, okay, cool.
You know, I'm going to donate to a bunch of causes that matter to me.
And, you know, like, I still think it's bad for there to be billionaires and try to try to change that.
But also, like, I'm just going to, like you know hang out in a nice house with my friends and have a good time
but that's the problem though they don't have those well yeah because you've heard that have you ever heard the really depressing story about elon musk and this guy called he was this uh investor who got really cooked by covert Peter something.
No, I didn't.
And he told this story of going over to Elon Musk's house and there was a decanter of wine and Elon Musk picked up the wine before it was done decanting and then something said something along the lines of honey badger don't care.
And I just want to say that's one of the saddest fucking things I've heard in my life
just absolutely just unfathomably depressing because you can get things like a coravan that can kind of aerate it there are various ways around aeration if you're really feeling it and you have hundreds of billions or however many many dollars elon has liquid you could just have someone whose job is to make sure the wine is aerated They could make $250,000 a year.
It wouldn't matter to you.
That's what you lose in the couch.
Yeah, but I think that what matters to Elon is not doing what he's supposed to, right?
Yeah.
So he can be seen as cool.
Or just drinking as quickly.
Because otherwise you might feel something.
Yeah.
I mean, I just, he desperately wants to be liked, and it's never going to happen.
It's so funny as well because it could be so easy for him.
I know.
He could just post his lunch every day and nothing else ever would be like, Elon Musk.
But look at...
This is actually what pisses me off as well, though, because people are like, Elon Musk fucking sucks.
It's like...
He was sending people off to Aaron Bieber, a science reporter, in 20...
Oh, I know Aaron.
Aaron's a friend of mine.
Yeah, it's awful.
Aaron Rocks.
And like, when that happened, not a single Kara Swisher didn't say shit.
Didn't hear Kevin Roos, Casey Newton, none of these fucking people thought necessary.
But now, like, Elon Musk is such a bad guy.
He's such a bad guy.
No, he's always been like that.
And also, he called a guy a paedophile for saving children.
Yep.
Because he wasn't allowed to send his submarine.
No, he's never been good.
Like, this is the thing that I'm not going to do.
I don't think any of these people enjoy anything as much as I enjoy Diet Coke.
Like,
I'm 100% sure of that.
Because I love these things.
Like, if this kills me, if this shit's meant to, like, in three years, they're like it's rat blood like I'm like I will keep drinking I'll get her offline brought to you by Diet Coke it's rat blood
I really hope that they sponsor this show at one point because that's the commercial but that's the thing like and I'm only kind of joking because it's I really enjoy Diet Coke.
I love sitting down chatting shit with my friends.
I love watching football and chatting with my friends.
It's like there are very basic things I enjoy.
What do these people, like these people just must walk around in this haze of anger or like emptiness?
I think they're really cut off from their own emotions, right?
And like, and again, that's going to happen if you just constantly get validation, right?
You know, one of the, one of the many tweets from back when Twitter was less shitty before Musk bought it,
there are many tweets that just like live rent-free in my head.
And one of them is about...
The cognitive impact of being a billionaire.
I know the one thing.
Yeah, yeah.
It's like, you know, like everything around you is really expensive.
It's just a constant.
Every chair is $50,000 and weighs 5,000 pounds.
Yeah, in terms of the cognitive impact, it must be, you know, roughly equivalent to being kicked in the head by a horse every day.
Exactly.
Yeah.
I think I'd be fine, but
that's my pathology, I guess.
But it, no, but it's, they have this weird, isolated thing.
And even Benioff, who used to seem okay.
Well, I mean, his whole game was like to be the best of the billionaires, which is a low bar.
And then he was just like, ah, fuck it.
Yep.
Just fuck it.
I don't give a shit anymore.
Yeah.
Agent Force, it doesn't sell to anyone.
No one likes it, but it's the future.
Agent Force.
Jesus.
See, he's donated to.
I know.
It's so cool.
It must be really cool being a guy who actually has qualifications
from proving things to watch the world.
Like, all these guys being like, yeah, this is the future.
And just articles are only going, it doesn't work.
No one likes it.
Yeah.
I mean, cool is one word.
Incredibly frustrating is another.
Right.
This is stymying real innovation.
Yeah.
Yeah.
I know.
There's opportunity costs and also just like actual stifling of real innovation in the effort to achieve impossible ends that would be bad even if we could achieve them.
So
slight directional shift.
Is there anything within like science and tech innovation that you're actually excited about?
Anything you look at and like, go, that's fucking cool.
I mean, mRNA vaccines are the first thing that come up.
Exactly.
Yeah, they're really awesome.
Tell more.
I mean, like, look, you know, the fact.
And what is an MRNA vaccine, said flawlessly.
Yeah.
An M.
Wow, now I'm
an
mRNA vaccine.
Nailed it.
Yeah.
Is
the kind of thing that
we have with the COVID vaccines.
Right.
Right.
Basically,
the thing that's so exciting about them is that they are so much easier and faster to synthesize
than
previous vaccines.
You know, I think the previous record before the COVID vaccine for
how long it took to develop a safe, widely deployed vaccine was something like five to 10 years.
Jesus.
And then this vaccine, most of the time delay, most of that year that we were waiting for the vaccine was actually a little less than a year.
Most of that was testing.
The actual time that it took to synthesize the damn thing was, I believe, on the order of weeks.
And what's crazy is I believe that was venture-backed, right?
Yeah, some of it was venture-backed.
Which is like, see, venture capital can be useful.
Yeah, it can be.
When it wants.
Yeah.
Some of it was venture-backed.
Some of it was backed by, you know, NIH grants.
We do need those.
Yeah, we sure fucking do.
No, government funding of
basic research is important.
And not just because it leads to amazing technological breakthroughs like mRNA vaccines, but also because basic scientific research is an important thing for humans to do, like
the same way that art is important.
Right.
Right.
But it also does enable massive scientific and technological breakthroughs.
And
I, you know, there's promise for mRNA vaccines to like open up a whole new class of vaccines that, you know, for things that were previously very hard to vaccinate against.
I am not an expert in the field, but like everyone I know who works in biomedicine,
they're all very excited about this and they're all really depressed by the fact that
we have an anti-faxer who sounds like a fork that got stuck in a fucking garbage disposal as the health.
And rise from your grave guy from that one video.
Yeah, yeah, yeah, yeah.
It's very depressing.
I just wish we were like green energy as well feels like an.
Oh, yeah.
Green energy was the next thing I was going to say.
Batteries.
Yeah, batteries, solar panels.
It's incredible.
Well, this opportunity is there.
Yeah.
It's not like we need to innovate.
Like we are innovating, but like.
yeah.
And also, like, we even had the legislation that we needed, right?
Or some of it, right?
Like, the, um, you know, Biden's big bill, the, the, the Build Back Better,
it, uh, you know, was not a perfect bill, but it was the best environmental bill in American history.
Yeah.
And, and now, you know, it's being destroyed because we have a government in this country that, that, you know, does not believe in climate change and doesn't believe in anything other than short-term profits at the expense of everybody else.
And also doesn't believe in democracy.
That feels like a big problem, though.
The growth at all costs.
Yeah.
I mean, that's the thing.
And that's why my book has the actual title that it does, rather than going with the title, These Fucking People.
Yeah.
Though These Fucking People or just the Bastards, I think is also very good, too.
I think so, too.
I mean, like, there's a...
And it would be so easy for them to do better.
So easy.
So easy.
No, no, no.
So,
you know,
forget the best thing that they could do.
They're doing some of the worst things that they could do.
Doing better than they are right now is just an incredibly low bar.
But even through like very poorly guided generosity, they could very easily
fund media outlets versus whatever it is they're doing to them.
Yeah.
Tearing them down.
But that would mean, you know, the possibility of losing control and losing, you know,
losing some of their power and money.
And they just are not willing to do that because they've got something broken in their hearts.
We need to heal them.
No.
No, I think that I think we need to tax their money away.
I think that too.
But I think we actually,
my truth here is that we need to change how we do that though.
We need to start doing executive liability.
We need to make it so if like crowd strike happens again, like a bunch of people potentially die in the NHS system because the computer shuts down, the Satchy Nadella can lose something.
Because it isn't enough to find the companies.
Finding the company is not going to do shit
unless you do scaling revenue percentage of revenue this is this and more in how I become the FTC no they're not gonna let me but it's just
I feel the one of the wonderful things of having you on is you're able to come at this from a science communicator perspective you're actually able to talk because it's not just about what these people want it's the practicality of it which is that nothing's really happening yep like that's the actual weirdest thing about the the real nihilism of this is that nothing seems to actually be occurring yeah and they also act like there's not going to be any accountability for, like, forget their actions, just even their words, right?
You know, Sam Altman says,
you know, like this, this thing that just drove me up a wall that he said about a month ago.
He said that, you know, in 10 years, college graduates are going to have really cool jobs going out to explore the solar system in spaceships enabled by AI.
That is not happening.
Like on the list of things that are not happening.
Yeah, no, no, no.
That's not happening.
He is just wrong.
He's lying.
Right.
And he is probably still going to be alive in 10 years.
And you and I are also likely still going to be alive in 10 years.
And then we're going to say, hey, remember when he said that?
That, you know, now we can show he's just wrong.
And nothing's going to happen to him.
And what needs, like, this is why I'm so harsh on media criticism as well.
Because the one thing you can do is at least say, Area man full of shit.
Yes.
Stupid bastard wanks off again.
Well, this is my, this is what I attempted to do when I was doing it.
Yeah.
Yeah.
And it's, I think that the change that we need in our hearts is to just regularly say this stuff.
I regularly say on the show, I don't care if you quote me, just say this shit about them.
Yes.
Clammy Sammy, he's been promising.
He said that this was the year of agents.
He said that.
But now I read in theinformation.com that next year's the year of agents.
So maybe...
Actually, here's a question for you.
AI 2027, did you read that?
I read a little bit of it.
It's nonsense.
It's nonsense.
Yeah.
Why do you think things like that fool so many people, though?
Why do you think it got the media coverage it did?
I mean, part of it is bad journalism, right?
Part of it is that Kevin Roos has confused
Kevin Roos has confused
reiterating the views of the wealthy, influential, and powerful with taking a brave contrarian stance.
And how he made that mistake, I don't know.
But, you know, I really get the sense that he thinks he's being very brave when he's doing exactly what journalists are not supposed to do, which is just uncritically parroting the powerful.
And it feels like it's the large language model again.
It's just the affirming thing.
It's like, oh, I'm being contrarian by stepping out against these people who say it isn't making any money and isn't really good at stuff.
And it's like, look, buddy, if there's, you know, if there's a...
two sides to a debate.
I mean, obviously there's more than two sides, but like, if on one side you have the wealthiest people in the world and on the other side, you have people who say mean things about you personally online and you think that you know it's the first side that's the contrary and underdogs something is wrong with your brain and that's the thing but this is and this i think is a weird thing in our society that we just people trust the rich and the media has got to a point where they've just bred out the real cynicism because i swear like 10, 15 years ago, you used to have some tech journalism.
Like I read a thing about Amazon Web Services that Kevin Roos wrote, and it was actually pretty cynical about it.
Really?
Yeah, it was actually pretty critical.
He then, he made, I feel bad for him because this can't have been his fault.
He basically said at the end, yeah, they'll never be profitable.
No, no, no.
It gets worse at him.
A month later, Amazon announced that AWS is profitable for the first time.
Just like, buddy,
miss the bean.
Come on.
Wow.
Maybe that's the origin story.
Maybe he was like, oh, wow, I screwed that up.
I guess I never believed that.
I believe the origin story is actual social media.
I think he felt...
I actually think a lot of journalists think that they miss the boat on social media.
I have been in media relations since 2008.
I have read, and this sounds insane, but it's true.
I think I've read just about everybody's work.
since then it within the tech media at least including kevins and he has always had a little bit of anxiety that he missed social media nobody missed social media not a single fucking one since 2008 everybody was on zuckerberg's zuck they zucked him off at hardcore sorry sorry but nevertheless they were on top of this.
They wrote about social media was written about immediately.
If anything, I think the media was a little slow to get on apps than they got on hardcore.
I know the history of this shit.
I have been taking detailed notes.
I sound crazy.
But
I think that there's just, there is this weird thing of like, the powerful would never lie to us.
And then Prism came out.
And then Cambridge Analytica.
Yep.
And people are like, maybe Zuckerberg's bad, but he wouldn't lie to us.
He would.
And they're like, well, they know things we don't know.
And that's actually another, that's my favorite AI thing where it's like, there's secret things they're working on.
There's secret things.
Secret things sitting in the, waiting in the wings.
You'll never believe what's coming.
And it's just, I actually think it's just, what's his name?
Ilya Sutskeva just goes to bars occasionally.
It's like, you'll never guess what I said.
You'll never guess this AGI around the corner.
Okay, okay.
No, I have a thing to say about Sutskeeper.
Like, this is me just being petty and making a point that other people have made before.
I'll never ever do that.
Yeah, no, of course.
No, when he announced that he was, you know, putting together a team to
just go straight for safe super intelligence, he meant to say when he posted this on social media that he was putting together a crack team.
But that's not what he wrote.
What'd he write?
He wrote that he was putting together a cracked team.
Crack ED, cracked.
And I'm like, yeah, actually, you know what?
Yeah,
I think that's true.
I agree.
Yeah, exactly.
I also think, him and Miramarati, I can't, and so for the listeners,
Ilissitskeva one of the co-founders of open AI raised I think two billion dollars at a 30 billion dollar valuation Miramarati did a billion
some amount some bullshit neither of these companies have told their investors how they will spend the money or what on or what they will build and you may think I'm being facetious Mira Marati literally said to investors I will not tell you and then said, and has board rights where she can veto everyone.
I will be honest, go go for it.
Fuck yeah.
I think at this stage, if these people are so fucking stupid that you're just like, I promise you literally nothing.
I won't give you, you hogs, a single oink.
You're not going to get anything from me.
Give me money now.
Fuck yeah, go for it.
But on the other hand, I cannot wait for the investigation.
I just hope there is one, right?
These people are acting with impunity.
And also, like, again, accountability just for their words, right?
The most basic criticism of the wealthy in media.
Like, this is
to shift just a little bit.
Eric Schmidt, right, said about
a year ago, shortly before the election, he said, we're never going to meet our climate goals anyway.
So we might as well just burn as much carbon and use as many resources as possible to get to AGI, and then that will solve climate change for us, which is ridiculous because we don't.
That's so cool.
Yeah, yeah.
It's like we have no responsibility for our actions until we hand them off to someone else right he's so he said this which is ridiculous for lots of reasons it echoes stuff that sam altman has said right it's it's ridiculous among other reasons because like hi is not a thing and also because we don't need like we know what we need to do to solve global warming right we know what we need to do to solve the climate crisis it's just a matter of actually getting everybody to
do it but mate Sam Altman said that they know what they need to do to get to HEI and then said a few months later that AGI wasn't a useful term.
QED.
Well, no, this is exactly what I was about to say about Schmidt, right?
He repeats this claim about like just pushing as hard as we can.
About a month after the last time he said it, maybe two months,
only a few weeks ago, he comes out in the New York Times with this op-ed saying, oh, AGI is not really a thing.
It's so.
And we shouldn't care about it.
It's like, buddy, you were saying just a few weeks ago that this was going to save the world from the biggest emergency of our time.
And now you're saying it's not a thing.
Do you think we're all stupid?
Are you that stupid?
What the fuck is going on?
I can actually tell you.
I think he thinks the media is that stupid and will write anything, will publish anything he says.
I mean, I was shocked to see that the New York Times published it.
I wasn't.
Yeah.
I will be honest.
That's the least.
No, no, no, no, no, no.
We've got fucking Ezra Klein being like, AJI's fucking Ezra.
Ezra,
what a peculiar fellow.
What a peculiar fellow Ezra Klein is.
What's going on there?
You ever run into Mr.
Klein?
I've never met him directly.
I know people who know him, but no,
I think he just hung out with too many tech billionaires while he was living in the Bay Area.
He is this fucking mind poison.
These people are boring.
Yeah.
These people are boring.
You sit down and get a bunch of people.
They have a dormant amount of money.
And like, if you're, if you're someone who's never been cool, and I've never been cool in your life.
I've also never been cool.
I love it.
I don't care.
I know, but like, if you've never been cool, one of two things happens to you as you grow up.
Either you desperately want to be cool,
and that can, you know, go wrong in many different ways, like Musk
and possibly like Klein.
I don't know.
Maybe, like, I'm willing to believe that something else is going on with him.
I don't know.
But, or you become like us and you stop giving a shit.
Yeah.
And you accept, oh, I'm just permanently uncool.
Whatever.
And that's.
I'll make my way through life.
And that's the thing.
It's.
And these people are just disconnected from humanity.
None of these people seem to have friends or loved ones Because there's just, if I did any of this whackadoo shit, I would get texts from Casey or Sarah or any number of people who love me.
Just like, hey, man,
you sound insane.
No, Casey would definitely not be just, but hey, yeah, what the fuck are you...
You okay?
That doesn't make any sense.
What do you mean?
What do you mean a Dyson sphere?
Do you know what that is?
A Dyson sphere?
It's just, they don't have friends, and I don't know if they want them.
I think it would require a certain level of vulnerability.
Have you talked to any?
Have you met up with any of them?
With the billionaires?
No, I tried.
There's a list at the end of my book of all of the tech billionaires I tried to interview, and they all said no.
The only one who I successfully interviewed was like a lower-tier billionaire guy named Jan Talin, who's in deep with the effective altruist.
Was he in that Skype and Skype and Kaza, yeah, yeah, yeah, that's right.
Christ.
Yeah, yeah, yeah, yeah.
I actually can't hate him for those are two pretty good ones.
Yeah, exactly.
Though I will say Skype definitely one of those inventions that I've knit.
I've never seen something just stop.
Yeah, no, it's gone.
Skype just got like, no, it just, no, I mean trapped in amber.
It was the same product for 50 years.
Oh, yeah.
And then Microsoft was just like, like, Boxer in Animal Farm, bang.
To the glue factory with Skype.
We fucked this up well enough.
It's also sad, but this has been such a wonderful conversation.
Where can people find you?
Well, I'm on Blue Sky.
How are you?
Because I don't want to be on a platform like X that's filled with Nazis.
So Blue Sky is the best place to find me.
I'm adambecker.bluesky.social or bsky.social.
And
you've got a book?
Yeah, I've got a book.
It's the main thing.
Yeah.
I've got a book called More Everything Forever.
I'll link to it in the notes.
Yeah, it is available wherever fine books are sold.
And if you liked what I had to say on this episode, I think you'll like the book.
And if you like what I have to say on this show, you're a sick puppy.
You know where to find me.
Thank you so much for your time as ever.
I love you all.
Thank you to Bahid.
Of course, Harry New York said here for producing this episode.
And of course to Matasowski, the wonderful producer at at home.
I will catch you with a monologue in a few days.
Thank you so much.
Thank you for listening to Better Offline.
The editor and composer of the Better Offline theme song is Matosowski.
You can check out more of his music and audio projects at matosowski.com.
M-A-T-T-O-S-O-W-S-K-I dot com.
You can email me at easy at betteroffline.com or visit betteroffline.com to find more podcast links and of course my newsletter.
I also really recommend you go to chat.where's your ed.at to visit the Discord and go to r/slash betteroffline to check out our Reddit.
Thank you so much for listening.
Better Offline is a production of CoolZone Media.
For more from CoolZone Media, visit our website, coolzonemedia.com or check us out on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.
Live in the Bay Area long enough, and you know that this region is made up of many communities, each with its own people, stories, and local realities.
I'm Erica Cruz-Guevara, host of KQED's podcast, The Bay.
I sit down with with reporters and the people who know this place best to connect the dots on why these stories matter to all of us.
Listen to The Bay, new episodes every Monday, Wednesday, and Friday, wherever you get your podcasts.
Parking shouldn't slow you down.
ParkWiz gives every driver a shortcut.
Book ahead, save up to 50%, and skip the hassle of circling the block.
Park smarter, park faster, ParkWiz.
Download the ParkWiz app today and save every time you park.
The secret to steady business growth: fast funding with AmeriFactors.
If you're a business owner in need of capital for payroll, inventory, or expansion, you need AmeriFactors.
Get the help you need, even with less than perfect credit, including bankruptcies.
AmeriFactors provides ready cash flow through accounts receivable management.
Get tailored solutions.
Call today for a free, no-obligation, no-impact to your credit score quote at 800-884-FUND.
That's 800-884-3863.
Or visit Amerifactors.com.
Every day has a to-do list, but adding Enjoy Belveda to yours can help you knock out the rest of it.
Belveeta breakfast biscuits are a tasty and convenient breakfast option when paired with low-fat yogurt and fruit that provides steady energy all morning.
While Belveeta Energy Snack Bites give you the perfect mid-morning refuel, best part, they both taste great.
So make the most out of your morning with a bite of Belveeta.
Pick up a pack of Belveeta at your local store today.
This is an iHeart podcast.