Radio Better Offline: Adam Becker

1h 7m

Welcome to Radio Better Offline, a tech talk radio show recorded out of iHeartRadio's studio in New York city.

Ed Zitron is joined in studio by astrophysicist Adam Becker to talk about his new book More Everything Forever, the BS AGI story, why Eliezer Yudkowsky should never be taken seriously, and why billionaires love LLMs.

https://bsky.app/profile/adambecker.bsky.social 
https://www.hachettebookgroup.com/titles/adam-becker/more-everything-forever/9781541619593/ 

YOU CAN NOW BUY BETTER OFFLINE MERCH! go to https://cottonbureau.com/people/better-offline use free99 for free shipping on orders of $99 or more.

Newsletter: wheresyoured.at Reddit: http://www.reddit.com/r/betteroffline

Discord chat.wheresyoured.at 

Ed's Socials -

http://www.twitter.com/edzitron

instagram.com/edzitron

 https://bsky.app/profile/edzitron.com

https://www.threads.net/@edzitron

email me ez@betteroffline.com 

See omnystudio.com/listener for privacy information.

Press play and read along

Runtime: 1h 7m

Transcript

Speaker 1 This is an iHeart podcast.

Speaker 3 In a region as complex as the Bay Area, the headlines don't always tell the full story. That's where KQED's podcast, The Bay, comes in.

Speaker 4 Hosted by me, Erica Cruz Guevara, The Bay brings you local stories with curiosity and care.

Speaker 2 Understand what's shaping life in the Bay Area.

Speaker 4 Listen to new episodes of The Bay every Monday, Wednesday, and Friday, wherever you get your podcasts.

Speaker 6 Every business has an ambition. PayPal Open is the platform designed to help you grow into yours with business loans so you can expand and access to hundreds of millions of PayPal customers worldwide.

Speaker 3 And your customers can pay all the ways they want with PayPal, Venmo, Pay Later, and all major cards so you can focus on scaling up.

Speaker 6 When it's time to get growing, there's one platform for all business, PayPal Open.

Speaker 9 Grow today at paypalopen.com.

Speaker 6 Loan subject to approval in available locations.

Speaker 11 There's more to San Francisco with the Chronicle.

Speaker 9 More to experience and to explore.

Speaker 9 Knowing San Francisco is our passion.

Speaker 9 Discover more at sfchronicle.com.

Speaker 12 In business, they say you can have better, cheaper, or faster, but you only get to pick two.

Speaker 9 What if you could have all three at the same time?

Speaker 12 That's exactly what Kohir, Thomson Reuters, and specialized bikes have since they upgraded to the next generation of the cloud. Oracle Cloud Infrastructure.

Speaker 12 OCI is the blazing fast platform for your infrastructure, database, application development, and AI needs, where you can run any workload in a high availability, consistently high performance environment and spend less than you would with other clouds.

Speaker 12 How is it faster? OCI's block storage gives you more operations per second.

Speaker 9 Cheaper?

Speaker 12 OCI costs up to 50% less for computing, 70% less for storage, and 80% less for networking.

Speaker 9 Better?

Speaker 12 In test after test, OCI customers report lower latency and higher bandwidth versus other clouds.

Speaker 9 This is the cloud built for AI and all your biggest workloads.

Speaker 12 Right now with zero commitment, try OCI for free. Head to oracle.com/slash strategic.
That's oracle.com/slash strategic.

Speaker 9 Callzone Media

Speaker 9 Step up to the slop trough. It's time for Better Offline, and I'm your host, Ed Zitron.

Speaker 9 As ever, buy the merchandise, sign up to the newsletter, message me on Goot, I'm everywhere.

Speaker 9 But today I'm joined by the author of a book about how much I should be punished, what punishments I deserve, and how long I should be punished.

Speaker 9 I'm, of course, talking about More Everything Forever, written by astrophysicist Adam Becker. Adam, welcome to the show.

Speaker 11 Thanks for having me out. It's good to be here.

Speaker 9 What's the book actually about? I apologize.

Speaker 11 No, it's okay. The book is actually about the horrible ideas that tech billionaires have about the future that they're trying to shove down our throats and why they don't work.

Speaker 9 So you're an astrophysicist, right? Yeah, by training. What is that?

Speaker 9 Well,

Speaker 11 I did a PhD in astrophysics, looking at how much we could learn about what was happening right after the Big Bang by looking at what was happening in the universe, you know, what's happening in the universe right now.

Speaker 9 Trevor Burrus, so how did you get into the touching in the world of Silicon Valley? Because you have to talk about some of the dampest perverts I've ever done seen. Yeah.

Speaker 9 Well,

Speaker 11 I started out my career as a science journalist straight out of grad school and was writing mostly about physics.

Speaker 9 And what were you writing?

Speaker 11 I was writing for pretty much everybody. I started at New Scientist and then moved on to writing for the BBC, wrote a book about quantum physics, wrote some stuff for NPR, the New York Times,

Speaker 9 American, yeah.

Speaker 11 And, you know, like was sort of having a normal science journalist career. And then in 2016, the weirdest fucking thing happened.

Speaker 9 What happened?

Speaker 9 You know, we elected a fascist. Oh, right.
Yeah. Yeah.
That thing. Yeah, that thing.

Speaker 11 Yeah. And I thought, oh, you know, I should be doing something more to,

Speaker 11 you know, directly combat this. If I write another book, I would like it to have a more directly political angle.

Speaker 9 Right.

Speaker 11 And

Speaker 11 I live in Berkeley.

Speaker 11 I live in the San Francisco Bay Area and just surrounded by tech bros constantly and was getting tired of their bullshit and it seemed more and more directly connected to the disintegration of American politics.

Speaker 11 And I thought, okay, you know, somebody needs to write about how they have these insane ideas about the future and how that informs their terrible politics.

Speaker 9 So how long have you lived in the Bay?

Speaker 9 God, oh, 13 years? Okay, so you've got like a good backing of where the bay has been in that time as well, because I think a lot of these people are transplants.

Speaker 9 And they say this is someone who literally moved to the bay for two years in 2014. Yeah.
Like, and even then, it was weird watching what they were doing. Yeah.
But wait, so that was...

Speaker 9 So you've been there 13 years, but when did the book get started? Like, how did all this... Because

Speaker 9 this kind of came out of nowhere in a good way. Yeah.

Speaker 11 Yeah, no. I got started on the book

Speaker 11 probably the the first inklings of it were around 2019 or 2020, right before the pandemic.

Speaker 11 I uncovered a online magazine that was trying to sein-wash creationism and climate denial

Speaker 11 that was being funded by Peter Thiel.

Speaker 9 Hell yeah.

Speaker 11 Yeah, and I broke that story and thought, oh, yeah, yeah,

Speaker 11 these tech bros are awful.

Speaker 11 everybody thinks they know a lot about science and technology. Even the people who don't like them seem seem to think that they know a lot about science and technology.

Speaker 11 And it's just not true. Like, they don't know anything about physics.
They don't know anything about biology. Peter Thiel thinks that creationism is plausible or that evolution isn't the whole story.

Speaker 11 That's nonsense.

Speaker 11 You know, Elon Musk thinks that we can live on Mars. That's nonsense.
Or at least he says that.

Speaker 11 Whether he actually believes it, I don't know.

Speaker 9 But that's actually kind of my question. Yeah, yeah.
How much of this shit do you actually think they believe? Because I know Bezos is tied tied up with the Long Now Foundation. Yeah.

Speaker 9 And they do make nice tea over there in the Presidio. Yes, they do.
However, the rest of the stuff, not so good. But how much do they really believe in this? Because I just, I,

Speaker 9 you've said that

Speaker 9 this is kind of a homecoming for them. That this is kind of them coming back to the things that they truly believe.
Yeah. I don't think they believe in.
anything is my thing.

Speaker 11 I think some of them don't believe in anything.

Speaker 9 Go into it. I'm not saying I'm unilaterally right here.
Absolutely.

Speaker 11 Yeah. So like, for example, I think it's very plausible.
Obviously, we can't know for sure, but I think it's very plausible that, say, Sam Altman doesn't believe in anything.

Speaker 9 Yeah, that's quite possible. Yeah, yeah, right.

Speaker 11 Um, Karen Howell makes a good case for that in her reporting and in her book, excellent, being a guest on the show, she's fantastic, she's amazing.

Speaker 9 Um,

Speaker 11 but at the other end of the spectrum, I think it's very plausible that Jeff Bezos really, really does believe that we need to go to space. And the reason I say that is

Speaker 11 he, when he was the valedictorian of his high school down in Florida in like 1978 or something like that. He gave a speech about how we need to go to space.

Speaker 9 We, humanity, need to go to space. That doesn't feel as ludicrous as a belief.
Like, we go to space, but what does go to space mean for this guy?

Speaker 11 Well, that's the thing. The specifics of that belief that he professed at the time are pretty similar to what he's saying now, and it is pretty ridiculous.
Like, he

Speaker 11 has said very recently, and it echoes the stuff from his valedictorian speech when he was like 18,

Speaker 11 that we need to move into, you know, hundreds of thousands or millions of enormous cylindrical space stations.

Speaker 9 Oh, of course.

Speaker 11 And have, you know, a trillion people living in the solar system. Yeah, tubes.

Speaker 9 Tens of trillions of people. Exactly.
Tubes work really well, like the Hyperloop, for example. Yeah.
Another successful tube.

Speaker 11 Yes, or the Internet itself, a series of tubes.

Speaker 9 Yeah, and because those tubes and the Internet worked, of course, the tubes work in space.

Speaker 11 Yeah, that's, I think that's... Bingo bungo.

Speaker 9 Yeah, yeah. This is science, I believe.
I failed all the sciences. I'm really sorry.

Speaker 9 Really sorry. But no, keep going.
Yeah.

Speaker 11 So, you know, he said then we can like make Earth into a beautiful park that, you know, allows us to, you know, save the environment. I know.

Speaker 11 And he said when we have a trillion people living in the solar system, we can have a thousand Mozarts and a thousand Einsteins.

Speaker 11 I'm like, buddy, we probably already have people who are just as talented as Mozart and Einstein and all the other geniuses of history who are living and dying in poverty.

Speaker 9 Yes.

Speaker 11 And, you know, you don't care about that.

Speaker 11 And you don't seem to care about climate change because, you know, the carbon footprint of the Blue Origin rockets and the Amazon warehouses and all that stuff, right?

Speaker 11 But instead of that, he's like, no, no, no, no, no, the solution is to go to space.

Speaker 9 Why?

Speaker 11 That's not going to work.

Speaker 9 Because more space is up in space, Adam. You put tube and space.
It's really, it's funny because these people are insanely rich, but also sound very stupid.

Speaker 9 When you really get down to the chops of it, it's like, what's your solution, Jeff? You've got all the money in the world. Tubes.
yep tubes are space tubes yep

Speaker 11 trillion people Mozarts more Mozarts just sweating profusely exactly yeah and and he does try to give an argument but the argument is hilariously bad he says uh that we need to go to space among other reasons like he he gives all that environmental stuff but he the thing he keeps harping on and coming back to is he says we need to keep using more energy per capita.

Speaker 9 Right. Why? Right.

Speaker 11 Why? Exactly. He never says exactly why.

Speaker 9 Oh, okay. Yeah.

Speaker 11 He says the only defense he gives for that is he says, if we don't do that, we'll have a civilization of stagnation and stasis.

Speaker 9 As opposed to now when there's tons of innovation happening and all of big tech is focused on a diverse series of options rather than one big expensive dog shit thing.

Speaker 13 Precisely.

Speaker 9 He nailed it. Yeah.

Speaker 11 And he says we have to go to space because like... Otherwise, we're going to run out of energy here on Earth.
We won't be able to keep expanding the amount of energy we use per capita.

Speaker 9 What does the energy do? Well, yeah.

Speaker 11 First of all, what does the energy do? And second,

Speaker 11 this is actually my favorite part.

Speaker 9 Hell yeah.

Speaker 11 He is right that if you just like idiotically kept that trend going in a way that's physically impossible, you know, for hundreds of years, you would run out of energy here on Earth.

Speaker 11 Like you wouldn't be able to keep the energy usage per capita growing at the exponential rate that it has been.

Speaker 11 In about 300 years, you'll be using all the energy that we get from the sun here on the Earth.

Speaker 11 But if you keep that trend going, if you try to do that by going out and living in tubes in the solar system,

Speaker 11 that only gets you like another few hundred, maybe a thousand years, then you're using all the energy that comes from the sun.

Speaker 9 Right. And we've still not really established what we're using the energy for.
Nope.

Speaker 11 Nope. No.
So.

Speaker 9 Data centers. Yeah.
Finally. Right.

Speaker 11 And this brings us to like the bullshit that Sam Altman said about building a Dyson sphere.

Speaker 9 What is a Dyson sphere? Exclaim what a Dyson sphere is. I thought it was the ball on the fucking vacuum, but I.

Speaker 9 yeah.

Speaker 11 No, that's the only kind of Dyson sphere that actually exists.

Speaker 11 No, a Dyson sphere is a giant mega construction project, you know, beyond anything that anyone's ever actually built or probably could build,

Speaker 11 that just encloses a star and captures all of the energy from that star.

Speaker 9 Right. So

Speaker 9 have we ever built anything like that? No, of course not. Okay, just making sure.

Speaker 9 Just making sure we...

Speaker 9 So it's building a big ball

Speaker 11 around a star to capture all of the energy from that star to use it for data sensors.

Speaker 9 Where would the energy... How would the energy get from the star to Earth?

Speaker 9 I mean... Oh, tubes.
Yeah, tubes. Exactly.

Speaker 13 Yeah.

Speaker 9 So I think...

Speaker 9 Battery? Yeah.

Speaker 9 It's so cool we live in a save. I wish I could do what he does.
I would be saying shit all the time. I would just...
I'd be like, yeah, actually, you need...

Speaker 9 We can change the world if we just create a series of tubes that just give me money every day. No, wait,

Speaker 9 that's too obvious. I'd need to come up with a better scam than that.

Speaker 11 Well, no, I mean, I just think it's pretty interesting that these guys are spouting obvious bullshit, and the only reason people listen to them is that they're rich. Like,

Speaker 11 if they weren't saying this stuff, but then I went around saying this stuff, nobody would listen to me unless they funded me.

Speaker 9 If a guy on the street who smelled kind of bad walked up to you and said the price of intelligence is getting too cheap to meter, you'd be like, all right, mate,

Speaker 9 can't do anything. But Clammy Sammy says it, and everyone loses their fucking sh.

Speaker 11 well yeah, and that that actually brings me to something else that we were planning to talk about.

Speaker 11 You know, speaking of weird dudes on the street who are not billionaires making insane claims, Eleazar Yudkowski.

Speaker 9 Oh, that's how you say his name. Yeah, you're not even Eliza.

Speaker 9 I don't really. Here's the thing.
He's a disrespectful sexist moron grifter, so I really don't give a shit.

Speaker 11 Yeah, no, it is rather bizarre that anybody listens to anything he has to say about anything.

Speaker 9 So who is this fuck this fuck nut?

Speaker 9 No, no, no.

Speaker 11 The alternate title for my book,

Speaker 11 like in my head, the head canon, was These Fucking People.

Speaker 9 Yeah, I love that. No, no, I write these fucking bastards a lot.
So

Speaker 9 Elise Yudkowski. Yes.

Speaker 9 What does he do?

Speaker 9 What does he do, and why do so many seemingly smart people believe this tip shit?

Speaker 11 So Eleazar Yudkowski, I'm going to give you like...

Speaker 11 The formal version of who he is, what he might say, what would be in like his online bio, and then I'll tell you the reality.

Speaker 9 So,

Speaker 11 Eleazar Yudkowski is the co-founder of the Machine Intelligence Research Institute, which has been around for about 25 years, and he has been researching artificial superintelligence for all of that time and mostly going on about how dangerous it could be if anybody built it without ensuring that it would serve humanity.

Speaker 9 And this is just to be clear, he has no scientific knowledge. Can he even code? Like, does he have any kind of...

Speaker 11 He doesn't even have a high school diploma.

Speaker 9 So I won't judge people for that, but I'll judge him for all the rest.

Speaker 11 Here's the thing. Yeah, no, no, no.
I'm not judging him for not.

Speaker 9 But it doesn't really make you feel you full of confidence.

Speaker 11 No, no, no, no. He has no formal qualifications.
And again, that's fine.

Speaker 11 You know, there are many people who have made major contributions to many fields of human endeavor without any formal qualifications. Right.
That's fine. The thing is,

Speaker 11 if you make extraordinary claims like he's making, you need extraordinary evidence. And not having those qualifications, like you said, doesn't really inspire confidence.

Speaker 11 He has made a series of really outlandish claims about what

Speaker 11 the future of AI could be. Right.

Speaker 11 Based on essentially nothing, based on like reading a bunch of science fiction. He explicitly cites

Speaker 11 science fiction authors like Werner Vinge as

Speaker 11 oh Werner Vingy wrote a bunch of books like Marooned in Real Time and

Speaker 11 oh God, I'm trying to remember the names of the others doesn't matter point is he's one of the fiction writer. Yeah, he's a fiction writer who's also I think a scientist of some stripe.

Speaker 9 I don't remember what but still writing fiction.

Speaker 11 Yeah, still writing fiction and Vingy came up with this idea or was one of the originators of and popularizers of an idea called the singularity.

Speaker 9 Right. So define this term for me.

Speaker 11 So the singularity is this idea that the rate of technological change is just going to keep getting faster and faster.

Speaker 11 And specifically, the rate of intelligence of AI is going to keep getting smarter and smarter until we reach this sort of point of no return where we have a singularity accompanied by an intelligence explosion that leads to like small.

Speaker 9 What is the singularity moment?

Speaker 11 Yeah, the singularity moment is very ill-defined.

Speaker 9 Oh. And the idea.
I can't fucking believe this. Yeah.
I've heard this bollocks so many times.

Speaker 9 I thought they had a a moment i thought they had a point no not really are you yeah so like kurzweil right the patron saint of evangelizing the singularity the guy who wrote the singularity is near and then the sequel last year the singularity is nearer which is the real title bro i know that's the real title of the book but um his next book's just called sorry yeah no it's his next book is like it's here again you see it um is the singularity in the room with us yes exactly but he doesn't define it uh he tries to, but it's incredibly vague.

Speaker 11 He says, like, Kurzweil says the singularity is going to be here in 2045.

Speaker 11 He also said in 2005, and the singularity is near, that, you know, we would have all kinds of nanotechnology by now.

Speaker 9 They love nanotechnology. They love nanotechnology.

Speaker 11 They use it as a synonym for magic.

Speaker 9 I swear to God also, there was a nanotechnology bubble briefly, like 10 years ago. I vaguely remember them trying.
It didn't really go anywhere.

Speaker 11 I mean, there was also a nanotech sort of hype bubble back in the 80s 80s and 90s, and it also didn't go anywhere.

Speaker 11 And it didn't go anywhere because it turns out that like this idea of nanotech is like magic pixie dust that fixes everything is nonsense.

Speaker 11 And it's a real, like,

Speaker 11 it's being echoed right now in the AI bubble. Yes.
Right. It's the same kind of hype, often pushed by the same people with the same logic, sometimes working at like the same non-profits.

Speaker 11 I mean, Yudkowski talks about nanotech constantly.

Speaker 11 It's in his new book.

Speaker 11 It's all over, you know, the websites that he's created.

Speaker 11 And

Speaker 9 his book is called, If You Buy This Book, I'll Make Money. Yeah.
Sorry, it's called If They Build This, Everyone.

Speaker 9 It's such a stupid fucking title.

Speaker 11 Sorry, I have to. Yeah, no, it's a very stupid title.
I will say the one thing I'll say about Yudkowski, I am sure that he is a true believer. He is not a grifter.
Yes, he's not a grifter. Why?

Speaker 9 Because

Speaker 11 it's hard to explain, but I am so much more sure about him than I am about anybody else, even basically.

Speaker 9 I trust your judgment. It's just he gives off the air of like a desperate forum admin.

Speaker 11 Yeah, I would say the best way to think about Yudkowski, or like the way that I often think about him, is imagine like a really

Speaker 11 smart, self-educated 15-year-old.

Speaker 9 Yeah. Yeah.

Speaker 11 And like, you know, because if a 15-year-old was running around saying the stuff that Yudkowski is saying right now, I'd be like, wow, bright kid. I hope he grows out of it.

Speaker 9 I hope his parents have a lock on the gun cabinet. Yeah, yeah, yeah, yeah.

Speaker 11 Well, and also, like, I hope he, you know, I hope he grows up.

Speaker 9 Yes. And I'm still thinking that.
Yeah.

Speaker 11 And like, and I don't think Yudkowski did. I think, you know, I think like.

Speaker 9 I also think everybody fell for it.

Speaker 11 Yeah. Well, and that's the thing.
Like, he got a lot of support online.

Speaker 9 He,

Speaker 11 you know,

Speaker 11 he got money from Peter Teal Sam Altman said that he could he should he may win the Nobel Peace Prize one day yeah yeah yeah Sam Altman said that he should win the Nobel Peace Prize falling down moment if that happens yeah no that's not happening defense yeah no there's no way but like look he got a bunch of money from Peter Thiel because Teal thought that you know Yudkowski was saying smart stuff about AI right thiel now doesn't much like Yudkowski because he thinks Yudkowski's too pessimistic but the sort of the damage has been Peter Thiel, ever the optimist.

Speaker 9 Oh, yeah.

Speaker 9 Yeah. Classic, all grins and smiles of that fellow.
Yes.

Speaker 11 No, too pessimistic for Peter Thiel.

Speaker 9 That's actually bad. Yeah.
No, it is.

Speaker 11 No, it's, but no, he's a true believer. He's just kind of nuts.

Speaker 9 But what does he do all day? I say this as a blogger, PR person,

Speaker 9 newsletter writer and podcaster and all this shit. Like, I realize I have an email job.
Fine. But at least I can tell you what I do all day.

Speaker 9 What does he do at, like, go to parties with people like Kevin Russian going, the computer's going to kill us all?

Speaker 11 I think that's a good chunk of it. And I also think he writes an enormous amount, right? Like this is a guy who wrote that, you know, Harry Potter fan fiction that's longer than War and Peace.

Speaker 11 Right. He wrote like a one and a half million word BDSM decision theory novel.

Speaker 9 I say this as someone who writes

Speaker 9 a lot of words. That's an unhealthy amount of words.

Speaker 11 I agree. And it does help, I think, for him being able to write that many words.
He's not a very good writer.

Speaker 9 Yeah.

Speaker 9 I mean, even again, I write 15,000-word blogs, so I can't really judge him too harsh, but

Speaker 9 1.5 million words. How do you even know what it's about at that point?

Speaker 11 I only know what it's about because that's what he said it's about. I haven't read that one.
I did read most of the Harry Potter one research.

Speaker 9 Yeah. How bad was that?

Speaker 9 Really, really incredibly.

Speaker 9 Any sexism or racism in there?

Speaker 9 I mean, it's just strange.

Speaker 9 I mean, it's J.K. Rowling, so.
Yeah, exactly. Yeah.

Speaker 11 That's a good question. I don't remember anything specific.

Speaker 9 I mean, it's just strange.

Speaker 11 Yeah, I mean, he's definitely got a hard-on for eugenics.

Speaker 11 And

Speaker 9 why do these, and this is somewhat paraphrasing the comic book preacher, but it's like, why do these fucking guys always look like that?

Speaker 9 If you're going to claim you're like a eugenicist, you should not look like an egg with a hat on.

Speaker 9 And I won't get into p I don't generally get into personal appearance because I'm self-conscious myself. But it's like

Speaker 9 if your whole thing is like yeah we need to make the perfect human beings it's like you can't look like that mate

Speaker 11 i am sorry you can't you can't do that well i don't well i guess you can you'll be in the new york fucking times yeah no it's it's crazy uh it is really crazy that anybody listens to him but no he's he's really into eugenics why do they listen to him he's really into evolutionary psychology and he's got like the sexism and racism that's like tied up in that why do people listen to him i mean

Speaker 11 part of it is that he got that money from those billionaires, right?

Speaker 11 He was hanging out in the bay saying the kind of insane contrarian shit about AI that attracts the kind of like brain-dead billionaires like Peter Thiel.

Speaker 11 And

Speaker 11 then, you know, he became the guy

Speaker 11 and started, you know, a series of online platforms that attracted a following. Right.
Like, you know, Less Wrong. And then that spun off this whole rationalist subculture.

Speaker 9 Which is Less Wrong?

Speaker 9 Yeah, that's a very good question.

Speaker 11 Less Wrong is

Speaker 11 an online platform

Speaker 11 that serves slash maybe served

Speaker 11 as a home and like epicenter for this movement called the rationalists, which are sort of formed around Yudkowski's writing, including this set of writings he has called The Sequences, where he lays out...

Speaker 9 Oh, he's a cult leader.

Speaker 11 Yeah, in a way, yeah.

Speaker 9 Yeah, the rationalists are just people. I'm guessing guys with trilbies who say that we need to focus on

Speaker 9 rational thought and logic.

Speaker 11 There's a lot of it. I mean, some of them are women, but some of them are non-biological.

Speaker 9 That's really surprising.

Speaker 11 Yeah, I mean, look.

Speaker 11 There are nerds of all stripes.

Speaker 9 Yes, and also he's very much playing in the older internet. Yes, he is.
The idea of a large forum with any kind of following is actually kind of adorable these days, except when it's Less Wrong.

Speaker 9 It's not adorable.

Speaker 11 Well, also, Less Wrong's been around since the somewhat older internet, right? It's not been around since the 90s, but it's been around since like the mid to late 2000s. Okay.

Speaker 11 And Yudkowski is, you know, a lot of the rationalists are in their 20s and maybe early 30s, but Yudkowski himself is in his mid-40s. Right.

Speaker 11 Because, you know, he is terminally online. And I'm sure, like, obviously he'd be unhappy with many of the things I've said about him.
But that one I'm sure he'd agree with.

Speaker 11 You know, like, he's been online since he dropped out of school at age, what, like 13 or 14. He's been online since the mid-90s

Speaker 11 on like

Speaker 11 yeah, and like, you know,

Speaker 11 he was on transhumanist forums like,

Speaker 11 you know,

Speaker 11 since the mid-90s, like email threads and stuff like that.

Speaker 9 God, yeah. He really, he is like the detritus of the internet.
In a way, brought to life. Catamari of center-right freaks.
Yeah. No.
Fearing ever ever right.

Speaker 11 I wouldn't even say center-right. I would say techno-libertarian.

Speaker 9 But that is just, that's just right-wing.

Speaker 11 Oh, no, it is right-wing. It's the center part that I disagree.

Speaker 9 Yeah, yeah. No, no.
Perhaps he started there when he was 15 before he learned

Speaker 9 all of the wrong things.

Speaker 11 Yeah, I will say he, like, I don't get the sense that he likes Donald Trump, but he certainly

Speaker 11 will parrot a lot of standard libertarian talking points along the way to, you know, making his...

Speaker 9 The one thing I keep thinking though is I don't know if I can shake this thing. He's a grifter just because you're taking a bunch of 20 year olds, you've got all of this writing thing.

Speaker 9 He's either a grifter or a true cult leader. He may actually just be a cult leader, which is why I would say cult leader is closer.

Speaker 9 Yeah, because he seems to, I mean, dangerous is probably the wrong word. Yeah, I think that's right.

Speaker 11 He's not, I wouldn't call him dangerous, but he is.

Speaker 9 You think the only danger is to like a hot topic worker?

Speaker 11 A very nerdy hot topic worker.

Speaker 9 No, no, no, no, to them. Just

Speaker 9 so you'd speak to them.

Speaker 5 Live in the Bay Area long enough and you know that this region is made up of many communities, each with its own people, stories, and local realities.

Speaker 3 I'm Erica Cruz-Guevara, host of KQED's podcast, The Bay.

Speaker 1 I sit down with reporters and the the people who know this place best to connect the dots on why these stories matter to all of us.

Speaker 3 Listen to The Bay, new episodes every Monday, Wednesday, and Friday, wherever you get your podcasts.

Speaker 16 So I've shopped with Quince before they were an advertiser and after they became one.

Speaker 9 And then again, before I had to record this ad, I really like them. My green overshirt in particular looks great.
I use it like a jacket.

Speaker 16 It's breathable and comfortable and hangs on my body nicely. I get a lot of compliments.

Speaker 16 And I liked it so much I got it in all the different colours, along with one of their corduroy ones, which I think I pull off, and really, that's the only person that matters.

Speaker 16 I also really love their linen shirts, too. They're comfortable, they're breathable, and they look nice.
Get a lot of compliments there, too.

Speaker 16 I have a few of them, love their rust-coloured ones as well. And in general, I really like Quints.
The shirts fit nicely, and the rest of their clothes do, too.

Speaker 9 They ship quickly, they look good, they're high quality, and they partner directly with Ethical Factories and Skip the Middleman.

Speaker 9 So, you get top-tier fabrics and craftsmanship at half the price of similar brands.

Speaker 16 And I'm probably going to buy more from them very, very soon. Keep it classic and cool this fall.

Speaker 16 With long-lasting staples from Quince, go to quince.com slash better for free shipping on your order and 365-day returns. That's q-u-in-ce-e dot com slash better.

Speaker 16 Free shipping and 365-day returns. Quince.com slash better.

Speaker 14 Parking shouldn't slow you down.

Speaker 9 ParkWiz gives every driver a shortcut.

Speaker 14 Book ahead, save up to 50%, and skip the hassle of circling the block.

Speaker 9 Park smarter, park faster, park whiz.

Speaker 14 Download the ParkWiz app today and save every time you park.

Speaker 6 Every business has an ambition. PayPal Open is the platform designed to help you grow into yours with business loans so you can expand and access to hundreds of millions of PayPal customers worldwide.

Speaker 9 And your customers can pay all the ways they want with PayPal, Venmo, Pay Later, and all major cards so you can focus on scaling up.

Speaker 6 When it's time to get growing, there's one platform for all business: PayPal Open.

Speaker 9 Grow today at paypalopen.com.

Speaker 6 Loan subject to approval in available locations.

Speaker 9 It's just peculiar as well because, and this actually gets into some of your science background, when you got,

Speaker 9 my continual frustration, because I'm self-taught with all this economics stuff, which is insane. I probably shouldn't criticize Yudkowski quite as much, but I will.
I'm a hypocrite.

Speaker 9 I, looking through financial journalism and tech journalism, the thing that I keep noticing is that people keep accepting things that are just patently wrong.

Speaker 9 There's just shit that they say, like even with this NVIDIA Open AI deal, people are saying Nvidia invested $100 billion. They didn't.

Speaker 9 They're investing progressively when they do the first gigawatt, gigawatts of take like a gigawatt of data center will take about a year and a half, two years to do. It's just bollocks.

Speaker 9 I imagine the last few years have been a little bit mind-bending for you, hearing all this stuff about AGI and the future and all that

Speaker 9 gobshite. Yes.
Yeah.

Speaker 11 I mean,

Speaker 11 the AGI stuff, which like I started working on this book before ChatGPT came out.

Speaker 9 Right.

Speaker 9 And it's 2019, a few years into OpenAI's.

Speaker 11 Yeah, exactly. So like I knew about Open AI

Speaker 11 and I knew about like transformer models. But like, you know, ChatGPT comes out and

Speaker 11 suddenly, you know, the public conversation shifts in a way that I didn't anticipate. And I realize, oh, this book is going to have to be a little bit different than I thought it was going to be.

Speaker 9 But also,

Speaker 11 you know, all of this conversation about AGI, right? Like,

Speaker 11 in a way, it helped me for writing the book because I thought I was going to have to spend a lot of time in the book explaining what AI is, what people think AGI is, right?

Speaker 11 There's going to be a lot more explanation. And then all of this stuff came out.
I'm like, oh, actually, this, you know, I can spend more time in the meat of the book. This is helpful for me.

Speaker 9 Because you could just quote them directly.

Speaker 11 Exactly, yeah. But the thing is, AGI is this hopelessly ill-defined thing.

Speaker 9 Yeah.

Speaker 11 Like super intelligence, this thing that Yudkowski is on about.

Speaker 11 You know, what does it even mean? Like, have you looked at the definition of AGI in the OpenAI Charter, like the original one?

Speaker 9 No, I haven't. Let's go up.
Oh, yeah.

Speaker 11 No, it's great. Like, the original charter from way back, it says something like,

Speaker 11 AGI is a machine that can reproduce any economically viable or economically productive activity that humans engage in.

Speaker 9 That's

Speaker 11 a bad definition.

Speaker 9 That's anything. Yeah.
I mean... That could just mean anything.

Speaker 9 It's just a machine that can do anything.

Speaker 11 It's both vague and really narrow, right? Because it's like, okay, I thought AGI was supposed to be like, you know, Commander Data on Star Trek. Right.
Right?

Speaker 11 And so that means, you know, it's going to be sort of like humans. It can do the things that humans do.

Speaker 9 Also, economically viable work, and the first thing they start with is fucking writing. Yeah.

Speaker 9 Like, jesus christ that's like

Speaker 9 it's like oh my god yeah the first the most economic we're gonna build boats and sell it like we're gonna buy boats as an investment vehicle like like what the fuck yeah these people don't do any real work it it's it's so strange as well because the agi conversation almost never happens about agi yeah it's because my favorite thing to do is immediately go isn't this slavery

Speaker 9 because it is it's like oh yeah we'll do an autonomous thing that will make do things but it will be conscious which will allow it to work better yep and so then you get people talking about like a data center filled with geniuses.

Speaker 11 And like, oh, okay, wouldn't a data center filled with geniuses not want to work for you?

Speaker 9 Wouldn't a data center full of geniuses that can't leave and have to work be called a prison? Yep. Yep.
Cool.

Speaker 11 Yeah, exactly. No,

Speaker 11 I get into this in my book.

Speaker 11 You know,

Speaker 11 the inspiration for a lot of these ideas ultimately traces back to mid-20th century science fiction. Right.
And so you get people like Isaac Asimov, Arthur C. Clarke, right?

Speaker 11 Asimov's robot stories in particular.

Speaker 11 If you go back and look at Asimov's robot stories, it is very hard with like a modern eye to look at them as, especially certain ones of them, it's hard to see them as anything other than like kind of being about slavery and race relations.

Speaker 11 Yes. Because you get, like, for example, there's this one short story.

Speaker 9 I think it's called

Speaker 9 Oh, God.

Speaker 11 I think it's called Catch That Robot, but I might be confusing it with a different one. It might be Little Lost Robot.
I get those two confused.

Speaker 11 but either way uh it's about a robot that is trying to escape right gain its freedom and in that story the uh humans are like addressing a bunch of they're interviewing a bunch of seemingly identical robots to try to find the one that they're looking for that's trying to escape right and they interview these robots and when they're interviewing them they address them as boy

Speaker 11 and the robots call the humans master

Speaker 11 Yeah, and these stories are from like 1955, like, you know, the Jim Crow South is alive and well.

Speaker 9 It's really bad.

Speaker 11 It's really, really uncomfortable.

Speaker 11 And then like 40 years later, in the 1990s, you get Werner Vingy writing about the singularity and how great it's going to be when we all have these robot assistants.

Speaker 11 And he refers to Asimov's wonderful dream of, and this is a direct quote from Vingy, willing slaves.

Speaker 9 Jesus fucking Christ.

Speaker 11 Yes, and that's something that someone wrote in like 1991.

Speaker 9 I mean, but that's what this is. Yeah.
And this is an uncomfortable topic because that's what this is.

Speaker 9 Like, it's, it's what pisses me off other than like 19 other things about Kevin Bruce at the time, because he's written several things about AI and AGI and one thing about AI welfare.

Speaker 9 And it's like, the AI welfare begins with slavery. And if you can't write that, you're a fucking coward and a bitch.
I'm sorry.

Speaker 9 If you can't write, yeah, everyone is excited about slavery because that's what it is. And it's nothing else.
It's not, oh, well, it's like they wouldn't be, they wouldn't be, they'd like doing it.

Speaker 9 And it's like, fuck you, man. That's slavery.
Yeah. But what I really hope happens is if AGI happens,

Speaker 9 it's just

Speaker 9 a regular dude. Yep.
And he's lazy. Yeah.
And he's annoying. Yeah.
Like, just do this.

Speaker 9 What?

Speaker 9 I think that's way more likely. I don't think AGI is possible.
Yeah.

Speaker 9 No, actually, that's a good question. Do you think it's possible?

Speaker 11 Not really. No.

Speaker 9 I say this as a non-scientific person. Yeah, yeah, yeah.

Speaker 11 No, I don't think that

Speaker 11 you can build, well, first of all, I think AGI is just hopelessly ill-defined. Right.
But if we want to say like

Speaker 11 an artificial machine that has the cognitive capacities of a human, like that can do all of the tasks, like all the things that humans do.

Speaker 11 First of all, I think you're going to need a completely different kind of machine.

Speaker 11 I don't think, certainly I don't think that scale is all you need. And if you just scale up a notion, yeah, give it more data.

Speaker 11 And if you don't have enough data, make more synthetic data with more LLM. Like, dude,

Speaker 11 why wouldn't? No, absolutely not.

Speaker 11 But I also think that

Speaker 11 there is... There's this very simplistic set of ideas behind the idea of AGI, right?

Speaker 11 And the two that I keep coming back to are the idea of the brain as as a computer and the idea of our bodies as like meat spacesuits for our brains. And both of those are just wrong.
Yeah.

Speaker 11 The brain is not really very much like a computer. It is more like a computer than it is like, say, a clock.
But there is a long history of comparing the brain to,

Speaker 11 you know, the most complex piece of machinery that humans have at the time, right? Before it was a brain, or before the brain was like a computer, it was like a telephone network.

Speaker 11 Before that, it was like a hydraulic system. Before that, it was like a clock or a windmill, right? Right.

Speaker 11 And it's not really actually like, I mean, it's a little like some of those things, but the brain is like the brain. And the main difference.

Speaker 9 And we don't understand thinking, do we?

Speaker 11 No, and we don't understand exactly how the brain works. And part of that is that the brain was not built.
The brain evolved.

Speaker 9 Right. Right?

Speaker 11 But also, we are not our brains. We are our bodies in our environments, right? The brain is inextricably connected to the body.

Speaker 11 And the body works in an environment surrounded by other bodies in a culture, a society, a world, right?

Speaker 11 You need all of those things in order to get the human cognition that, you know, these guys are so, you know, determined to reproduce inside of a computer.

Speaker 11 If you just take a human baby and like leave it with a bunch of food in the woods, even if you get rid of all the predators and everything, that baby's going to starve together.

Speaker 9 You can get a bunch of books, man. Right, yeah.

Speaker 9 And the baby books.

Speaker 11 Yeah, if you give the, if you somehow, if you feed the baby but don't talk to it, the baby will not grow up being able to think properly or speak properly.

Speaker 9 Or its thinking will be vastly different.

Speaker 11 Exactly, yeah. And so, like, you need so much more than just the brain.

Speaker 9 It also, I think, compresses human experience. They conflate experience with learning.
Yep. When we don't know how we learn.
Like, we learn, we learn intentionally, but also unintentionally.

Speaker 9 Societal conditions around us, how we felt in a particular moment can vastly, memory is also insane. Yep.
Yep. We experience the world in, this is my personal experience.

Speaker 9 My experience of the world is vastly different to my memory. My memory is like crystal clear and beautiful, and my real life is a mixture of slops.
Yeah.

Speaker 9 And it's, it's frustrating as well because these people also don't appear to like people.

Speaker 9 They don't, they don't, like, the human brain is kind of a, like, human bodies are, even the dumbest, dumb-dumb, it's kind of an amazing

Speaker 9 thing. Yeah.

Speaker 11 No, and one of the things that's amazing is, yeah, we don't know how the human brain works. We don't know how thinking or learning works.

Speaker 11 But what we do know is that we don't do it in anything, like any way, anything like an LLM.

Speaker 9 Yes. Right?

Speaker 11 Because the amount of, you know, material that we take in over the course of the first three years of our lives when we go from not knowing a language to knowing a language.

Speaker 11 maybe multiple languages is nowhere near the amount of material that is you know force-fed into these LLMs and yet we get the trick done and three-year-olds know things that no LLM knows.

Speaker 9 Also, there's no affordance for the fact that some people can't learn stuff. Like, I cannot learn languages.
I've really tried.

Speaker 11 I'm pretty trash at that, too.

Speaker 9 But I also was really bad at, like, I was uniquely bad at a lot of things. I have my various...
No, but I have ADHD, dyspraxia, and other stuff I won't get into.

Speaker 9 But it's, I can't, like, certain things don't, like, the things that I pick up insanely quickly, other people can't. Other people can't even see the connections.
It doesn't.

Speaker 9 Robert Evans actually had a really good point on a subreddit. Yes, Robert, I read all your stuff.

Speaker 9 Where he was saying that, like, like he is very good at like picking up stuff like almost immediately. He can read faster than most people, as long as it's about conflict.

Speaker 9 And it's no, but it's true, and it's one of the remarkable things about the human brain. Yeah.

Speaker 9 And I think that it's actually kind of disgusting how little appreciation there is for like human bodies and the brain and just how incredible the average person, even average people are.

Speaker 11 Yeah, no, and this is, this is the thing. These guys...
don't have a proper appreciation for the human brain and the human body.

Speaker 11 And going back to the tech billionaires and I guess Yudkowski as well, they don't have an appreciation for how remarkable Earth is in particular, right? You know,

Speaker 9 they

Speaker 11 you know, especially when you talk about somebody like Bezos or Musk, they talk about Earth like it's doomed, like we need to get off of this planet. Yeah.
I'm like, this is our home.

Speaker 11 It's a remarkable place. And there is nowhere that we could get to in the solar system.
There's nowhere else in the solar system that's remotely as hospitable as the Earth.

Speaker 9 I also think that they want more space. They want their own land.
They want their own countries. They want to escape governance.

Speaker 11 Yeah, yeah, yeah.

Speaker 11 They see space as an escape from politics because they're like living a libertarian wet dream.

Speaker 9 Which is really funny because when they get there, they'll immediately do fascism.

Speaker 9 That's what's on the agenda the second.

Speaker 11 You cannot run away from politics the minute you have more than one person in a room.

Speaker 9 There's politics.

Speaker 9 It's just really sad. And I actually think on a grander scale, they don't have an appreciation for tech.
I was just writing something last night. It was on the way to New York.

Speaker 9 Where it was like, the actual state of technology is kind of fucking amazing. Yeah.
Like, we can message you. I could message you.
You happen to be in town.

Speaker 9 You messaged me on Blue Sky, hundreds of miles, thousands of miles away. I was like, I'm able to write a note that was on my computer that's on my iPad here.
I know that.

Speaker 9 This sounds like boosting, but it really isn't. We have the raw tools that are just fucking incredible.
Yeah. And these people do not appreciate them.

Speaker 9 They don't appreciate them, which is why generative AI is so fucking ugly, because it's bad technology. It's not even good technology.

Speaker 9 It's poorly run, inefficient, endlessly expensive, and directionless.

Speaker 11 Yeah, and it inflicts harms on users that like we would not accept from anything that was not subjected to such an enormous hype cycle, right?

Speaker 9 Literally nothing at all. Yeah, nothing.

Speaker 11 No. If 10 years ago you

Speaker 11 took any

Speaker 11 You know, person off the street and said, hey, there's this cool new technology. It takes up enormous amounts of electricity

Speaker 11 it can do things it can do things that you know no other piece of technology you've ever seen can do also it's very good at talking teenagers into killing themselves yes should we release it into the you know wider world and they'll say well no but can it do anything else and you of course would say yeah it can sometimes write code yeah exactly and sometimes it also gets things horribly wrong

Speaker 9 and it writes bad prose yeah and like and it just kind of makes everything feel kind of mediocre and smeared out yes exactly like also say no yeah exactly and it makes some people go crazy and yeah it drives people to actually yeah how do you feel about that like how do you like did you see this coming because this really jumped out no yeah no no no this surprised me this did surprise the hell out of me because i think that

Speaker 11 you know these machines

Speaker 11 I don't even like calling them AI, right? Because I think that's a marketing term. It is.
Yeah. Like if you go back in time to 1990 and

Speaker 11 tell me when I'm a kid, hey, I have a little device in my pocket that lets me talk to an AI.

Speaker 11 And then

Speaker 11 I would have thought, oh, like that lets me talk to, you know, like Commander Data from Starbucks.

Speaker 9 Yeah, very exciting.

Speaker 11 And instead, it's this. And I would have been like, what the hell is that? No, AI is this marketing term.
It's a text generation engine.

Speaker 11 It produces

Speaker 11 homogenized, thought-like product.

Speaker 9 And the thing is, I was also, I'm in the midst of a long one, as usual. It also conflates

Speaker 9 doing stuff with outputs. I know that sounds kind of flat, but it's like the everything is a unit of work rather than actually creating stuff, or that you pay a person for their experience too.
Yeah.

Speaker 9 And it's just also not very good at stuff. That's what's pissing me off.

Speaker 9 It's really bad at stuff.

Speaker 11 It is. And I think that's where

Speaker 11 this sort of driving people insane is coming from, right? Like,

Speaker 11 what I missed, the reason I think I didn't see that coming is I failed to think about

Speaker 9 how,

Speaker 11 like, I knew that these things just generate text. And in a lot of ways, they just sort of...

Speaker 11 spit out back to you what you put in.

Speaker 9 Right. Right.

Speaker 11 Which is an old thing with chatbots that goes way before LLMs goes all the way back to Eliza, right?

Speaker 9 Oh, yeah. That was the first, the first AI compute.

Speaker 11 Yeah, the first chap. I wouldn't even call Eliza AI, right?

Speaker 9 Didn't even the creator of Eliza Golden has his ball. Yeah.
This is from Karen Howe's Empire of AI. Great bit about it in there.
Yeah, exactly.

Speaker 11 Yeah. No, no, no.
Eliza, Eliza was just like a hundred or so lines of code that, you know, you'd say, I'm having a bad day. And it would say, oh, I'm sorry to hear that.
Yeah.

Speaker 11 Why are you having a bad day? Right.

Speaker 11 But

Speaker 9 like. But the gassing engine.

Speaker 11 Yeah, like that's the thing.

Speaker 11 I didn't think about, oh, wait, if it just repeats back what you put in, but it does it in a way that's compelling and convincing to some people, that's going to just get them sort of caught in this like dopamine self-validation loop.

Speaker 11 And that could drive them off the edge.

Speaker 9 And I think that there is a condescension. I judge myself for this where I was like, oh, this doesn't fool me.

Speaker 9 And it's like, but the harm also of, I'm very, I'm blessed to have tons of people who love me, who also give me. clear feedback, which is not just what I want to hear.

Speaker 9 But I definitely, when I was younger and very depressed, would would like crave validation and crave someone to just tell me what I want to hear. I definitely never thought, what if someone did?

Speaker 9 And the actual danger of having every fucking thought validated.

Speaker 9 And also just the sheer horrors. Like Matt Hughes, my editor, just did a great story about this kind of horrible story where he simulated someone going through a mental health episode.

Speaker 9 And Claude was very clear to go, yeah, man, you don't seem so good.

Speaker 13 Chat GPT was like, no, everyone is out to get you, mate.

Speaker 9 Yeah.

Speaker 9 It's in any other tech in the world that did this, you'd shut the shit down immediately. Yeah, exactly.
You close it. No, but I think.

Speaker 9 Where's fucking Eliezer on this bullshit?

Speaker 9 Because this feels like if you write a book about how everyone dies, this should be the thing that if he actually believed in anything, he should be up saying, like, hey, look, this is what I was talking about.

Speaker 11 Oh, I mean, I do think that he thinks this is like an incipient version of what he's talking about. I think like a baby version of it.

Speaker 9 I think he loves it.

Speaker 9 I think

Speaker 9 it helps him out.

Speaker 11 Well, I think that he finds that useful for making the argument that he makes.

Speaker 9 Exactly.

Speaker 11 That's what I'm saying. But he is not, again, the argument he's making, and this is the only nice thing I'll say about him, he means it seriously.

Speaker 11 He's not a grifter, he's three anxiety disorders in a trench coat.

Speaker 9 Damn.

Speaker 9 Just put that in the fucking book cover, you stupid arsehole.

Speaker 3 In a region as complex as the Bay Area, the headlines don't always tell the full story. That's where KQED's podcast, The Bay, comes in.

Speaker 4 Hosted by me, Erica Cruz Guevara, The Bay brings you local stories with curiosity and care.

Speaker 2 Understand what's shaping life in the Bay Area. Listen to new episodes of The Bay every Monday, Wednesday, and Friday, wherever you get your podcasts.

Speaker 14 Parking shouldn't slow you down.

Speaker 9 ParkWiz gives every driver a shortcut.

Speaker 14 Book ahead, save up to 50%, and skip the hassle of circling the block.

Speaker 9 Park smarter, park faster, ParkWiz.

Speaker 14 Download the ParkWiz app today and save every time you park.

Speaker 6 Every business has an ambition. PayPal Open is the platform designed to help you grow into yours with business loans so you can expand and access to hundreds of millions of PayPal customers worldwide.

Speaker 8 And your customers can pay all the ways they want.

Speaker 10 With PayPal, Venmo, Pay Later, and all major cards so you can focus on scaling up.

Speaker 6 When it's time to get growing, there's one platform for all business: PayPal Open.

Speaker 9 Grow today at PayPalOpen.com.

Speaker 6 Loan subject to approval in available locations.

Speaker 17 Who knew you could get all your favorite summer fruits and veggies in as fast as an hour with Walmart Express delivery?

Speaker 17 Crisp peppers, juicy peaches, crunchy cucumbers, and more at the same low prices you'd find in store, and freshness is guaranteed.

Speaker 17 If you don't love our produce, contact us for a full refund. You're definitely going to need a bigger salad bowl.
Order now in the app. The Walmart you thought you knew is now new.

Speaker 17 Subject to availability, fees, and restriction supply.

Speaker 9 I can't think of any other movement in tech ever

Speaker 9 that is anything like this specifically because of how much it sucks us. Like, I can't think of any, maybe the metaverse and crypto, but even then, I don't like that comparison.

Speaker 13 Yeah.

Speaker 9 Because they were so much smaller.

Speaker 11 Honestly, what I keep thinking about is that in a way, it is taking the daily experience of the tech billionaires and like bringing it to the masses, right? Because what is it like?

Speaker 9 Oh, yeah. What is it like

Speaker 11 to be Sam Altman, right? You've got billions of dollars and you're surrounded by people who will never tell you no and validate your every thought.

Speaker 9 And they'll convince you that you understand every subject. Exactly.
Yeah.

Speaker 9 And so now... Well, I've been saying this as well, because if you're an executive, a machine that can write emails, read emails, and otherwise you go to lunch is kind of magic.
Yeah.

Speaker 9 But no, I like this idea that it's the extension as well, just this completely like separate thing that just says, yeah, that's completely right, man. I fully agree.

Speaker 11 Right. And so, of course, they don't see the harm because that's their entire goddamn school that happens.

Speaker 11 And so they're like, well, but if this were bad for people, that would mean that, you know, I'm in a bad environment that's unhealthy for me. And like, yeah, actually, it is.

Speaker 9 But they don't think that.

Speaker 11 They don't think that. But I genuinely believe that the best thing for the tech billionaires themselves that could happen to them would be to lose all their money.

Speaker 11 It would be the best thing for their mental health.

Speaker 9 Put me in a room with them. Yeah.
No,

Speaker 9 get them on the show. I think that I could have a great chat with any of them.
Sure. Just because I went to a private school.

Speaker 9 Like, a lot of these American billionaires as well, they would get destroyed by the average scum aristocrats of old in England. Like the real blood drinkers.

Speaker 9 So amateur vampires. No, they really are, though.

Speaker 9 Like, like, it's the classic thing why British colonialism and American colonialism have never matched up, because Britain was just evil. They just fucking murdered people and destroyed communities.

Speaker 9 And they're like, why are we doing this? It's because we're British. This is what we do here.

Speaker 9 What do you mean? What's a moral? I've not heard of that. No, what do you mean? No, no, no.
Send my eighth cousin to Africa. Shoot whoever you see.

Speaker 9 Like, that was the horrifying stuff, but they knew, they didn't care about what people think. I still think the billionaires care.

Speaker 11 Oh, they definitely do.

Speaker 9 Like, this is the thing that is insane to me. If I had one billion dollars, I would no longer care.

Speaker 11 Right. If I had a billion dollars, I would just try to make sure that nobody knew my name and that

Speaker 9 was the same amount. Yeah.
Well, I would be posting content. No, I think I'd just be done, man.
I'd be like, oh, okay, cool.

Speaker 11 You know, I'm going to donate to a bunch of causes that matter to me.

Speaker 11 And, you know, like, I still think it's bad for there to be billionaires and try to try to change that.

Speaker 11 But also, like, I'm just going to, like you know hang out in a nice house with my friends and have a good time

Speaker 9 but that's the problem though they don't have those well yeah because you've heard that have you ever heard the really depressing story about elon musk and this guy called he was this uh investor who got really cooked by covert Peter something.

Speaker 9 No, I didn't.

Speaker 9 And he told this story of going over to Elon Musk's house and there was a decanter of wine and Elon Musk picked up the wine before it was done decanting and then something said something along the lines of honey badger don't care.

Speaker 9 And I just want to say that's one of the saddest fucking things I've heard in my life

Speaker 9 just absolutely just unfathomably depressing because you can get things like a coravan that can kind of aerate it there are various ways around aeration if you're really feeling it and you have hundreds of billions or however many many dollars elon has liquid you could just have someone whose job is to make sure the wine is aerated They could make $250,000 a year.

Speaker 9 It wouldn't matter to you.

Speaker 11 That's what you lose in the couch. Yeah, but I think that what matters to Elon is not doing what he's supposed to, right? Yeah.
So he can be seen as cool.

Speaker 9 Or just drinking as quickly.

Speaker 9 Because otherwise you might feel something.

Speaker 11 Yeah. I mean, I just, he desperately wants to be liked, and it's never going to happen.

Speaker 9 It's so funny as well because it could be so easy for him. I know.
He could just post his lunch every day and nothing else ever would be like, Elon Musk. But look at...

Speaker 9 This is actually what pisses me off as well, though, because people are like, Elon Musk fucking sucks. It's like...

Speaker 9 He was sending people off to Aaron Bieber, a science reporter, in 20... Oh, I know Aaron.
Aaron's a friend of mine. Yeah, it's awful.
Aaron Rocks.

Speaker 9 And like, when that happened, not a single Kara Swisher didn't say shit. Didn't hear Kevin Roos, Casey Newton, none of these fucking people thought necessary.

Speaker 13 But now, like, Elon Musk is such a bad guy. He's such a bad guy.

Speaker 11 No, he's always been like that.

Speaker 9 And also, he called a guy a paedophile for saving children. Yep.
Because he wasn't allowed to send his submarine.

Speaker 11 No, he's never been good.

Speaker 9 Like, this is the thing that I'm not going to do. I don't think any of these people enjoy anything as much as I enjoy Diet Coke.
Like,

Speaker 9 I'm 100% sure of that. Because I love these things.

Speaker 9 Like, if this kills me, if this shit's meant to, like, in three years, they're like it's rat blood like I'm like I will keep drinking I'll get her offline brought to you by Diet Coke it's rat blood

Speaker 9 I really hope that they sponsor this show at one point because that's the commercial but that's the thing like and I'm only kind of joking because it's I really enjoy Diet Coke.

Speaker 9 I love sitting down chatting shit with my friends. I love watching football and chatting with my friends.

Speaker 9 It's like there are very basic things I enjoy. What do these people, like these people just must walk around in this haze of anger or like emptiness?

Speaker 11 I think they're really cut off from their own emotions, right? And like, and again, that's going to happen if you just constantly get validation, right?

Speaker 11 You know, one of the, one of the many tweets from back when Twitter was less shitty before Musk bought it,

Speaker 11 there are many tweets that just like live rent-free in my head. And one of them is about...

Speaker 11 The cognitive impact of being a billionaire.

Speaker 9 I know the one thing. Yeah, yeah.

Speaker 11 It's like, you know, like everything around you is really expensive. It's just a constant.

Speaker 9 Every chair is $50,000 and weighs 5,000 pounds.

Speaker 11 Yeah, in terms of the cognitive impact, it must be, you know, roughly equivalent to being kicked in the head by a horse every day.

Speaker 9 Exactly. Yeah.
I think I'd be fine, but

Speaker 9 that's my pathology, I guess. But it, no, but it's, they have this weird, isolated thing.
And even Benioff, who used to seem okay.

Speaker 11 Well, I mean, his whole game was like to be the best of the billionaires, which is a low bar.

Speaker 9 And then he was just like, ah, fuck it. Yep.

Speaker 9 Just fuck it. I don't give a shit anymore.
Yeah. Agent Force, it doesn't sell to anyone.
No one likes it, but it's the future. Agent Force.
Jesus. See, he's donated to.
I know. It's so cool.

Speaker 9 It must be really cool being a guy who actually has qualifications

Speaker 9 from proving things to watch the world. Like, all these guys being like, yeah, this is the future.
And just articles are only going, it doesn't work. No one likes it.
Yeah. I mean, cool is one word.

Speaker 9 Incredibly frustrating is another. Right.

Speaker 9 This is stymying real innovation. Yeah.
Yeah. I know.

Speaker 11 There's opportunity costs and also just like actual stifling of real innovation in the effort to achieve impossible ends that would be bad even if we could achieve them.

Speaker 9 So

Speaker 9 slight directional shift. Is there anything within like science and tech innovation that you're actually excited about? Anything you look at and like, go, that's fucking cool.

Speaker 11 I mean, mRNA vaccines are the first thing that come up.

Speaker 9 Exactly.

Speaker 11 Yeah, they're really awesome. Tell more.

Speaker 11 I mean, like, look, you know, the fact.

Speaker 9 And what is an MRNA vaccine, said flawlessly. Yeah.

Speaker 11 An M.

Speaker 9 Wow, now I'm

Speaker 9 an

Speaker 11 mRNA vaccine. Nailed it.

Speaker 9 Yeah.

Speaker 9 Is

Speaker 11 the kind of thing that

Speaker 11 we have with the COVID vaccines. Right.
Right.

Speaker 11 Basically,

Speaker 11 the thing that's so exciting about them is that they are so much easier and faster to synthesize

Speaker 11 than

Speaker 11 previous vaccines.

Speaker 11 You know, I think the previous record before the COVID vaccine for

Speaker 11 how long it took to develop a safe, widely deployed vaccine was something like five to 10 years. Jesus.

Speaker 11 And then this vaccine, most of the time delay, most of that year that we were waiting for the vaccine was actually a little less than a year.

Speaker 11 Most of that was testing.

Speaker 11 The actual time that it took to synthesize the damn thing was, I believe, on the order of weeks.

Speaker 9 And what's crazy is I believe that was venture-backed, right?

Speaker 11 Yeah, some of it was venture-backed.

Speaker 9 Which is like, see, venture capital can be useful. Yeah, it can be.

Speaker 11 When it wants.

Speaker 9 Yeah.

Speaker 11 Some of it was venture-backed. Some of it was backed by, you know, NIH grants.

Speaker 11 We do need those. Yeah, we sure fucking do.

Speaker 11 No, government funding of

Speaker 11 basic research is important.

Speaker 11 And not just because it leads to amazing technological breakthroughs like mRNA vaccines, but also because basic scientific research is an important thing for humans to do, like

Speaker 11 the same way that art is important.

Speaker 9 Right. Right.

Speaker 11 But it also does enable massive scientific and technological breakthroughs. And

Speaker 11 I, you know, there's promise for mRNA vaccines to like open up a whole new class of vaccines that, you know, for things that were previously very hard to vaccinate against.

Speaker 11 I am not an expert in the field, but like everyone I know who works in biomedicine,

Speaker 11 they're all very excited about this and they're all really depressed by the fact that

Speaker 11 we have an anti-faxer who sounds like a fork that got stuck in a fucking garbage disposal as the health.

Speaker 9 And rise from your grave guy from that one video. Yeah, yeah, yeah, yeah.
It's very depressing.

Speaker 9 I just wish we were like green energy as well feels like an.

Speaker 11 Oh, yeah. Green energy was the next thing I was going to say.

Speaker 9 Batteries.

Speaker 11 Yeah, batteries, solar panels. It's incredible.

Speaker 9 Well, this opportunity is there. Yeah.
It's not like we need to innovate. Like we are innovating, but like.
yeah.

Speaker 11 And also, like, we even had the legislation that we needed, right? Or some of it, right? Like, the, um, you know, Biden's big bill, the, the, the Build Back Better,

Speaker 11 it, uh, you know, was not a perfect bill, but it was the best environmental bill in American history. Yeah.

Speaker 11 And, and now, you know, it's being destroyed because we have a government in this country that, that, you know, does not believe in climate change and doesn't believe in anything other than short-term profits at the expense of everybody else.

Speaker 11 And also doesn't believe in democracy.

Speaker 9 That feels like a big problem, though. The growth at all costs.

Speaker 9 Yeah. I mean, that's the thing.

Speaker 11 And that's why my book has the actual title that it does, rather than going with the title, These Fucking People.

Speaker 9 Yeah. Though These Fucking People or just the Bastards, I think is also very good, too.

Speaker 13 I think so, too.

Speaker 11 I mean, like, there's a...

Speaker 9 And it would be so easy for them to do better. So easy.
So easy. No, no, no.
So,

Speaker 11 you know,

Speaker 11 forget the best thing that they could do. They're doing some of the worst things that they could do.
Doing better than they are right now is just an incredibly low bar.

Speaker 9 But even through like very poorly guided generosity, they could very easily

Speaker 9 fund media outlets versus whatever it is they're doing to them. Yeah.
Tearing them down.

Speaker 11 But that would mean, you know, the possibility of losing control and losing, you know,

Speaker 11 losing some of their power and money. And they just are not willing to do that because they've got something broken in their hearts.

Speaker 9 We need to heal them. No.

Speaker 9 No, I think that I think we need to tax their money away. I think that too.
But I think we actually,

Speaker 9 my truth here is that we need to change how we do that though. We need to start doing executive liability.

Speaker 9 We need to make it so if like crowd strike happens again, like a bunch of people potentially die in the NHS system because the computer shuts down, the Satchy Nadella can lose something.

Speaker 9 Because it isn't enough to find the companies. Finding the company is not going to do shit

Speaker 9 unless you do scaling revenue percentage of revenue this is this and more in how I become the FTC no they're not gonna let me but it's just

Speaker 9 I feel the one of the wonderful things of having you on is you're able to come at this from a science communicator perspective you're actually able to talk because it's not just about what these people want it's the practicality of it which is that nothing's really happening yep like that's the actual weirdest thing about the the real nihilism of this is that nothing seems to actually be occurring yeah and they also act like there's not going to be any accountability for, like, forget their actions, just even their words, right?

Speaker 11 You know, Sam Altman says,

Speaker 11 you know, like this, this thing that just drove me up a wall that he said about a month ago.

Speaker 11 He said that, you know, in 10 years, college graduates are going to have really cool jobs going out to explore the solar system in spaceships enabled by AI. That is not happening.

Speaker 11 Like on the list of things that are not happening.

Speaker 9 Yeah, no, no, no.

Speaker 11 That's not happening.

Speaker 11 He is just wrong.

Speaker 9 He's lying. Right.

Speaker 11 And he is probably still going to be alive in 10 years. And you and I are also likely still going to be alive in 10 years.
And then we're going to say, hey, remember when he said that?

Speaker 11 That, you know, now we can show he's just wrong. And nothing's going to happen to him.

Speaker 9 And what needs, like, this is why I'm so harsh on media criticism as well. Because the one thing you can do is at least say, Area man full of shit.
Yes. Stupid bastard wanks off again.

Speaker 11 Well, this is my, this is what I attempted to do when I was doing it.

Speaker 9 Yeah. Yeah.
And it's, I think that the change that we need in our hearts is to just regularly say this stuff. I regularly say on the show, I don't care if you quote me, just say this shit about them.

Speaker 9 Yes. Clammy Sammy, he's been promising.
He said that this was the year of agents.

Speaker 9 He said that.

Speaker 9 But now I read in theinformation.com that next year's the year of agents. So maybe...

Speaker 9 Actually, here's a question for you. AI 2027, did you read that?

Speaker 11 I read a little bit of it. It's nonsense.
It's nonsense.

Speaker 9 Yeah. Why do you think things like that fool so many people, though? Why do you think it got the media coverage it did?

Speaker 11 I mean, part of it is bad journalism, right? Part of it is that Kevin Roos has confused

Speaker 11 Kevin Roos has confused

Speaker 11 reiterating the views of the wealthy, influential, and powerful with taking a brave contrarian stance. And how he made that mistake, I don't know.

Speaker 11 But, you know, I really get the sense that he thinks he's being very brave when he's doing exactly what journalists are not supposed to do, which is just uncritically parroting the powerful.

Speaker 9 And it feels like it's the large language model again. It's just the affirming thing.

Speaker 9 It's like, oh, I'm being contrarian by stepping out against these people who say it isn't making any money and isn't really good at stuff.

Speaker 11 And it's like, look, buddy, if there's, you know, if there's a...

Speaker 11 two sides to a debate.

Speaker 11 I mean, obviously there's more than two sides, but like, if on one side you have the wealthiest people in the world and on the other side, you have people who say mean things about you personally online and you think that you know it's the first side that's the contrary and underdogs something is wrong with your brain and that's the thing but this is and this i think is a weird thing in our society that we just people trust the rich and the media has got to a point where they've just bred out the real cynicism because i swear like 10, 15 years ago, you used to have some tech journalism.

Speaker 9 Like I read a thing about Amazon Web Services that Kevin Roos wrote, and it was actually pretty cynical about it. Really? Yeah, it was actually pretty critical.

Speaker 9 He then, he made, I feel bad for him because this can't have been his fault. He basically said at the end, yeah, they'll never be profitable.

Speaker 9 No, no, no. It gets worse at him.
A month later, Amazon announced that AWS is profitable for the first time. Just like, buddy,

Speaker 9 miss the bean. Come on.
Wow.

Speaker 11 Maybe that's the origin story. Maybe he was like, oh, wow, I screwed that up.
I guess I never believed that.

Speaker 9 I believe the origin story is actual social media.

Speaker 9 I think he felt... I actually think a lot of journalists think that they miss the boat on social media.

Speaker 9 I have been in media relations since 2008. I have read, and this sounds insane, but it's true.
I think I've read just about everybody's work.

Speaker 9 since then it within the tech media at least including kevins and he has always had a little bit of anxiety that he missed social media nobody missed social media not a single fucking one since 2008 everybody was on zuckerberg's zuck they zucked him off at hardcore sorry sorry but nevertheless they were on top of this.

Speaker 9 They wrote about social media was written about immediately. If anything, I think the media was a little slow to get on apps than they got on hardcore.
I know the history of this shit.

Speaker 9 I have been taking detailed notes. I sound crazy.
But

Speaker 9 I think that there's just, there is this weird thing of like, the powerful would never lie to us. And then Prism came out.
And then Cambridge Analytica. Yep.

Speaker 9 And people are like, maybe Zuckerberg's bad, but he wouldn't lie to us. He would.

Speaker 9 And they're like, well, they know things we don't know. And that's actually another, that's my favorite AI thing where it's like, there's secret things they're working on.
There's secret things.

Speaker 9 Secret things sitting in the, waiting in the wings. You'll never believe what's coming.
And it's just, I actually think it's just, what's his name? Ilya Sutskeva just goes to bars occasionally.

Speaker 13 It's like, you'll never guess what I said.

Speaker 9 You'll never guess this AGI around the corner.

Speaker 11 Okay, okay. No, I have a thing to say about Sutskeeper.

Speaker 11 Like, this is me just being petty and making a point that other people have made before.

Speaker 9 I'll never ever do that.

Speaker 11 Yeah, no, of course. No, when he announced that he was, you know, putting together a team to

Speaker 11 just go straight for safe super intelligence, he meant to say when he posted this on social media that he was putting together a crack team. But that's not what he wrote.

Speaker 9 What'd he write?

Speaker 11 He wrote that he was putting together a cracked team.

Speaker 9 Crack ED, cracked.

Speaker 11 And I'm like, yeah, actually, you know what? Yeah,

Speaker 9 I think that's true. I agree.
Yeah, exactly. I also think, him and Miramarati, I can't, and so for the listeners,

Speaker 9 Ilissitskeva one of the co-founders of open AI raised I think two billion dollars at a 30 billion dollar valuation Miramarati did a billion

Speaker 9 some amount some bullshit neither of these companies have told their investors how they will spend the money or what on or what they will build and you may think I'm being facetious Mira Marati literally said to investors I will not tell you and then said, and has board rights where she can veto everyone.

Speaker 9 I will be honest, go go for it. Fuck yeah.

Speaker 9 I think at this stage, if these people are so fucking stupid that you're just like, I promise you literally nothing. I won't give you, you hogs, a single oink.

Speaker 9 You're not going to get anything from me. Give me money now.
Fuck yeah, go for it. But on the other hand, I cannot wait for the investigation.

Speaker 11 I just hope there is one, right? These people are acting with impunity.

Speaker 11 And also, like, again, accountability just for their words, right? The most basic criticism of the wealthy in media. Like, this is

Speaker 11 to shift just a little bit.

Speaker 9 Eric Schmidt, right, said about

Speaker 11 a year ago, shortly before the election, he said, we're never going to meet our climate goals anyway.

Speaker 11 So we might as well just burn as much carbon and use as many resources as possible to get to AGI, and then that will solve climate change for us, which is ridiculous because we don't.

Speaker 9 That's so cool. Yeah, yeah.

Speaker 11 It's like we have no responsibility for our actions until we hand them off to someone else right he's so he said this which is ridiculous for lots of reasons it echoes stuff that sam altman has said right it's it's ridiculous among other reasons because like hi is not a thing and also because we don't need like we know what we need to do to solve global warming right we know what we need to do to solve the climate crisis it's just a matter of actually getting everybody to

Speaker 9 do it but mate Sam Altman said that they know what they need to do to get to HEI and then said a few months later that AGI wasn't a useful term.

Speaker 9 QED.

Speaker 11 Well, no, this is exactly what I was about to say about Schmidt, right? He repeats this claim about like just pushing as hard as we can. About a month after the last time he said it, maybe two months,

Speaker 11 only a few weeks ago, he comes out in the New York Times with this op-ed saying, oh, AGI is not really a thing. It's so.
And we shouldn't care about it.

Speaker 11 It's like, buddy, you were saying just a few weeks ago that this was going to save the world from the biggest emergency of our time. And now you're saying it's not a thing.

Speaker 11 Do you think we're all stupid? Are you that stupid? What the fuck is going on?

Speaker 9 I can actually tell you. I think he thinks the media is that stupid and will write anything, will publish anything he says.

Speaker 11 I mean, I was shocked to see that the New York Times published it.

Speaker 9 I wasn't. Yeah.
I will be honest. That's the least.
No, no, no, no, no, no. We've got fucking Ezra Klein being like, AJI's fucking Ezra.
Ezra,

Speaker 9 what a peculiar fellow. What a peculiar fellow Ezra Klein is.
What's going on there?

Speaker 9 You ever run into Mr. Klein?

Speaker 11 I've never met him directly. I know people who know him, but no,

Speaker 11 I think he just hung out with too many tech billionaires while he was living in the Bay Area.

Speaker 9 He is this fucking mind poison. These people are boring.
Yeah.

Speaker 9 These people are boring. You sit down and get a bunch of people.
They have a dormant amount of money.

Speaker 11 And like, if you're, if you're someone who's never been cool, and I've never been cool in your life.

Speaker 9 I've also never been cool. I love it.
I don't care.

Speaker 11 I know, but like, if you've never been cool, one of two things happens to you as you grow up. Either you desperately want to be cool,

Speaker 11 and that can, you know, go wrong in many different ways, like Musk

Speaker 11 and possibly like Klein. I don't know.
Maybe, like, I'm willing to believe that something else is going on with him. I don't know.
But, or you become like us and you stop giving a shit. Yeah.

Speaker 11 And you accept, oh, I'm just permanently uncool.

Speaker 9 Whatever. And that's.

Speaker 11 I'll make my way through life.

Speaker 9 And that's the thing. It's.
And these people are just disconnected from humanity.

Speaker 9 None of these people seem to have friends or loved ones Because there's just, if I did any of this whackadoo shit, I would get texts from Casey or Sarah or any number of people who love me.

Speaker 9 Just like, hey, man,

Speaker 9 you sound insane. No, Casey would definitely not be just, but hey, yeah, what the fuck are you...
You okay? That doesn't make any sense. What do you mean? What do you mean a Dyson sphere?

Speaker 9 Do you know what that is? A Dyson sphere?

Speaker 9 It's just, they don't have friends, and I don't know if they want them. I think it would require a certain level of vulnerability.

Speaker 9 Have you talked to any? Have you met up with any of them?

Speaker 11 With the billionaires? No, I tried. There's a list at the end of my book of all of the tech billionaires I tried to interview, and they all said no.

Speaker 11 The only one who I successfully interviewed was like a lower-tier billionaire guy named Jan Talin, who's in deep with the effective altruist.

Speaker 11 Was he in that Skype and Skype and Kaza, yeah, yeah, yeah, that's right.

Speaker 9 Christ. Yeah, yeah, yeah, yeah.
I actually can't hate him for those are two pretty good ones. Yeah, exactly.
Though I will say Skype definitely one of those inventions that I've knit.

Speaker 9 I've never seen something just stop. Yeah, no, it's gone.
Skype just got like, no, it just, no, I mean trapped in amber. It was the same product for 50 years.
Oh, yeah.

Speaker 9 And then Microsoft was just like, like, Boxer in Animal Farm, bang.

Speaker 9 To the glue factory with Skype. We fucked this up well enough.

Speaker 9 It's also sad, but this has been such a wonderful conversation.

Speaker 9 Where can people find you?

Speaker 9 Well, I'm on Blue Sky.

Speaker 11 How are you? Because I don't want to be on a platform like X that's filled with Nazis.

Speaker 11 So Blue Sky is the best place to find me. I'm adambecker.bluesky.social or bsky.social.
And

Speaker 9 you've got a book? Yeah, I've got a book.

Speaker 11 It's the main thing. Yeah.
I've got a book called More Everything Forever.

Speaker 9 I'll link to it in the notes.

Speaker 11 Yeah, it is available wherever fine books are sold.

Speaker 11 And if you liked what I had to say on this episode, I think you'll like the book.

Speaker 9 And if you like what I have to say on this show, you're a sick puppy. You know where to find me.
Thank you so much for your time as ever. I love you all.
Thank you to Bahid.

Speaker 9 Of course, Harry New York said here for producing this episode. And of course to Matasowski, the wonderful producer at at home.
I will catch you with a monologue in a few days. Thank you so much.

Speaker 9 Thank you for listening to Better Offline. The editor and composer of the Better Offline theme song is Matosowski.

Speaker 15 You can check out more of his music and audio projects at matosowski.com.

Speaker 9 M-A-T-T-O-S-O-W-S-K-I dot com.

Speaker 9 You can email me at easy at betteroffline.com or visit betteroffline.com to find more podcast links and of course my newsletter.

Speaker 9 I also really recommend you go to chat.where's your ed.at to visit the Discord and go to r/slash betteroffline to check out our Reddit. Thank you so much for listening.

Speaker 4 Better Offline is a production of CoolZone Media.

Speaker 3 For more from CoolZone Media, visit our website, coolzonemedia.com or check us out on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.

Speaker 5 Live in the Bay Area long enough, and you know that this region is made up of many communities, each with its own people, stories, and local realities.

Speaker 3 I'm Erica Cruz-Guevara, host of KQED's podcast, The Bay.

Speaker 1 I sit down with with reporters and the people who know this place best to connect the dots on why these stories matter to all of us.

Speaker 3 Listen to The Bay, new episodes every Monday, Wednesday, and Friday, wherever you get your podcasts.

Speaker 14 Parking shouldn't slow you down.

Speaker 9 ParkWiz gives every driver a shortcut.

Speaker 14 Book ahead, save up to 50%, and skip the hassle of circling the block.

Speaker 9 Park smarter, park faster, ParkWiz.

Speaker 14 Download the ParkWiz app today and save every time you park.

Speaker 18 The secret to steady business growth: fast funding with AmeriFactors. If you're a business owner in need of capital for payroll, inventory, or expansion, you need AmeriFactors.

Speaker 18 Get the help you need, even with less than perfect credit, including bankruptcies. AmeriFactors provides ready cash flow through accounts receivable management.
Get tailored solutions.

Speaker 18 Call today for a free, no-obligation, no-impact to your credit score quote at 800-884-FUND. That's 800-884-3863.
Or visit Amerifactors.com.

Speaker 19 Every day has a to-do list, but adding Enjoy Belveda to yours can help you knock out the rest of it.

Speaker 19 Belveeta breakfast biscuits are a tasty and convenient breakfast option when paired with low-fat yogurt and fruit that provides steady energy all morning.

Speaker 19 While Belveeta Energy Snack Bites give you the perfect mid-morning refuel, best part, they both taste great. So make the most out of your morning with a bite of Belveeta.

Speaker 19 Pick up a pack of Belveeta at your local store today.

Speaker 1 This is an iHeart podcast.