AGI, Immortality, & Visions of the Future with Adam Becker
Press play and read along
Transcript
Speaker 1
It's the season to come together over your holiday favorites at Starbucks. Warm up with a creamy caramel brulee latte.
Get festive with an iced gingerbread chai, or share a velvety peppermint mocha.
Speaker 1 Together is the best place to be at Starbucks.
Speaker 2 It's time for Cyber Monday, Dell Technology's biggest sale of the year. Enjoy the lowest prices of the year on select PCs like the Dell 16 Plus, featuring Intel Core Ultra processors.
Speaker 2 And with built-in advanced features it's the PC that helps you do more faster. They also have huge deals on accessories that pair perfectly with your Dell PC.
Speaker 2 Plus, earn Dell rewards and enjoy many other benefits like free shipping, price match guarantee and expert support. Shop now at dell.com slash deals.
Speaker 3 Love me some future talk. But they spook me though with the way they take the end of civilization.
Speaker 2 But we don't know whose hands are on the steering wheel. We don't know who's shaping this future, and that's why there's concern.
Speaker 3 Well, I know who's shaping it, and I'm scared to death.
Speaker 3 It's you.
Speaker 3 Let's watch Chuck be scared to death as we discuss all the ways tech will be shaping our future. Coming right up, Star Talk Special Edition.
Speaker 3 Welcome to Star Talk,
Speaker 3 your place in the universe where science and pop culture collide.
Speaker 3 Star Talk begins right now.
Speaker 3
This is Star Talk, Special Edition. I'm Neil deGrasse Tyson.
You're a personal astrophysicist. And when I say special edition, it means I turn to my right and Gary O'Reilly is sitting there.
I know.
Speaker 3
Hooray. Hi.
Where'd you get your British accent?
Speaker 2 Stole it.
Speaker 2 Just a thief.
Speaker 3 Chuck. Hey.
Speaker 3
Otherwise known as Lord Nice, but we can call you Lord of Comedy. Lord of Comedy.
Can we do that? Okay, let's do that. So today, we're going to explore a vision of the future.
Speaker 3
And everybody's got their take on the future. Yeah.
Everybody's got everybody, but they all have different takes because they're coming from a different place.
Speaker 3 So you got to hear it all if you're going to assimilate it into something that you're going to take action on. True.
Speaker 3
So either make something happen or prevent something else from happening. Yeah.
So set us up, Gary. All right.
Speaker 2 So what does the future hold for us? That'll include scientists, science fiction authors, tech CEOs,
Speaker 2
and the so-called futurists. Everyone has their own idea for the future technologies.
Vision of AGI, nuclear fusion, the singularity, transhumanism, living on Mars. We've got to get to the moon first.
Speaker 3
Stuff we're going to go to. No, you don't go straight to Mars.
Do we? Don't get me started.
Speaker 3 As to Mars.
Speaker 3 And there you have it.
Speaker 2 And stuff we talk about all the time on StarTo. And in the face of new technological developments, we're quickly going from science fiction to science reality.
Speaker 2 But are we headed towards the utopian or are we headed towards dystopia? That will get into, are these technologies as close as they claim?
Speaker 2 Is science fiction always a guiding light or can it be a blueprint for those in power?
Speaker 3
So on that note. So who do we have today? Adam Becker.
Adam, welcome to Star Talk. Thanks for having me.
It's good to be here. All right.
You got a PhD
Speaker 3
in computational cosmology. Yes.
Love it. That was back in 2012.
Yep. And you wrote a book in 2018 called What is Real? Yep.
That's audacious.
Speaker 3
You know, the unfinished quest for the meaning of quantum physics. Sound like there's a little bit of philosophy in there.
Yeah, yeah. Or a lot of philosophy.
Oh, there's some. Yeah.
Speaker 3
And you just came out with a new book because you've been a writing science science popularizer maniac ever since your PhD. Pretty much.
Yeah, yeah.
Speaker 3
So here's the title. I love this.
More everything forever.
Speaker 3
AI Overlords, Space Empires, and Silicon Valley's Crusade to Control the Fate of Humanity. Oh, I got a better title.
We're effed.
Speaker 3 You know, you should have had that title. Yeah, you know, we considered it, but we just didn't think that it would really, you know, sell.
Speaker 3 So, my researchers told me that we've corresponded before.
Speaker 3 Because
Speaker 3
I've only just met you now. So, what? That's true.
So, what? So, they told you we corresponded, but they didn't tell you.
Speaker 3
Wow. Oh, okay.
So, what happened? So, they set me up. That's what happened.
That's so unlike us. No, what?
Speaker 3 Oh, God. Well, what happened
Speaker 3 was
Speaker 3 I was a snot-nosed kid in grad school and came to visit the museum and noticed what I thought was a mistake on one of the plaques.
Speaker 3
And so I emailed just the general astronomy department email here at the Rose Center. And then two weeks later, you wrote back.
Oh. And what did he say?
Speaker 3 Well,
Speaker 3 I'm sure I would have been polite.
Speaker 3 Was there a mistake? And did Neil school you if there wasn't?
Speaker 3 Whether there was a mistake or not was a matter of some debate.
Speaker 3
Oh, really? Yeah. So the question was the size of the universe.
Oh, it's a plaque. Oh, okay.
There it is. Yeah.
Well, that's still a debate today. Well,
Speaker 3 not in the way that he's describing it.
Speaker 3
Yeah, yeah. So I deal in observables, okay? Right.
As do many practical scientists. Right.
Speaker 3
So when you speak of what the universe is doing, you speak of what you see it's doing. Right.
And we can see galaxies whose light has been traveling for 14 billion years, 13.8 billion, right?
Speaker 3 And so we will loosely say, well, it's definitely the age of the universe, but we say, we speak of the size,
Speaker 3 we can be a little sloppy and say it's 13.8 billion light years to the edge of the universe.
Speaker 3
But that's not strictly accurate. What you have to do is Since then, the universe has been expanding.
So where's that galaxy now? It's like 45 billion light years away, but you can't see it.
Speaker 3
So you have to stick it into a model of the expansion rate of the universe and come out with a number that you cannot observe. So you were being snot-nosed about that.
But that's fine.
Speaker 3
Tell me I was polite because I think I'm polite. You were polite.
You were polite. We had a little back and forth.
And eventually, eventually, I think you were getting a little impatient. Oh, really?
Speaker 3 And you said,
Speaker 3 why don't you...
Speaker 3 you know, make a presentation of this to a wide audience in a way that you think.
Speaker 3 I remember that correspondence now.
Speaker 3
Okay, okay, I get it. You said that.
Okay. And then I went up and did a podcast about it and sent it to you.
Okay.
Speaker 3
All right. So in your book, Overlords, Space Empires, you just go all out.
Yeah. And you're coming to it from a physicist with a philosophy flavor.
Speaker 3 So you're going to see this in ways pure tech people wouldn't or politicians or just regular everyday folk walking up and down the street. So how did you prepare for this book?
Speaker 3 I read a lot of really bad writing by, you know, tech CEOs and people defending tech CEOs online, you know, writing long essays and books about
Speaker 3 why the future is inevitably going to be all about super intelligent AI, why the future is going to inevitably be about
Speaker 3 some of these things, you know, were ideas that I had the expertise to say, okay, no, that's not true and here's why.
Speaker 3 But some of them were ideas about like, you know, biology or areas of physics that I don't have expertise in or, you know, other things.
Speaker 3
And so then I went and interviewed a bunch of people who have expertise in those areas. Now you're playing journalist in that capacity.
Yeah, yeah, yeah, yeah, yeah.
Speaker 3 And like read books on those subjects and, you know, pulled out what I needed and stuff like that. And then I tried to interview, you know, the tech CEOs themselves
Speaker 3
and almost all of them said no. No.
Right. Yeah.
Cause they're looking at you as somebody who is intellectually honest with some integrity and they're like we can't talk to you okay
Speaker 3 because they know they're full of crap yeah well i i think that they just didn't see any reason to right you know like i was very honest i said you know like this is a book that's going to take a critical look at you that was your problem well
Speaker 3 if you had gone in and said i'm enamored of the fact that ai is going to be such an integral part of the next chapter in human history and that you guys
Speaker 3
are the progenitors of this amazing tech. They would have been like, come on in, let's talk for a second.
I'm like, no, you don't meet an appointment. Stop by any time.
That was your first mistake.
Speaker 3 Yeah, man.
Speaker 3 Well, but you know, I got journalistic integrity.
Speaker 3 All right. Well, good.
Speaker 2 Well, look, let's explore some of the scenarios
Speaker 2 that are going to be potentially the reality of us as a human race.
Speaker 3 Let's go right on down the list. Yeah, the laundry list.
Speaker 3 Okay.
Speaker 2 Well, let's look at Mars in 2050.
Speaker 3 Oh, yeah.
Speaker 2 Are we saying, maybe, maybe not? You're kidding me? Oh, that's definitely going to happen.
Speaker 3 Yeah, Elon Musk, he has said he wants to put a million people on Mars by 2050 to have a self-sustaining civilization that will survive there even if the rockets from Earth stop coming because there's been an asteroid strike or nuclear war or something here.
Speaker 3
That's definitely not happening. There are a lot of reasons why that's not happening.
Getting anyone to Mars by 2050 and bringing them back alive or just having them live there for a while.
Speaker 3 That would be incredibly difficult. The challenges just to put boots on Mars the way that we did on the moon are enormous, right?
Speaker 3 Just learning how to keep someone alive in deep space that far away from Earth for as long as it takes to get to Mars, stay on Mars, come back. We do not know how to do that yet.
Speaker 3
Chuck, that's the problem. They want to put boots on Mars instead of sneakers on Mars.
If you get a sneaker contract, they'll pay the whole way.
Speaker 3 Nike would have been there by now.
Speaker 3 They just do it.
Speaker 3
Absolutely. They just do it.
Well please
Speaker 3 about sneakers.
Speaker 3
What are the biggest challenges of going that far into space? Is it radiation or? Yeah, there's radiation. And that's not just when you're in space.
That's also when you're on Mars, right?
Speaker 3 You know, the two things that primarily protect us from radiation here on Earth are our, you know, the Earth's magnetic field. and the thick atmosphere that Earth has.
Speaker 3
Mars doesn't have either of those things. So when you're on the surface of Mars, you're getting pretty much the same radiation dose that you do like out in space.
And that's not good, right?
Speaker 3 You know, like the thing that I tell people is the movie The Martian is science fiction.
Speaker 3 One of the things that's science fiction about it is: if Mark Watney really, you know, had to do all the stuff that he did in that movie, he'd come home and he'd be dead of cancer in a couple of years because he had too much radiation exposure hanging out in Mars.
Speaker 3 Hey, are you looking to learn, grow, or simply get inspired? With Masterclass, anyone can learn from the best to become their best.
Speaker 3 For as low as $10 a month, get unlimited access to over 200 classes taught by world-class business leaders, writers, chefs, and more.
Speaker 3
One of their world-class leaders, I hear, is really good at science. I think you might know who he is.
But there's a lot more they offer than science.
Speaker 3
I recently started watching a series on how to be more confident. I wasn't trying to learn anything, I was just trying to see if I'm doing everything right.
And what do you know?
Speaker 3
I'm pretty confident now that I am. Each lesson fits easily into your schedule.
Watch anytime on your phone, your laptop, or your TV, or switch to audio mode and learn on the go.
Speaker 3 Are you traveling for the holidays? Well, download the classes and watch them offline. I love the information that I get on photography and interior design.
Speaker 3 Two things that a lot of people don't know that I really enjoy. All right, enough about me.
Speaker 3
Head over to Masterclass, where they always have great offerings during the holidays, sometimes as much as 50% off. Head over to masterclass.com slash Star Talk for the current offer.
That's up to 50%
Speaker 3 off at masterclass.com slash Star Talk. Once again, masterclass.com slash Star Talk.
Speaker 1
It's the season to come together over your holiday favorites at Starbucks. Warm up with a creamy caramel brulee latte.
Get festive with an iced gingerbread chai, or share a velvety peppermint mocha.
Speaker 1 Together is the best place to be at Starbucks.
Speaker 4 Black Friday Savings are here at the Home Depot, which means it's time to add new cordless power to your collection.
Speaker 7 Right now, when you buy a select battery kit from one of our top brands like Ryobi or Milwaukee, you'll get a select tool from that same brand for free.
Speaker 5 Click into one of our best deals of the season and stock up on tools for all your upcoming projects.
Speaker 4 Get Black Friday savings happening now at the Home Depot.
Speaker 8 Limit one per transaction exclusion supply full eligible tool list in store and online.
Speaker 3 I'm Brian Futterman and I support Star Talk on Patreon. This is Star Talk with Neil deGrasse Tyson.
Speaker 3 What about the ISS? If Scott Kelly could stay up there for a year, one of the twins,
Speaker 3 one stayed on Earth and one,
Speaker 3 right?
Speaker 3 Why couldn't you just extend that for whatever time necessary to go to mars even if it's not to live there if it's just to go there and dig a hole and come back right so there's a couple of things first of all on the iss they're still in the earth's magnetic field they still have a bunch of the shielding oh wait and what's that called neil wait the field that goes all the way out like that oh yeah oh it's called the oh magnetic field no it's not the magnet it's the magnetosphere yeah yeah think of x-men yes the magnetosphere go ahead yeah
Speaker 3
Yeah, exactly. It is like the X-Men.
Yeah. They've still got that protection.
Also, if something goes wrong on the ISS, they'll be back on the surface of the earth in a matter of hours.
Speaker 3 Like they can just abort and come back home, right? Yeah, and most. I mean, you can,
Speaker 3
you come out, you're down within a half hour. Exactly.
Yeah, yeah, yeah, yeah. The hours is you want to line up so you don't land in the middle of sharks.
Right, exactly.
Speaker 3 Yeah, yeah, yeah, yeah, yeah, yeah.
Speaker 3
So like you can get out easy. Easy.
And you can also have a real-time conversation with people on the ground because they're, you know, they're not that high up.
Speaker 3 And so the speed of light delay with the conversation doesn't matter. On Mars,
Speaker 3 it's a minimum of something like, I think, eight minutes each way and a maximum of something like 15 or 20 one way.
Speaker 3 And so if you send out a message, you are waiting at least 15, 20 minutes to get a message back. Maybe
Speaker 3
25. Yeah.
It better not be late. So how's it going? Over.
Yeah.
Speaker 3 Yeah. Put some content in it.
Speaker 3 Or watch out for the cliff.
Speaker 3 Exactly. Yeah.
Speaker 3 And the other thing is, like, if you have a problem on the surface of Mars and you want to come back, that's going to take you at least nine months, maybe more, if you happen to be near a launch window where, you know, the Earth and Mars are like
Speaker 3 positions. If you're not near a launch window, it could be well over a year before you can come home.
Speaker 3
Yeah, a full-up round trip mission to Mars. with ideal launch and return parameters is multiple years.
Yeah. Right.
But you can get to the moon and back in a week. Yep.
In like a bit in a news cycle.
Speaker 3 Yep. Right.
Speaker 2 So if we overcome the logistics of getting from Earth to Mars, if, big, big if, where are they going to live? Because they're not going to go out there and start building.
Speaker 3
Yeah. And they.
Yeah, why don't you just build a little sort of half underground thing that shields you from?
Speaker 3 Yeah. Well, so then you have other problems, right? You know, there's no air.
Speaker 3 You got to bring in oxygen or, you know, do some sort of reaction to make oxygen on the surface, which, yeah, you can do that, but it's not the easiest thing uh you got to bring in all your food you can't grow it there the martian surface the dirt on mars is filled with toxic chemicals you're gonna have a hard time getting it out of stuff because it's very fine it's not it's gonna be that's gonna be here on earth soon too
Speaker 3 let's be for real yeah
Speaker 3 but we know you can grow poop potatoes on
Speaker 3 yes we know that
Speaker 3 we know that yes exactly yeah there was a proof of concept in the movie the martian yeah
Speaker 3
no but but actually it's it's funny, though. The guy who wrote the book, what's his name? Andy Weir.
Andy Weir, yeah. In fact, we had him on the show.
Yeah. He's in our archives.
Speaker 3 He has said that, you know, the discovery of these particular poisonous compounds in the Martian surface called perchlorates, he didn't know about that when he wrote the book because it wasn't widely known.
Speaker 3 And so now we know if you tried to, you know, farm poop potatoes on Mars, they'd be poisonous.
Speaker 3
Yeah. So that's the unknown unknown.
Yeah, yeah, yeah. Exactly.
And that's still out there. Okay, that's not going to work.
Speaker 2
We're not thinking that for some time. Functional immortality.
Yeah. And there's a lot of ways we can get there.
I mean,
Speaker 2 the biological immortality of growing organs in pigs and things and then transplanting is one thing, but
Speaker 2 are we getting towards singularity?
Speaker 3 Yeah. So, I mean, the biological replacing organs thing, you know,
Speaker 3
you can't replace the brain. Well, not yet.
Yeah.
Speaker 3 I mean, not yet. I'll never do it with that attitude.
Speaker 3 Sonny.
Speaker 2 That was more a British approach to things rather than an American.
Speaker 3 Someone needs a better attitude about things.
Speaker 3 But yeah, this idea of the singularity,
Speaker 3 that we're going to get to this point where technology in general and AI in particular gets faster and faster and smarter and smarter until it gains godlike powers. It's a science fiction story.
Speaker 3 But what does that have to do with living forever? Well, so the idea is that then you get this godlike AI that like grants us immortality. It has like essentially magic powers.
Speaker 3 Or, you know, it finds a way to take
Speaker 3 it smart enough to figure out how to make us live forever.
Speaker 3 It can solve the problem. Solve the problem of
Speaker 3
mortality. Yes.
I got you.
Speaker 2 But whose mortality problem is it going to solve?
Speaker 3
Yeah. Well, it doesn't have one.
So
Speaker 3 there's no problems at all.
Speaker 2 Are there a select few or is this open for everybody?
Speaker 3 Oh, well, we know for a fact that it's going to be for a select few and it's going to be for the people who are the gatekeepers to AI. We're already seeing that now, but go ahead.
Speaker 3
Well, but the other thing is that the whole idea is kind of, you know, nonsense to begin with. Like this idea of singularity is, well, it's based on a few really serious.
like flawed ideas.
Speaker 3 First, this idea that there is this like single thing called intelligence. You can just ramp it up or down in a computer and it can just make itself more and more intelligent.
Speaker 3
That's not really how intelligence works. Intelligence is like a really complicated thing.
It's not one number.
Speaker 3 And also the usual way talking about the singularity that like the main
Speaker 3 popularizer of the singularity, Ray Kurzweil, has.
Speaker 3 Who's been on the show? Who's been on the show?
Speaker 3 In our archives.
Speaker 3 These little commercials. I love that.
Speaker 3
Go on. But yeah, Kurzweil, you know, he says it's.
He came out with his second book. Yes.
The first one was The Singularity is Near. Yes.
You know, the title of his second book?
Speaker 3
Yeah, The Singularity is Nearer. Nearer.
Yeah. This is so funny.
Now, I tell people that they don't believe me. I'm like, go look it up.
It's out.
Speaker 3 His next book coming out is going to be called Almost There. Yeah.
Speaker 3 No,
Speaker 3 he thinks that, you know, Moore's Law, this idea that computer chips are just going to get faster and more powerful and like double in speed every 18 months.
Speaker 3 He thinks that this is this, you know, specific instance of a more general like law of accelerating returns in technology and in nature.
Speaker 3
And he says he's traced it all the way back to the beginning of the universe and that it shows that a singularity is coming in like 2045 or something like that. Precisely.
Yeah, precisely.
Speaker 3
On October 12th. Oh, man.
That sounds to me like the end is near. Yeah, I know.
It is.
Speaker 3 I mean, like, the people are like, I don't need a bank account. You know, Jesus is coming back.
Speaker 3
The end is near. That's what he should have.
No, so it's funny that you say that, right?
Speaker 3 Because he's got a picture of himself and the singularity is near with one of those like poster boards on him that says the singularity is near.
Speaker 3 And there's an AI research group that's inspired by these ideas of singularity called the Machine Intelligence Research Institute, MIRI.
Speaker 3
They don't give their employees 401ks because they think the end is near. That's some cheap ass.
Yeah, I know, right?
Speaker 3
Wow. Yeah.
Well, okay, I still want to get to the immortality thing. Yeah, because you haven't addressed the fact that right now, with or without AI,
Speaker 3 there's a lot of research on not just replacing organs, though that might be in our offering,
Speaker 3
but delaying the aging functions of your cells. Totally.
That could work out to extend human lifespan or health span a certain amount of time. Or I like to pay the health span.
Yeah, yeah.
Speaker 3
And I'm really worried about the health span. That's a good word.
Yeah.
Speaker 3
What do the Galapagos, what do they live to be? Like 100 and the tortoises? The tortoises. Tortoises? Yeah, like 100 or something.
200 and something.
Speaker 3 They have AI. So that's
Speaker 3 the...
Speaker 2 Remember we spoke with Venki Ramakrishnan? Yes. He was telling us about the Greenland shark.
Speaker 3 Yes.
Speaker 2 Being about 800 years.
Speaker 3
Yeah, that's crazy. Yeah.
Yeah. It is possible that some
Speaker 3
biotechnology will be developed that will radically extend human lifespan or health span. Maybe.
Yeah.
Speaker 3 But what these guys are talking about with singularity, they're generally not talking about that as the end game.
Speaker 3 The end game they have in mind is not just an extended lifespan, but real immortality by uploading their consciousness into
Speaker 3
the mind. But before we get to that, that's the immortality.
Yeah, the immortality of your mind. Yeah, yeah, yeah.
Speaker 2 Before we get to that, do we visit transhumanism? Do we get, and how are we designed?
Speaker 3 What's transhumanism?
Speaker 2 That's it, I'm just saying. Yeah.
Speaker 3 It's this idea that you can use technology to transcend like the limits of human biology and physics. Aren't we kind of already doing that, which is why we live twice as long as people 150 years ago?
Speaker 3
Yeah, no, there's definitely a lot of people. We're kind of already, if we told them what we're doing, we know about nutrition, vitamins, they'll say, what's a vitamin? Right.
We got vaccines.
Speaker 3
We got this. We got, what's a vaccine? We got, right? Aren't we already transhuman compared to what the age nature would require us to be dead at? Totally.
Yeah.
Speaker 3 Like, look, I think that we have used technology to make many things much better about being alive. Like,
Speaker 3 that's just true. The question is, does that trend continue indefinitely, right? No, because RFK is going to make sure we
Speaker 3 when we lived half as long as we do.
Speaker 3 That's what's happening.
Speaker 3
Let's be clear. That's RFK Jr.
Yes.
Speaker 3 So if we go to this uploaded consciousness
Speaker 2 and that becomes reality, that just doesn't exist without a power source.
Speaker 3 Right. The thing about
Speaker 3 the singularity and like Kurzweil's idea about this accelerating returns and Moore's Law just going on forever and this power source thing, right?
Speaker 3 The idea that it would need increasing levels of power as well. And so this leads to this sort of exponential drive for materials and power.
Speaker 3
And the thing that Kurzweil forgets is exponential trends are not like laws of nature. The law of nature about exponential trends is they end.
Right. They have to end.
They have to end.
Speaker 3
And so, and it's because ultimately limited resources like energy. Right.
Right. And so although
Speaker 3 when we talk about how much longer a charge in our computer lasts today compared with the early days of laptops,
Speaker 3 part of that is better batteries, but also part of of that is more efficient chips.
Speaker 3 And when we get to quantum computing, where much more computing happens in much less, with much less of an energy draw, it could be that we're coming at it from the other side where the energy needs are dropping, thereby not requiring the power supplies necessary.
Speaker 3 How long, well, in my memory, not your memory,
Speaker 3 I got a few years on you, a room this size was necessary to cool a computer, Otherwise, the computer would overheat and the computer is doing like Ford function mathematics.
Speaker 3 So the efficiencies matter. All these tubes that had to be kept cool.
Speaker 3 So it's not obvious that it's a linear exponential. Can I say that?
Speaker 3 Where the exponential is just going to hit a limit because you can come at it from other directions. That's cool.
Speaker 3 However, the other thing is, though, in nature, the exponential acceleration, it's more like the law of diminishing returns
Speaker 3 is more likely than the law of exponential acceleration. Well, no, that's actually exactly right.
Speaker 3
Yeah, because there's, you know, if you look at the history of Moore's Law, like how it is that the semiconductor industry, oh, yeah, named for Gordon Moore. Gordon Moore, co-founder of Intel.
Yes.
Speaker 3 And if you look at how Intel and other, you know, semiconductor companies actually made the chips smaller and faster over that time, you know, it's not a law of nature.
Speaker 3 It's a decision, a business decision that these companies made.
Speaker 3 And in order to keep that trend going, they had to invest more and more and more money just to keep the same sort of level of doubling, to keep that exponential trend going.
Speaker 3
And eventually, it did stop, right? Moore's Law is done. It's over.
Because you can't make silicon transistors smaller than an atom of silicon. Right.
Yeah.
Speaker 3
And what they're doing now is just adding more chips. Right.
So the more powerful computers are not smaller and denser, they're just bigger now. Yep.
Right.
Speaker 3 Yeah, and they're putting them on top of them. They're stacking them there.
Speaker 2 So the solution in the minds of these tech billionaires is to arrive at a super intelligence, to get an AGI. Yeah.
Speaker 3 Yeah.
Speaker 2 And I mean, Sam Altman is saying within the next two years.
Speaker 3 I think he said, yeah.
Speaker 2
Right. So they're looking at that as being the solution to this problem where we're saying we're not sure if it will be exponential.
We're not sure where where the end point is.
Speaker 3 They're looking at it as the solution to every problem. None of the tech bros have a degree in physics the way you do.
Speaker 3 So what are you bringing to the table that they don't see? I mean, they believe that
Speaker 3 AGI, I mean, Altman has said that AGI is going to solve every problem, including like global warming, which is crazy. Why? It's crazy.
Speaker 3 Well, because if it's smarter than you and you can't solve it, why is it crazy to think it could solve it?
Speaker 3 Well, first of of all, the artificial intelligence systems that they're building now are just drawing more and more and more energy.
Speaker 3 If you did build one that could solve global warming and you turn it on and said, How do you solve global warming?
Speaker 3 I'm pretty sure the first thing it would do is say, Well, you shouldn't have built me. Yeah, it turned me off.
Speaker 3
It turned me off. Yeah, that'll help.
Yeah.
Speaker 3 That would be a good test of its own
Speaker 3 self-preservation.
Speaker 3
You are causing most of our global warming. What's the best solution? Does it turn yourself off? Yeah.
Well, I mean, the other thing is that we don't need AI to tell us how to solve it.
Speaker 3 We already know what the solution is.
Speaker 3 The issue is not like that insufficient intelligence has been thrown at the problem. The issue is primarily not even a technological problem at all at this point, aside from carbon capture.
Speaker 3 The main issue is
Speaker 3
human behavior. It's greed.
Exactly.
Speaker 3
Yeah, it's greed. It's greed.
Chuck, greed is good.
Speaker 3 Oh, God.
Speaker 3 Yeah.
Speaker 3 But the, yeah, the other thing is, is just that when Altman talks about, you know, he's talked about things like, oh, AI means that in 10 years, college graduates are going to be getting, you know, cool jobs exploring the solar system, right?
Speaker 3 And I can just look at that and say, well, that's bullshit. Or, you know, he says AI is going to, you know, discover new laws of physics, and that's going to remove limitations.
Speaker 3 that we have in the world today. And like, well, discovering new laws of physics, I mean, putting aside whether or not the AI can do that, that does not always remove limitations.
Speaker 3 Sometimes new laws of physics, in fact, a lot of times
Speaker 3
create a limitation. Exactly.
Einstein with relativity discovered a limitation in the speed of light, right? Newton didn't know that there was any such limitation. Right.
Yeah.
Speaker 3 So is it possible
Speaker 3 that all of these
Speaker 3 I'll call them postulates because they're not, that they're making, right?
Speaker 3
Are just a means of hyping up what they're doing to keep the revenue stream coming to them. Like, let's be honest.
If I tell you this thing is going to solve everything, right? I give you my money.
Speaker 3
You give me, right? You give me some money. Yeah.
I mean, it's kind of like. Not just me, the government will be money.
No, that's what I'm saying.
Speaker 3 Everybody's going to give you money. It's kind of like
Speaker 2 21st century snake oil.
Speaker 3 Is it on your telemetry? I was about to say something different. I was going to say,
Speaker 3 it's kind of the evangelical business model of a television evangelist.
Speaker 3
Like the whole whole idea is, hey, you got problems, and these problems can be addressed, they can be solved. All you got to do is send me some money.
That's all you got to do.
Speaker 3
And I'm going to send you this little blessing cloth. And, you know, Chuck was a preacher in his earlier career.
If he isn't, he will be in the early years.
Speaker 3 One day I made the wrong decision.
Speaker 3 Preaching is good money.
Speaker 2 Everything we've discussed has been about being somewhere else.
Speaker 3 Yeah.
Speaker 2 About not solving problems here.
Speaker 3 Yep.
Speaker 2 So are these people looking at us and going, you are completely screwed?
Speaker 3 Yep.
Speaker 2 We're out of here and we're the ones that can afford it and we're the ones with the tech to be able to achieve it.
Speaker 3 Why are they walking away?
Speaker 2 What's in their mind? What's their thinking about turning their back and moving?
Speaker 3
Well, they think that... But they didn't grant you an interview.
Yeah, they didn't grant you an interview. So you don't really know what's on their mind.
Oh, you can read enough of their stuff to
Speaker 3 infer what's on their mind.
Speaker 3 Yeah, they've given other people interviews who are nicer to them.
Speaker 3
I could be more charming enough. Okay.
Apparently not. I mean, I just sent an email.
Speaker 3 A couple of them were almost willing to do it and then they changed their minds probably because, you know, they read the email again. They're like, oh, he's going to disagree with us.
Speaker 3 Why should we talk to him? But whatever. Some of them are being very cynical, like the way that you were talking about, right?
Speaker 3 And saying, oh, you know, I can just do this. And this is, you know, I can claim that all of these things are coming in the future.
Speaker 3 And that this is a way of, you know, generating more profit and getting people to give me more money. Some of them, I think, genuinely believe it.
Speaker 3 The idea that the future has to be elsewhere, I think some of it is just from this sense that they have that things are bad here on Earth and that trying to solve problems here on Earth would be complicated and messy and difficult and that somehow going to space would give them a fresh start, which is not true.
Speaker 3
You can't escape politics. You can't escape.
We're still human.
Speaker 3
Exactly. You can't escape human nature.
Exactly. So how does an AI overlord plug into these scenarios? Well, the idea is that they build a sort of AI god, and it just does whatever they want.
Speaker 3
It can be a lot of fun. This would be AGI.
Yeah. Yeah.
Artificial general intelligence. Yeah.
So when we normally think of AI, we think of a task-driven AI. It can drive a car.
Speaker 3 It can make a perfect cup of coffee.
Speaker 3
It can fly an airplane. Sure.
But AGI transcends all all of that. Yeah.
It can just learn about anything and might even be
Speaker 3
achieve consciousness. Yeah, well, it would be.
Like Skynet. Yeah, absolutely.
Because it's,
Speaker 3 we are AGI.
Speaker 3 That's what we are.
Speaker 3 We are just, you know, the equivalent of what they want as AGI.
Speaker 2 It'll be AGI 0.10.
Speaker 3 Yeah. 0.10.
Speaker 3 No, 0.1. 0.1.
Speaker 3
Okay. Yeah.
So. Yeah, no,
Speaker 3 that's right. I mean, mean, no, no, but the point is, whatever we are, AGI, so how long it take you to go to school to open up all your textbooks and learn from them and get an exam? AGI will do what?
Speaker 3
Yeah, it's supposed to be able to do all that much faster. 20 minutes.
In 20 minutes. Yeah, exactly.
Speaker 3
Yeah. You'll get your college degree in 20 minutes.
Well, and the other thing is, like, those AI systems, all the things that you mentioned, right?
Speaker 3 Make a perfect cup of coffee, fly an airplane, drive a car. The AI systems we have right now can't do any of those things without human supervision, right?
Speaker 3 Even those self-driving cars that are all over the streets of San Francisco, there's actually a human remotely supervising it and intervening pretty frequently.
Speaker 3 So they tell you. Yeah.
Speaker 3
That's funny. Talking about the Waymos? Yeah, yeah, yeah, exactly.
The Waymo is Google, if I remember correctly. Yes.
Yeah, yeah, yeah, that's right.
Speaker 3 And so AGI is supposed to be able to do all of these things like independently, right? And then get smarter and smarter.
Speaker 3 And the human is not even in the equation. Human's not in the equation.
Speaker 3 And you can just make it go, you can have it do all the things that a human does, but you can like overclock it, make it go faster, think faster than a human.
Speaker 3 And then the idea is it gets smarter and smarter and achieves these like superhuman, super intelligent powers. The idea then is that for the billionaires controlling it, it's like a genie.
Speaker 3
And for the rest of us, it's an overlord. It's an overlord.
Yeah. But this is where they're so stupid.
And this is where all really rich people are.
Speaker 3 They're so stupid, they have hundreds of billions of dollars and you don't.
Speaker 3 No, but that's what makes them so stupid i agree i'm serious yeah it's the fact that they have all this money and they've convinced themselves that they can transcend anything yep it's evidence of their own genius right yeah so it's it's they're hubris it's their downfall
Speaker 3
it's like the first dictator the very first dictator was a guy a little guy by the name of Julius Caesar. But Julius Caesar was the very first dictator.
You know how he became dictator?
Speaker 3 They said, all right, how about you be dictator, but you do it for a year.
Speaker 3 If you create a godlike being, whether it's artificial general intelligence or whatever,
Speaker 3 and you think that you're going to control it,
Speaker 3
it is not a god at that point. Yeah.
You are the god. And that's really what they're saying.
Yeah. They're saying we're gods.
Yeah. And the thing is, like, That's true.
Speaker 3 If they somehow did achieve it, who would be controlling it? But also, it's an incoherent idea.
Speaker 3 Like it's not, the good news for the rest of us is that it's not like something that's actually coming because it doesn't make any sense. That's funny.
Speaker 3
It's like these guys are hanging their hat and you're just like, yeah, man, it's just a dumb idea. Yeah, it is.
No, no, no. Incoherent sounds way more of a beatdown.
Speaker 3 Your idea is incoherent.
Speaker 3 Let's dunk it on somebody right there.
Speaker 2 Is this a misconception of the science, the misconception of science fiction?
Speaker 3 Yeah, I mean, I think a lot of the ideas, right, and And
Speaker 3 this goes back to like, why are they like trying to go somewhere else? I think they just get these ideas from science fiction and they just take it way too literally. They don't read it well, right?
Speaker 3 Like my favorite science fiction, the science fiction I grew up with was Star Trek, right? The thing about Star Trek is, yeah, okay, they're on the Starship Enterprise.
Speaker 3 They're out there, you know, exploring strange new worlds, new life, new civilizations, all that jazz, right?
Speaker 3
Finish it. To boldly go where no one has gone before.
Thank you.
Speaker 3
Not to go boldly. Yeah, not to go boldly.
No, we can split the infinity. Split that infinity.
Yeah, hell yeah. But
Speaker 3
the thing is, though, Star Trek was never really about space. Right.
It's about like us here and now, right? And it was always an allegory and not even a particularly veiled one, right?
Speaker 3 I seem to recall an episode where Kirk and Spock were literally punching Nazis with swastikas, right?
Speaker 3 And then there was also,
Speaker 3 and there was also the episode with like the two dudes and one of them, like the left half of his face was white and the right half was black and the other one it was so frank gorshan gorshan frank gorshen yeah
Speaker 3 he played the riddler yeah and yes in batman yeah yeah yeah
Speaker 3 exactly so
Speaker 3 it's obvious why we persecute them they're black on their right side yeah we're black on their left side yeah that was kind of kind of blunt yeah exactly but star trek is always blunt right right and that's kind of Part of the fun.
Speaker 3
Yeah, right? It is. But these guys watch Star Trek and they're like, oh, yeah, warp drive is cool.
Let's do that.
Speaker 3
You're not and missed the whole point of Star Trek and the process. Yeah.
Right. Because Star Trek is utopian ideals in a galaxy that's descending towards dystopia.
Speaker 3 And they're fighting, fighting it every step of the way. Could it be that to become a tech bro in the first place,
Speaker 3 you had to be really focused on a level to the exclusion of your social life and possibly even your personal hygiene. As a result, you achieve these
Speaker 3 places and part of your life's training did not include the emotions and feelings of others or how people think about the world or what their desires are.
Speaker 3 And you think that what you accomplish is for their best interest, even though you have no idea who they are. Oh, yeah.
Speaker 2 Is that a fair, is that a...
Speaker 3 I think that's right, right? Like,
Speaker 3 the way I like to talk about it is, you know, for someone who claims to care about humanity so much, Elon Musk doesn't really seem to care very much about humans.
Speaker 3 Right.
Speaker 3
Yeah. And isn't he the guy who said empathy is a bad thing? Right.
Yeah. And, and, but he's also said, I'm going to save humanity.
Yeah, he did.
Speaker 3
And he, but he's also said, I'm going to save humanity by taking us to Mars. I'm like, buddy, first of all, no.
And second, like, I don't think that you actually care that much about other humans.
Speaker 3 And I think that what you said is exactly right.
Speaker 3 Except you also have to add in, they think that the fact that they succeeded in business, which a lot of that's just luck, is proof of government contracts. Right, in government contracts.
Speaker 3
They've gone through subsidies, yeah. Yeah.
For the car business as well as the rocket business. Absolutely.
But like Musk and others, right, like
Speaker 3 Altman and Andreessen and these other people, they all think that this is proof that they are like the smartest people who've ever lived, so the richest people who've ever lived.
Speaker 3
And that's just not how anything works. And just remind me, Altman is OpenAI.
Yeah, Sam Altman is CEO of OpenAI. Let's go down that list.
Yeah, absolutely.
Speaker 3 And Mark Andreessen is the head of Andreessen Horrow, which is the biggest tech venture capital firm. Oh, so you need that to get the confusion.
Speaker 3
So OpenAI is what gives us chat GGT. Got it, got it.
That's right. And of course, we all know Elon.
Speaker 3
Is Branson a player? Branson is less of a player in Silicon Valley. What role in the tech sector does Bezos play? I mean, you know, he's...
He's got his own rockets. Yeah, he's got his own rockets.
Speaker 3 He's also like owns most of the infrastructure of the World Wide Web. This, I think,
Speaker 3 most people don't know what stands for
Speaker 3 Amazon Web Services. Basically, most of the cloud, most of the cloud, most of the actual computers that compose the cloud belong to Jeff Bezos.
Speaker 3 So Amazon.com is like window dressing on a whole other operation
Speaker 3
that matters to him. The real operation.
Yeah, exactly. That's why he doesn't have to make a buck selling you a, he's going to sell your book
Speaker 3 for 80% off.
Speaker 3 Well, after this, I'll be lucky if he sells my book at all. No, no, we're going good here.
Speaker 3 Let me remind you.
Speaker 3 Let me get the title of the book back in here.
Speaker 3 Give us the title again. Yeah, it's More Everything Forever.
Speaker 3
AI Overlords, Space Empires, and Silicon Valley's Crusade to Control the Fate. If you're there, it is.
That's what we're talking about. Yeah.
Man, okay.
Speaker 4 Black Friday savings are here at the Home Depot, which means it's time to add new cordless power to your collection.
Speaker 7 Right now, when you buy a select battery kit from one of our top brands, like Ryobi or Milwaukee, you'll get a select tool from that same brand for free.
Speaker 4 Click into one of our best deals of the season and stock up on tools for all your upcoming projects. Get Black Friday Savings happening now at the Home Depot.
Speaker 8 Limit one per transaction, exclusion supply, full, eligible tool list in store and online.
Speaker 3 I'm a high note hitting songbird, but I'm also a bird-watching backpacker.
Speaker 3 Wood thrush, three o'clock.
Speaker 9 Walmart has a wellness side too, with tons of things I need to feel good, from electrolytes to help keep me hydrated to soothing cough drops for after every show.
Speaker 3 Oh man, how about waterproof boots? Size 10?
Speaker 9 They've got half a billion things online, on the app, and in store.
Speaker 3 Really? Who knew?
Speaker 3 Okay, was that you or the birds?
Speaker 1 Check out the wellness side of Walmart today.
Speaker 3 Hey, Ryan Reynolds here, wishing you a very happy half-off holiday because right now, Mint Mobile is offering you the gift of 50% off unlimited. To be clear, that's half price, not half the service.
Speaker 3
Mint is still premium unlimited wireless for a great price. So that means a half day.
Yeah? Give it a try at mintmobile.com/slash switch.
Speaker 10
Upfront payment of $45 for a three-month plan, equivalent to $15 per month required. New New customer offer for first three months only.
Speeds low under 55 gigabytes of networks busy.
Speaker 10 Taxes and fees extra.
Speaker 3 See Mintmobile.com.
Speaker 2 So when we think about the science fiction, and I think Neil's point about the isolation of these people growing up, if we think about the science fiction and you think about certain fights with Star Trek, then maybe Matrix or Blade Runner, and you go through the laundry list,
Speaker 2 how have they co-opted these and kind of bolted this to that and that to this?
Speaker 3
You think they're upwards by sci-fi? Oh, yeah, totally. I mean, Elon Musk tweeted that science fiction shouldn't remain fiction forever.
Okay, that's fair. I'm fine there.
Speaker 3 Yeah, I sort of understand what he means, but which science fiction, right? Like, Blade Runner's a dystopia.
Speaker 3 Right.
Speaker 3 And then he comes out and says that the cyber truck, that ugly piece of crap, looks like something that Blade Runner would drive, which Blade Runner is not the name of any character in Blade Runner, but we can put that aside.
Speaker 3 There's a profession in Blade Runner. Yes, exactly.
Speaker 3 That's great.
Speaker 2 But there's a number of dystopians. Oh, yeah.
Speaker 3 But they're all dystopiophopic.
Speaker 3
That's why they're one, isn't it? They're all on this topic. Some of them are.
I mean, right? There's aspirational science fiction like Star Trek, right?
Speaker 3 But even that is, like I said,
Speaker 3
they're the bright spot. Yeah, they're not.
The Federation is the bright spot. The Federation is definitely the bright spot.
Yeah, no, no, no, that's true.
Speaker 3 But like, there's a tweet that lives rent-free in my head
Speaker 3
about a thing called the Torment Nexus. This is actually in my book at the very beginning.
The Torment Nexus. The Torment Nexus.
Speaker 3
I'm afraid to ask what this is. We're going there.
We're going there. We're going there.
We're going there. Let's do it.
So the tweet goes like this.
Speaker 3 Science fiction author, in my book, I created the Torment Nexus as a cautionary tale.
Speaker 3 Tech billionaire, at long last, we've created the Torment Nexus from classic sci-fi novel, Don't Create the Torment Nexus.
Speaker 3 this is what these guys are doing right right like skynet yeah like right there's skynet but also like if you go back and look at like the classic cyberpunk novels by somebody like say william gibson right who i think is a great novelist a lot of those novels like uh uh neuromancer are about like the concentration of wealth and power and the way that the wealthy can and will use technology to remove themselves from the rest of us and accumulate wealth and power while insulating themselves from the consequences.
Speaker 3 And that's exactly what we see happening. And so when they say that they want to make science fiction into reality, we need to ask, okay, which ones?
Speaker 3 Because if you want to make neuromance a reality, man, that's bad news for everyone who's not you.
Speaker 2 So how much of science fiction has always been a silent alarm call?
Speaker 3 A silent warning? Yeah. I mean, some...
Speaker 2 We go back to Fritz Lang and Metropolis back in 1927.
Speaker 3
Yeah, absolutely. Yeah.
Metropolis is very much like a movie about the need to keep emotional intelligence with pace with technology, right? I didn't get that at it, but I believe you.
Speaker 3
I'm just saying that that's a deep read. To me, it was just a weird alien bottom.
I mean, it's a weird movie. Yeah, yeah, yeah.
Speaker 3 Like at the end, they say the heart and the hand must work together or something like that, right?
Speaker 3 And so that's how I read that, at least.
Speaker 3 Like, I think that science fiction, a lot of it, has always been like about looking at the world as it is now and saying, okay, if we push that a little bit, if we want to take a look at this situation in a different context and understand it in a different way by removing it from all of the sort of social and cultural connotations that a particular thing has here and now, we put it somewhere else and maybe we can see it more clearly, right?
Speaker 3 It's what Star Trek does. It's what like, oh, my favorite science fiction author, Ursula Le Guin, right?
Speaker 3 She did this over and over again, looking at, you know, poverty, inequality, capitalism, gender, you name it, right?
Speaker 3 So Rod Sterling, back in 1959,
Speaker 3
he's interviewed about this new show called Twilight Zone. Yeah.
And he says it. He said, look, there's stories I'm telling that you could not tell in just a dramatic way.
It has to be set.
Speaker 3 at a time and a place that is not you and now.
Speaker 3 Otherwise, I couldn't get away with these stories.
Speaker 3 And only then do people say, wait, might that be me?
Speaker 3 But if it's blatant and in your face, you reject it. And he said, in the end, we're just trying to sell soap.
Speaker 3 He understood the situation. But tell an entertaining story, but said it in another place.
Speaker 3 I just looked up when Skynet achieved consciousness.
Speaker 3
It was 2.14 a.m. Eastern Time, August 29th, 1997.
Oh, wow. Oh, wow.
Because the movie was 1984 or the first 84, I think, was 84. Yeah, 84.
Yeah, so that was that was only 13 years in the future. Yeah.
Speaker 3
I hate when they do that. Go far enough where.
Oh, I have a whole list.
Speaker 2 But that was the whole thing with Star Trek. It was set something like 200 years ago.
Speaker 3
Yeah, that was safely in the future. I got a whole list.
I'm saying, go safely into the future. Do you know Soil and Green was 2022? Oh, God.
Well, that's why I'm eating people now.
Speaker 3 It's people.
Speaker 3 Everybody doesn't realize. That's what the pandemic was about, guys.
Speaker 3 Because it happened. It's like,
Speaker 3 enjoy that burger.
Speaker 3 So what else is in your laundry list here? Basically, I think that what they want, this vision that they have, is this idea of going to space and living forever. Right.
Speaker 3 And so a lot of it is really about space colonization, going out and expanding to take over the universe. That's like, because they don't want to just stop with Mars.
Speaker 3 They want to put Dyson spheres around every single star in the observable universe and like collect all of that energy. And that's,
Speaker 3
that's not going to happen, man. There'll be a Kardashi scale five, I think.
Yeah. Where you control all the energy output of all stars in the known universe.
Yeah. But doesn't the Borg have some
Speaker 3 similar energy? Yeah, yeah. Does the Borg want to assimilate everything? And what was it, the phrase? They want to take your cultural and political distinctiveness and make it part of our own.
Speaker 3
Yeah. Yeah.
Speaking as a scientist, I kind of like what science brings society.
Speaker 3 And shouldn't that be enough?
Speaker 3 Why does everyone go to science fiction?
Speaker 3 Is there some morbid fascination with science gone bad? And
Speaker 3 isn't that a problem with us, not with the storytellers themselves? I mean, I don't think that science fiction is in and of itself the problem, right? Like I'm. That's what I'm getting at.
Speaker 3
Yeah, exactly. Yeah.
No, I'm a huge sci-fi fan. I'm also a scientist by training, at least.
Speaker 3 The reason people find science fiction more compelling than science has a lot to do with the fact that it's not really about the future, that it's, you know, sort of these interesting what-if scenarios that reflect on where we are right now, right?
Speaker 3 If you tried to make a very realistic, you know, TV show about what life could actually be like a hundred years from now and made it as realistic as possible, people probably wouldn't watch it because it would involve so much slang that doesn't make any sense to us right now and like shifts in language.
Speaker 3
OMG. Right.
yeah.
Speaker 3
Right. WTF.
Yeah. But this is like like, and so many little things like that, right? It's not meant to be a realistic depiction of the future.
Speaker 3
So, I mean, part of me wants to say, you know, the problem isn't science fiction. The problem isn't science.
The problem is like critical reading comprehension skills.
Speaker 3 And
Speaker 3
money. Yeah.
Exactly.
Speaker 3 So the accumulation of wealth to a very few is always going to be a very bad thing for any society. But right now, unfortunately, there's a global society of billionaires that has popped up.
Speaker 3 And they're
Speaker 3 Marxist. What's that? When did you become Marxist? I'm not a Marxist, believe me.
Speaker 3
I'm pretty cool with capitalism. I'm just all about guardrails.
And I also believe that $2 billion is all you get to have. Okay.
Speaker 3 Now that was pretty Marxist.
Speaker 3 Yeah, I think that within Dusk Capital.
Speaker 3 We're going to have $2 billion.
Speaker 3 Karl Marx said,
Speaker 3 $2 billion, no more. $2 billion, no more.
Speaker 3 So, no.
Speaker 2 So, basically, you're saying not enough pure science education, but a hell of a lot of money is a bad combination.
Speaker 3
That's a bad combination. Yeah.
No, and I agree completely. So, give us the takeaway thesis of your book.
Speaker 3 Oh, I mean, I actually do end the book saying that we should, you know, limit the amount of money that people should be able to have.
Speaker 3 Okay.
Speaker 2 Did Karl Marx write the fumet for us?
Speaker 3
I don't think you meant that. I don't think you meant that.
What you mean is
Speaker 3
we should limit how much power the people who have money have. Yeah, absolutely.
The problem with that is the more money you have, money is always, and they call it soft power. It's not.
Speaker 3 It is straight hard power because you are able to influence every corridor of power that there is when you have enough money. Oh, no, this is great, man.
Speaker 3 You should follow me around and just say the stuff that I need to say, but better.
Speaker 3 do but what you need to do is this is where progressive tax is a good thing and we found that out under fdr uh back in the day where basically you got to a certain amount of money and they were like yeah we're going to take 90 of that Okay, and we're going to take it and we're going to do stuff because you wouldn't have been able to get that much money without all the things that we want to now support with the money that we helped you make but you get to keep up until that point all pretty much all your money but when you get to this level you're going to give us on that money give me give me your give me that money
Speaker 3 but yeah no i think that
Speaker 3 we as a society like because it's not just the billionaires it's also that we as a society buy into this idea right that the ultra wealthy know what they're talking about when it comes to something other than the ins and outs of having
Speaker 3
dumbasses yeah exactly no you know what they're good at they're good at rigging the game for them to make more money. Yep.
That's what they're good at.
Speaker 3
And everybody thinks they're going to be rich one day. Yep.
Okay. And so I did this, the calculation for a billionaire, what it takes just to make a billion dollars.
Speaker 3 And I think I used $500 an hour, which is a very good amount of money.
Speaker 3
$500 an hour, 24 hours a day, seven days a week. And I think it came out to like, you'd have to work 2,300 and four or 4,000 years or 2,300 years.
It's ridiculous. It's a ridiculous amount of money.
Speaker 3
That's what I'm saying. Well, you want to know what makes it even more ridiculous.
Turn it around. You have a billion dollars.
You want to get rid of it. You spend $500 an hour.
Speaker 3
And it takes you many times longer than a human life. What do you need more than a billion dollars for? That's my point.
Yeah. So you get $2 billion.
And that's it.
Speaker 3
I did that calculation with Elon Musk's wealth. Oh, did you? Yeah.
You turn all of his money into $100 bills
Speaker 3 and lay them end to end.
Speaker 3 And you ask, how far does it go around the earth we can go several times around the earth with hundred dollar bills okay and then there's some leftover money tape them together to a ribbon and you'll have enough left over to go to the moon and back yeah see that's ridiculous
Speaker 3 that's freaking ridiculous yeah so now we feel like we have to protect these people this is what i don't understand the issue is their outsized power they have over laws legislation politicians absolutely and the rest of us i don't mind rich people provided they're not trying to control my life yep Okay.
Speaker 3
Yeah. No, I agree.
I'm with you. We got to land this plane.
Okay.
Speaker 3 I think I figured out what's going on here. A lot of smart people, a lot of wealthy people, a lot of people with influence trying to figure out what kind of future we will have,
Speaker 3 what kind of future we should have. And we all know that future will pivot on advances in science and technology, as civilization has always pivoted on science and technology.
Speaker 3 And so, but we're at a point now, and maybe we've been at this point before.
Speaker 3 So, is this really any different? I don't know. But it seems like we have the future in the palm of our hands.
Speaker 3 And in the end, it comes down to not how advanced the science is, not how clever anybody is, not how it's not related to any of that.
Speaker 3 It has to do with how wise we are in the face of our own creations.
Speaker 3 And
Speaker 3 wisdom, I think, is an undervalued
Speaker 3 factor
Speaker 3 in all the brilliance people are exhibiting in their creations, in their discoveries, in their forces operating, what the future of civilization will be.
Speaker 3 So
Speaker 3 if I may appeal
Speaker 3 to
Speaker 3 what it is to not only think about how great your inventions and discoveries are, but think about how you might
Speaker 3
harness it as you harness a horse. An unharnessed horse runs wild.
You don't know what it's going to do next. A harnessed horse is still a horse,
Speaker 3 but it gets to do exactly what you need it to do and what you want it to do.
Speaker 3 And that is a dose of wisdom,
Speaker 3 coupled with our ingenuity.
Speaker 3 I'd like to think there's more of that in our future. Maybe we'll avoid the disasters that the science fiction writers always portray.
Speaker 3 And that is a cosmic perspective.
Speaker 3
Dude, thank you for being on Star Talk. Thanks for having me.
This has been a lot of fun. Good luck with the book.
Thank you. Give me the title of the book again.
Speaker 3
More Everything Forever: AI Overlords, Space Empires, and Silicon Valley's Crusade to Control the Fate of Humanity. You got it right.
He did it right. Well, he did right.
Speaker 3
All right. Back to Berkeley, you go.
Yes. And keep us thinking about the future.
I will. There's not enough of that going on.
Thank you. Yeah.
I'd be happy to come back anytime. All right.
Speaker 3 This has been Star Talk Special Edition.
Speaker 3 You put together another one with you, peeps.
Speaker 2 Lane Un's worth as well. Take a large slice of credit.
Speaker 3
There you go. All right, Chuck.
Always a pleasure. We're all good here.
Neil deGrasse Tyson for Star Talk Special Edition.
Speaker 3 Bidding you to keep looking up.
Speaker 4 Black Friday savings are here at the Home Depot, which means it's time to add new cordless power to your collection.
Speaker 7 Right now, when you buy a select battery kit from one of our top brands like Ryobi or Milwaukee, you'll get a select tool from that same brand for free.
Speaker 5 Click into one of our best deals of the season and stock up on tools for all your upcoming projects.
Speaker 4 Get Black Friday savings happening now at the Home Depot.
Speaker 8 Limit one per transaction, exclusion supply, full, eligible tool list in store and online.
Speaker 3 I'm a high note hitting songbird, but I'm also a bird-watching backpacker.
Speaker 3 Wood thrush, three o'clock.
Speaker 9 Walmart has a wellness side too, with tons of things I need to feel good. From electrolytes to help keep me hydrated to soothing cough drops for after every show.
Speaker 3 Oh man, how about waterproof boots? Size 10?
Speaker 9 They've got half a billion things online, on the app and in store.
Speaker 3 Really? Who knew?
Speaker 3 Okay, was that you or the birds?
Speaker 1 Check out the wellness side of Walmart today.