Elon Musk: OpenAI Betrayal, His Future at Tesla, and the Next Big Thing - Grokipedia

1h 33m

(0:00) Disgraziad Corner: The most disgraceful things of the week!

(3:10) Elon on X's new algorithm, why there has been so much Sydney Sweeney content lately

(11:35) Creating Grokipedia: Wikipedia's failures, the future of information on the internet, confirmation bias

(24:52) Three years of X: Looking back on the Twitter acquisition and how it changed free speech on the internet

(42:49) Tesla vote on Elon's compensation, would he leave Tesla if it doesn't pass?

(47:40) OpenAI lawsuit, for-profit conversion, how much Elon should own, OpenAI's great irony

(56:24) AI power efficiency, Robotaxis, future of self-driving

(1:09:34) Bill Gates flips on climate change, solar, energy production

Follow Elon:

https://x.com/elonmusk

Follow the besties:

https://x.com/chamath

https://x.com/Jason

https://x.com/DavidSacks

https://x.com/friedberg

Follow on X:

https://x.com/theallinpod

Follow on Instagram:

https://www.instagram.com/theallinpod

Follow on TikTok:

https://www.tiktok.com/@theallinpod

Follow on LinkedIn:

https://www.linkedin.com/company/allinpod

Intro Music Credit:

https://rb.gy/tppkzl

https://x.com/yung_spielburg

Intro Video Credit:

https://x.com/TheZachEffect

Press play and read along

Runtime: 1h 33m

Transcript

Speaker 1 Let's get started. You know, we wanted to try something new this week.

Speaker 1 Every week, you know, I get a little upset. Things perturb me, Sax.

Speaker 1 And when it does, I just yell and scream, Descrazia. And so I bought the domain name descraziad.com for no reason other than my own amusement.
But you know what?

Speaker 1 I'm not alone in my absolute disgust at what's going on in the world. So this week, we're going to bring out a new feature here on the All-In Podcast, Descraziad Corner.

Speaker 2 Descraziad.

Speaker 1 He was the best guy around.

Speaker 2 What about the people he murdered? What murder? You can act like a man! What about you? He's just kidding.

Speaker 2 What's wrong with you?

Speaker 3 Your hair was in the toilet water.

Speaker 2 Disgusting. I ought to suffocate you, you little.

Speaker 2 It's a fing disgrace.

Speaker 4 Descraziad.

Speaker 1 Descraziad.

Speaker 4 This is fantastic.

Speaker 2 This is our new feature.

Speaker 1 Chamath, you look like you're ready to go.

Speaker 2 Why don't you

Speaker 1 tell everybody who gets your descraziad this week?

Speaker 4 Wait, we all had to come with a descraziad.

Speaker 2 You really were the first time you started. You missed a memo.
All right, fine. Enough.
I got one. I got one.
Okay, all right.

Speaker 1 Just calm down.

Speaker 4 My descraziad corner goes to Jason Calkanis.

Speaker 2 Okay, here we go. Come on, man.
You can't.

Speaker 4 And Pete Buttigedge, where they in the first 30 seconds of the interview, compared virtue signaling points about how each one worked at various moments at Amnesty International. Absolutely.

Speaker 4 Literally affecting zero change, making no progress in the world, but collecting a badge that they use to hold over other people.

Speaker 2 Descrazia. He wrote a lot of letters.
We wrote a lot of letters. Descraziad.

Speaker 1 Which is good. That means it's like a good one because

Speaker 1 behind the scenes. Descraziad.

Speaker 4 Jason Kopanis and Pete Budigedge. Descraziad.
Great.

Speaker 1 I'm glad that I get the first one. And you can imagine what's coming next week for for you.

Speaker 1 I saw the Sydney Sweeney dress today trending on social.

Speaker 1 Descraziad.

Speaker 2 It's too much. What?

Speaker 4 It's too much. What is it? I didn't even know what this is.
You didn't see it.

Speaker 2 Bring it up in the picture. Okay.

Speaker 3 Bring it up. It's a little floppy.

Speaker 2 How is this disgraceful? Disgusting. What are you talking about?

Speaker 1 Much. It's disgraceful.
A little bit of, like, look at this. Oh, my God.
Too much. It's elegant.
Too much. In my day, Sachs, a little cleavage, maybe.

Speaker 1 perhaps in the 90s or 2000s, some side view. This is too much.

Speaker 2 Great highbrow subject meta.

Speaker 2 We were discussing their own politics and Sydney Sweden pressed.

Speaker 2 I don't know.

Speaker 2 Hi, Dad. Put away the phone, Jason.

Speaker 2 Let your winners ride.

Speaker 1 Rainman David Saturday.

Speaker 1 we open source it to the fans and they've just gone crazy with it. Love you best

Speaker 1 queen of kin

Speaker 1 what's going on with the algorithm I'm getting Sidney Sweeney's dress all day and last meet Saxon

Speaker 2 baby in

Speaker 2 15 times and then Sax poor Sax got he got invited to SleutCon for two weeks straight on the algorithm I say the algorithm has become

Speaker 3 if you If you demonstrate,

Speaker 2 you can't even tell if that's a joke or a real thing. It's a real thing in San Francisco.
It's all too real. Oh, it's actually real.

Speaker 2 Yeah. But con is

Speaker 2 for real. But I've noticed, yeah,

Speaker 2 if you demonstrate interest

Speaker 3 in anything on X now, if you click on it, God forbid you like something,

Speaker 2 man. The algorithm just on it.

Speaker 3 It will give you more of that.

Speaker 2 It will give you a lot more.

Speaker 2 Yes, yes. So we did have an issue.

Speaker 2 We still have somewhat of an issue where

Speaker 2 there was an important bug that was figured out that was solved over the weekend, which caused

Speaker 2 in-network posts to be not shown.

Speaker 2 So you basically, if you followed someone, you wouldn't see them, wouldn't see their posts.

Speaker 2 Obviously, a big bug.

Speaker 2 But we made your bug.

Speaker 2 Then

Speaker 2 the algorithm was not probably taking into account

Speaker 2 if you just

Speaker 2 dwelt on something.

Speaker 2 But

Speaker 2 if you interacted with it, it would go hog wild. So if you, as David said, if you were to favorite, reply, or engage with it in some way,

Speaker 2 it is going to get you a torrent of that same thing. Oh, Zach.

Speaker 2 So maybe you

Speaker 2 what was your interaction?

Speaker 2 Did you bookmark Slack on? I think you bookmarked it. Here's what I thought was good about it, though, is all of a sudden.

Speaker 2 If you haven't just sponsored Sidney Sweeney's boobs,

Speaker 2 you would get a lot more of it.

Speaker 3 Yeah, that thing.

Speaker 3 But what I thought was good about it was that you would see who else had a take on the same subject matter. And that actually has been a useful.

Speaker 2 part of it.

Speaker 3 So you do, you do get more of a, you get more of like a 360 view on whatever it is that you're shown interest in

Speaker 2 Yeah, yeah,

Speaker 2 it just it's like it was giving you if you you take a

Speaker 2 you'd have like it was just going too far, obviously. It was overcorrecting.
It had too much gain on

Speaker 2 It just turned up the gain way too high on any interaction would you would then get a torrent of that. It's like it's like oh, you had a taste of it.
We're going to give you three helpings. Okay.

Speaker 2 We're going to force you with it.

Speaker 2 We're going to give you the food funnel. And that's all being done.

Speaker 1 I assume it's all being done with Grok now. So it's not like the old hard-coded algorithm, or is it using Grok?

Speaker 2 Well, what's happening is

Speaker 2 we're gradually deleting the legacy Twitter heuristics. Now, the problem is that it's like, as you delete these heuristics, it turns out the one heuristic, the one bug was covering for the other bug.

Speaker 2 And so when you delete one side of the bug, you know, it's like that meme with the internet where there's like this very complicated machine and there's like a tiny little wooden stick that's something that's keeping it going, which was, I guess, Amazon, AWS East or whatever, had something like that.

Speaker 2 You know,

Speaker 2 when somebody pulled out the little stick,

Speaker 2 what's this?

Speaker 2 I think it would be good if it half of Earth, you know?

Speaker 1 It would be great if it showed like one person you follow and then like it blended. the old style, which was just reverse chronological of your friends, the original version, with this new version.

Speaker 1 So you get like a little bit of both.

Speaker 2 Well, you can still, you still have the everyone still has the following tab. Yeah.

Speaker 2 Now, something we're going to be adding is the ability to have a curated following tab, because the problem is, like, if you follow some people and they're maybe a little more prolific than you're, you know,

Speaker 2 you know, you follow someone and some people are much more, you know, say a lot more than others.

Speaker 2 That makes the following tab hard to use.

Speaker 2 So we're going to add add an option where you can have the following tab be curated. So

Speaker 2 Grokwell will say, what are the most interesting things posted by your friends? And we'll show you that in the following tab. It will also give you the option of literally seeing everything.

Speaker 2 But I think having that option will make the following tab much more useful.

Speaker 2 So it'll be a curated list of people you follow, like ideally the most interesting stuff that they've said, which is kind of what you want to look at.

Speaker 2 And then

Speaker 2 we've mostly fixed the bug, which would

Speaker 2 give you way too much of something if you interacted with a particular subject matter.

Speaker 2 And then

Speaker 2 the

Speaker 2 really big change, which is where Grok literally reads everything that's posted to the platform.

Speaker 2 There's about 100 million posts per day. So it's 100 million pieces of content per day.

Speaker 2 And I think that's actually just maybe just in English. I think it goes beyond that if it's outside of English.

Speaker 2 So Grok is going to, we're going to start off reading the

Speaker 2 what Grok thinks are the top 10 million of the 100 million. And we'll actually read them and understand them and categorize them and match them to users.

Speaker 2 It's like this is not a job humans could ever do.

Speaker 2 And then once that is is scaling reasonably well, we'll add the entire 100 million a day.

Speaker 2 So it's literally going to read through 100 million things

Speaker 2 and show you the things that it thinks out of 100 million posts per day, what are the most interesting posts to you.

Speaker 1 How much of Colossus will that take?

Speaker 2 A lot of work.

Speaker 1 Yeah, that's like, is it tens of thousands of servers to do that every day?

Speaker 2 Yeah, my guess is it's probably on the order of 50k, H100, something like that.

Speaker 1 Wow. And that will replace search.
So you'll be able to actually search on Twitter and find things in like with a

Speaker 1 with a plain language.

Speaker 2 We'll have semantic search where you can just ask a question

Speaker 2 and it will show you all content, whether that is text, pictures, or video, that matches your search query semantically.

Speaker 4 How's it been? Three years in. This is a, it was a three-year anniversary, like a couple years ago.

Speaker 2 This is three years? Yeah. Yeah, remember it was Halloween?

Speaker 2 Yeah, Halloween's back.

Speaker 1 Halloween's back, but it was the weekend you took over was Halloween.

Speaker 2 Yeah.

Speaker 2 We had a good time. Yeah.

Speaker 1 Yeah, three years.

Speaker 2 Things three years from now.

Speaker 2 Yeah.

Speaker 1 What's the takeaway?

Speaker 3 Three years later.

Speaker 1 You obviously don't regret buying it. It's saved free speech.
That was good. Seemed to have turned that that whole thing around.
And that was, I think, a big part of your mission.

Speaker 1 But then you added it to XAI, which makes it incredibly valuable as a data source.

Speaker 1 So when you look back on it, the reason you bought it to stop crazy woke mind virus and make truth exist in the world again. Great.
Mission accomplished.

Speaker 1 And now it has this great future.

Speaker 2 Yeah, we've got community notes. You can also ask Grok about

Speaker 2 anything you see on the platform.

Speaker 2 You know, just press the Grok icon on any X post and it will

Speaker 2 analyze it for you and research it as much as you want. So you can basically have,

Speaker 2 just by tapping the Grok icon, you can assess

Speaker 2 whether that post is the truth, the whole truth, or nothing but the truth, or whether there's something supplemental you need to be explained. So

Speaker 2 I think it's actually, we've made a lot of progress towards

Speaker 2 freedom of speech and

Speaker 2 people being able to tell whether something is false or not false.

Speaker 2 Propaganda. The recent update to Grok is actually, I think, very good at piercing through propaganda.

Speaker 2 And then we used that latest version of Grok to create Grokopedia,

Speaker 2 which I think is

Speaker 2 much more,

Speaker 2 it's not just, I think, more neutral

Speaker 2 and more accurate than Wikipedia, but actually has a lot more information than a Wikipedia page.

Speaker 4 Did you you seed it with Wikipedia? Actually, take a step back.

Speaker 4 How did you do this?

Speaker 2 Well, we used AI.

Speaker 4 But meaning like totally unsupervised, just a complete training run on its own, totally synthetic data,

Speaker 4 no seeded set, nothing.

Speaker 2 Well,

Speaker 2 it was only just recently possible for us to do this.

Speaker 2 So

Speaker 2 we finished training on a maximally truth-seeking,

Speaker 2 maximally true seeking, a version of Grok that is good at cogent analysis. So breaking down

Speaker 2 any given argument into its axiomatic elements,

Speaker 2 assessing whether those axioms are

Speaker 2 not, you know, the basic test for cogency, the axioms are likely to be true, they're not

Speaker 2 contradictory. uh that um the conclusion naturally the the conclusion most likely follows from those axioms.

Speaker 2 So we just trained Grok on a lot of critical thinking.

Speaker 2 So it just got really good at critical thinking, which was quite hard.

Speaker 2 And then we took that version of Grok and said, okay, cycle through the million most popular articles in Wikipedia and add, modify, and delete. So

Speaker 2 that means

Speaker 2 research the rest of the internet,

Speaker 2 whatever's publicly available,

Speaker 2 and

Speaker 2 correct the Wikipedia articles and fix mistakes,

Speaker 2 but also add a lot more context.

Speaker 2 So

Speaker 2 sometimes really the nature of the propaganda is that

Speaker 2 facts are stated that are technically true, but

Speaker 2 do not properly represent a picture of the individual or event.

Speaker 1 This is critical.

Speaker 1 Because when you have a bio, as you do, actually we all do, on Wikipedia, over time, it's just the people you fired or you beat in business or have an axe to grind.

Speaker 1 So it just slowly becomes like the place where everybody, you know, kind of who hates you then puts their information. I looked at mine.

Speaker 1 It was so much more representative and it was five times longer, six times longer. And what it gave weight to

Speaker 1 was much more accurate. much more accurate.
And this opportunity was sitting here, I think, for a long time.

Speaker 1 It's just great that you got to it because

Speaker 1 they don't update my page, but, you know, I don't know, twice a month with, you know, and then who is the secret cobble? There's 50 people who are anonymous who decide what gets put on it.

Speaker 1 It was a much better, much more updated page in version one.

Speaker 2 Yes, this is version 0.1, as we put it, as we show at the top. So

Speaker 2 I do think actually by the time we get to version 1, 1.0, it'll be 10 times better. But even at this early stage,

Speaker 2 as you just mentioned,

Speaker 2 it's not just that it's correcting errors, but

Speaker 2 it is creating a more accurate and realistic and fleshed-out

Speaker 2 description of people and events.

Speaker 2 Elon, do you think that

Speaker 2 and subject matters? Like, you can look at articles on physics and Gracopedia. They're much better than Wikipedia by far.

Speaker 4 This is what I was going to ask you. Do you think that you can take this corpus of pages now and

Speaker 4 get Google to deboost Wikipedia or boost Gracopedia in traditional search? Because a lot of people still find this and they believe that it's authoritative because it comes up number one, right?

Speaker 4 So how do we do that? How do you flip Google?

Speaker 2 Yeah, so it really can, if people share a lot of,

Speaker 2 if Gracopedia is used elsewhere, like if people cite it on their websites or post about it on social media,

Speaker 2 or when they do a search, when Gracopedia shows up, if they click on Gracopedia, it will naturally

Speaker 2 rise in Google's

Speaker 2 rankings.

Speaker 2 I did text Cindar because

Speaker 2 even sort of a day after launch, if you typed in Gracopedia, Google would just say, Did you mean Wikipedia? Wikipedia, yeah. And it wouldn't even bring Gracopedia up at all.
Yeah, that's true.

Speaker 2 How's the usage been? Have you seen good growth since it launched?

Speaker 2 Yeah,

Speaker 2 it went super viral.

Speaker 2 So

Speaker 2 we're seeing it cited all over the place.

Speaker 2 But yeah,

Speaker 2 and I think we'll see it used more and more

Speaker 2 as people refer to it. And people will judge for themselves.

Speaker 2 When you read a Grakipedia article about a subject or a person that you know a lot about and you see, wow, this is way better than Wikipedia.

Speaker 2 It's more comprehensive, it's way more accurate,

Speaker 2 it's neutral instead of biased, then

Speaker 2 you're going to port those links around

Speaker 2 and say that this is actually the better source.

Speaker 2 Grock video will

Speaker 2 succeed, I think, very well because it is fundamentally a superior product to Wikipedia. It is a better source of information.

Speaker 2 And we haven't even added images and video yet.

Speaker 2 So

Speaker 2 we're going to add a lot of video.

Speaker 2 So using Grock Imagine to create videos.

Speaker 2 And so if you're trying to explain something,

Speaker 2 Grock Imagine can take the text from Grockpedia and then generate a video,

Speaker 2 an explanatory video. So if you're trying to understand anything from how to tie bow tie to how do certain chemical reactions work or

Speaker 2 really anything, dietary things, medical things,

Speaker 2 You can just go and see the video of how it works that is created by Ava.

Speaker 4 When you have this version that's maximally truth-seeking as a model, do you think that there needs to be a better eval or a benchmark that people can point to that shows how off of the truth things are?

Speaker 4 So that if you're going to start a training run with Common Crawl, or if you're going to use Reddit, or if you're going to use,

Speaker 4 is it important to be able to say, hey, hold on a second, this eval just sucks, like you guys suck on this eval. Like it's just, this is crappy data

Speaker 2 yeah i'm i i guess i'm not i think i mean there are a lot of evals out there um i've complete confidence that crawl computer is going to succeed um because wikipedia is actually not a very good product yeah

Speaker 2 it's it's it's the information is sparse

Speaker 2 wrong and out of date.

Speaker 2 And if you can go, if you find,

Speaker 2 and it doesn't have, you know, there are very few images, there's basically no video.

Speaker 2 So if you have something which is,

Speaker 2 you know,

Speaker 2 accurate, comprehensive,

Speaker 2 has videos,

Speaker 2 where moreover you can ask, if there's any part of it that you're curious about, you can just highlight it and

Speaker 2 ask Grock right there.

Speaker 2 Like if you're trying to learn something, it's just great.

Speaker 2 It's not going to be a little bit better than Wikipedia. It's going to be a hundred times better than Wikipedia.
Elon, do you you think you'll see like good uniform usage?

Speaker 2 Like, if you look back on the last three years since you bought Twitter,

Speaker 2 there was a lot of people after you bought Twitter that said, I'm leaving Twitter. Elon's bought it.
I'm going to go to this other wherever the hell they went.

Speaker 2 And there's all these new threats.

Speaker 2 And there's all these, and there's all these blue sky skylines.

Speaker 2 Yeah.

Speaker 2 Yeah, but blue sky iceballing is my favorite.

Speaker 1 I guess my question is,

Speaker 2 as you destroy the woke mind viral kind of

Speaker 2 control of the system, and as you bring truth to the system, whether the system is through Grakipedia or through X,

Speaker 2 do people like just look for confirmation bias and they actually don't accept the truth? Like, what do you like, or do you think people are actually going to see the truth and change?

Speaker 2 Yeah.

Speaker 2 But I mean, is that like you thought Sidney Sweeney's boobs were great. I just see mine.

Speaker 2 Looking good. Yeah.
Solid, solid week up there.

Speaker 2 I think we just got flagged on YouTube.

Speaker 2 Yeah, we did. That was definitely going to give us a censorship moment.

Speaker 2 Great A moves.

Speaker 2 No, but but like like but but do people change their mind? I mean

Speaker 2 if there's a actually take there's no such thing as grade A move.

Speaker 2 God's off the rails already.

Speaker 1 David, you were trying to ask a serious question. Go ahead.

Speaker 2 Well, Well, I just want to know if people change their mind.

Speaker 2 Like, can you actually change people's minds by putting the truth in front of them, or do people just take, you know, they kind of ignore the truth because they feel like they're in some sort of camp and they're like, I'm on the side.

Speaker 4 They want the confirmation bias.

Speaker 2 They want the confirmation bias and they want to stay in a camp and they want to be tribal about everything.

Speaker 2 It is remarkable how much people believe things simply because it is

Speaker 2 the belief of their in-group, you know, whatever their sort of political or ideological tribe is.

Speaker 2 So,

Speaker 2 I mean, there's some pretty hilarious videos of, you know,

Speaker 2 you know,

Speaker 2 and there's like some guy going around, Tom is like a racist Nazi or whatever. And then, and, and then, and he was like trying to show them the videos

Speaker 2 of the thing that they are talking about,

Speaker 2 where he is, in fact, uh condemning the Nazis in the strongest possible terms and condemning racism in the strongest possible terms. And they literally don't even want to watch the videos.

Speaker 2 So, so yeah,

Speaker 2 people,

Speaker 2 or at least some people, would they would prefer

Speaker 2 they will stick to whatever their

Speaker 2 ideological views are, whatever their sort of political tribal views are, no matter what.

Speaker 2 The evidence could be staring them in the face and they're just going to be a flat earther. You know,

Speaker 2 there is no evidence that you can show to a flat earther to convince them the world's round because everything is just a lie. The world is flat.
Type of thing.

Speaker 1 I think the ability to hit at Grok in a reply and ask it a question in the thread has really become like a truth-seeking missile on the platform.

Speaker 1 So when I put up metrics or something like that, I reply to myself and I say, at Grok, is the information I just shared correct? And can you find any better information?

Speaker 1 And please tell me if my argument is correct or if I'm wrong. And then it goes through and then it DMs Sachs and then Sachs gets in my replies and tries to correct me.

Speaker 1 No, but it does actually a really good job of like, and that combined with community notes. Now you've got like two swings at bat, the community's consensus view and then Grok coming in.

Speaker 1 I think it would be like really interesting if Grok on like really powerful threads kind of did like its own version of community notes and had it sitting there ahead of time.

Speaker 1 You know, like you could look at a thread and it just had next to it, you know, or maybe on like the specific statistic, you could click on it and it would show you like, here's where that statistic's from.

Speaker 2 I mean, you can,

Speaker 2 I mean, pretty much every, I mean, essentially every post on X, unless it's like advertising or something,

Speaker 2 has the Grok symbol on it. Yeah.
And you just tap that symbol and you're one tap away from a Grok analysis, literally just one tap. And we don't wanna clutter the interface with where it's

Speaker 2 providing an explanation, but I'm just saying if you go on X right now, it's one tap to get the to get Grok's analysis. And Grok will research

Speaker 2 the X post and give you an accurate answer.

Speaker 2 And you can even ask us to do further research and further due diligence. And you can go as far down the rabbit hole as you wanna go.
But I do think this is

Speaker 2 consistent with we want X to be the best source of truth on the planet by far. And I think it is.

Speaker 2 And where you hear

Speaker 2 any and all points of view.

Speaker 2 but where those points of view are corrected by human editors with community notes. And the essence of community notes is that

Speaker 2 people who historically disagree agree that this community note is correct.

Speaker 2 And all of the community notes code is open source and the data is open source. So you can recreate any community note from scratch independently.

Speaker 3 By and large, it's worked very well.

Speaker 2 Yeah. Yeah.

Speaker 3 I think we originally had the idea to have you back on the pod because it was a three-year anniversary of the Twitter acquisition. So

Speaker 3 I just wanted to kind of reminisce a little bit.

Speaker 3 I remember, yeah, I mean, I remember.

Speaker 2 Where's that sink?

Speaker 3 Where's that sink? Well, yeah. So Elon was staying at my house.
We had talked the week before and he told me the deal was going to close. And so I was like, hey, do you need a place to stay?

Speaker 3 And he took me up on it. And the day before he went to the Twitter office, there was a request made to my staff.
Do you happen to have an extra sink?

Speaker 3 And they did not, but they were able to.

Speaker 2 Who has an extra sink, really?

Speaker 3 But they were able to locate one at a nearby hardware store, and I think they paid extra to get it out of the window or something.

Speaker 2 Well, I think the store was confused because my security team was asking for any kind of sink. And like

Speaker 2 normally people wouldn't ask for any kind of sink.

Speaker 2 You need a sink that puts in your bathroom or connects to a certain kind of plumbing. So they're like trying to ask these, like, well, what kind of faucets do you want?

Speaker 2 That's no, no, I just wanted a sink. Yeah, they think it's a mental person going.

Speaker 2 the store was confused that we just wanted a sink

Speaker 2 and didn't and didn't care what what the sink connected to

Speaker 2 that was

Speaker 2 they were just they were like almost not letting us buy the sink because because they they thought maybe we'd buy the wrong sink you know um

Speaker 2 it's just rare that somebody wants a sink for sink sake

Speaker 3 For meme purposes.

Speaker 1 One of my favorite memories was Elon said, hey, you know, swing by, check it out. I said, okay, I'll come by.
And I drive up there and I'm looking where to park the car.

Speaker 1 And I realize there's just parking spaces around the entire building. And I'm like, okay, this can't be like legal parking, but I park and it's legal parking.

Speaker 2 Yeah, I mean,

Speaker 2 you are in downtown SF, so you might get your window broken.

Speaker 1 Yeah, I might not be there when I get back. But we get in there and the place is empty.
And then

Speaker 2 it was seriously empty, except for the cafeteria. There was an entire, there were two, the Twitter headquarters was two buildings.

Speaker 2 One of the buildings was completely and utterly empty, and the other building had like 5% occupancy.

Speaker 1 And the 5% occupancy, we go to the cafeteria, we all go get something to eat, and we realize there's more people working in the cafeteria

Speaker 1 and at Twitter.

Speaker 2 There were more people making the food than eating the food

Speaker 2 in this giant cafeteria, you know,

Speaker 2 really nice, really nice cafeteria.

Speaker 2 You know,

Speaker 2 this is where we discovered that the actual price of the lunch was $400.

Speaker 2 The original price was $20, but

Speaker 2 it was at 5% occupancy, so it was 20 times higher. And they still kept making the same amount pretty much

Speaker 2 and charging the same amount. So effectively, lunch was $400.

Speaker 2 That's a great meeting.

Speaker 2 Yes,

Speaker 2 and then there was

Speaker 2 where we had the initial meetings, sort of the sort of trying to figure out what the heck's going on meetings

Speaker 2 in, in these, in the, because, you know, there's the two buildings, two Twitter buildings, and one, the one with literally no one in it. Um, that's, that's where we had the initial meetings.
Um,

Speaker 2 and um, and then we tried drawing on the whiteboard, and the markers had gone dry. So the

Speaker 2 notes,

Speaker 2 nobody had used the

Speaker 2 whiteboard markers in like two years.

Speaker 2 So sad. None of the markers worked.
So we're like, this is totally bizarre. But it was totally clean because the cleaning crew had come in and done their job and

Speaker 2 cleaned an already clean place for, I don't know, two, three years straight.

Speaker 2 It was files.

Speaker 2 I mean, honestly,

Speaker 2 this is more crazy than any sort of Mike Judge movie or, you know, Silicon Valley or anything like that.

Speaker 2 And then I remember going into the men's bathroom and

Speaker 2 there's a table

Speaker 2 with

Speaker 2 menstrual hygiene products.

Speaker 1 Yeah.

Speaker 1 Refreshed every week.

Speaker 2 Tampons, like a fresh box of tampons.

Speaker 2 And we're like, but there's literally no one in this building.

Speaker 2 So,

Speaker 2 but nope, it hadn't turned off the

Speaker 2 send fresh tampons to the men's bathroom in the empty building had not been turned off. No.
So

Speaker 2 every week they would put a fresh box of tampons in an empty building

Speaker 2 for years.

Speaker 2 This happened for years. And it must be very confusing to the people that were being asked to do this because they're like,

Speaker 2 I'll throw them away. Well, I remember when you.

Speaker 2 I guess they're paying us. So we'll just put tampons.

Speaker 2 So seriously, if you consider the string of possibilities necessary in order for anyone to possibly use that tampon in the men's bathroom at the unoccupied second building of Twitter headquarters.

Speaker 2 Because you'd have to be a burglar

Speaker 2 who is a trans man burglar

Speaker 2 who's unwilling to use the women's bathroom that also has tampons.

Speaker 2 Statistically,

Speaker 2 there's no one in the building.

Speaker 2 So you've broken into the building.

Speaker 2 At that moment, you have a period. Yes.
And you're on your period.

Speaker 2 I mean, you are more likely to be struck by a meteor

Speaker 2 than need that tampon. Okay.

Speaker 2 Well, I remember

Speaker 3 I think it was shortly after that, you discovered an entire room

Speaker 3 at the office that was filled with stay woke t-shirts.

Speaker 2 Do you remember this? An entire pile of merch. Yes.
Hashtag stay woke. Stay woke.
And also a big sort of buttons, like those magnetic buttons that you put on your shirt that

Speaker 2 I am an engineer.

Speaker 2 I'm like, look, if you're an engineer, you don't need a button.

Speaker 2 Who's the button for? Who are you telling me to? You could just ship code.

Speaker 2 We would know. We could check your GitHub.

Speaker 2 But yeah,

Speaker 2 they're like scarves,

Speaker 2 hoodies,

Speaker 2 all kinds of merch that said hashtag stay woke. Yeah.

Speaker 1 A couple of music.

Speaker 3 When we found that, I was like, my God, man, the barbarians are fully within the gates now.

Speaker 2 I mean,

Speaker 2 the barbarians have smashed through the gates and are looting the merch. Yes.

Speaker 3 You are rummaging through their holy relics and defiling them.

Speaker 1 I mean, but when you think about it, David, the amount of waste that we saw there during those first 30 days, just to be serious about it for a second, this was a publicly traded company.

Speaker 1 So if you think about the financial duty of those individuals, there was a list of SaaS software we went through and none of it was being used.

Speaker 1 Some of it had never been installed, and they had been paying for it for two years. They'd been paying for a SaaS product for two years.

Speaker 1 And the one that blew my mind the most that we canceled was they were paying a certain amount of money per desk to have desk suiteing software in an office where nobody came to work.

Speaker 2 So they were paying

Speaker 2 nobody.

Speaker 2 There was millions of dollars a year being paid for, yes, but for

Speaker 2 analysis of pedestrian,

Speaker 2 like software that use cameras to analyze the pedestrian traffic to figure out where you can alleviate pedestrian traffic jams in an empty building. Right.

Speaker 2 That's like 11 out of 10 on a Dolbert scale.

Speaker 1 Yeah, it was pretty. Shout out, Scott Adams.

Speaker 2 You've gone off scale on your Dolbert level at that point.

Speaker 3 Let's talk about the free speech aspect for a second, because I think that is the most important legacy of the Twitter acquisition.

Speaker 3 And I think people have short memories and they forget how bad things were three years ago.

Speaker 3 First of all, you had figures as diverse as President Trump, Jordan Peterson, Jay Bhattacharya, Andrew Tate.

Speaker 2 They were all banned from Twitter.

Speaker 3 And I remember when you opened up the Twitter jails and reinstated their accounts, kind of, you know, freed all the bad boys of free speech.

Speaker 2 Stoned the Bastille.

Speaker 3 Yes. So you basically gave all the bad boys of free speech their accounts back.
But second, beyond just the bannings, there was the shadow bannings.

Speaker 3 And Twitter had claimed for years that they were not shadow banning.

Speaker 2 This was a paranoid, conservative conspiracy theorist.

Speaker 2 There was a very aggressive shadow banning by what was called the Trust and Safety Group, which, of course, naturally would be the one that is doing the nefarious shadow banning.

Speaker 2 And I just think we shouldn't have a group called Trust and Safety.

Speaker 2 This is an Orwellian name if you ever, if there ever was one.

Speaker 2 i'm from the trust department oh really

Speaker 2 we want to talk to you about your tweets

Speaker 2 okay

Speaker 2 can we see your dm say that you're from the trust department it's literally that's the ministry of truth right there yeah

Speaker 3 they had maintained for years that they were not engaged in this practice including under oath and on the heels yeah of you opening that up and exposing that

Speaker 3 Because by the way, it wasn't just the fact they were doing it. They had created an elaborate set of tools to do this.

Speaker 2 They had checkboxes

Speaker 2 to

Speaker 2 de-boost accounts. Yes.

Speaker 3 Yes. And subsequently, we found out that other social networking properties have done this as well.

Speaker 2 But you were really the first to expose it. This is still being done at the other social media companies.

Speaker 2 Including Google, by the way.

Speaker 2 So

Speaker 2 for,

Speaker 2 you know,

Speaker 2 I don't want to pick on Google because they're all doing it, but for search results, if you simply push a result pretty far down the page or

Speaker 2 the second page of results,

Speaker 2 the joke used to be, or still is, I think, where do you hide a dead? What's the best place to hide a dead body?

Speaker 2 The second page of Google search results, because nobody ever goes to the second page of Google search results. So you could hide a dead body there, nobody would find it.

Speaker 2 And

Speaker 2 then it's not like you haven't made them go away.

Speaker 2 You've just

Speaker 2 put them on this one page too. Yes.

Speaker 3 So shadow banning, I think, was number two. So first was banning, second was shadow banning.
I think third to me was government collusion, government interference. So you released the Twitter files.

Speaker 3 Nothing like that had ever been done before, where you just, you actually let investigative reporters go through Twitters, emails, unfettered tax groups.

Speaker 2 I was not looking over their shoulder at all.

Speaker 2 They just had direct access to everything.

Speaker 3 And they found that there was extensive collusion between the FBI and the Twitter trust and safety group, where it turns out the FBI had 80 agents submitting takedown requests and they were very involved in the banning, the shadow banning, the censorship, which I don't think we ever had definitive evidence of that before.

Speaker 3 That was pretty extraordinary.

Speaker 2 Yeah, and the

Speaker 2 U.S. House of Representatives had hearings on the matter

Speaker 2 and a lot of this was unearthed. It's public record.
So a lot of people,

Speaker 2 some people on the left still think this is like, made up. I'm like, this is just literally these, the Twitter files are literally the files at Twitter.

Speaker 2 I mean, we're literally just talking about these are the emails that were sent internally that confirm this. This is what's on the Slack channels.

Speaker 2 And this is what is shown

Speaker 2 on the Twitter database as where people have made either suspensions or shadow bans.

Speaker 1 Has the government come and asked you to take stuff down since, or they just have to, the policy is, hey, listen, you got to file a warrant. You got to...

Speaker 1 you got to come correct as opposed to just putting pressure on executives.

Speaker 2 Yeah,

Speaker 2 our policy at this point is to follow the law.

Speaker 2 So

Speaker 2 if now,

Speaker 2 the laws are obviously different in different countries. So sometimes, you know, I get criticized for like, why don't I push free speech in XYZ

Speaker 2 country that doesn't have free speech laws? I'm like, because that's not the law there.

Speaker 2 And if we don't obey the law, we'll simply be blocked in that country.

Speaker 2 So

Speaker 2 the policy is really just to adhere to the laws in any given country.

Speaker 2 It is not up to us to agree or disagree with those laws. And

Speaker 2 if the people of that country want laws to be different, then they should

Speaker 2 ask their leaders to change the laws.

Speaker 2 As soon as you start going beyond the law, now you're putting your thumb on the scale.

Speaker 2 Yeah,

Speaker 2 I think that's the right policy is just adhere to the laws within any given country.

Speaker 2 Now, sometimes we get you know,

Speaker 2 in a bit of a bind, like we had got into with Brazil, where,

Speaker 2 you know, this, this, this judge in Brazil was asking us to, or telling us to break the law in Brazil

Speaker 2 and ban accounts contrary to the law of Brazil. And now we're now we're sort of somewhat stuck.

Speaker 2 We're like, wait a second, we're reading the law and it says this is not allowed to happen and also that and giving us a gag order. So like we're not

Speaker 2 to

Speaker 2 say it's happening

Speaker 2 and we have to break the law and the judge is telling us to break the law. The law is breaking the law.
That's where things get very difficult.

Speaker 2 And we were actually banned in Brazil for a while because of that.

Speaker 3 I just want to make one final point on the free speech issue and then we can move on. It's just, I think people forget that the censorship wasn't just about COVID.

Speaker 3 There was a growing number of categories. of thought and opinion that were being outlawed.

Speaker 3 The quote content moderation, which is another Orwellian euphemism for censorship, was being applied to categories like gender and even climate change.

Speaker 3 The definition of hate speech was constantly growing.

Speaker 2 Yes.

Speaker 3 And more and more people were being banned or shadow banned. And there was more and more things that you couldn't say.
This trend of censorship was growing. It was galloping.

Speaker 3 And it would have continued if it wasn't, I think, for the fact that you decided to buy Twitter and opened it up.

Speaker 3 And it was only on the heels of that that the other social networks were willing to, I think, be a little bit chastened in their policies and start to push back more.

Speaker 2 Yeah, that's right.

Speaker 2 Once Twitter broke ranks, the others had to,

Speaker 2 it became very obvious what the others were doing. And so they had to mitigate their censorship substantially because of what Twitter did.

Speaker 2 And I mean, perhaps to give them some credit, they also felt that they had the air cover to

Speaker 2 be more inclined towards free speech.

Speaker 2 They still do a lot of sort of

Speaker 2 shadow banning and whatnot at the other social media companies, but it's much less than it used to be.

Speaker 2 Elon, what have you seen in terms of like governments creating new laws?

Speaker 2 So we've seen a lot of this crackdown in the UK on what's being called hateful speech on social media and folks getting arrested and actually going to prison over it.

Speaker 2 And it seems like when there's more freedom, the side that is threatened by that comes out and creates their own counter, right? There's a reaction to that, and there seems to be a reaction.

Speaker 2 Are you seeing more of these laws around the world in response to your opening up free speech through Twitter and

Speaker 2 those changes and what they're enabling?

Speaker 2 That the governments and the parties that control those governments aren't aligned and they're stepping in and saying, Let's create new ways of maintaining our control through law.

Speaker 2 Yeah, there is

Speaker 2 been an overall global movement to suppress free speech under the name of in the under the guise of suppressing hate speech.

Speaker 2 But then, you know, it's the problem with that is that

Speaker 2 your freedom of speech only matters if people are allowed to say things that

Speaker 2 you don't like, or even things that you hate.

Speaker 2 Because

Speaker 2 if you're allowed to suppress speech that you don't like,

Speaker 2 then

Speaker 2 and you know, you don't have freedom of speech. And it's only a matter of time before things switch around.
And then the shoes on the other foot, and they will suppress you.

Speaker 2 So suppress not lest you be suppressed.

Speaker 2 But there is

Speaker 2 a movement, and

Speaker 2 there was a very strong movement to codify speech suppression into the law throughout the world, including the Western world,

Speaker 2 Europe and Australia.

Speaker 1 The UK and Germany were very

Speaker 1 aggressive in this regard.

Speaker 2 Yes. And my understanding is that in the UK, there's something like 2,000 or 3,000 people in prison for social media posts.

Speaker 2 And in fact,

Speaker 2 there's so many people that were in prison for social media posts.

Speaker 2 And many of these things are like you can't believe that someone would actually be put in prison for this.

Speaker 2 They have, in a lot of cases, released people who have committed violent crimes in order to imprison people who have simply made posts on social media, which is deeply wrong.

Speaker 2 And

Speaker 2 underscores why the founders of this country made the

Speaker 2 First Amendment. The First Amendment was freedom of speech.

Speaker 2 Why did they do that? It's because in the places that they came from, there wasn't freedom of speech, and you could be imprisoned or killed for saying things.

Speaker 4 Can I ask you a question just to maybe move to a different topic? If you came and did this next week, we will be past the Tesla board vote.

Speaker 4 We talked about it last week, and we talked about how crazy ISS and Glass Lewis is. And we use this one insane example where, like, Ira

Speaker 4 Aaron Price didn't get the recommendation from ISS and Glass Lewis because he didn't meet the gender requirements, but then Kathleen

Speaker 2 also didn't.

Speaker 2 It doesn't make any sense. Can you, so the board vote is on the sixth.

Speaker 2 There's an African-American one. Yeah.

Speaker 2 Yeah, she was, they recommended against her, but then also recommended against our Enterprise

Speaker 2 on the grounds he was insufficiently diverse. So I'm like,

Speaker 2 these things don't make any sense. Yeah.

Speaker 2 So I do think we've got a fundamental issue with corporate governance in publicly traded companies where you've got about half of the stock market is controlled by passive index funds.

Speaker 2 And

Speaker 2 most of them outsource their decision

Speaker 2 to advisory firms, and particularly Glass Lewis and ISS. I call them corporate ISIS.

Speaker 2 All they do is basically

Speaker 2 they're just terrorists.

Speaker 2 And they own no stock in any of these companies.

Speaker 2 So I think that there's a fundamental breakdown of fiduciary responsibility here

Speaker 2 where really

Speaker 2 any company that's managing,

Speaker 2 even though they're passively managing index funds or whatever, that they do at the end of the day have a fiduciary duty to vote

Speaker 2 you know along the lines of what would maximize the the shareholder returns because people are counting on them like people you know have say

Speaker 2 you know so has have all their savings and say 401k or something like that

Speaker 2 and they're they're counting on the index funds to vote do company votes in the direction that would

Speaker 2 ensure that their retirement savings do as well as possible. But the problem is if that is then outsourced to ISIS and Glass Lewis, which have been infiltrated by far-left activists,

Speaker 2 because

Speaker 2 you know where basically political activists go, they go where the power is.

Speaker 2 And so effectively,

Speaker 2 Glass Lewis and ISS

Speaker 2 controlled the vote of half the stock market.

Speaker 2 Now, if you're a political activist, you know what a great place would be to go work?

Speaker 2 ISS and Glass Lewis. And they do.

Speaker 2 So,

Speaker 2 my concern for the future,

Speaker 2 because

Speaker 2 the Tesla thing is called sort of compensation, but really, it's not about compensation. It's not like I'm going to go out and buy a yacht with it or something.

Speaker 2 It's just that I do, in order, if I'm going to build up Optimus and have all these robots out there, I need to make sure we do not have a terminated scenario

Speaker 2 and that I can

Speaker 2 maximize the safety of the robots.

Speaker 2 But I feel like I need to have something like a 25% vote,

Speaker 2 which is enough of a vote to have a strong influence, but not so much of a vote that I can't be fired if I go insane.

Speaker 2 So it's kind of,

Speaker 2 but my concern would be creating this army of robots and then being fired for political reasons

Speaker 2 because

Speaker 2 ISS and Glass Lewis

Speaker 2 declined to,

Speaker 2 ISIS and Glass Lewis fire me effectively, or the activists at those bombs fire me.

Speaker 2 Even though I've done everything right.

Speaker 2 That's my concern.

Speaker 2 And then I cannot ensure the safety of the robots.

Speaker 1 If you don't get that vote, if it doesn't go your way, it looks like it's going to.

Speaker 2 Would you leave?

Speaker 1 I mean, is that even in the cards? I heard the board was very concerned about that.

Speaker 2 Let's just say, I'm not going to build a robot army

Speaker 2 if I, if I can be easily kicked out by activist investors. Yeah.
No way.

Speaker 2 No way. Yeah.

Speaker 1 Makes sense.

Speaker 1 And who is capable of running the four or five major product lines at Tesla? I mean, this is the madness of it. It's a very complex business.
People don't understand what's under the hood there.

Speaker 1 It's not just a car company. You got batteries, you got trucks, you got the self-driving group.
I mean, this is a very complex business that you've built over decades now.

Speaker 1 It's not a very simple thing to run. I don't think there's an Elon equivalent out there who can just jump into the cockpit.

Speaker 4 By the way, if we take a full turn around corporate governance corner, also this week, what was interesting about the Open AI restructuring was I read the letter and your lawsuit was excluded.

Speaker 4 from the allowances of the California Attorney General basically saying this thing can go through, which means that your lawsuit is still out there, right?

Speaker 4 And I think it's going to go to a jury trial.

Speaker 4 So there, that corporate governance thing is still very much in question. Do you have any thoughts on that?

Speaker 2 Yes, I believe that will go to a jury trial in February or March,

Speaker 2 and then we'll see what the results are there. But

Speaker 2 there's

Speaker 2 a mountain of evidence

Speaker 2 that shows that OpenAI was created as

Speaker 2 an open source nonprofit. It's literally, that's the exact description in the incorporation documents.

Speaker 2 And in fact, the incorporation documents explicitly say that no officer or founding member

Speaker 2 will benefit financially from OpenAI.

Speaker 2 And they've completely violated that. And more of it,

Speaker 2 then

Speaker 2 you can just use the Wayback Machine and look at the website of OpenAI. Again, open source nonprofit, open source nonprofit, the whole way, until

Speaker 2 it looked like, wow, this is uh there's a lot of money to be gained here and then suddenly it starts changing um and they try to change the definition of open ai to mean open to everyone instead of open source even though it always meant open source

Speaker 2 i came up with the name

Speaker 2 yeah that's how i know

Speaker 2 so

Speaker 2 uh

Speaker 1 if they open sourced it uh or they gave you i mean you don't need the money but if they gave you the percentage ownership in it that you would be rightfully,

Speaker 1 which 50 million for a startup would be half at least. But they must have made an overture towards you and said, hey, can we just give you 10% of this thing and give us your blessing? Like

Speaker 1 you obviously have a different goal here, yeah?

Speaker 2 Yeah.

Speaker 2 I mean, essentially, since I came up with the idea for the company, named it,

Speaker 2 provided the A, B, and C rounds of funding, recruited the

Speaker 2 critical personnel, and told them everything I know.

Speaker 2 You know, if that had been a commercial corporation, I'd probably own half the company.

Speaker 2 So,

Speaker 2 but

Speaker 2 I could have chosen to do that.

Speaker 2 It was totally at my discretion, I could have done that.

Speaker 2 But I created it as a nonprofit for the world, an open source nonprofit for the world.

Speaker 4 Do you think the right thing to do is to take those models and just open source them? Today? If you could affect that change, is that the right thing to do?

Speaker 2 Yeah, I think that that is what it was created to do. So it should.
I mean, the best open source models right now, actually, ironically, because FAITE seems to be an irony maximizer,

Speaker 2 the best open source models are generally from China. Yeah.

Speaker 2 Like

Speaker 2 that's bizarre.

Speaker 2 And then I think the second best

Speaker 2 one is, or

Speaker 2 maybe it's better than the second best, but like the

Speaker 2 Grok 2.5

Speaker 2 open source model is actually very good.

Speaker 2 And I think we'd probably be,

Speaker 2 and we'll continue to open source our models. But whereas, like, try using any of the recent so-called OpenAI open source models.
They don't work.

Speaker 2 They basically, they open sourced a broken, non-working version of their models

Speaker 2 as a fig leaf.

Speaker 2 I mean, do you know anyone who's running OpenAI's open source models? Exactly. Yeah, nobody.

Speaker 1 We've had a big debate debate about jobs here.

Speaker 1 Obviously, there's going to be job displacement. You and I have talked about it for decades.

Speaker 1 What's your take on the pace of it? Because obviously, you're building self-driving software, you're building Optimus.

Speaker 1 And we're seeing Amazon take some steps here where they're like, yeah, we're probably not going to hire these positions in the future.

Speaker 1 And, you know, maybe they're getting rid of people now because they were bloated, but maybe some of its AI, you know, it's all debatable. What do you think the timeline is?

Speaker 1 And what do you think as a society, we're going to need to do to mitigate it if it goes too fast?

Speaker 2 Well,

Speaker 2 you know, I call AI the supersonic tsunami.

Speaker 2 So

Speaker 2 not the most comforting description in the world.

Speaker 2 It's fast and big. If there was a tsunami, a giant wall of water moving faster than the speed of sound, that's AI.

Speaker 1 When does it land?

Speaker 2 Yeah, exactly.

Speaker 2 So

Speaker 2 now, this is happening whether I want it to or not. I actually try to slow down AI by the way.

Speaker 2 And then

Speaker 2 the reason I wanted to create Open AI was to serve as a counterweight to Google, because at the time, Google was

Speaker 2 sort of essentially had unilateral power in AI. That all the AI, essentially.

Speaker 2 And,

Speaker 2 you know, Larry Page was not,

Speaker 2 you know,

Speaker 2 he was not taking AI's safety seriously.

Speaker 2 I mean, Jason Archer, were you there when he called me a species?

Speaker 2 Yes, I was there. Yeah.
Okay, so.

Speaker 1 You were more concerned about the human race than you were about the machines. And yeah, you had a clear bias for humanity.

Speaker 2 Yes, yes, exactly. I was like, Larry, well, what, like, we need to make sure that the AI AI doesn't destroy all the humans.
And then he called me a species,

Speaker 2 like racist or something, for being pro human intelligence instead of machine intelligence. I'm like, well, Larry, what side are you on?

Speaker 2 I mean,

Speaker 2 you know, that's kind of a concern.

Speaker 2 And then at the time,

Speaker 2 Google had essentially a monopoly on AI.

Speaker 1 Yeah, they bought DeepMind, which you were on the board of, had an investment in. Larry and Sergei had invested in as well.

Speaker 2 And it's really interesting. I found out out about it because I told him about it.
And

Speaker 2 I showed him some stuff from deep from DeepMind and I think that's how he

Speaker 2 found out about it and acquired them actually. I got to begin what I say.

Speaker 2 But

Speaker 2 the point is that it's like, look, Larry's not taking AI safety seriously.

Speaker 2 And Google had essentially all the AI and all the computers and all the money. And I'm like, this is a unipolar world where the guy in charge is not taking things seriously

Speaker 2 and called me a species for being pro-human. Um, what do you do in those circumstances? You know, build a competitor, yes.

Speaker 2 Um, so OpenAI was created essentially as the opposite, which is an open source nonprofit, the opposite of Google.

Speaker 2 Um, now, unfortunately, it's, it's, it, it needs to change its name to closed for maximum profit AI. Yeah,

Speaker 2 for maximum profit, to be clear.

Speaker 2 The most about the company goes for the most amount of profit you could possibly get. I mean,

Speaker 2 it is so, it is like, like, it's comical i mean when you hear

Speaker 2 you maximize it you have to say like what is the most

Speaker 2 the most ironic outcome for a company that that was created for to do open source uh not at non-profit ai is it's super closed source it's tighter than for

Speaker 2 the ai open ai source is locked up tighter than fortnox um and uh and and they are going for maximum profit like a maximum um like get the burp and the steak knife that you know

Speaker 2 yeah i mean no

Speaker 2 you know like like they're going for the buffet

Speaker 2 and they're just diving headfirst into the profit buffet i mean it's just or at least aspiration the revenue buffet at least profit we'll see um

Speaker 2 i mean it's like

Speaker 2 ravenous wolves for revenue ravenous

Speaker 4 revenue buffet no no it's literally like

Speaker 2 super villain it's like bond villain level flip.

Speaker 1 Like it went from being United Nations to being Spectre in like James Bondland. When you hear him say, I'm going to, when Sam says it's going to like raise 1.4 trillion to build our data centers.

Speaker 2 Yeah, no, but I think he means it.

Speaker 1 Yeah. I mean, it's, I would say, audacious, but I wouldn't want to

Speaker 1 insult the word.

Speaker 4 Oh, actually,

Speaker 2 I have a question about this. How is that possible?

Speaker 4 In the earnings call, you said something that was insane. And then I think the math actually nets up, but you said we could connect all the Teslas and allow them in downtime to actually offer up

Speaker 4 inference and you can string them all together.

Speaker 4 I think the math is like it could actually be like a hundred gigawatts.

Speaker 2 Is that right? Do you do? If ultimately there's a Tesla fleet that is

Speaker 2 100 million vehicles, which I think we probably will get to at some point, 100 million vehicle fleet.

Speaker 2 And they have

Speaker 2 mostly state-of-the-art inference computers in them

Speaker 2 that each each say are a kilowatt of inference compute.

Speaker 2 And they have built-in power and cooling and

Speaker 2 connect to the wire. That's the key.

Speaker 4 Yeah, exactly.

Speaker 2 Yeah, exactly. And

Speaker 2 you have 100 gigawatts of inference compute. Elon, do you think that the architecture, like there was an attention-free model that came out the last week?

Speaker 2 There's been all of these papers, all of these new models that have been shown to reduce power per token of output by many, many, many orders of magnitude, like not just an order of magnitude, but like maybe three or four.

Speaker 2 Like what's your view and all the work you've been doing on where we're headed in terms of power per unit of compute or per token of output?

Speaker 2 Well, we have a clear example of efficient, power-efficient compute, which is the human brain.

Speaker 2 So our brains use about 20 watts of power, but and all that only about 10 watts is higher brain function.

Speaker 2 Most of it's, you know, half of it is just housekeeping functions, you know, keeping your heart going and breathing and that kind of thing.

Speaker 2 So, so you've got maybe 10 watts of higher brain function in a human.

Speaker 2 And we've managed to build civilization with 10 watts of a biological computer.

Speaker 2 And that biological computer has like a 20-year

Speaker 2 boot sequence.

Speaker 2 But it's very power efficient. So given that humans are capable of inventing

Speaker 2 general relativity and quantum mechanics and or discovering general relativity,

Speaker 2 like inventing aircraft, lasers, the internet, and discovering physics with a 10-watt meat computer, essentially,

Speaker 2 then

Speaker 2 there's clearly a massive opportunity for improving the efficiency of AI compute.

Speaker 2 Because

Speaker 2 it's currently many orders of magnitude away from that. And it's still the case that

Speaker 2 a 100-megawatt

Speaker 2 or even a gigawatt

Speaker 2 AI supercomputer at this point can't do everything that a human can do.

Speaker 2 It will be able to, but it can't yet.

Speaker 2 So, but

Speaker 2 like I said, we've got this obvious case of

Speaker 2 human brains being very power efficient and achieving and building civilization

Speaker 2 with 10 watts to compute.

Speaker 2 And our bandwidth is very low. So the speed at which we communicate information to each other is extremely low.

Speaker 2 We're not communicating at a terabit. We're communicating more at like 10 bits per second.

Speaker 2 So

Speaker 2 that should naturally lead you to the conclusion that there's massive opportunity for being more power efficient with AI. And at Tesla and at XAI,

Speaker 2 we continue to see massive improvements in inference computer efficiency.

Speaker 2 So

Speaker 2 yeah.

Speaker 4 Do you think that there's a moment where you would

Speaker 4 justify stopping

Speaker 4 all the traditional cars and just going completely all in on cyber cab if you felt like

Speaker 4 the learning was good enough and that the system was safe enough? Is there ever a moment like that, or do you think you'll always kind of dual track and always do both?

Speaker 2 I mean, all of the cars we make right now are capable of being a robo-taxi. So there's a little confusion of the terminology because

Speaker 2 our cars look normal. You know, like Model 3 or Model Y looks, it's a good looking car, but it looks normal.

Speaker 2 But it has an advanced AI computer and advanced AI software and cameras. And we didn't want the cameras to stick out.
So we, you know, so that we wouldn't want them to be ugly or stick out. So

Speaker 2 we put them, they're sort of in unobtrusive locations. You know, the forward-looking cameras are in front of the rearview mirror.

Speaker 2 The side view mirrors are in the side repeaters. The side view cameras are in the side repeaters.

Speaker 2 The rear camera is just

Speaker 2 above the license plate, actually typically where the rear view camera is in a car.

Speaker 2 And

Speaker 2 the diagonal forward ones are in the B pillars. Like if you look closely, you can see all the cameras, but you have to look closely.
We just didn't want them to stick out like warts or something.

Speaker 2 But actually, all the cars we make are hyper-intelligent and have the cameras in the right places. They just look normal.

Speaker 2 And

Speaker 2 so all of the cars we make are capable of unsupervised full autonomy.

Speaker 2 Now, we have a dedicated part, which is the cyber cab, which has no steering wheel or pedals,

Speaker 2 which are obviously vestigial in an autonomous world. And we saw production of the cyber cab in Q2 next year.
And we'll scale that up to quite high volume.

Speaker 2 I think ultimately we'll make millions of cyber cabs per year.

Speaker 2 But it is important to emphasize that all of our cars are capable of being robotic taxis.

Speaker 1 The cyber cab is gorgeous. I told you I'd buy two of those if you put a steering wheel in them.
And there is a big movement around.

Speaker 1 People are begging for it. Why not? Why not let us buy a couple, you know,

Speaker 1 just the first ones off the line and drive them? I mean, they look great.

Speaker 1 It's like the perfect model. You always had a vision for a Model 2, right? Like, isn't it like the perfect Model 2 in addition to being a cyber cab?

Speaker 2 Look, the reality is people may think they want to drive their car, but the reality is that they don't.

Speaker 2 How many times have you been, say, in an Uber or Lyft and you said, you know what, I wish I could take over from the driver?

Speaker 2 And I wish I could get off my phone and take over from the Uber driver

Speaker 2 and drive to my destination. How many times have you

Speaker 2 thought that to yourself?

Speaker 1 No, it's quite the opposite.

Speaker 2 I have zero times. Okay.

Speaker 1 I have the Model Y and it just got 14. I have Juniper and I got the 14.1 and I put it on Mad Max mode the last couple of days.

Speaker 2 That is

Speaker 2 a unique experience.

Speaker 1 I was like, wait a second, this thing is driving in a very unique fashion.

Speaker 1 Yeah.

Speaker 2 Yeah. It assumes you want to get to your destination in a hurry.
Yeah.

Speaker 2 I used to give cam drivers an extra 20 bucks to do that. Medical appointment or something.
I don't know. Yeah.

Speaker 1 But it feels like it's getting very close, but you have to be very careful. You know, Uber had a horrible accident with the safety driver.
Cruise had a terrible accident.

Speaker 1 It wasn't their fault exactly, except, you know, somebody got hit and then

Speaker 1 they hit the person a second time and they got dragged.

Speaker 2 Yeah, yeah.

Speaker 1 You know, there's this pretty high stakes. So you're being extremely cautious.

Speaker 2 The car is actually extremely capable right now. Yeah.
But we are being extremely cautious and we're being paranoid about it because to your point,

Speaker 2 even one accident would be headline news. Well, probably worldwide headline news.

Speaker 1 Especially if it's a Tesla. Like Waymo, I think, gets a bit of a pass.
I think there's half the country or a number of people probably would, you know, go extra hard on you

Speaker 2 uh yes uh

Speaker 2 yeah exactly um yeah not everyone in the press is my friend

Speaker 2 hadn't noticed

Speaker 2 some of them are a little antagonistic yeah so you can just

Speaker 1 but people are pressuring you to go fast and i i think is everybody's got to just take their time with this thing it's obviously going to happen but i i just get very nervous that the the pressure to put these things on the road faster than they're ready is just

Speaker 1 a little crazy. So I applaud you for putting the safety monitor in, doing the safety drive with no shame in the safety driver game.

Speaker 1 It's so much the right decision, obviously, but people are criticizing you for it. I think it's dumb.
It's the right thing to do.

Speaker 2 Yes. And we do expect it to take to not have any

Speaker 2 sort of safety

Speaker 2 occupant or there's hardly a driver that just sits on the monitor.

Speaker 2 Safety monitor just sits. You just sit.

Speaker 2 They just sit in the car and don't do anything.

Speaker 2 Safety dude.

Speaker 2 Yeah.

Speaker 2 So, but we do expect that the cars will be driving around without any safety monitor

Speaker 2 before the end of the year. So sometime in December.

Speaker 1 In Austin, yeah. I mean, you got a number of reps under your belt in Austin, and it feels like pretty well

Speaker 1 you guys have done a great job figuring out where the trouble spots are.

Speaker 1 Maybe you could talk a little bit about what you learned in the first, I don't know, it's been like three or four months of this so far.

Speaker 1 What did you learn the first three or four months of the Austin experiment?

Speaker 2 Actually, it's gone pretty smoothly.

Speaker 2 A lot of things that we're learning

Speaker 2 are just how to manage a fleet.

Speaker 2 Because you've got to write all the fleet management software, right?

Speaker 2 And you've got to write the right hailing software. You've got to write, basically, the software that Uber has, you've got to write that software.

Speaker 2 It's just summoning a robot car instead of a car with a driver.

Speaker 2 So a lot of the things we're doing, we're scaling up the number of cars to say, like, what happens if you have a thousand cars? Like, so

Speaker 2 we think probably we'll have a thousand cars or more in the Bay Area

Speaker 2 by the end of this year, probably

Speaker 2 500 or more in the Greater

Speaker 2 Austin area.

Speaker 2 And

Speaker 2 if

Speaker 2 you have to,

Speaker 2 you have to make sure the cars don't all, for example, go to the same supercharger

Speaker 2 at the same time.

Speaker 2 Right.

Speaker 2 So,

Speaker 2 or don't all go to the same intersection.

Speaker 2 It's like, what do these cars do? And then, like, sometimes there's high demand and sometimes there's low demand. What do you do during those times? Do you have a car circle the block?

Speaker 2 Do you have a try to find a parking space?

Speaker 2 And then, you know, sometimes, like, say it's a

Speaker 2 disabled parking space or something, but the the writing's faded or the things faded. The car's like, oh, look, a parking space will jump right in there.

Speaker 1 It's like, get a ticket.

Speaker 2 You got to look carefully, make sure it's like, you know, it's not

Speaker 2 an illegal parking space

Speaker 2 or it sees a space to park and it's like ridiculously tight.

Speaker 2 But it's like, oh, I can get in there.

Speaker 2 But with like, you know, three inches on either side type of thing. Bad computer.

Speaker 2 But nobody else will be able to get in the car if you do that.

Speaker 2 So, you know, there's just like all these oddball corner cases.

Speaker 2 And

Speaker 1 regulators, like regulators are all very,

Speaker 1 yeah,

Speaker 1 they have different levels of persicketiness and regulations depending on the city, depending on the airport. I mean, it's just,

Speaker 1 you know, very different everywhere. That's going to just be a lot of blocking and tackling.
And it just takes time.

Speaker 2 Elon, Elon, can I ask another question?

Speaker 2 In order to take people to San Jose airport, like San Jose, you actually have to connect to San Jose airport servers

Speaker 2 because you have to pay a fee every time you get zone off. So the car actually has to do a remote call.
The robot car has to do

Speaker 2 a remote procedure call to San Jose airport servers to

Speaker 2 say I'm dropping someone off at the airport and charge me whatever five bucks.

Speaker 2 Which is like there are all these like quirky things like that the the the the like airports are somewhat of a racket um uh yeah

Speaker 2 um so so that's like you know we had to solve that thing but it's kind of funny the robot car is like calling the server the airport server to to uh you know charge its credit card or whatever

Speaker 1 it's like send a fax yeah we're gonna be dropping off at this time

Speaker 2 but it will soon become extremely normal to see cars going around with no one in them yeah yeah extremely

Speaker 2 on just before uh we lose you i want to like ask if you saw the bill gates memo that he put out a lot of people are talking about this memo

Speaker 2 like you know did i i guess

Speaker 2 billie g is not my love

Speaker 2 oh man

Speaker 2 like did did did climate change become woke did it become like woke and is it over

Speaker 2 like you know like what happened and what's what happened with Billy G? I mean,

Speaker 2 you know,

Speaker 2 great question. Great question.
Yeah.

Speaker 2 You know,

Speaker 2 you'd think that someone like Bill Gates, who clearly started a tech, you know, started a technology company that's one of the biggest companies in the world, Microsoft,

Speaker 2 you'd think he'd be really quite

Speaker 2 strong in the sciences.

Speaker 2 But actually, my at least direct direct conversations with him have,

Speaker 2 he is not strong in the sciences.

Speaker 2 Yeah, this is really surprising. You know, like he came to visit me at the Tesla Giga Factory in Austin and was telling me that it's impossible to have a long-range semi-truck.

Speaker 2 And I was like, well, but we literally have them.

Speaker 2 And you can drive them. And Pepsi is literally using them right now.
and you can drive them yourself or send someone, obviously Bogey is not going to drive himself, but you can send a trusted

Speaker 2 person to drive the truck and verify that it can do the things that we say it's doing. And he's like, no, no, it doesn't work.
It doesn't work. And I'm like,

Speaker 2 okay, I'm kind of stuck there.

Speaker 2 Then it's like, I was like, well, so it must be that

Speaker 2 you disagree with the watt-hours per kilogram of the battery pack.

Speaker 2 So that you must think that perhaps we can't achieve the energy density of the battery pack, or that the watt-hours per mile of the truck is too high,

Speaker 2 and that when you combine those two numbers, the range is low. And so, which one of those numbers do you think we have wrong? And what numbers do you think are correct?

Speaker 2 And he didn't know any of the numbers.

Speaker 2 And I'm like, well, then doesn't it seem that it's perhaps

Speaker 2 premature to conclude that a long-range semi-cannot work if you do not know the energy density of the battery pack or the energy efficiency of the truck chassis.

Speaker 2 Hmm.

Speaker 1 But yeah, he's now taking a 180 on climate.

Speaker 2 He's saying maybe this shouldn't be the topic.

Speaker 2 Climate is gay.

Speaker 2 Why would he say climate is gay? That's wrong. It's totally retarded.

Speaker 2 Bill Gay said that climate is gay and retarded. Come on.

Speaker 2 Maybe he's got some data centers he's got to put up.

Speaker 1 Does he have to stand up a data center for Sam Altman or something? I don't know.

Speaker 2 What is Azure?

Speaker 2 I don't know.

Speaker 2 He changed his position.

Speaker 1 I can't figure out why.

Speaker 2 I mean,

Speaker 2 you know, I mean, the reality of the whole climate change thing is that

Speaker 2 you've just had sort of people who say it doesn't exist at all, and then people who say it's are super llamas and saying, you know, RAR is going to be underwater in five years.

Speaker 2 And obviously, neither of those two positions are true.

Speaker 2 The reality is you can measure the carbon concentration in the atmosphere. Again, you could just literally buy a CO2 monitor from Amazon.
It's like 50 bucks. And you can measure it yourself.

Speaker 2 And

Speaker 2 you can say, okay, well, look,

Speaker 2 the positive million of CO2 in the atmosphere has been increasing steadily at two to three per year.

Speaker 2 At some point, if you continue to take billions, eventually trillions of tons of carbon from deep underground and transfer it to the atmosphere and oceans, so you transfer it from deep underground into the surface cycle, you will change the chemical constituency of the atmosphere and oceans.

Speaker 2 You just literally will.

Speaker 2 Then you can only, then now you can say to what degree and over what time scale.

Speaker 2 And the reality is that, in my opinion, is that we've got at least 50 years before it's a serious issue.

Speaker 2 I don't think we've got 500 years,

Speaker 2 but we've probably got 50.

Speaker 2 It's not five years.

Speaker 2 So if you're trying to get to the right order of magnitude of accuracy, I'd say the concern level for climate change is on the order of 50 years. It's definitely not five.

Speaker 2 And I think it probably isn't 500.

Speaker 2 So really, the right course of action is actually just the reasonable course of action, which is to lean in the direction of sustainable energy

Speaker 2 and lean in the direction of solar

Speaker 2 and

Speaker 2 of a sort of a solar battery future and

Speaker 2 generally have the rules of the system

Speaker 2 lean in that direction.

Speaker 2 I don't think we need massive subsidies, but then we also shouldn't have massive subsidies for the oil and gas industry.

Speaker 2 Okay.

Speaker 2 So the oil and gas industry has massive tax write-offs that they don't even think of as subsidies

Speaker 2 because these things have been in place for, in some cases,

Speaker 2 80 years.

Speaker 2 But they're not there for other industries. So when you've got special tax conditions that are in one industry and not another industry, I call that a subsidy.
Obviously, it is.

Speaker 2 But they've taken it for granted for so long in oil and gas that they don't think of it as a subsidy.

Speaker 2 So the right course of action, of course, is to remove, in my opinion, to remove subsidies from all industries.

Speaker 2 But the political reality is that the oil and gas industry is very strong in the Republican Party, but not in the Democratic Party.

Speaker 2 So you will not see obviously even the tiniest subsidy being removed from the oil, gas, and coal industry. In fact, there were some that were added to the oil, gas, and coal industry

Speaker 2 in the sort of big bill.

Speaker 2 And

Speaker 2 there were

Speaker 2 a massive number of

Speaker 2 sustainable energy incentives that were removed, some of which I agreed with, by the way.

Speaker 2 Some of the incentives have gone too far.

Speaker 2 But

Speaker 2 anyway,

Speaker 2 the actual, I think,

Speaker 2 the correct scientific conclusion, in my opinion,

Speaker 2 and I think we can back this up with solid reasoning. Let me ask Rock, for example,

Speaker 2 is that we should

Speaker 2 we should lean in the direction of moving towards a sustainable energy future.

Speaker 2 We will eventually run out of oil, gas, and coal to burn anyway, because

Speaker 2 there's a finite amount of that stuff.

Speaker 2 And we will eventually have to go to something that lasts a long time that is sustainable.

Speaker 2 But to your point about the irony of things, it seems to be the case that making energy with solar is cheaper than making energy with some of these carbon-based sources today.

Speaker 2 And so the irony is it's already working. I mean, the market is moving in that direction.

Speaker 2 And this notion that we need to kind of force everyone into a model of behavior, it's just naturally going to change because we've got better systems.

Speaker 2 You know, you and others have engineered better systems that make these alternatives cheaper, and therefore they're winning, like they're actually winning in the market, which is great.

Speaker 2 But

Speaker 2 they can't win if there are subsidies to support the old systems, obviously. Yeah, I mean,

Speaker 2 by the way, there are actually massive disincentives for Solo because

Speaker 2 China is a massive producer of solar panels.

Speaker 2 China does an incredible job of solar

Speaker 2 solar panel manufacturing. Really incredible.

Speaker 2 They have

Speaker 2 roughly one and a half terawatts of solar production right now.

Speaker 2 And they're only using a terawatt per year. But by the way, that's a gigantic number.

Speaker 2 The average US power consumption is only half a terawatt.

Speaker 2 So just think about that for a second.

Speaker 2 China's solar panel out production max capacity is one and a half terawatts per year.

Speaker 2 U.S. steady-state power usage is half a terawatt.
Now,

Speaker 2 you do have to reduce, you say to produce one and a half terawatts a year of solar, you need to add that with batteries, taking into account

Speaker 2 the differences between night and day, the fact that the solar panel is not always pointed directly at the sun, that kind of thing. So you can divide by five-ish.

Speaker 2 to say that but that still means that China has the ability to produce solar panels that have a steady state output that is roughly two-thirds that of the entire U.S.

Speaker 2 economy from all sources, which means that just with solar alone, China can,

Speaker 2 in 18 months, produce enough solar panels to power the entirety of the United States.

Speaker 4 What do you think about near-field solar, aka nuclear?

Speaker 2 I'm in favor of, look, make energy from any way

Speaker 2 you want.

Speaker 2 That isn't obviously harmful to the environment.

Speaker 2 Generally, people don't welcome a nuclear reactor in their backyard.

Speaker 2 They're not like championing it. Put it here.
Put it under my bed.

Speaker 2 Put it on my roof.

Speaker 2 If your next-door neighbor said, hey, I'm selling my house and they're putting a reactor there,

Speaker 2 what would you,

Speaker 2 the typical homeowner response will be negative.

Speaker 2 Very few people will embrace a nuclear reactor

Speaker 2 adjacent to their house.

Speaker 2 But nonetheless,

Speaker 2 I do think nuclear is actually very safe.

Speaker 2 There's a lot of

Speaker 2 scaremongering and propaganda around fission, assuming you're talking about fission.

Speaker 2 But fission is actually very safe. They obviously have this on

Speaker 2 the Navy, U.S. Navy has this on submarines and aircraft carriers and with people really walking right.
I mean, a submarine is a pretty crowded place and they have a nuclear-powered submarine. So

Speaker 2 I think I think vision is fine

Speaker 2 as an option.

Speaker 2 The regulatory environment makes it very difficult to actually get that done.

Speaker 2 And then it is important to appreciate just the sheer magnitude of the power of the sun.

Speaker 2 So this is, here are some just important

Speaker 2 basic facts.

Speaker 2 Even Wikipedia has these facts, right?

Speaker 2 You know, so you don't even have to go to

Speaker 2 the best answer, but even Wikipedia has

Speaker 2 it right. Yes, yes.
I'm saying, what I'm saying, even Wikipedia's got these facts right.

Speaker 2 The Sun is about 99.8% of the mass of the solar system.

Speaker 2 Then Jupiter is about 0.1%.

Speaker 2 And everything else is in the remaining 0.1%. And we are much less than 0.1%.

Speaker 2 So

Speaker 2 if you burnt all of the mass of the solar system,

Speaker 2 okay,

Speaker 2 then the total energy produced by the sun would still round up to 100%.

Speaker 2 Like if you just burnt Earth,

Speaker 2 the whole planet, and burnt Jupiter, which is very big and quite challenging to burn,

Speaker 2 you

Speaker 2 turn

Speaker 2 Jupiter into thermonuclear actor,

Speaker 2 it wouldn't matter. The sun compared to the sun.
The sun is 99.8% of the mass of the solar system, and everything else is in the miscellaneous category. So

Speaker 2 basically, no matter what you do, total energy produced in our solar system rounds up to 100% from the sun.

Speaker 2 You could even throw another Jupiter in there.

Speaker 2 So we're going to snag a Jupiter from somewhere else.

Speaker 2 and somehow teleport you could teleport two more Jupiters into our solar system burn them and the sun would still round up to 100

Speaker 2 you know so as long as you're at 99.6 percent you're still rounding up to 100

Speaker 2 um

Speaker 2 maybe that gives some perspective of why solar is really the thing that matters and and and as soon as you start thinking about things in at sort of a grander scale like kardashev scale 2 civilizations it becomes very very obvious it's like i'm not saying anything that's new by the way like uh

Speaker 2 anyone who studies physics has known this for

Speaker 2 a very long time. In fact, Khadashev, I think, was a Russian physicist who came up with this idea, I think, in the 60s,

Speaker 2 just as a way to classify civilizations.

Speaker 2 Where Ikarashev scale one would be,

Speaker 2 you've harnessed most of the energy of the planet. Khalarshev scale two, you've harnessed most of the energy of your sun.
Khalashev three, you've harnessed most of the energy of a galaxy.

Speaker 2 Now, we're only about, I don't know, 1% or a few percent of Cardashiv Scale 1 right now,

Speaker 2 optimistically.

Speaker 2 So,

Speaker 2 but as soon as you go to Color Shift Scale 2, where you're talking about the power of the Sun, then you're really just saying

Speaker 2 everything is solar power

Speaker 2 and

Speaker 2 the rest is in the noise.

Speaker 2 And

Speaker 2 yeah, so like the

Speaker 2 you know, like the sun produces about a billion times, or

Speaker 2 call it well over a billion times more energy than everything on Earth combined.

Speaker 4 It's crazy.

Speaker 1 It's mind-blowing.

Speaker 2 Right.

Speaker 2 Yeah.

Speaker 1 Yeah. Solar is the obvious solution to all this.
And yeah, I mean, short term.

Speaker 1 You'll have to use some of these other sources, but hey, there it is. An hour and a half from

Speaker 2 star-powered. Like, maybe we've got a branding issue here.
Yeah, star-powered. Instead of solar-powered, it's starlight.

Speaker 1 Yeah, starlight.

Speaker 2 Perfect. It's the power of a blazing sun.

Speaker 2 How much energy does an entire star have?

Speaker 2 Yeah. Well, the sun is a star.

Speaker 1 More than enough.

Speaker 2 All right.

Speaker 2 And also, you really need to keep the power local.

Speaker 2 So sometimes people, honestly, I've had these discussions so many times. It's where they say, well, would you beam the power back to Earth? I'm like, do you want to melt Earth?

Speaker 2 Because you would melt Earth if you did that.

Speaker 2 We'd be vaporized in an instant.

Speaker 2 So you really need to keep the power local.

Speaker 2 You know, basically distributed power.

Speaker 2 And I guess most of it we use for intelligence. So it's like the future is like a whole bunch of solar-powered AI satellites.
But Elon,

Speaker 2 the only thing that makes the star work is it just happens to have a lot of mass. So it has that gravity to ignite the fusion, to ignite the fusion reaction, right?

Speaker 2 But like we could ignite the fusion reaction on Earth now. I don't know if your view has changed.

Speaker 2 I think we talked about this a couple of years ago where you were pretty like, we don't know if or when fusion becomes real here, but theoretically, we could take like 10. No, I want to be clear.

Speaker 2 My opinion on,

Speaker 2 so

Speaker 2 yeah, I started physics of physics in college.

Speaker 2 At one point in high school, I was thinking about a career in physics. One of my sons actually

Speaker 2 is doing a career in physics. But the problem is I came to the conclusion that I'd be waiting for a collider

Speaker 2 or a telescope. I don't have any

Speaker 2 to get that career in physics, but I have a strong interest in the subject.

Speaker 2 So my opinion on, say, creating a fusion reactor on Earth is I think this is actually not a hard problem. Actually, it's a little hard.
I mean, it's not like totally trivial.

Speaker 2 But if you just scale up a Tokamak,

Speaker 2 the bigger you make it, the easier the problem gets. So

Speaker 2 you've got a surface to volume ratio thing where

Speaker 2 you're trying to maintain a really hot core while having a wall that doesn't melt.

Speaker 2 So

Speaker 2 that's a similar problem with rocket engines. You've got a super hot core in the rocket engine, but you don't want the walls, the chamber walls of the rocket engine to melt.

Speaker 2 So you have a temperature gradient where it's very hot in the middle and gradually gets cold enough as you get to the perimeter, as you get to the

Speaker 2 chamber walls in the rocket engine, where it doesn't melt

Speaker 2 because you've lowered the temperature

Speaker 2 and you've got a temperature gradient. So

Speaker 2 if you just scale up

Speaker 2 the donut reactor, Tokamak,

Speaker 2 and

Speaker 2 improve your surface to volume ratio, it becomes much easier. And you can absolutely, in my opinion,

Speaker 2 I think just anyone who looks at the math,

Speaker 2 you can make

Speaker 2 a reactor that generates more energy than it consumes. And the bigger you make it, the easier it is.

Speaker 2 And in the limit, you just have a giant gravitationally contained thermonuclear reactor like the sun.

Speaker 2 which requires no maintenance and it's free.

Speaker 2 So this is also why, why would we bother doing that on making a little itty bitty sun that's so microscopic you'd barely notice um on earth when we've got the giant free one in the sky

Speaker 2 yeah but we only but we're we only get a fraction of one percent of that energy on the planet earth we have to go

Speaker 2 yeah right so we've got to figure out how to wrap the sun if we're going to harness that energy that that's our

Speaker 2 our long

Speaker 2 if people want to have fun with reactors you know um that's that's fine Have fun with reactors.

Speaker 2 But it's not a serious endeavor compared to the sun.

Speaker 2 You know,

Speaker 2 it's sort of a fun science, it's a fun science project to make a

Speaker 2 nuclear reactor, but

Speaker 2 it's just PNS compared to the Sun.

Speaker 2 And even the solar energy that does reach Earth

Speaker 2 is a gigawatt per square kilometer, or roughly, you know, called two and a half gigawatts per square mile.

Speaker 2 So that's a lot, you know, and the commercially available panels are around 25, almost 26% efficiency. And maybe, you know, but you can, and then you say, like, if you pack it densely, you get an 80%

Speaker 2 packing density, you're going to, which I think, you know, in a lot of places, you could get an 80% packing density. You effectively have about,

Speaker 2 you know,

Speaker 2 200 megawatts per square kilometer.

Speaker 2 And

Speaker 2 you need to pair that with batteries so you you have continuous power.

Speaker 2 Although our power usage drops considerably at night, so you need less batteries than you think.

Speaker 2 And doesn't the question then become...

Speaker 2 Maybe an easy number to remember is a gigabyte hour per square kilometer per day is a roughly correct number.

Speaker 2 But then doesn't your technical challenge become the scalability of manufacturing of those systems?

Speaker 2 So, you know, accessing the raw materials and getting them out of the ground of planet Earth to make them, to make enough of them to get to that sort of scale and that volume that you're talking about.

Speaker 2 And as you kind of think about what it would take to get to that scale, like, do we have an ability to do that with what we have today? Like, can we pull that much material out of the ground?

Speaker 2 Yes. Solar panels are made of silicon, which is sand, essentially.

Speaker 2 And

Speaker 2 I guess more on the battery side, but. On the battery side, yeah.
So on the battery side,

Speaker 2 you know, the like iron phosphate, lithium-ion battery cells, this, you know, Earth, I'd like to throw out some like interesting factoids here.

Speaker 2 If most people don't know, if you said

Speaker 2 as measured by mass, what is the biggest element?

Speaker 2 What is Earth made of

Speaker 2 as measured by mass?

Speaker 2 Actually,

Speaker 2 it's iron.

Speaker 2 Iron. Yeah, we're, I think, 32% iron, 30% oxygen,

Speaker 2 and then everything else is in the remaining percentage. So

Speaker 2 we're basically

Speaker 2 a rusty ball bearing.

Speaker 2 That's Earth.

Speaker 2 And with a lot of silicon at the surface in the form of sand.

Speaker 2 And the iron phosphate, so iron phosphate, lithium ion cells, iron, extremely common, most common element on Earth, even in the crust.

Speaker 2 And then phosphorus is also very common.

Speaker 2 And then the anode is carbon, but also very common. And then lithium is also very common.
So

Speaker 2 there's actually, you can do the math. In fact, we did the math and published the math, but nobody looked at it.

Speaker 2 It's on the Tesla website

Speaker 2 that shows that you can completely power Earth with solar panels and batteries.

Speaker 2 And there's no shortage of anything.

Speaker 1 All right.

Speaker 2 So on that note,

Speaker 1 go get to work, Elon, and just power the Earth while you're getting implants into people's brains and satellites and other good fun stuff. Good to see you, buddy.

Speaker 2 Yeah, good to see you guys. Yeah, yeah, thanks for stopping by anytime.
Thanks for joining us. You have the same link.
Stop by anytime.

Speaker 3 Thank you for coming today, and thank you for liberating free speech three years ago.

Speaker 2 You're welcome. Yeah,

Speaker 3 that was a very important milestone.

Speaker 2 And I see all you guys are in just different places. I guess this is a very virtual situation.

Speaker 2 Always been that. I'm at the ranch.

Speaker 2 Sex is on the bottom. Are you ever in the same room?

Speaker 1 We try not to be.

Speaker 2 Only when we do,

Speaker 2 only when we do that, that summit, but otherwise we avoid each other in the same room. Yeah.

Speaker 2 Your summit is

Speaker 2 pretty fun.

Speaker 2 We had a great time recounting SNL sketches that didn't make it. Oh, God.
There's just so many good ones.

Speaker 2 I mean,

Speaker 1 we didn't even get to the Jeopardy ones.

Speaker 2 Yeah, yeah. No, those are so offensive.
Oh, wait.

Speaker 2 Well, I think we skipped a few that would have dramatically increased our probability of being killed. Let me take this one out.

Speaker 2 Boys, I love you. I love you.
I love you all. I'm going to poker.
Later. Take care.

Speaker 2 Bye-bye. Love you.

Speaker 2 We'll let your winners ride.

Speaker 1 Rainman David Saxon says.

Speaker 1 And it said we open source it to the fans and they've just gone crazy with it. Love you, Best.
I'm Queen of Kino.

Speaker 1 Besties are God.

Speaker 1 That's my dog taking a notice in your driveway.

Speaker 1 My avatar will meet me at the end. We should all just get a room and just have one big huge orchief because they're all just useless.

Speaker 1 It's like this like sexual tension that they just need to release somehow.

Speaker 2 We need to get merged. Besties are.

Speaker 2 I'm going all in.