#877 - Marc Andreessen - Elon Musk, The Changing World Order & America’s Future

1h 29m
Marc Andreessen is a venture capitalist, entrepreneur, and co-founder of Andreessen Horowitz.
America is entering a new chapter. With the recent election in the past, sweeping changes on the horizon, and a lot of uncertainty. Just how optimistic should we be about the upheaval the world is about to face?
Expect to learn how we ended up on our current timeline, just how big the civil war is within the Democratic party right now, Marc’s thoughts on the motives and response to the Brian Thompson killing, how much government efficiency can be improved upon, Elon Musk's productivity secrets, how much of a political revolution is happening in Silicon Valley and much more...
Sponsors:
See discounts for all the products I use and recommend: https://chriswillx.com/deals
Get the best bloodwork analysis in America and bypass Function’s 400,000-person waitlist at https://functionhealth.com/modernwisdom
Get a 20% discount on Nomatic’s amazing luggage at https://nomatic.com/modernwisdom
Get $150 discount on Plunge’s amazing sauna or cold plunge at https://plunge.com (use code MW150)
Extra Stuff:
Get my free reading list of 100 books to read before you die: https://chriswillx.com/books
Try my productivity energy drink Neutonic: https://neutonic.com/modernwisdom
Episodes You Might Enjoy:
#577 - David Goggins - This Is How To Master Your Life: https://tinyurl.com/43hv6y59
#712 - Dr Jordan Peterson - How To Destroy Your Negative Beliefs: https://tinyurl.com/2rtz7avf
#700 - Dr Andrew Huberman - The Secret Tools To Hack Your Brain: https://tinyurl.com/3ccn5vkp
-
Get In Touch:
Instagram: https://www.instagram.com/chriswillx
Twitter: https://www.twitter.com/chriswillx
YouTube: https://www.youtube.com/modernwisdompodcast
Email: https://chriswillx.com/contact
-
Learn more about your ad choices. Visit megaphone.fm/adchoices

Press play and read along

Runtime: 1h 29m

Transcript

Speaker 1 How much of a new timeline are we on right now?

Speaker 2 We are on a very, very different new timeline.

Speaker 2 So,

Speaker 2 yeah, so

Speaker 2 I think the timeline split twice.

Speaker 2 It split once in the second week of July.

Speaker 2 And then it split again on November 6th. And I don't know.
You tell me.

Speaker 2 I mean, can you feel it?

Speaker 1 The sense in the air, the ambience, has certainly taken a hard pivot.

Speaker 1 Lots of people that were happy are now unhappy, and lots of people that were unhappy are now fucking ecstatic.

Speaker 2 So, I think actually I've detected something interesting, and maybe it's just my world or maybe business, but I think it's broader, which is I actually think a fairly large number of people who didn't vote for Trump are actually feeling

Speaker 2 people who run organizations who did not vote for Trump are feeling liberated.

Speaker 2 They're feeling like they can make changes that they have been wanting to make for a long time and they can really dial down a lot of the things that have really been causing the problems.

Speaker 2 So, the blast radius of the good vibes is wider than people might have anticipated i think it's wider than people anticipated and i've been in some discussions where people are like yeah like it feel it really feels like the air is coming out of the uh it really feels like the tension is draining out of the system in an interesting way this time which is of course the exact opposite of how it felt in 2016.

Speaker 2 and so it's i i don't know i'm i'm you know i'm cautiously optimistic that actually a fairly you know i look i don't think there's any like overnight you know transformation and there's going to be continued you know drama and strife and so forth but um i think a lot of institutions i think a lot of leaders at a lot of institutions including ones that are left-leaning, I think they've just simply had it with a lot of the chaos of the last 10 years and a lot of the drama and a lot of the pressure and conflict.

Speaker 2 And I think they're just done with it. And they want their, you know, if it's a company, they want it to get back to business.
If it's a university, they want it to get back to teaching.

Speaker 2 So I'm, you know, knock on wood, cautiously optimistic.

Speaker 1 This pivot from

Speaker 1 saying good or appearing good to, hang on, but what actually happened? It's stress testing the outcomes and people optimizing for actual effectiveness as opposed to like

Speaker 2 optical

Speaker 1 slickness, I guess, or popularity, reputational stuff.

Speaker 1 I think that pivot seems to be

Speaker 1 one that was quite overdue.

Speaker 2 Yeah, and you know, part of it is, you know, there were some ideas that had to be stress tested, right, apparently. And I think they were stress tested and maybe fun watching, maybe backfired.

Speaker 2 And then, you know, there's this thing in, you know, there's this thing in philosophy,

Speaker 2 they call it the paradox of tolerance,

Speaker 2 which is in order to maximize tolerance,

Speaker 2 we must not tolerate the intolerant, right?

Speaker 2 Right? Right.

Speaker 2 Like, you can only have tolerance if you don't tolerate the intolerant.

Speaker 2 And so, therefore, if you want to maximize tolerance, which is what we've all been told that we need to do, you need to ostracize, cancel, nuke, and get rid of the intolerance.

Speaker 2 And of course, the definition of the intolerant, you know, very rapidly goes from small people, the small set of people who are like truly antisocial to basically everybody who doesn't agree with every single thing on the thousand-item checklist of what makes a good good person.

Speaker 2 Right.

Speaker 2 And so, you know, it's, it's, you know, put it this way, it's hard to win elections when your electoral strategy is to shrink your coalition as much as you possibly can by driving out as many people as possible, by tagging them as racist or sexist or intolerant.

Speaker 2 And like, it just doesn't work. It doesn't win elections.
It doesn't lead to a happy company. It doesn't lead to a well-functioning university.

Speaker 2 It doesn't lead to a well-functioning organization of any kind. And I think a lot of leaders have maybe finally figured that out.

Speaker 1 Yeah, I think any group that's bound together over the mutual distaste of an out-group, not the mutual love of an in-group,

Speaker 1 the only way that you can continue to bind that together is to scapegoat and shave off the people on the outside that are insufficiently pure, continue to point the finger at them and say, Well, at least we're not dot, dot, dot, that thing over there.

Speaker 1 But yeah, I mean, it's the reason I've had a bunch of conversations with people on the left, Anna Kasparian, Chenk Yuga, Dave Smith, you know, like people who

Speaker 1 varied political standings. Anna,

Speaker 1 the fact that you can only move from left to right, you can't go from right to left. So it's a one-way valve.
You know, it's a one-way street. And guess what happens?

Speaker 1 The street that allows you to move in one direction is going to be more welcoming and better populated.

Speaker 2 Yeah. And I don't know if you may know this

Speaker 2 or either because of age or nationality, you might not know this, but

Speaker 2 this happened before actually in my lifetime. So sort of the late 60s, the 1970s were in many ways, you know, a lot like what we just went through in the last decade.

Speaker 2 You know, and sort of, you know, Trump was that was the, you know, the version of Trump at that time was Nixon

Speaker 2 and the version of things like, you know, the Iraq war was the Vietnam War and so forth. And so there were, you know, energy crises, inflation, all these, all these, you know,

Speaker 2 hostages crisis, all these crazy comparisons. And

Speaker 2 basically, and what happened actually was, you know, Jimmy Carter lost in 1980, and then Democrats didn't win another national election until 1992.

Speaker 2 It took actually 12 years for the Democrat Party to get back on track. And really what had to happen was the party, the party and the movement behind it had to go back to the center.

Speaker 2 And to your point, had to become a party of actual inclusivity, of actual inclusivity, of actual tolerance.

Speaker 2 You know, sort of the Democratic Party had to reconstruct its own big tent, precisely for the reason you said, which is you have to be able to attract people back.

Speaker 2 Like you have to get people to be able to come back and feel like they're part of a broad-based coalition and not just part of a very narrow thing. Now, at that time, it took 12 years, right?

Speaker 2 So, you know, they lost in 80, they lost in 84, they lost in 88.

Speaker 2 And then Bill Clinton and then a guy named El Fromm, who ran this thing called the Democratic Leadership Council, and then Al Gore, you know, kind of put together this program to kind of bring the Democrats back to center.

Speaker 2 And then that led to a boom

Speaker 2 for the Democratic Party with a much more sensible, centrist set of policies and a much, much more, you know, kind of say open-minded and optimistic attitude.

Speaker 2 By the way, pro-business, pro-pro, pro-patriotism, you know, kind of pro-kind of treating everybody well.

Speaker 2 And so

Speaker 2 I am cautiously optimistic that the current Democratic Party will be able to find its way back in less than 12 years.

Speaker 2 And I think that would be good for the country, right? I mean, I think even if you're a Republican, you should want there to be a healthy,

Speaker 2 I think you should want there to be a healthy, vibrant, viable, centrist, sensible, responsible,

Speaker 2 kind of even, you know, kind of

Speaker 2 stable configuration of an opposition party as opposed to, you know, as opposed to what we've seen, what we've seen recently.

Speaker 2 And so I'm cautiously, you know, there's a Democratic Party, there's a civil war already, you know, kind of playing out the Democratic Party.

Speaker 2 And, you know, there's, there's all these different arguments going back and forth already.

Speaker 2 And I'm, you know, but you're starting to see voices like Rishi Torres and Rokana and others who are standing up saying, you know, it's kind of time to go back to the center.

Speaker 2 And again, I'm cautiously optimistic that that will happen.

Speaker 1 How big of a civil war do you think is unfolding at the moment?

Speaker 2 So it's really big. So it's really quite something.
So a couple of things.

Speaker 2 So one is there was just a political story that was really good that talked about the first big kind of summit meeting that the Democrats have had, which was the DNC hosted a meeting of the state Democratic Party leaders.

Speaker 2 and

Speaker 2 it started with a full land acknowledgement, right? And then the

Speaker 2 step one, okay. And then

Speaker 2 the current chairman of the party gave the STEM winning speech about how the party absolutely needs to double down on identity politics.

Speaker 2 Step two. And then

Speaker 2 this horrific tragedy in New York with this CEO getting shot and killed.

Speaker 2 There's now loud, prominent voices on the left, including in the press, saying basically yay, murder, which I think is step three is maybe, maybe not.

Speaker 2 These are not vectoring in the direction of, let's just say, a successful majority

Speaker 2 electoral path.

Speaker 2 But again,

Speaker 2 the contrast is clear.

Speaker 2 That is a conceivable path. There are some people who want to go on that path.

Speaker 2 I think there are a much larger number of completely sensible people in the party and interested in the party and interested in rejoining the party

Speaker 2 who would like to

Speaker 2 see the exact opposite.

Speaker 2 But yeah,

Speaker 2 they are going to you know argue this and litigate this i think all the way out i think they you know they really feel like they have to um they have to argue this and and you mentioned the the folks you've talked to like you know there are now some you know i think some you know some voices including some you know folks that are pretty far left who are kind of saying all right let's let's pause for a moment make sure we're not going off the cliff you work with a lot of founders ceos tons of your friends will be in that position too what do you make of the response to the brian thompson killing and and the situation more broadly

Speaker 2 well the killing it i mean the killing itself was obviously an enormous shock. And I know people who knew him and had been with him recently.

Speaker 2 So, you know, he was a well-known figure in the healthcare industry. So, and very, you know, very respected among

Speaker 2 sort of the business circles.

Speaker 2 So,

Speaker 2 yeah, I mean, enormous shock. And then, of course, at first, you're just like, well, these things are either random or, you know,

Speaker 2 or most murders are committed by

Speaker 2 somebody in somebody's personal life.

Speaker 2 You know, the way this one seems to be unfolding, you know, is, and I don't know anything that's not in the press, but it seems to have a ideological motivation behind it. And,

Speaker 2 and so, you know, and then it's like, okay, is that just a one-off or, you know, is this the beginning of a pattern?

Speaker 2 Um, and, um, you know, I would say it, it, you know, a lot of people I talk to are very disconcerted by, I would say, the enthusiasm that a lot of people in the press are showing. To,

Speaker 2 there have been a tremendous, there have been a shockingly large number of stories of the line of, you know, of course, murder is bad, but

Speaker 2 we shouldn't laugh. I shouldn't laugh.
Right. Well, I mean, it is.
I mean, you have to laugh or you cry, right? So, and then it's like 3,000 words that come after the butt, right?

Speaker 2 And it's like, okay, like, you know, which is basically a long, long, long-winded sort of justification for murder.

Speaker 2 And so, you know,

Speaker 2 that is disconcerting. There are some comedians now, comedians, you know, that have gotten in the game.

Speaker 2 And so, you know, I would say this is fairly disconcerting. I mean, we'll see what happens.

Speaker 2 The scary scenario, you know, I mentioned the 70s.

Speaker 2 You know, domestic terror actually got to be quite a thing in the 70s in the U.S.

Speaker 2 And there's this great book called Days of Rage that kind of chronicles basically this like very widespread pattern of domestic terror, ideologically motivated domestic terror in the 70s by many kind of sort of radical revolutionary groups, the Weather Underground and

Speaker 2 many others.

Speaker 2 And what a lot of those domestic terrorists had in common was sort of this very, you know, very, very kind of privileged backgrounds.

Speaker 2 You know, a lot of these were like Ivy League, you know, like the Weather Underground came out of Columbia University.

Speaker 2 And then, you know,

Speaker 2 domestic terrorists coming out of places like UC Berkeley.

Speaker 2 And then, you know, sort of very privileged, often very privileged backgrounds, very kind of of upper, you know, kind of elite upper middle class people.

Speaker 2 And then, you know, ended up in some cases going on the lamb for years, being hunted by the FBI.

Speaker 2 And there were, you know, there were years in the 1970s when there were thousands of terror bombings a year.

Speaker 2 And then there was a run of, you know, anti-corporate, you know, basically murder and terrorism in Germany kind of through that period into the 80s with this thing called the Bader-Meinhoff Group and so forth.

Speaker 2 And so, you know, there's always been this kind of violent edge to kind of call it the anti-corporate, anti-business, anti-industrial, you know, you know, kind of move.

Speaker 2 And by the way, you know, a little bit of that, you know, some of that's on the left, some of that's on the right. You know, there's, there's different, you know, you can squint different ways.

Speaker 2 You know, Ted Kaczynski was on the right. This guy maybe appears to be a little bit more on the left.

Speaker 2 And so, you know, this pattern has unfolded before.

Speaker 2 You know, I don't know that it's ever been effective at causing any of the change that its proponents seem to want.

Speaker 2 You know, it seems to, you know, these, these tactics, violent tactics usually backfire in our societies. But, you know, certain, certainly, certainly hope that this hasn't become a copycat situation.

Speaker 2 And

Speaker 2 I do think it's striking. I mean, I think there are a lot of voices in American public life who are behaving quite irresponsibly right now.

Speaker 1 What did they think that they're doing? That sometimes people that run healthcare organizations don't pay out when people should be paid out. Therefore, it's okay to kill the person at the top?

Speaker 2 Yeah, it's the thing. Well, it's, of course, murder is bad, but.
And then in the 3,000 words is basically what you said. So

Speaker 2 the whole healthcare situation is very, it's obviously very complicated and very emotional.

Speaker 2 This can become a very long conversation, but the sort of key fact that all Western countries are trying to kind of grapple with, and this is true of the US, but also Canada and the UK and the rest of Europe,

Speaker 2 but basically there's two key facts. So key fact number one is healthcare is at like a fifth of US GDP in terms of spending.
So like a fifth of all national production per year goes to healthcare.

Speaker 2 And

Speaker 2 that number is rising.

Speaker 2 Healthcare is basically rising.

Speaker 2 And if you just chart this on a graph, it's just very clear what's happening is healthcare left unchecked is going to go from being a fifth economy to a fourth to a third to a half.

Speaker 2 And then, you know, someday in the future, you know, effectively all of it.

Speaker 2 And then nobody wants to pay for it. Right.

Speaker 2 Right. And so like nobody wants to pay, everybody wants somebody else to pay for it.

Speaker 2 And then just saying the government should pay for it, of course, doesn't answer the question because the government has to be funded by somebody. And so that means taxpayers have to pay for it.

Speaker 2 And so you get in all these questions. And of course, we live in a progressive taxation system in which people who make more money pay a higher percentage of taxes.

Speaker 2 And there's big political fights over that. And I'm kind of neutral on that.
I don't really, I don't really care.

Speaker 2 It's not one of my issues.

Speaker 2 But, you know, it's part of this big fight over who pays for it. So that's, that's issue number one.

Speaker 2 And that's just going to, you know, it's going to bring out a lot of emotion in people because, of course, especially as you get closer to end of life or you get into these very bad chronic

Speaker 2 conditions that require a lot of money to take care of people, it's this real challenge of how to. how to pay for it.

Speaker 2 And then the other kind of key kind of fact is all these countries also are aging very rapidly, right? The demographics are headed in the direction of increased average age.

Speaker 2 And the consequence of that is um the you know a sharp reduction in the percentage of working-aged people versus you know older people retirees and the the whole basis for every social welfare system in the world is that the current workers pay for the current retirees right that like that that that is the basis there is no you know the social security trust fund doesn't exist the money that you pay in you don't get later you know your money you pay in today gets paid out to old people today and then when you're old there need to be young workers in the system to be able to generate the money to pay you and that's true for social security It's true for Medicare.

Speaker 2 It's true for Medicaid. It's true for all of these kinds of social.
It's true of any socialized health system. It'd be true of government-funded health systems.
It'd be the same thing.

Speaker 2 It just, you know, for taxpayers. Taxpayers are people who work.

Speaker 2 And so

Speaker 2 if the demographics go upside down and you have a country, you know, and you can see the future in places like Japan, where you just have like way more old people than you have young people.

Speaker 2 And then you just have this like fundamental mismatch. And then you literally quite simply can't pay for it.

Speaker 2 And so like these two things kind of put this whole issue in a vice.

Speaker 2 Now, what we try to do, like in our line of work, which is, you know, tech and biotech and VC, what we try to do is figure out ways to kind of get out of that vice, primarily through new technologies that at least, you know, in some cases will provide new kinds of healthcare, new kinds of drugs and new kinds of medical devices, but also things that are able to break that.

Speaker 2 price curve and able to make healthcare much cheaper, right? That, you know, that's the other way to solve it is you just, you make all this stuff much cheaper.

Speaker 2 And that's where, you know, we get excited about things like, for example, AI and healthcare, right?

Speaker 2 And, you know, maybe the prospect that instead of, you know, people having, you know, only having human doctors, maybe a lot of, you know, routine medical interactions are with AI doctors.

Speaker 2 And then, you know, you only consult with a human doctor when something gets, you know, really sensitive.

Speaker 2 And, you know, I think there's a reasonable chance in the next few years that we can figure out how to do that. So hopefully we can solve this with technology.

Speaker 2 It is weird, I will say, and it's coming out in this kind of current, you know, kind of debate and controversy.

Speaker 2 It is weird that the same people who are the maddest about healthcare costs rising and not being able to pay for it are also the ones who hate technology the most.

Speaker 2 Right? Like,

Speaker 2 we are

Speaker 2 and nuclear and people who care most about climate, like people who care most about climate and carbon emissions hate nuclear energy the most, right?

Speaker 2 And it's like, all right, we are genuinely trying to help solve these. Those of us in tech are genuinely trying to help solve these.

Speaker 2 What do you think that is?

Speaker 1 Is it just some sort of generalized skepticism, generalized scrutiny over rich people that they don't have the best interests at heart of the common folk?

Speaker 2 Yeah, well,

Speaker 2 there's this, say, part of it's just like simply right, moral intuitions, right?

Speaker 2 And so you have these kind of moral intuitions that, and moral intuitions presumably evolved to be the way they are according to kind of historical conditions of how people lived, but you know, historically, people lived in these very small, you know, kind of communities.

Speaker 2 You know, one way of describing, there's this great book called The Ancient City that goes back and reconstructs basically like pre, basically prehistoric civilization, like what it was like. And

Speaker 2 basically, the conclusion of it is, you know, sort of pre, you know, go back 4,000, 5,000, 6,000 years and been further back.

Speaker 2 You know, basically, we all lived in these very small tribes, you know, maxed out at a few hundred people.

Speaker 2 And then you could kind of describe the politics of the tribe being sort of a hybrid of like absolute fascism combined with absolute communism, right? Like, like at the same time, right?

Speaker 2 So it was, it was like absolute fascism in that the father of the family or the leader of the tribe had the total power of life or death over all the members of the tribe and could kill them at any moment for whatever reason he wanted to.

Speaker 2 And then it was also pure communism in that there was no market economy. Everybody shared everything.

Speaker 2 And if you, if you went out and killed an animal and brought it back and didn't share the food, they would kill it. You had to share the food.

Speaker 2 Like everything had to be shared because it was how you were, both of those forms of governance were basically how the tribe would survive.

Speaker 2 It needed total discipline and total sharing, right, at the same time.

Speaker 2 And so we evolve these moral intuitions that basically says, you know, maybe we don't want the fascism part, but we kind of want everything to be shared. We want everything to be equal.

Speaker 2 Having inequality is sort of intrinsically morally offensive to us.

Speaker 2 And so we naturally drive towards, you know, who can possibly argue against, you know, wouldn't it be desirable if everybody, you know, if everybody had the same, you know, how it's default unfair for some people to have more than others.

Speaker 2 And then, of course, profit, you know, particularly drives people crazy, right? Because profit feels, you know, profit in the capitalist system.

Speaker 2 If you have this moral intuition, it feels like it's unfair because it feels like it's money that's been extracted that's not, you know, that's not doing anything.

Speaker 2 There was, I won't name names, but there was a congressman this morning who tweeted two back-to-back tweets.

Speaker 2 And he tweeted, he said, it's just completely absurd and outrageous that the, you know, the American health insurance industry generates $1.4 trillion in profits per year.

Speaker 2 And he said, you know, that's just like so clearly unfair because that's, you know, presumptively that's $1.4 trillion being extracted, providing, you know, no value at all.

Speaker 2 And then like four hours later, there was a second tweet, which he said, he said, you know, actually, I learned that actually it's $1.4 trillion in revenue,

Speaker 2 not profit.

Speaker 2 And total profits are, you know, a very small fraction of what I thought it was, right? You know,

Speaker 2 but still, right, right.

Speaker 2 Equally absurd and unfair, right?

Speaker 1 Going back to sort of where we're at at the moment, I know that you've been heavily involved in a lot of meetings over the last month or so.

Speaker 1 How much, having been on every side of the fence and above it and inside of it, how much governmental efficiency is there to improve on, in your opinion?

Speaker 2 Oh,

Speaker 2 like absolutely enormous.

Speaker 2 And it's actually interesting that this, this actually, historically, this is actually a bipartisan question.

Speaker 2 And I actually think that the Vivekani line are looking at at it in a very bipartisan way. And you may have already seen there are already some Democrats signing up for

Speaker 2 the Doge Congressional Caucus and coming up.

Speaker 2 Even Bernie Sanders, to his credit, came out and said he certainly agrees with Elon and Vivek specifically on the topic of military spending, which is, of course, one of the big areas.

Speaker 2 And so that's a, you know, that's, and I think, you know, that's significant.

Speaker 2 So it historically is a bipartisan topic. When I was younger, you know, I mentioned the sort of Clinton Gore come back in 92.
This was actually a big theme of the Clinton-Gore campaign in 92.

Speaker 2 And actually, El Gore, who I know, you know, I've known over the years, El Gore actually had a whole program like the Doge actually at the time called Rego, R-E-G-O, called Reinventing Government.

Speaker 2 And

Speaker 2 he said many of the same things that actually Elon Abbeck are saying today.

Speaker 2 And he had this famous, he made it famous at the time. You find it on YouTube.
There's this famous thing he did to kind of visualize this, which is he went on the David Letterman show at the time.

Speaker 2 And he took out,

Speaker 2 it was like the Department of Defense was like buying like $600 shatterproof ashtrays.

Speaker 2 And

Speaker 2 he took them onto the Letterman Show

Speaker 2 with a chisel and safety glasses and gave it a big whack. And of course, it fractures just like any other ashtray.

Speaker 2 So there was a whole program around that. They made a little bit of progress.

Speaker 2 So anyway,

Speaker 2 I think everybody effectively agrees that there's

Speaker 2 who doesn't want, who doesn't want tax money to be spent efficiently? Who doesn't want to get results?

Speaker 2 Who doesn't want to eliminate waste and fraud?

Speaker 2 These are kind of,

Speaker 2 these are, those are hard things to argue against.

Speaker 2 In terms of methods, you know,

Speaker 2 the big change here is that we've never, you know, in the past, these have been government officials trying to figure this out or, you know, commissions or whatever.

Speaker 1 Some may say the blind leading the blind.

Speaker 2 Some may say. Participants in the system attempting to reform the system from within, let's say.

Speaker 2 And, you know, and this, of course, what's starkly different here is to have, you know, people of the caliber of Elon and Vivek, you know, who are, you know, tremendously accomplished entrepreneurs

Speaker 2 and founders and people who run businesses and people who understand how to do things that like, you know, and Elon, you know, Vivek's obviously very smart on all this, but Elon is legendary for being able to think, as he says, think from first principles on things like costs.

Speaker 2 And if you read the books about him, he's always focused on that.

Speaker 2 And so he's going to bring a toolkit that has been very successfully deployed at some of the world's leading, you know, literally some of the world's leading companies like Tesla and SpaceX.

Speaker 2 And he's going to bring that to the government for the first time.

Speaker 1 How is Elon so productive? Have you ever deconstructed this?

Speaker 2 So I actually have known Elon for a long time. I didn't work with him

Speaker 2 for a very long time, just because we were working on different

Speaker 2 things. But I've been working with him quite closely for the last couple of years, starting with the X acquisition.

Speaker 2 And then we've also invested in XAI and then in SpaceX. And so we now work on three companies together to different degrees.
And now

Speaker 2 the government stuff.

Speaker 2 Yeah, I mean, there's basically, he has an operating method that he's developed that I would say is very unusual by modern standards.

Speaker 2 I'm actually, I'm not aware of another current CEO who operates the way that he does.

Speaker 2 And I think probably the single biggest kind of question in all of business right now is why don't more CEOs operate the way that he does? And it's a complicated question we could talk about.

Speaker 2 But if you go back in history, you find characters more like him.

Speaker 2 And so especially like the industrialists of the late 1800s, early 1900s, you know, people like Henry Ford or Andrew Carnegie or Thomas Watson, who built IBM.

Speaker 2 If you go back and read the biographies of people like that, Andrew Mellon, you know, those guys, Cornelius Vanderbilt, like those guys ran very similar to the way that

Speaker 2 Elon runs things. And it's just this, like, the top, top line thing is just this incredible devotion

Speaker 2 from the leader of the company to fully, deeply understand what the company does and to be, you know, completely knowledgeable about every aspect of it and to be in the trenches and talking directly to the people who do the work, deeply understanding the issues and, you know, being the key, the lead problem solver in the organization.

Speaker 2 And basically what Elon does is he shows up every week at each of his companies.

Speaker 2 He identifies the biggest problem that the company's having that week and he fixes it. Right.
And then he does that every week for 52 weeks in a row.

Speaker 2 And then each of his companies has solved the 52 biggest problems that year, like in that year.

Speaker 2 And, you know, most other large companies are still having the planning meeting for the pre-planning meeting, for the board meeting, for the presentation, for the this, for the, you know, with the compliance review and the legal review.

Speaker 2 And like, so

Speaker 2 he's just like, it's, it's this, it's this level of both like incredible intellectual capability coupled with incredible force of personality, moral authority, execution capability, focus on fundamentals

Speaker 2 that is just like really amazing to watch. And then by the way, the side effect of it is he attracts many of the best people in the world to work with him.

Speaker 2 Because if you work with Elon, like the expectations are through the roof in terms of your level of performance. And he is going to know who you are.
And he is going to know what you've done.

Speaker 2 And he's going to know what you've done this week.

Speaker 2 And he's going to know if you're underperforming. And he may fire you in the meeting if

Speaker 2 you're not carrying your weight.

Speaker 2 But if you are as committed to the company as he is and working hard and capable, many people who have worked for it for him say that they had the best experience of their lives when they were working for him and so then there's this this attraction thing and and and that's why i think his companies compound the way they do is because they just because of this they just keep bringing oh they're a black hole that sucks in the best talent if you're the best in the world you want to work harder than anybody else come here

Speaker 2 That's right. And people look at this from the outside and they're like, how can people tolerate, you know, all the criticisms? Okay, tolerate this guy and da, da, da, da, da.

Speaker 2 And inside the company, people are like, finally, I get to work for somebody.

Speaker 2 Somebody gets it.

Speaker 2 somebody there's a famous lion, somebody used to work in one of the other defense space, aerospace companies, and went to work at SpaceX and said, he was asked what it was like.

Speaker 2 And he said, it's like being dropped into a shocking zone of competence.

Speaker 2 Right? It's just like, it's just like, is that to your point? Like, everybody around me is like so absolutely competent.

Speaker 2 And, and, and, and look, most of, most people, most of us never have that experience.

Speaker 2 Most people are never in an organization where like the bar is held that high. And as a consequence, as a consequence, the competence level is so high and stays so high and even rises over time.

Speaker 2 And he's, and again, to his master credit, he has been able to do that repeatedly.

Speaker 1 So detail-oriented, focusing on everything from A to Z in terms of the business, sort of intimately familiar with it, and prepared to get his hands dirty from a, I will get in there and actually look at solving the problem myself.

Speaker 1 So is that, you know, that suggests to me, although this can't be the case because you couldn't run this many companies if you were doing it this way,

Speaker 1 there may be some challenges in learning when to delegate, when to hand off tasks, because in order to get that level of

Speaker 1 resolution, to be able to see things with that much finitude,

Speaker 1 you also need to spend a lot of time on it. And then if you're solving the problems, how do you choose the problems that are yours to solve and the problems that aren't? So on and so forth.
So

Speaker 1 that must be a challenge, delineating between what is my problem to solve and what is somebody else's.

Speaker 2 Yeah, so I would say most leaders, like most CEOs we work with, have exactly that problem for exactly the reasons you described. I think the Elon method is a little bit different.

Speaker 2 And I don't know if you'd agree with this, but the way I think about it is he actually delegates almost everything.

Speaker 2 Like he's not involved in most of the things that his companies are doing. He's involved in the thing that is the biggest problem right now

Speaker 2 until that thing is fixed. And then he doesn't have to be involved anymore.
And then he can go focus on the next thing that's the biggest problem for that company right now. So like,

Speaker 2 for example, in manufacturing, there's this concept of the bottleneck, right? And so in any manufacturing chain, there's always some bottleneck.

Speaker 2 There's There's always something that is keeping the manufacturing line from running the way that it's supposed to. And sometimes the bottleneck is up is at the beginning of the process.

Speaker 2 It's like we can't get enough raw material. Sometimes the bottleneck is at the end of the process.

Speaker 2 We don't have enough warehouses for the finished product, or the bottleneck might be somewhere in the middle. And if you run a manufacturing company, there's always a bottleneck.

Speaker 2 Whatever the bottleneck is, is holding everything up.

Speaker 2 And

Speaker 2 the job number one is to remove that bottleneck and get everything flowing.

Speaker 2 And I think he basically has universalized that concept.

Speaker 2 And he basically looks at every company like it's some sort of conceptual assembly line sometimes a literal assembly line you know making cars and rockets and and basically any given week there's a bottleneck there's guaranteed there's guaranteed to be the main bottleneck there's going to be one thing that's going to be the thing that's holding people back and and so the answer to your question the resolution of the paradox is i'm going to micromanage the solution of that

Speaker 2 i don't need to manage everything else because everything else by definition is running better than that right and so i can go focus on that the other part of it that is so compelling and this is where i think a lot of especially non-technical ceos would really struggle to implement the method, is he really, when he identifies the bottleneck, he goes and he talks to the line engineers who understand the technical nature of the bottleneck.

Speaker 2 And if that's people on a manufacturing line, he's talking to people directly on the line. Or if that's people in a software development group, he's talking to the people actually writing the code.

Speaker 2 So he's not asking the VP of engineering to ask the director of engineering, to ask the manager, to ask the individual contributor to write a report for, you know, that's to be reviewed in three weeks.

Speaker 2 He doesn't do that. He would like throw them all out of the window.
There's just no way he would do that.

Speaker 2 What he he does is he goes and he goes and personally finds the engineer who actually has the knowledge about the thing.

Speaker 2 And then he sits in the room with that engineer and fixes the problem with them.

Speaker 2 Right.

Speaker 2 And again, this is why he inspires such incredible loyalty from the, from the, especially the technical people who he works with, which is they're just like, wow, if I, if I, if I'm up against a problem, I don't know how to solve, freaking Elon Musk.

Speaker 1 Call on the guy that owns the game.

Speaker 2 Freaking Elon Musk is going to show up in his Gulf stream and he's going to sit with me in the overnight in front of the keyboard or in front of the manufacturing line.

Speaker 2 And he's going to help me figure this out. Wow.
Right. And so it's like, how do you possibly like, how can, yeah, and this is the thing.

Speaker 2 It's like, okay, you're like a normal CEO running a normal company. Like, how can you possibly compete with that?

Speaker 1 What are the other reasons that CEOs of similar companies or comparable companies don't do that?

Speaker 1 Technical understanding, I get that, that you need to be able to get your hands dirty from a specificity perspective, but what else?

Speaker 2 Yeah, so like, so that's a lot, that is a lot of it.

Speaker 2 And that's a, that's a topic that makes people really mad because non-technical people really hate being told that they are not qualified to do something because they're not technical.

Speaker 2 But every now and then, the technical details actually do matter. So there's a whole domain there.
And most managers, it's like most people in government are lawyers.

Speaker 2 Most people in business and senior levels are MBAs. And so

Speaker 2 there's a,

Speaker 2 most large companies are not run by engineers, right? They're run by trained business people. And so there's a, or increasingly, lawyer, increasingly also lawyers.

Speaker 2 And so, you know, there is a real challenge there. And then I just think more generally,

Speaker 2 more generally, it's just the way that management is taught

Speaker 2 and, you know, most classically in the form of something like a Harvard Business School or Stanford Business School. It's basically taught.

Speaker 2 It's basically management as it was sort of developed and implemented, I would say, in like the 1950s, 60s, 70s. It was sort of the so-called scientific school of management.

Speaker 2 And so it's basically management as a generic skill that you can apply to any industry.

Speaker 2 And, you know, you can manage a soup company or you can manage a, you know, I don't know, whatever, whatever, whatever kind of company. And they're kind of all the same.

Speaker 2 And it it kind of doesn't matter what they do. And there's a common set of management practices.
And it has, you know, is a lot to do, it's, you know, process.

Speaker 2 It's, you know, how do you structure, you know, the, how do you manage the balance sheet? How do you set the review schedule for the meetings? How do you do compliance?

Speaker 2 How do you, you know, hire, you know, how do you manage a, you know, how do you hire and motivate executives?

Speaker 2 You know, how do you resolve interpersonal conflicts? Like all these general business skills. And by the way, those general business skills are very useful in lots of contexts.

Speaker 2 It's just that that training doesn't give you any of the information that training gives you none of what you need to go do what elan does um and then and then elon push i would say he pushes it as far as he can and not doing all the stuff that you're classically trained to do so that he can spend all of his time doing the things that only he can do which it turns out has this like just like incredible you know cataly catalytic multiplicative effect where his companies are you know so just incredibly amazing elon just said this thing at the uh most recent tesla event um that blew my mind you know because his companies are famous they don't they don't actually have like marketing departments Like, like, Tesla doesn't spend any money on sales and marketing.

Speaker 2 Like, they don't, they don't sell the cars.

Speaker 2 Like, every other car company in the world is like running all these ad campaigns all the time, TV commercials, and the, you know, newspaper flyers and the quarterly, you know, sales events and like all this crazy stuff and promotions, all these things.

Speaker 2 Like, Tesla doesn't do any of that. It's just like it's about, you know, it's the best car, and you just like show up and buy it or not.
It's up to you. You're a moron if you don't, right?

Speaker 2 Like, it's a diversity.

Speaker 1 My dad went to go and test drive a Model S maybe a couple of years ago. And he's sort of an old school car guy.
And he's used to the sort of banter back and forth, the four courts, the discussion.

Speaker 2 They say, please sit down.

Speaker 1 Would you like a coffee? Would you like, you know, what is that?

Speaker 1 And he said that he walked in and it felt like he got into an Apple store, and there was some, as he called it, teenager in a hoodie that came out.

Speaker 1 And my dad started to do the whole like classic boomer parent negotiation thing. And this guy was like, the price is the price, and I don't have any fucking coffee for you.

Speaker 2 So

Speaker 1 he left feeling

Speaker 1 like this is the new world. And now he has a Range Rover.
So

Speaker 2 well, that's the thing. It's like, do you want the best car in the world or not? Right.
Like, that's Elon's mentality, right? And it's working very well.

Speaker 2 And then he, at this event, he just took it to the next level. And this broke my brain.
I'm still thinking about it.

Speaker 2 He said, if you think about it, he said, the best product in the world shouldn't even need a logo.

Speaker 2 Like, you shouldn't even have to have your name on the product.

Speaker 1 People can just identify it from how good it is.

Speaker 2 It's just obvious. It's just obvious.
He did. Right.
It's just obvious. Everybody knows because it's the best product in the world.
Everybody has it. Everybody uses it.

Speaker 2 And, like, of course, you don't need to put the name on it. Everybody knows.
Right. And so it's, and for a minute, I was like, all right, is this like a Zen, you know, thing that I thought about?

Speaker 2 And then I was like, no, he's actually, as usual with Elon, it's like, no, he's actually serious.

Speaker 2 The best product in the world would not need your name on it.

Speaker 2 And so, yeah, anyway, so it's just this like completely different, it's just this completely different method.

Speaker 2 And then, and then on top of that, to have the guy who's on top of his game, you know, to, you know, to, as you said, to kind of be working on the government challenge and then to be, you know, to have the have the new president, you know, for whatever people think of the new president, pros and cons, like to have him fully, you know, fully signed up, embracing, you know, Elon and encouraging him to do this is a really, I mean, this, this has not, you have to go back, like literally the last time something like this happened in the United States government was was literally like 1933.

Speaker 2 This is the closest analogy to what at least has started to happen is is literally what happened under FDR, which is just like a fundamental reinvention of how government works and like this massive influx of talent from actually from the private sector that is able to just like do things that are unimagined.

Speaker 2 And so it's, you know, it's a once in an 80 years thing. I'm, you know, I'm doing everything I can to help and I hope it goes.

Speaker 1 Pouring some kerosene onto the fire as you are.

Speaker 1 I saw a clip of Elon speaking. I don't know whether it was a recent clip, but the clip only came up the other day.
He's talking about his thoughts on being a CEO.

Speaker 1 A lot of times people think that creating a company is going to be fun. I would say it's really not that fun.
I mean, there are periods of fun and there are periods where it's just awful.

Speaker 1 And particularly if you're the CEO of the company, you actually have a distillation of all the worst problems in a company.

Speaker 1 There's no point in spending your time on things that are going right, so you only spend your time on things that are going wrong.

Speaker 1 And there are things that are going wrong that other people can't take care of. So you have like the worst.

Speaker 1 You have a filter for the crappiest problems in the company, the most pernicious and painful problems. So I wouldn't say it's fun.

Speaker 1 You have to feel quite compelled to do it and have a fairly high pain threshold. And there's a friend of mine who says, starting companies is like staring into the abyss and eating glass.

Speaker 1 And there's some truth to that. The staring into the abyss part is that you're constantly facing extermination for the company because most startups fail.
90%, 99% fail. That's the staring part.

Speaker 1 You're constantly staring. Okay.
And saying, if I don't get this right, the company will die.

Speaker 2 Quite stressful.

Speaker 1 And the eating glass part is you've got to work on the problems that the company needs you to work on, not the problems you want to work on. So you end up working on that.

Speaker 1 You really wish you weren't working on it. That's the eating glass part.
And it goes on for a long time. What's your assessment of his post-mortem on being a CEO?

Speaker 2 So, the original form, number one, I agree with all of that, and I felt that myself.

Speaker 2 And that's one of the reasons I'm happy to be an investor right now and not a not a CEO of a company.

Speaker 2 But the original form of the quote, the eating glass quote, was actually Sean Parker. And the original form of the quote was:

Speaker 2 starting a company is like

Speaker 2 eating glass. Eventually, you start to like the taste of your own blood.

Speaker 2 Oh my God.

Speaker 2 Oh, my God.

Speaker 2 And I love using that quote in front of an audience because it always gets the exact reaction you just gave me.

Speaker 2 It always gets that moment of stunned silence followed by that look in the face and everybody's like, oh my God, that's horrifying. And I'm like, yes, now you understand.

Speaker 2 Go forth, run your company. Now you understand what it's going to feel like.
Yeah, no, look, and the serious thing I would say is, you know,

Speaker 2 it's a little bit less now, but there have been times when the idea of being an entrepreneur and, you know, tech entrepreneur has been like a romanticized concept.

Speaker 2 And, you know, there used to be TV shows talking about how much fun it was.

Speaker 2 And,

Speaker 2 you know, I,

Speaker 2 you know, I have people ask questions like, well, how do we,

Speaker 2 how do we encourage more people to be entrepreneurs? And my answer always was like, no, we shouldn't do that. Like,

Speaker 2 people shouldn't be encouraged to do something that painful.

Speaker 2 They should do it because they really want to do it. In fact, they should do it because they can't not do it.
Right.

Speaker 2 So it's, I'm not a, I'm not an endurance athlete, but it would be a little bit like encouraging somebody to do a triathlon or an ultramarathon, right?

Speaker 2 Like, yeah, why don't you go out there and run 120 miles in the heat? Right. Like, no, like that's bad.
No, most people should not do that, but the people driven to do that should do that.

Speaker 2 And by the way, the people driven to do that will do that, right? For, you know, for their, for, for their own reasons. And so it, it really is like that.

Speaker 2 And yeah, and then look, it's, it's tremendously painful. Um, you know, most

Speaker 2 most of the experience of being in business in a competitive, you know, setting and a startup is sort of the peak version of that is most of it consists of basically being told no, you know, by, by everybody, you know, you need all these things, right?

Speaker 2 You need employees and you need, you know, financing and you need customers.

Speaker 2 And, you know, you go around all day long asking people for those things and they tell you no, or, you know, half the time they don't even respond.

Speaker 2 You know, they, they just go dark on you and ghost you.

Speaker 2 And then, you know, just when you think you get it working, a new competitor emerges and just starts like, you know, punching you square in the face.

Speaker 2 And, you know, the minute you think your product's working, there's a problem in it. You have to do a recall.

Speaker 2 And, and every, you know, it's just, yeah, it's just this constant, it's just kind of this, this constant rolling, you know, horror story.

Speaker 1 You're now the, you're now the agony ant to the people who are eating glass and learning to try and enjoy the taste of their own blood.

Speaker 1 Do you, do you feel more like a therapist than an investor sometimes?

Speaker 2 So it really depends. One of the things,

Speaker 2 I think there are certain professionals, I think there are certain fields in which you really get to see somebody's core personality

Speaker 2 and you get to see their kind of core attributes and their core virtues and their core vices and their core weaknesses.

Speaker 2 And

Speaker 2 it's really only people under situations of extreme stress where you really see that. And so

Speaker 2 part of it is like we see that. We see people when they're at their most stressed and when they're at their worst.

Speaker 2 And then, you know, then there are, you know, I would say many cases where we're you know one of their only remaining allies you know kind of at that point so that that's a not uncommon thing um and you know there there is you know sort of a there's a coaching component to it and a therapeutic and a just a general you know kind of being a friend you know being an ally component to it um but i also say the other thing you see is just like you know there are very specific personality types um and you know there are certain people um where i'll just give you an example mark zuckerberg um you know a great superpower that mark zuckerberg has that probably is not well understood enough which is he does not get emotionally upset in stressful situations.

Speaker 2 He is able to maintain an analytical frame of mind,

Speaker 2 even when other people would be, you know, literally bursting into tears and hiding under the table. And I've seen that many, many times.

Speaker 2 It's just like, you know, the company's been through a tremendous amount of both good and bad, you know, things over the last, you know, I've been involved almost for 20 years.

Speaker 2 And, you know, we've been through almost everything good and bad. And I've literally not once seen him raise his voice.

Speaker 2 You know, he's, he's always been completely, you know, and he's, he's very sympathetic and he's very, you know, when somebody has a personal crisis, you know, it's not that he's not emotional, like he engages emotionally with people and he's very supportive of people when they have issues.

Speaker 2 And he, you know, he feels things very deeply, like everybody does. But, you know, he just, he has a level of emotional self-control.
You know, psychologists would say, you know, 0% neuroticism.

Speaker 2 He just doesn't respond emotionally. And as a consequence, he can keep his head right when everybody around him is losing theirs.
And so you've got that.

Speaker 2 Having said that, we also work with a lot of people who are, let's say, higher in neuroticism.

Speaker 2 And in particular, often the more creative, you know, people who are super creative are often high in neuroticism.

Speaker 2 And so sometimes you get the, you know, we get the artist type where they're like incredibly creative and they're a fountain of new ideas and they just have a much more, let's say, direct engagement with their emotions.

Speaker 2 And they feel things, you know, I would say more directly or kind of in a more raw way. And so, you know, when things go bad, it really

Speaker 2 comes down to them emotionally. I was frankly myself probably more on that on that side of things.

Speaker 2 And so, yeah, there's, you know, a therapeutic aspect of that.

Speaker 2 The good news I would say is generally in business. So a couple of things.
So one is it's not in our world, it's not a 90% failure rate. It's about 50%.

Speaker 2 And

Speaker 2 50% is still high, but you have a reasonably good shot. And then

Speaker 2 I think most problems in business are fixable as long as you can keep the team together.

Speaker 2 I think usually when companies crack and go down, it's usually not, sometimes it's catalyzed by something that happened from the outside, but generally the thing that actually happens before a company goes down is the internal, the team cracks,

Speaker 2 the team itself cracks internally, and the founders turn on each other, or the management team dissolves.

Speaker 2 And so, a big part of it is just with these things, is just like,

Speaker 2 can you keep the team together? If you can keep the team together and we're happy to be a part of it,

Speaker 2 our job is to be part of that team,

Speaker 2 then most of these companies can battle through most things.

Speaker 2 And so, I've said

Speaker 2 I've seen just as many incredible last-minute saves and rescues and turnarounds as I have screaming disasters. So,

Speaker 2 there is reason for optimism going into even the dark times.

Speaker 1 Going back to the governmental efficiency thing, how much do you think the huge swaths, the platoons of useless middle management inside of the government are shitting themselves at the moment?

Speaker 2 Yeah, so it's, you know, it's actually an interesting question. I'm actually, it's, it's, it's, it's hard to say from the outside.

Speaker 2 The other thing is you have this, I would say, very interesting thing.

Speaker 2 You don't have a, you know, there's no homogenous, you know, the government employs whatever it is, you know, millions of people, multiple millions of people.

Speaker 2 It's not, it's, you know, it's far from a homogenous group.

Speaker 2 And I'll just give you some examples.

Speaker 2 There's a pretty large contingent,

Speaker 2 which is sort of the baby boom

Speaker 2 component of it, you know, because the government went on a massive hiring boom in the 50s, 60s, 70s, 80s.

Speaker 2 There's a pretty big contingent that's close to retirement

Speaker 2 and was probably going to retire anyway in the near future. And so there's some prospect maybe there for kind of, let's say, mutually agreeable accelerated retirement

Speaker 2 is one possible solution.

Speaker 1 Mutually agreeable, accelerated retirement, the most diplomatic way to say goodbye.

Speaker 2 People who check No, no, this happens at so in fairness, this happens at companies. And I'm not saying the Doge is going to do this.

Speaker 2 I'm not in charge of this, but I'll just give you the scenario that happens in companies. It's something called buyouts, as contrasted to layoffs.

Speaker 2 And it sounds like it's a clever way to do the same thing, but it's actually different, which is you basically... you offer retirement packages, right?

Speaker 2 And you let people voluntarily sign up for the retirement pack. And they don't have to sign up for it.
It's a voluntary thing.

Speaker 2 But if they choose to, they can get accelerated retirement or an unusual level of retirement compensation.

Speaker 2 A way to think about this is like the budget crisis that a business has when it gets in trouble or that the government has where we're $36 trillion in debt.

Speaker 2 The budget crisis is not like the money that we might have to pay to have people retire this year. Like you can afford almost anything to do that.

Speaker 2 It's the savings that you can have compounded over the next 20 years if you get your fiscal house in order, right?

Speaker 2 And so you can sometimes do things in the near term that cost more money in order to get the long-term savings. And that often makes sense.

Speaker 2 And so there's, so, and again, I'm not speaking for the Doge, so this, you know, this is one option. Another option that they've already talked about in public is the remote work thing.

Speaker 2 You know, most federal workers are not, most federal workers are not actually at work. You know, most federal workplaces are empty today.

Speaker 2 You know, there are agencies where there are formal collective bargaining agreements where the employees are down to a day a month in the office.

Speaker 2 Right.

Speaker 2 And, you know, there's a huge controversy around remote work, you know, is how effective it is. And there are companies that make it work.

Speaker 2 you know, if hypothetically you have a government agency with 10,000 people and nobody's literally coming to work, you might as a taxpayer have some questions about exactly what's going on.

Speaker 2 And then you have this situation where if you say, as they've said, if

Speaker 2 people have to come back to work, like part of the thing of working for the government, working for taxpayers is people have to come to work. Well, there are people who have literally left.

Speaker 2 There are people who have moved to lower cost locations who are not going to come back to work. And so that's another opportunity.
And so there's like, there's, there's a lot of this.

Speaker 2 And then I would say the third thing that they've talked about, which is, I think, probably the part of it that is not well appreciated,

Speaker 2 is that the government that we live under today,

Speaker 2 you know, we, we, you know, remember that old, I don't know if you had, I don't know if you were, you probably didn't see this because you weren't maybe here at the time, but there's this cartoon that every American kid of a certain age uh saw in elementary school, social studies, which was how a bill becomes a law.

Speaker 2 Um, oh, it's this, it's this very bouncy, it's this very bouncy like 1970s cartoon about the bill, and you know, the congregation, you know, so the White House proposes the bill, and then the Congress agrees, and then the Senate, and then they reconcile, and you know, dance and the bill's little cartoon, you know, bill wrapped in a little ribbon and it bounces up the thing, and the president signs it, and everybody's happy, right?

Speaker 2 And so, it's like you know, legislation, right? The legislative process. Um, most of what the government does and most of the rules that we all live under are not that, right?

Speaker 2 Most of it's not legislation. Most of it is what's called regulation, right? Which is very different.

Speaker 2 And most of the rules that the government sort of exists in order to process and enforce are regulatory,

Speaker 2 not legislative.

Speaker 2 The government that we live under today is the result of basically 60 to 80 years of agencies effectively putting out regulations entirely on their own.

Speaker 2 So executive branch agencies just basically deciding that there are going to be hundreds of thousands of rules that we need to live under and then deciding that they need to staff themselves to be able to process and handle and enforce all those rules.

Speaker 2 The Supreme Court recently ruled in a series of judgments a couple of years ago, ruled that regulations that were not authorized by legislation are not constitutional.

Speaker 2 And by the way, if you just read the U.S. Constitution, which is quite short and to the point, it makes it very clear that the legislature, the Congress exists to pass laws.

Speaker 2 The executive branch exists to enforce those laws. The executive branch does not get to issue its own laws, right?

Speaker 2 And so we've lived in a system where the executive branch has been issuing its own laws for a very long time. The Supreme Court now says that all of those regulations are now not constitutional.

Speaker 2 And so basically any activity in the government that relates to

Speaker 2 the staffing and enforcement and all of the processes and all the work around those regulations is no longer constitutional.

Speaker 2 And so there's a lot of government activity that actually is no longer constitutional. And, you know, the previous administration was completely uninterested in doing anything,

Speaker 2 bringing the executive branch into alignment with what the Supreme Court had ruled. But again, if you read the Constitution, the Supreme Court gets to make these decisions.

Speaker 2 They've decided this is not constitutional. This new administration has an opportunity to bring the government more in compliance with the Supreme Court, which means the government will get smaller.

Speaker 1 Yeah. What happens if Trump can't deliver change? He's got all of the things: House, Senate, popular vote, Elon, unlimited funds, blah, blah, blah.

Speaker 1 He doesn't even need to worry about being re-elected. What's the implication for America and the West if he can't make sizable change?

Speaker 2 You know, I don't know. I mean, look, I was to start by saying I'm optimistic.
Like, I think he's determined. I think that Elon's determined.
I think Viveka's determined.

Speaker 2 I think the, you know, the team around the Doge. By the way, a lot of the key staffing positions, you know, there's a lot of controversy in the press, as usual, about the cabinet officials.

Speaker 2 And I think, you know, many of those are also very good. But

Speaker 2 a lot of the operational managers that are being brought in at like the deputy secretary level and the department has are very, very strong people.

Speaker 2 And so I think this is going to be

Speaker 2 one of the strongest, if not strongest, you know, execution competence capability administrations that we've had in a long time. Again, maybe arguably since the,

Speaker 2 I think a reasonable case that this will be the best, best, the best staffed administration since the 1930s.

Speaker 2 And so, you know, I'm reasonably optimistic. And then look, the president has this, you know, is in this sort of very interesting position where he doesn't ever have to run again, right?

Speaker 2 Because he can't. Right.

Speaker 2 And so

Speaker 2 a lot of what happens in a second term, if presidents choose to take advantage of it, is they can do things that cause, you know, that basically might even put their re-election at peril.

Speaker 2 They can do more forceful things because they they you know they they can do the things that they actually think are important as compared to the things that they think they might need to do to get re-elected and that's something you can only do when you're going to get when you're going to get termed out and so um and then of course there's just yeah there's sort of you know all the things kind of behind trump's level of determination this time which i think are is very high um so i you know i'm quite optimistic i don't know i mean you know you could paint various cynical pictures of of the whole thing being gummed up um i i my guess is if it if it disappoints it's going to just disappoint in the sense of some of these things are going to get done but not all of them um and and and and and frankly like i think it'd be great it'd be great to solve all of our problems at once um but if we can't do that if we can just basically if if this team can sort of give us a roadmap on how to do things properly and can kind of show us you know through examples you know if if let's let's say there's you know let's let's say there's you know let's say there's a thousand versions of this problem that have to get solved and this team only gets to the first hundred

Speaker 2 then we will have the recipe book right for how to solve the next 900 over the next

Speaker 2 two or three or four administrations that follows that. And so

Speaker 2 that would be my optimistic failure case. And I think we will get at least that, if not a much more sweeping transformation.

Speaker 1 Talk to me about this Willow

Speaker 1 Google quantum computer thing. How big of a deal is it?

Speaker 2 Yeah, so first of all, I don't know.

Speaker 2 I just read the announcement. I haven't been face to face with the technology, so I don't know anything that's not public on that.

Speaker 2 Quantum computing historically has been one of those things that is theoretically very exciting and then just extremely hard to actually get working.

Speaker 2 Quantum computers don't act like regular computers in any way.

Speaker 2 And so, and there's always this kind of thing where you, in theory, can kind of demonstrate something in a lab and that maybe it works and maybe it doesn't.

Speaker 2 It's often not completely clear even whether it worked. And then, you know, could you ever get that to repeat? Could you get that to repeat across, you know, 10 or 100 different

Speaker 2 kind of locations? Could you ever then apply that to these different problems? Like there's just all these like really, really fundamental questions. And so it's one of these really enticing things

Speaker 2 that

Speaker 2 everybody's hoping that we get to. I just, I'm not close enough to know whether they got to the full breakthrough or not.

Speaker 2 The thing that I think would be the full breakthrough would be not only does it work in one lab, this is something where you could now stamp out 10 and then 100 and then 1,000 of these machines, and then you could do it repeatedly.

Speaker 2 The most enticing thing of what they said,

Speaker 2 which is one of these things where you think about it and then you decide you have to stop thinking about it because you'll never think about anything else again,

Speaker 2 is that in theory,

Speaker 2 the lab demonstration that they have is doing a computation

Speaker 2 that the entire universe that we live in, if it was converted entirely to just a giant computer,

Speaker 2 if every atom in our universe

Speaker 2 was turned into a giant computer, that computer could not solve that problem before the universe ultimately dies if he dumped it.

Speaker 2 Right? And

Speaker 2 the implication of that is that the computation, therefore, must be spread out across many other parallel universes, which is therefore proof that there are many other parallel universes.

Speaker 2 Interesting.

Speaker 2 Right. In other words, if you weren't able to spread this computation out across many parallel universes, it could not have been done.
Therefore, there must be many parallel universes.

Speaker 2 Therefore, we do actually live in a multiverse. And there are actually billions or uncountable numbers of parallel realities including.

Speaker 1 That we're using for our RAM.

Speaker 2 Yes. Yes.
And we're, yeah, we basically have out. Yeah, we're, yeah, we're outsourcing to these other universes.
I mean, you know, by the way, are we having any impact on them when we do that?

Speaker 2 I don't know.

Speaker 1 Yeah, there's some very glitchy 360p universe out there. And they're like, God damn it.

Speaker 2 That guy from Google.

Speaker 1 Yeah, that guy from Google's trying to work out if two plus two equals four again.

Speaker 2 Yes, exactly. And then, of course, are any of these other universes going to do it to us?

Speaker 2 So, you know,

Speaker 2 the quantum people really do believe. It's a, you know, we live in many, you know, many, many, many different realities.

Speaker 2 So, you know, this may be one of those things from science fiction that we end up figuring out.

Speaker 2 But yeah, this is the kind of thing, you know, it's, it's, but, you know, Einstein's famous reaction to quantum physics was, you know, this, this is impossible because

Speaker 2 the famous quote was, God doesn't play dice with the universe. Right.
And it, you know, this result is yet another evidence that, no, actually, God really likes playing dice with the universe.

Speaker 2 Seems like

Speaker 1 he is doing it. Between that, I mean, this, this last period, there's so much going on in tech.
Sora AI as well. What's your post-modern?

Speaker 1 I mean, they've closed now sign-ups and stuff because it just went ballistic. So for the people that don't know, this is the video equivalent of OpenAI's

Speaker 1 image creation service. And now you can do video too, although so many people signed up.
It's restricted to particular territories, et cetera, et cetera, et cetera.

Speaker 1 What do you make of the way that that went?

Speaker 2 Yeah, so this is, yeah, this is video. So this is text-to-video generation.
So you put in a text prompt, you get out of video, which sounds easy, turns out to be very hard.

Speaker 2 It's very magical.

Speaker 2 Sorry, one of the more impressive ones at OpenAI.

Speaker 2 There are others, we have a bunch of others, and there's a bunch of others underway. It's a competitive space, but

Speaker 2 the class of technologies is very impressive, and Sora is very impressive.

Speaker 2 I mean, one is just like, how amazing is it to live in a world in which you can literally

Speaker 2 say a text prompt of

Speaker 2 a hobbit living in a hobbit town with the dragon shows up, and then literally it will render that for you,

Speaker 2 right? And if you want the dragon to be rendered out of used car parts, you just say dragon made out of used car parts and it renders that for you, right?

Speaker 2 And so it's like, I mean, that's just amazing to start with.

Speaker 2 The even more amazing thing about that,

Speaker 2 and OpenAI put out a paper about this. It's very interesting.
So the more amazing thing about that is

Speaker 2 to generate

Speaker 2 video that passes the sniff test of the human eye looks at it and thinks that it real, thinks that it's real,

Speaker 2 you can't just sort of copy from other videos.

Speaker 2 And that's part of what's happening is, you're training, a system I saw is trained on millions of hours of video, you know, from like all over the world and all this open source stuff and everything and old, you know, out of old movies and all this stuff.

Speaker 2 And

Speaker 2 then there's copyright disputes over what else is in there and so forth.

Speaker 2 But, you know, so it's trained on lots of video, but it's not sufficient to just train on lots of two, you know, because video is all 2D.

Speaker 2 It's not sufficient to just train on 2D video and then generate a new 2D video that actually looks to the human eye like it's a representation of the real 3D world.

Speaker 2 And if you look at these videos coming out from Sora, if you look at them carefully,

Speaker 2 basically what you see is like multiple sources of lighting in different parts in 3D space coming together. You see reflections coming off of reflective surfaces that are actually correct.

Speaker 2 You see translucency that's correct. And then you get combinations of these factors.

Speaker 2 So like, for example, if you do something where like a man walking through a puddle at night, you'll get the splashing effects of the water.

Speaker 2 The water has to splash in a way that is physically realistic. The light has to come through, refract through the water droplets in the correct way.

Speaker 2 The water droplets have to reflect the image of the man's shoe in the right way. So

Speaker 2 and the AI term for this is world model.

Speaker 2 This is not only a model for video, this is a model that actually understands the real world. It's a model that actually understands 3D reality, right? And it understands light, right?

Speaker 2 And it understands surfaces and textures and materials and motion and gravity.

Speaker 2 And gravity, and gravity, and shapes, and right, exactly, all these things, right, exactly. You know, a stiff surface versus a spongy surface.

Speaker 2 Yeah, you know, def, you know, give me a close-up, give me a close-up of

Speaker 2 a baseball bat hitting a baseball. You know, like in slow motion, it has to deform the baseball in the way that in the real world it does, you know, when it hits the bat.

Speaker 2 And like for this story, it has to do all those things. And in fact, it is able to do all those things.
Like if you just look at the results, you can see all that's happening.

Speaker 2 And so what that means, backing up to that, the implication is that model is not just a video model. It's what's called a world model, meaning it actually understands 3D physical reality.

Speaker 2 The implication of that is that

Speaker 2 we may have basically just solved the fundamental challenge of robotics.

Speaker 2 The fundamental challenge of robotics is how do you get a physical robot to navigate the real world without screwing everything up, right?

Speaker 2 So, how do you get a robot waiter to navigate through a busy restaurant without stepping on anybody's foot, right? Without tripping over anything, without spilling water on the table,

Speaker 2 to understand everything that's happening in real time to be able to adapt, you know, very similar to the self-driving car challenge. Like, how do you do that?

Speaker 2 But how do you do that for basically everything? And how do you put machines up close with people in such a way that it's like completely safe?

Speaker 2 And it turns out one of the things you need to do to do that is you you need a world model.

Speaker 2 You need the robot needs to have a comprehensive understanding of physical reality so that it can understand what's happening.

Speaker 2 And so, when things change, because you know, the robot's seeing primarily visual, right?

Speaker 2 It's it's just like you're seeing visual, and so you have to map the visual into an internal representation of the 3D world.

Speaker 2 Up until now, building a world model like that has been difficult or impossible, and it now appears that that's actually starting to that that's actually starting to work. So,

Speaker 2 West World,

Speaker 1 it's happening

Speaker 2 2028, 2030.

Speaker 2 What do you think? I mean,

Speaker 1 you must be knee-deep in

Speaker 1 projections for robotics, for AI, for where we're going to get to.

Speaker 1 What's the meteorological artificial intelligence forecast for the next few years? What do you think people can expect sort of 25, 26, 27, et cetera?

Speaker 2 Yeah, so I'd start by saying it's really hard to forecast this.

Speaker 2 And I'll give you my favorite example of this, which was the world's leading AI researchers in the year 1956 got together and they got a grant from the government to spend 10 weeks in the summer of 1956 at DARPA, or at sorry, at Dartmouth, at Dartmouth University.

Speaker 2 And so they all got together. And in that 10 weeks, they were going to finally get to artificial general intelligence.

Speaker 2 They were so close. Like they were almost to what was called AGI.
They were almost to an AI that can do everything that a person can do. They were only 10 weeks away.

Speaker 2 And, you know, you'll notice that was in 1956. It didn't happen.
We're sitting here in 2024 and we still don't have it. So this field is prone to

Speaker 2 this field of all the fields in tech, AI is prone to utopianism, it's prone to apocalyptic nightmare scenarios, and it's prone to a very, very, it's been very hard to forecast progress, like extremely difficult to forecast progress.

Speaker 2 Well, another, another great example of that is OpenAI was not formed to make chat GPT. Open AI was not created to make large language models.

Speaker 2 OpenAI was created to make an entirely different kind of AI.

Speaker 2 And then it turns out, if you trace back the origin of GPT, there was literally one guy at OpenAI, his name, Alec Radford, and he was like sitting in a corner and he's like, I think all this other stuff is wrong.

Speaker 2 I think we should be doing this other thing instead. And so like, you know, even that company that brought us, you know, Chat GPT, you know, didn't, you know, a few years ago

Speaker 2 didn't know that that's what they were going to do. So forecasting is very hard,

Speaker 2 especially about the future.

Speaker 2 Look, having said that, the progress is staggering. And, you know, one is just the observed progress is staggering.
The plans that people have are incredibly exciting.

Speaker 2 You know, one of the things you always wonder with any field like this is just how many more ideas are there, right?

Speaker 2 Like, arguably, we've run out of ideas for what the smartphone can do like you know what what's new with the smartphone now versus last year versus the year before it's we've you know it's good the smartphone companies kind of run out run out of ideas with ai there's ideas all over the place and so that that's very optimistic and then Many of the smartest people in the world are being drawn into the field, right?

Speaker 2 And so every smart college kid, you know, who's considering what to do, a lot of the very smartest are going into the field.

Speaker 2 And then a lot of people are like coming over from other fields like physics to work on this now that it's really started to work.

Speaker 2 So we're also getting this effect of kind of this, you know, reverse brain drain or something where we're pulling in all the smart people. That's another reason for optimism.

Speaker 2 And then obviously the commercial opportunity is very large. So that's another reason for optimism.

Speaker 2 The big focus right now is to get these things to what's called reasoning, general purpose reasoning. So to get them to, you know, like,

Speaker 2 to get them to predictably be able to solve problems in a way that is, you know, fully coherent and leads to good results every time.

Speaker 2 And, you know, these things are very good at solving certain problems most of the time.

Speaker 2 It's a very unusual technology where if you ask a large language model the same question twice, it actually gives you different answers.

Speaker 2 And if it doesn't know the answer to the question, it will sometimes make up an answer. And it is pretty amazing and wild that we have creative computers that will literally

Speaker 2 make things up.

Speaker 2 But

Speaker 2 we need versions of these that don't do that.

Speaker 2 We need versions of these that are able to reason their way through complicated logic chains that are able to model physical reality like I just described, that are able to think longer and get better

Speaker 2 And so, there's tremendous amounts of work happening on that right now. I'm pretty optimistic on that.
I think by,

Speaker 2 yeah, I think by, I don't know, 2028 or something, I'm just off the top of my head, just with a little bit of margin of safety by 2028, these systems, you'll be able to basically give these systems problems and

Speaker 2 they'll solve the problems. If it's something, if it's something that a human being can do, they'll be able to do it.
And then robotics is basically that.

Speaker 2 And then we now have the interface method for robotics because we have large language models.

Speaker 2 So we can now make robots talk and listen, and we have voices and human english comprehension all that stuff and then and then basically the rest of robotics is basically mechanical engineering and then some power basically battery technology um and i i think robotics are getting quite close now um and so that and that i i i bring that up because like that ai is going to be important if it's just disembodied software but it's going to be really important if it's if it's around us all the time in the form of actually if it exists

Speaker 2 yes in the real world in the real world

Speaker 2 and of course that's already happening right you know you know autonomous you know it's now drones now fly themselves cars now you know cars now drive themselves, right?

Speaker 2 So, you know, for people who haven't tried either the Waymo cars in places like San Francisco or the new Tesla, you know, the latest version of the Tesla software, they're both outstanding.

Speaker 2 And so cars now drive themselves. That's a big step forward.
Drones now fly themselves. There are now companies making autonomous submarines

Speaker 2 and

Speaker 2 all kinds of

Speaker 2 fancy things.

Speaker 1 Oddly enough, the autonomous submarine just feels like one that you could have done earlier.

Speaker 1 There's far less traffic. There's far fewer things to crash into.
Yeah,

Speaker 2 probably, probably. There's, yeah, we have a company that

Speaker 2 has a military submarine that is

Speaker 2 literally

Speaker 2 an unpressurized shell sort of platform, and you can basically customize it in many different ways, load many kinds of things on it. But it's like unpressurized form.
It's able to go far deeper than

Speaker 2 a sub that needs to support human beings can go.

Speaker 2 It's able to go down and able to do all kinds of things. So yeah, like you know, the whole, as we're seeing, you know, on the military side, there's a transformation of military,

Speaker 2 of military strategy, you know, that's underway right now, you know, kind of as these technologies hit that we're already seeing in Ukraine.

Speaker 2 So that's going to matter. But, you know, look, a lot of this stuff is just going to be stuff in our personal lives, in our daily lives.

Speaker 1 What do you think of the adoption of self-driving, sort of en masse self-driving? I don't know whether there is a more

Speaker 1 highly kinetic,

Speaker 1 high-risk, and heavily involved area of human life that is ready to be outsourced to AI.

Speaker 1 You know, there's other things we can help you cook, or it'll, you know, keep your food fresh or do whatever, but there doesn't seem to be the same sort of modal consequences.

Speaker 1 You know, people have a very strange relationship between car safety when it's done manually and when it's sort of pivoting to that.

Speaker 1 Is this just some conceptual inertia thing that we're moving through over time, slowly, as usual?

Speaker 2 Yeah, so to start with, cars today are a disaster from a safety standpoint, right? And so cars, the current run rate of road deaths worldwide is at least a million people a year, right?

Speaker 2 And so it's about 40,000 road deaths a year in the U.S. and about a million worldwide.
And by the way, that million may be a low number. It may be much higher worldwide.

Speaker 2 And we just don't know how to count it, but it's at least a million worldwide.

Speaker 2 And so, you know, and it's sort of like, again, you kind of think about this, like, okay, a million people a year dying from cars. Okay, over the course of a decade, that's 10 million people.

Speaker 2 Like when 10 million people die from something in the modern world, we use words for it like, you know, like, you know, apocalypse, right? Genocide.

Speaker 2 like, you know, this is like mass death at a very large scale.

Speaker 2 And so one is just like, to your point, like, there's a psychological thing here, which is we have gotten acclimat. And by the way, for every death, there's many injuries, right?

Speaker 2 Many people crippled and, you know, never recover.

Speaker 2 And so

Speaker 2 there's a psychological thing here, which is we have gotten used to a very large amount of carnage in return for the convenience of modern transportation and logistics.

Speaker 2 By the way, you know, in the fullness of time as a civilization, it seems to have been a good trade-off.

Speaker 2 You know, I don't think, you know, very few of us, not named Greta Thunberg, want to go back to, you know,

Speaker 2 walking everywhere. Right.

Speaker 2 And so, and, you know, you know, you could, you could never go back to riding horses, right? Because, of course, you know, animal rights.

Speaker 2 But,

Speaker 2 you know, so like that, it is amazing that we got there.

Speaker 2 By the way, implication of that being, if the car were invented today, I mean, imagine the conversation that we would have right about the rollout of the automobile, right?

Speaker 2 Which is like, okay, the plan, right, the plan is to strap people into 6,000 pounds of steel and glass.

Speaker 2 And then the plan is to shoot them down, you know, a road that may or may not, you you know, be in good shape at 60 miles an hour. And then they're going to shoot them at each other.

Speaker 2 And then we're going to have the safety measure. The safety measure is going to be we're going to paint a line down the center of the road.
And that's going to keep them from crashing. Right.

Speaker 2 And so like there's no way that you'd be able to like invent this today or launch this today. So maybe it's good that it happened in an earlier era.
So anyway, to your point, we're used to that.

Speaker 2 The self-driving both Waymos and Teslas today are far safer than that, far safer.

Speaker 2 And there's like edge questions about, you know, like, you know, this is the metric is like, you know, collisions per, you know, thousand or hundred thousand miles or something or desperate thousand miles.

Speaker 2 And like, it's just no question. They're just like tiny percentages relative.

Speaker 2 And it's, by the way, not that they're necessarily zero, but they're tiny percentages of what we of what we experience every day with,

Speaker 2 you know, with road desk. By the way, I think as humans, we have a bad intuition on this because we can't see the other drivers most of the time when we're driving.

Speaker 2 So I find the cure for people who need to think about this harder is to just go spend a day sitting at the DMV

Speaker 2 and just watch the parade of humanity

Speaker 2 and watch the people who have to be physically steered by their relatives in front of the camera to get the new driver's license at age 90. Right.

Speaker 2 Like, so we, we just, and you know, and, you know, and then get X, if you had like an X-ray machine in the cars that are driving past, you just see how many people are just texting, right?

Speaker 2 Or how many people have been hitting the vape for the last three hours, right? So, so, so, like, there's no question that

Speaker 2 the technology we have is much safer. There's a societal question as to whether we want the, where we want the trade-off to be, right?

Speaker 2 Do we want the trade-off to be to stay to the current level of carnage because we're used to it, right?

Speaker 2 Or do we want, want, you know, and wait until the computers are perfect? Or, you know, is it just is it enough for the computers to be much better, which is what's actually happening?

Speaker 2 Interestingly and optimistically, we as a society have chosen to actually accept

Speaker 2 we have chosen to roll out self-driving without the computers being perfect. And so, you know, there's Teslas and Waymos on the road all over the place.

Speaker 2 And, you know, we're and people are fine with it. And I actually think that's like, it's a very optimistic thing that we were, we were able to do that.

Speaker 2 If you talk to the people who make self-driving cars, what they tell you is the problem is not ever a self-driving car colliding with a a self-driving car the problem is it's when you have other humans in the mix um right as you'd expect right because humans react in unpredictable ways um and so a lot of the engineering going going into these cars is to accommodate the human drivers there will be some future state years in the future where there are no more human drivers um at least on public roads and when that happens these things will be like basically completely safe because then you won't have the human element in there and so one of the things that we we may want to drive you know we have to drive to ultimately is that There are a lot of questions around that, but that may be where we want to go to.

Speaker 2 I'm pretty optimistic about this. I think that these companies are making excellent progress.
I think that

Speaker 2 this has pros and cons, but the Chinese auto industry is now coming online, and I think they're also going to be quite good at this. And I think they're coming now.

Speaker 2 And so, I think that this whole space is going to develop, I think, quite quickly. What about

Speaker 2 personal

Speaker 1 autonomous vehicles like drone, drone?

Speaker 1 I remember I was in Leonardo da Vinci Airport years ago, and there was one

Speaker 1 gyrocopter thing out front. This can fly itself, et cetera.

Speaker 1 That feels like a much bigger problem in that you're not retrofitting a new piece of technology onto an existing infrastructure, that this is basically an entirely new thing that you need to have where they can float around.

Speaker 2 Yeah. So

Speaker 2 the term most common in our world is they call it E-VTOL, E-V-T-O-L. So electric vertical takeoff and landing.
Okay.

Speaker 2 Right. Right.
So VTOL, VTOL, everybody's seen VTOL, VTOL, vertical takeoff and landing. It's a Harrier jet that the military flies.
The movie True Lies, the big scene on the Harrier jet.

Speaker 2 So it's the jet that

Speaker 2 has thrusters that point down. It rises.
And then it has thrusters that carry it forward. And so

Speaker 2 we've had these VTOL things. And by the way, we also have had helicopters which have the same property.
So we've had these things in the past.

Speaker 2 And then the electric part is basically that they'll be electric. And then the other presumptive part of that is that they'll be autonomous.
They'll be self-piloting.

Speaker 2 Because for those things to go mass market, you can't expect everybody to get a pilot's license, right? You're going to want to just get in, want to sit back and let the thing fly itself.

Speaker 2 So autonomous, electric, vertical takeoff and landing. We have all of the elements to do that.
We know how to do all of that.

Speaker 2 All those technologies exist now. And there are companies that have put this together and have these products working.

Speaker 2 The big issue right now,

Speaker 2 I think a couple of big issues. One is just power, which is it's just, it takes a lot of power to get something in the air and keep it in the air.
And like I said, batteries aren't very good right now

Speaker 2 relative to what we need for things like that.

Speaker 2 So we need some breakthroughs and batteries. And then the other is just cost and

Speaker 2 infrastructure and then safety regulate, you know, sort of safety regime and all that. You know, over the course of five or 10 years, I think you could imagine that happening.

Speaker 2 There's also, you know, there are people working on the Ironman suit.

Speaker 2 You know, Elon keeps making references to it. I don't know that he's working on it or not, but he keeps making references to it.

Speaker 2 And then there are, there are, I, there have been, well, do you remember?

Speaker 2 I don't remember there was, I was at the James Bond, one of the old James Bond movies, like Scoldfinger, one of those movies, actually had what looked like a stunt where James Bond actually escaped from a house by strapping on a jetpack.

Speaker 2 Okay. And actually, and actually did this thing and he did this thing, a big arc up in the air, and then he landed, you know, like a half mile away.

Speaker 2 And when I was a kid, I watched it like, you know, I just love those, I love those movies. And I watched it and I was like, well, clearly they did that with like a green screen special effects.

Speaker 2 And it's like, no, that was an actual jetpack. No way.
That actually, yes, Yes, that actually existed. And actually, the military has those today.

Speaker 1 I've seen someone boarding

Speaker 1 like a tanker type thing, and they've got sort of the hands in gauntlets. And that's not electric.
That looks like it's jet, some sort of sort of fuel jet-powered type thing.

Speaker 2 Yeah, that's right. So today, today that would be, today that would be, today that would be jet-powered.
And there are jet packs like that. Again, expensive, risky, you know, not meaningful.

Speaker 1 You don't want a bullet in the wrong location if you've got one of those things strapped to you.

Speaker 2 Yes, exactly. That's maybe the second most dangerous hobby after flying in squirrel suits.

Speaker 2 So

Speaker 2 those are the hobby videos where they end, the YouTube videos end two seconds before the final moment.

Speaker 2 So, but like, you know, look, we know how to get things in the air. Like, we know how to do that.

Speaker 2 And so, you know, is there a kind of, you know, is there a kind of propulsion? Is there a kind of, you know, either fuel system, propulsion system? Is there a battery system?

Speaker 2 By the way, you know, there are hobbyists that have literally put together like quadcopter drones,

Speaker 2 you know, where you use it. It's like a, you know, it's like a helicopter with, you know, using lots of individual like quadcopter rotors.

Speaker 2 And so that could be electric powered and could basically do the same thing.

Speaker 2 And so I, you know, there's lots of lots of hobbyist activity here. And so I think people are going to be working on this a lot over the next few years.
Solving autonomy helps tremendously

Speaker 2 because

Speaker 2 you're able to, you know, then the human doesn't need to be trained.

Speaker 2 You can make the thing super safe.

Speaker 2 Another interesting thing that's happened is if you've seen the Waymo car, like the Waymo cars in the street, they've got this sensor on the top of the car that kind of spins around. And

Speaker 2 that's a system called LiDAR.

Speaker 2 LiDAR and it's a light-based form of radar and so it's it's a system that lets it do basically 3D mapping of the environment so the the Tesla is basically optical so the Tesla is using cameras to construct a 3D model of the environment and sort of interpolating distances based on having different camera angles the Waymo cars actually have these sensors called LiDAR that actually do do depth sensing.

Speaker 2 The problem with the depth sensing is LiDAR sensors historically have been extremely expensive, which makes it hard to feel these things in products.

Speaker 2 But they are coming down rapidly in price now. And I, about three months ago, I bought my nine-year-old as my prototype, in-house prototyper for all this stuff.

Speaker 2 And I got him a Chinese robot dog for, I think, $1,600.

Speaker 2 And the robot dog's snout,

Speaker 2 and it's a robot dog like the demos you've seen. Like if you've seen like the Buster Dynamics robot dogs, it's like that, but you can buy it and you can actually have one of your own.

Speaker 2 And it does all the things. It's very impressive.

Speaker 2 And it actually has the snout that's the spinning sensor. And so they've somehow gotten gotten LiDAR down to a couple hundred dollars.
Wow.

Speaker 2 Um, and again, like that's very encouraging because now that you've got that, now you could start to think about building all kinds of things that then have depth sensing.

Speaker 2 And so, all the, yeah, so all these pieces are starting to fall into place.

Speaker 2 And with a little bit of luck and some progress in batteries, um, the next, you know, five years, ten years, yeah, more of the Ironman stuff, you know, Jessica stuff hopefully will start to happen.

Speaker 1 It really is a different timeline. Speaking of which, I went, uh, I was in the UK last week.
I was there for a couple of weeks. I went back home.
I had this live show thing.

Speaker 2 Give me your opinion on the sort of state of free speech and and regulation on that side of the world we're talking about the excesses the overreaches that hopefully may be coming out of over here what's your opinion on the other side of the atlantic yeah so i'd say there's like three kind of worlds of speech right now three uh zones in the world there's like the chinese version which is we're going to tell you explicitly what the rules are and you better not break them um but basically you know other than the things we tell you not to talk about which we're very clear about, you know, you can go crazy and talk about whatever you want.

Speaker 2 So, kind of the top-down authoritarian, you know, kind of model.

Speaker 2 We have the American model, which is we have free speech constitutionally guaranteed, but there are a thousand unwritten rules

Speaker 2 of society, like the world's worst version of Kirby Enthusiasm, where, you know, there's a thousand different ways to trip the PC police.

Speaker 2 And if you, if you trip any one of them, you are, you know, your life gets vaporized.

Speaker 2 And so you might call that kind of bottoms up authoritarianism, right? Like, you know, the government's not coming to get you. They're not going to jail you for it.

Speaker 2 But like, you know, boy, you know, just it's a shame that you got fired and your family now hates you and your friends have all left you and you can, you can never work again, right?

Speaker 2 So that's like the distributed, that's like the distributed bottoms up authoritarian unwritten version of what China has, right? So top down, bottoms up.

Speaker 2 And then there's Europe, which has decided to do both

Speaker 2 for

Speaker 2 reasons I have to say I don't fully understand.

Speaker 2 But, you know, at least from an American perspective, if you look at the speech laws in

Speaker 2 the UK or Canada, they are horrifying.

Speaker 2 I would like to think that if those speech laws came to the US, we would have a revolution. Like, they're just completely unacceptable.

Speaker 2 Like, of course, you can't send police to somebody's house because they said something wrong on Twitter. Like, of course, you can't do that.

Speaker 2 But yet it happens.

Speaker 2 And then, yeah, the rest of Europe has, you know, they all have variations on that. And, you know, some obviously Germany has like very

Speaker 2 explicit versions of that on a lot of topics and other countries do as well. So, yeah, but then also

Speaker 2 Europe and the Anglosphere have also the, you know, they have the top-down version of the actual laws and hate speech laws and so forth.

Speaker 2 But then they also have the bottoms-up authoritarianism

Speaker 2 of all the implicit codes, which if you're a

Speaker 2 highly educated Ivy League graduate equivalent, your job is to track those codes as they evolve every day by faithfully reading the New York Times cover to cover every day.

Speaker 2 But if you don't and you fall out of step or you're a working class person, you say the wrong thing.

Speaker 2 If you crack your your knuckles in the wrong way outside of your truck cab, you get accused of being a white supremacist and you get fired, which is something that actually happened in the U.S.

Speaker 2 during the 2020 craziness.

Speaker 2 So yeah, so yeah, Europe right now is combining the worst of those worlds. And, you know,

Speaker 2 you may know more about it than I do. They seem hell-bent to just get much, much worse.

Speaker 1 I don't know what's going on.

Speaker 1 I really, you know, I went back to the UK and I was quite disheartened by like a lot of this stuff. I was spent my time in Kensington.

Speaker 1 You know, the diversity is our strength thing in the UK, which is a message that you're greeted by as you enter the country.

Speaker 1 I don't know whether there's such a thing as too much diversity,

Speaker 1 but it just didn't particularly feel that much like London. I'm aware that that just sounds like the most Brexit sentence that I could have ever said.

Speaker 1 I was like, huh, like this just doesn't feel very much like London. Maybe it should be a cosmopolitan place, so on and so forth.
So that was kind of a little bit strange.

Speaker 1 It was just something that I hadn't noticed previously, like that level of diversity.

Speaker 1 What else did I notice? The level of service, like every Uber took 20 minutes to get to you. It's like, why is that happening? That seems, I was slap banging in the middle of the city.

Speaker 1 Every bit of service in every restaurant was either a long wait or there was some sort of a problem. Uber Eats seem completely determined to just like destroy whatever food or coffee.

Speaker 1 We actually started ordering twice the number of coffees that we needed to order in an attempt that we could maybe combine them together to have residual amounts of the right amount of coffee that we needed, and we still didn't get like just and I don't know, you know, I mean, where the I can just whine on like the fucking opulent, like returning prodigal son or something, but it just, it really was quite quite disappointing in many ways.

Speaker 1 And then, on top of this, you know, this upside-down world where Keir Starmer is saying that the immigration is too high and that this is a problem that we need to deal with.

Speaker 1 And at the same time, you've got these still unbelievably bizarre non-crime hate incidents that are being reported.

Speaker 2 It's odd.

Speaker 1 And I feel bad about my old homeland.

Speaker 1 I don't know whether Europe's in the middle of a downfall or what at the moment, but between all of the different parties that are going in this vote of no confidence that just happened recently, it's like whatever the timeline is, maybe this is actually what happens when we get used for the RAM of another fucking universe's computation that they just start to speed up all of the drama.

Speaker 1 That's the side effect. I don't know.
But yeah, there's something happening and it's being expedited, whatever it is.

Speaker 2 Yeah, so say a couple of things. So a friend of mine in private equity went and worked in, grew up in Texas and worked in the U.S.

Speaker 2 and then went over and lived in London and did private equity in Europe for five years. And he came back and I said, you know, what was it like?

Speaker 2 And he said, said Europe, it's like, he said, in Europe, it's like

Speaker 2 there are like five things that are more important than making money. And nobody will tell you what they are.

Speaker 2 Right.

Speaker 2 Like there's, there's, there are these, there are these goals and objectives and they're, they're loosely around ideas of societal fairness and they're loosely around ideas of not having to work very hard and they're loosely around, you know, retirement things things and social services and they're loosely around diversity and they're loosely around immigration but they're not they don't seem crisply defined um and so maybe there's a a bit of an identity crisis happening there um there's another another thing is a famous line um that fascism and communism are always looming over the u.s and landing in europe um right like the u.s always like there's always this threat of like the u.s is going to go communist in the 20s or is going to go fascist in the 30s or something and then what actually happens is you know germ you know france actually goes communist and germany actually goes fascist, right?

Speaker 2 Like, so, you know, they're actually still, there are actually still communist parties in Europe.

Speaker 2 And so maybe Europe is downstream of American culture in a way that is maybe helpful in some ways, but harmful in others.

Speaker 2 It's a little bit, let's say, a little bit, a great example of that with England is it's like, okay, like, you know, in the U.S., we are all trained from birth, of course, to feel, you know, extremely bad about the fate of Indigenous peoples on the North American continent.

Speaker 2 You know, Chris, pop, pop question, who are the Indigenous people of England?

Speaker 1 I don't know. Who are they? Whoever it was before the fucking Normans came and no, no, no.

Speaker 2 The answer is it's the English. Right, okay.
Okay. Like, it's just the English.
It's just the English. Right.

Speaker 2 Like, the English didn't display it. Yeah, it's like the Normans, it's like the Saxons, like, whatever came together, but there was no displacement, right?

Speaker 2 It's like the Indigenous people of England are just the English. Yet somehow the English feel just as bad about Indigenous peoples as we do.
Right? Like,

Speaker 2 right? Well, it's the other thing is like, you know, as you know, England has like no history, England has no history of like, you know,

Speaker 2 Africans, you know,

Speaker 2 anyway, I'm not going to go into that topic. But yeah, the

Speaker 2 spectacle of a BLM in England is really quite something. Yeah.
It's just, it's not historically, it's not very historically grounded, let's say.

Speaker 2 So yeah, so maybe it's just America, maybe America's both functions and dysfunctions kind of ripple out to the rest of the world

Speaker 2 in a way that is helpful in some ways and harmful in others.

Speaker 2 And then, you know, maybe it's a, I mean, from the outside, it seems like it's a continent that is in a series of countries that are having just a massive identity crisis.

Speaker 1 I think that's certainly true.

Speaker 1 Looking for a direction, looking for what your contribution is to the broader world. You know, it very much is

Speaker 1 a little bit of attention is being paid to Russia. Most of the attention is being paid to the US and some attention gets paid to China.

Speaker 1 And everybody else is playing second, third, fourth, fifth string behind that. And I wonder whether the UK is feeling like a jilted lover at the moment where it's just like we used to be important.

Speaker 1 We used to do a thing. I actually want you to bring this up.

Speaker 1 I've been sort of harping on about the

Speaker 1 differential number of entrepreneurs coming out of the UK versus coming out of the US, despite the fact that we have similar higher education levels and the same number of universities in the top 10 worldwide.

Speaker 1 The UK's got the same number as the US do, but we put out only 20%

Speaker 1 of the number of entrepreneurs.

Speaker 1 What, based on your assessment culturally,

Speaker 1 commercially, what do you lay that at the feet of?

Speaker 2 Yeah, look, part of it is some of the great UK entrepreneurs come to the U.S.

Speaker 2 And so we like, and by the way, this is not just true of the UK. It's also true of France and Germany and Sweden and Norway and many other countries, right? And so I just, because the U.S.

Speaker 2 has such a highly evolved, advanced entrepreneurial ecosystem, you know, it may just be that we're drawing a lot of them.

Speaker 2 And so they just, you know, they don't start companies in the UK because they've left. I think that's part of it.

Speaker 2 I think part of it is that, you know, the identity crisis, here I just know what I hear from my friends who are, you know, English or French or German, which is just, you know, they feel like their home countries and cultures are just not very supportive of the entire concept.

Speaker 2 You know, the governments don't necessarily want it.

Speaker 2 You know, the, you know, the legal codes are not, you know, well set up for it.

Speaker 2 You know, look, the UK, you know,

Speaker 2 this is a great example. So this is a great example of the dichotomy, just like the speech thing.

Speaker 2 So, you know, in the U.S., you know, this AI thing is kind of, you know, the big exciting thing in tech, and the U.S. is forming it.

Speaker 2 And, you know, the Biden administration was threatening to do all these horrible regulatory things, but the new administration is certainly not. They're going to do, I think, really smart things.

Speaker 2 And so there's going to be a AI tech boom in the U.S. that's going to be spectacular.
And a lot of that is around startups. Europe, the EU, has chosen to basically make all of that illegal, right?

Speaker 2 And so they passed this thing called the EU AI Act. And they, you know, this guy, Thierry Breton, it's this kind of crowning achievement to basically make AI startups in the EU illegal.

Speaker 2 So they've just decided they just don't want them.

Speaker 1 And then you move off at the knees just in case we were going to contribute to this new fledgling industry.

Speaker 2 Yeah. Well, the EU, you know, the EU, they have this slogan.
The EU kind of bureaucrats have this slogan. It's just say, you know, we're not going to lead the world in innovation,

Speaker 2 but we can lead it in regulation, right?

Speaker 2 It's like, no, right, exactly. That's what your reaction is, the appropriate reaction, which is like, no, that's not actually what we're going to, yeah, that doesn't make any sense.

Speaker 2 Like, if it's not literally happening there, then you're not going to be able to regulate.

Speaker 2 And in fact, what's happening is new leading-edge AI products from American companies now are actually not being,

Speaker 2 are actually not being released in.

Speaker 2 uh not not even being released in europe right so like the new apple ai products and i actually think the new open ai products as well are just not even, you know, literally being released in Europe as a result.

Speaker 2 The UK, again, has chosen, the UK has chosen this middle form, which is they sort of made it illegal, right?

Speaker 2 And so they did this, the last administration did this disastrous AI safety push, which was basically a massive red light, basically saying, don't even bother to try to start AI startups in the UK.

Speaker 2 But it wasn't like as overt a ban as the EU put on, but it was like this massive signal that basically says this is not a safe place to do it.

Speaker 2 And I think the UK is kind of stuck halfway in the middle right now and kind of has to decide which way it wants to go. Yeah.

Speaker 1 I don't know, man.

Speaker 1 I really hope that we can bring ourselves back around. Maybe not like old school colonialist glory.

Speaker 1 I'm not harkening back to the fucking East India Training Company, but

Speaker 1 something.

Speaker 1 I don't know. You're right.
The identity crisis and the direction is a big part. But then you layer on top.
I'm sure you've been seeing Dominic Cummings going full fucking scorched earth recently.

Speaker 1 He came on the show about six months ago, and everything that he said, even like Rory Stewart, someone that you think of as, you know, being way less of a firebrand than Dominic Cummings.

Speaker 1 And he has this really like visceral description of the type of people that inhabit the halls, Whitehall and Westminster.

Speaker 1 And he describes the way that they smell and the way that they talk and the things that concern them.

Speaker 1 And you realize, you know, the governmental efficiency thing needs to be scaled out beyond just the US. This is the sort of thing that really needs to be probably put everywhere.

Speaker 2 Yeah, so Dominic has this great, no, I'm glad you brought it, Dominic is, he's a good friend of mine. So Dominic has this great line.
He says, the people aren't running the system.

Speaker 2 The system is running the people.

Speaker 2 Right.

Speaker 2 And he uses that to describe the UK government, but you could also describe the US government and you could also describe the EU kind of the same way, which is like, in a sense, it's like, it's like, what happens when you get face to face with people in government is what you realize is, you know, there are actually a significant number of like very publicly spirited, very determined people who actually want to do good things.

Speaker 2 And there's this, and then there's, there's people who are not like that and they're just there for the job or whatever.

Speaker 2 But like, there are people, you know, who legitimately are working hard and trying trying to figure things out. And what happens is, if you know them, what happens generally, they get ground down.

Speaker 2 They end up disappointed and then they end up either becoming disillusioned or they end up leaving and going into the private sector.

Speaker 2 And so I think Dominic's explanation is a good one, which is the system is running the people, not the people running the system.

Speaker 2 The bad news with that is that

Speaker 2 it'd be easier. If the solution was just to swap out the people, in a sense, that would be an easy answer to the question.

Speaker 2 Whether you could do that or not is an open question, but at least you would know what to do to reform the system, system, as Dominic talks about and others talk about.

Speaker 2 And I actually saw, actually, Starmer is now talking about this too, right?

Speaker 2 Which is like, okay, how do we actually redo the system is, of course, much harder.

Speaker 2 The other lens on this that I think about a lot is Curtis Yarvin, who's also a good friend of mine. And

Speaker 2 the way he describes the American system that's running the people, the way he describes it is

Speaker 2 we are living under FDR's personal monarchy

Speaker 2 80 years later without FDR.

Speaker 2 And so, and the reason he describes that, he says, look,

Speaker 2 before FDR, the federal government was actually very small. Like, the tax rates were super low.
The federal government didn't do very much.

Speaker 2 The FDR dramatically,

Speaker 2 you know, by orders of magnitude, increased the size and scope of the federal government. He did that for two reasons.
One was the New Deal, and then the other was World War II.

Speaker 2 And so, the federal government that Franklin Roosevelt left behind in 1945 when he passed away was the government that he had built, which he had run the entire time from 1933 to 1945 himself, in which he had staffed himself and he had overseen himself and everything.

Speaker 2 And he built this basically this giant structure. And as Curtis basically says, as long as you had FDR running that,

Speaker 2 it could run really well. And, you know, we won World War II and saved the free world and like it worked and pulled the U.S.
out of depression and like the whole thing worked and it was great.

Speaker 2 But if you let an organization of that size and scope run without its founder-CEO for 80 years, you end up with what we have now, which is just like basically an out-of-control bureaucracy, like an out-of-control system in which people can't even make positive change, even if they want to.

Speaker 2 And again, that's why you could have in the U.S., you could have reason for optimism, which is, okay, what do you need? Well, you need another FDR-like figure, but in reverse, right?

Speaker 2 You need somebody and a team of people around them who's actually willing to come in and like take the thing by the throat and make the changes.

Speaker 2 By the way, make the changes that FDR would probably make if he were here to make them, but he's not, right? And so somebody else has to step up and do that.

Speaker 2 It has to be a president because nobody else conceivably has the power to do that. But, you know, we will see how much this president can do.

Speaker 2 But like, that's a lot of what this administration plans to do. In the UK, you know, look, the, you know, the UK government maybe grew up in parallel with the U.S.
government.

Speaker 2 So maybe FDR is also partially responsible for it by inspiring, you know, the general modern Western style of governance.

Speaker 2 But also, of course, the English system grew up for many centuries before that.

Speaker 2 You know, it may be time for an FDR-style transformational leader to come in and like really get a rubber. Yeah, the path dependence.

Speaker 1 I always, you know, we always talk about the wonderful heritage of the UK, but you don't necessarily think about the Einstein effect that has got you beholden to all of this that came before i learned last year that the distance between the two front benches uh in the the house of commons is a broadsword held out at arm's length from either side um

Speaker 1 which just tells you kind of everything that you need to know about what our current government and political institutions in the uk are inheriting it's like if you've got a country that's thousand years old couple of thousand years old you're like bloody hell like there's a really lovely history there and you say yeah, but what else comes along for the ride?

Speaker 1 And sometimes it's stuff you want to get rid of. Mark, let's bring this one home.
Dude, it's been a long time coming. You're so great.
I'd love to bring you back on at some point soon.

Speaker 1 I wish you people go. Don't want to keep up to date with all of the stuff that you're doing.

Speaker 2 Go to X. Go to X.
P-Mark A, P-M-A-R-C-A on X.com. Heck.
And also on South Secret.

Speaker 1 Mark, I appreciate you.

Speaker 2 Thank you, mate. Good, fantastic.
Thank you, Chris.

Speaker 3 The holidays mean more travel, more shopping, more time online, and more personal info in more places that could expose expose you more to identity theft.

Speaker 3 But LifeLock monitors millions of data points per second. If your identity is stolen, our US-based restoration specialists will fix it guaranteed or your money back.

Speaker 3 Don't face drained accounts, fraudulent loans, or financial losses alone. Get more holiday fun and less holiday worry with LifeLock.
Save up to 40% your first year. Visit lifelock.com slash podcast.

Speaker 3 Terms Apply.