Ep 22 | Peter Schweizer | The Glenn Beck Podcast
Learn more about your ad choices. Visit megaphone.fm/adchoices
Listen and follow along
Transcript
So, Peter, your film starts out with an amazing clip
from what wasn't long ago
on the Today Show with Brian Gumbel and
what's her name, Katie Couric,
asking the question, does anybody know what the internet is?
What's the internet?
Yeah.
Mid-1990s.
I mean, it seems like so long ago, and yet it's 20 years we've had this for wide application, and we're still trying to figure it out.
And I feel like what we're going to talk about today in five years will look like, oh my gosh, look at this, how ridiculous this conversation was.
They didn't even know.
Yeah.
Because things are changing so fast.
Right.
You
have made the film because why?
Because I've always been distrustful of concentrated power.
This is why I've investigated corruption in politics and on both sides.
Yeah, on both sides.
That's right.
Size and scope of government.
We've talked about this before,
Glenn, and I think we probably agree.
You know, corruption is a human problem.
It's part of our human nature.
Well, that also applies to people that are in the private sector.
Now, one of the reasons I like a free market economy is it tends to
self-correct.
The problem is with this technology, with Silicon Valley, the market's not going to self-correct.
So we're dealing with a massive concentration of power in Silicon Valley, I would argue, that is larger than any other corporate Titans have had in American history.
And it's only going to get worse.
So let me understand why you think it won't self-correct.
It won't self-correct because the barrier to entries, it seems like, oh, we can just get together and start a tech company, right?
And we'll be up and running.
But the reality is the gatekeepers are the large tech companies.
Apple, if you're not connected with Apple in some way and iTunes or you're not getting promotes, very hard to get application there.
Google,
the Titans control it and they restrict the market.
They restrict competition.
So it's very hard to say, we're going to create a new search engine today and we're going to compete with Google.
It's just not going to happen.
And everybody who's honest knows that.
And everybody that I know
says that just the servers that Google has,
the government servers would fit in a shoebox at Google.
Yeah.
Yeah.
That is
mind-boggling.
Yes.
I talked to Ray Kurzweil.
You know who Ray Kurzweil is?
Oh, yes.
So I talked to Ray Kurzweil, and I said to him at one point, point, we were talking about AI and we were talking about all of the things he's working on.
And I said, Ray, so
why wouldn't a Google
that is reading everything, knowing everything, and it's only going to get faster and it's going to be a personal assistant.
And the idea of the personal assistant is it's it's it's so knowledgeable on you that it can predict what you might think next.
So I'm thinking about things, I'm talking to people, and then you know, there's something in the back of me that says, you know, maybe we go to Hawaii the next day,
the next morning when I wake up, my assistant says, Hey, by the way, I found some really good.
I've not talked to him about going to Hawaii, but it just can predict me so well.
Yes.
So, what I said was,
why wouldn't Google
stop
anyone
from building a replacement for Google?
If there was something that was going against the corporation, why would it allow you to do that?
His answer was
because we never would.
Well, then I'm meeting a unique group of people in the history of the world.
Those words, trust us, have been uttered throughout human history by people that rose to power, who were well-intentioned, who did wonderful things, but along the way took a terrible turn.
Those words have also been used by people who were terrible from the beginning.
The point being, we cannot trust other human beings.
Given the long span of human history and politics and warfare and technology, we cannot trust other human beings with this much dominant power over our lives.
And look, they're not going to come and say, we're going to do terrible things to you.
It's always going to be presented as, we're doing this for your benefit.
This is a wonderful, good thing.
And there are benefits.
Huxley was right.
Not Orwell.
Right.
You know, I think it turns into Orwell.
Right.
But first, it's Huxley.
We want it.
We'll brave New World.
We'll welcome it.
Yes.
I wouldn't have given my fingerprint to anyone.
I gladly give it to Apple because it's
reduces, oh, really, I got to remember that.
I got to change that.
Yeah.
You know, now facial recognition.
Right.
So before we start, is your concern the power that they have now and in the next two years
or the power
that they have and will have in three, five years and with AI coming online?
My sense is, Glenn, that right now they have an enormous amount of power and we will reach a point where it won't be reversible.
In other words, they will have accumulated so much power that whatever actions are taken,
simply it will be too late.
That's my concern.
They get so embedded into a human life and their ability to steer and manipulate.
We become so dependent and conditioned to being dependent upon them that it becomes too late to self-correct.
That's my fear.
And I think we're approaching that probably in the next five to ten years.
I don't accuse Google of doing this or Facebook doing this at all, but
I have often wondered if it today, just with what they have today, if it fell into the wrong hands,
could anyone in Washington beat Google?
Because
they have everything.
They can make you look guilty.
They can place it.
They can do anything to your life.
Do they
today,
with what they have?
Who's more afraid?
Google, Facebook, or the government?
Oh, I think the government.
There's no question about it.
And it works a couple of ways, as it always does, Glenn.
They're certainly absolutely what you're saying.
They have the stick that they can wield in lots of ways.
They have the search histories of members of Congress, of senior White House officials.
If they're using a Gmail,
email account, even if it doesn't say Gmail, if they're using a Google-based email server, which a lot of government entities are, they have access to their emails.
So they know a lot of secrets about these people.
So there is no question there's the stick.
Wait, wait, let me just clarify.
So I want to be really careful.
Yeah.
Because
I don't want to assign things that they're not doing.
I want to make a difference between what they have,
what they can do with what they have, and then what we worry.
Very important distinction.
They don't necessarily have all of that information.
I shouldn't say that.
They have all that information.
We are not accusing them of looking or using any of that information at this point.
Correct.
Correct.
But it is a capability
they have.
Yes.
It's, you know, the equivalent would be,
you know, somebody who has a gun at home for home defense.
They may never use that gun, but they have the capacity to use it if they need to.
God forbid.
Same thing with Google.
They have that capacity.
And it's the same worry that we have with the United States government, with the NSA servers, which again, the NSA servers
are minuscule in comparison to Google.
That's right.
And the NSA servers, they are not using it, but they have everything.
Right, exactly.
And, you know, on top of that, Google has, you know, Google Docs, which again, they have the ability to scan and they do.
The Pentagon uses Google Docs.
So you've got classified information that potentially, I'm not saying they're doing it, but potentially could be accessed through Google Docs.
And you have news organizations from the New York Times to others who use Google email services.
So this is a capability that they have.
Well, I know five, maybe eight years ago, I was still in New York,
eight years ago,
one of Google Tech called me and said, you need to know something.
He said, our server farms.
I don't know why,
but the federal government is digging trenches around our farms, and they are now protecting us.
And they were digging very deep and putting security all around.
They were not in bed at the time.
Right.
But even the federal government knew.
Yeah.
This is really dangerous, what they have.
Yes.
Yeah.
The federal government knows the power.
And that's why, you know, one of the big myths, we've talked about this before, Glenn, one of the big myths that operates in America, and it's oftentimes perpetuated by the political left, which is that big government and big business hate each other.
And the reality is oftentimes no, that big business and big government like each other.
I think it's the worst fear.
Yeah, exactly.
Exactly.
And I think that is certainly the case with Google.
And so we've talked about the stick.
There's also the carrot.
All right.
One of the largest lobbying shops now in Washington, D.C.
is Google.
They are hiring former government employees.
I would not be surprised if we see additional members of Congress put on the payroll.
The point being, hey, we can make life really tough for you, but we can also make life for you personally very lucrative if you join us.
So it is really interesting.
I just, before we sat down, I just
hung up the phone with one of the editors of Wired magazine, and we were talking for about 40 minutes about some of this
stuff.
And
he said, well, you know,
Facebook has poured a a lot of money.
I mean, they're one of the number one, number two lobbyists in Washington.
And they have, you know, they have conservative lobbyists.
And I said, that doesn't make me feel better.
I want them out of Washington.
That's right.
I don't want any relationship with those two.
Right.
So tell me
what the film is about and exposes.
And I think it's important to start with
the one of the guys that you
feature prominently is a guy who is a die-hard Clinton supporter, Democrat.
This is not a political,
this is not, this has nothing to do with politics.
Right.
Film is really about the power of Google and Facebook, but in a way that a lot of people have not traditionally thought about them.
A lot of people recognize and see the issues of privacy, the fact that they're gobbling up all this information and they take this information, Glenn, and they run those ads about, you know, go to Acapulco because you did a search on Mexico or, you know, buy these shoes because you said you, you know, were looking for certain shoes.
So that is the concern of privacy.
But the problem goes far deeper than that.
And this goes back, I think, to a fundamental concept that people recognize everywhere.
And that is to the extent that an institution can do something for you, it can do something to you.
So Google and Facebook can do a lot of things for us, but that power that they have accumulated that gives them the capacity to know so much about us also gives them the capacity to do things to us.
And what they're doing is not just collecting information, the privacy concern, they're also actively working to manipulate us.
They're looking to steer us and move us in directions that we don't necessarily want to go.
They want to influence our values.
They want to influence
our worldview.
They want to influence the way that we see things.
And it's not about them flashing in front of us in a very visible way, Glenn, of saying, hey, have you ever considered this point of view?
Which is fine.
I think you could do that.
It's done in a hidden manner.
They do it by manipulating search results.
So you may be looking for information on, let's say, Congressman John Smith, but they're only going to give you certain bits of information about Congressman John Smith, depending on what Congressman John Smith's politics are and depending on whether they've decided to manipulate that search.
That is a really powerful charge.
Yep, it is.
And I think there's evidence to back it up.
There's lots of evidence to back it up.
The first bit of evidence is we know that they manipulate the algorithm because it's been proven by the Federal Trade Commission, the European Union, and academics at Harvard.
And this relates to commercial search.
They were charged 10 years ago by Yelp and by TripAdvisor for manipulating the algorithm to the detriment of those companies to benefit companies that Google owned that were competitors.
And as far as I'm concerned, Google can do that.
But here's the problem.
Google insisted and they pounded the table and said, we are not manipulating the algorithm.
We would never do that.
Guess what?
They were lying.
They were manipulating the algorithm and they've pretty much given up the argument that they weren't.
So they've done that.
And the work of Robert Epstein and others, I think, conclusively demonstrates that they are cooking the books on the algorithm as it relates to political search.
So I want to be clear, because you and I are both
free market guys.
Yes.
I don't want to tell Google and Apple.
I do not.
I will tell you, the conversation I had with Wired was
warning.
Yeah.
Warning.
These companies, if they don't smarten up, one side or the other will say, you're a utility.
And then it's government controlled and in bed with the government.
That was
semi-okay with the Bell system.
I mean, it was a lot better when it weren't.
But it's not this.
This is NSA.
Okay.
So I don't want to tell them how to run.
So are you saying to me, if they said, yeah,
yeah, we put our stuff first, you'd be okay with it?
On the commercial search, yes, I would be.
Although the problem is that the internet is essentially, to the extent that it's regulated, is based on the Communications Decency Act of 96.
And as we've talked about, Glenn, and as you know very well, if you are a neutral platform, as Google and Facebook insist they are, you are essentially saying you're a telegraph and you're simply relaying information from point A to point B.
And the law says, great, do that.
You have no regulation whatsoever.
And so Google and Facebook have operated under that platform.
The other part of the Communications Decency Act is if you are editing content, if you are sifting content, if you are engaging in editorial control, you are now a publisher and you will be treated in some respects as a media company.
Legally.
Exactly, legally.
And so my point is Google and Facebook should have to choose.
They should not be able to say we're a neutral platform
and yet we are going to exert editorial control.
And they've admitted that they exert editorial control.
And the reason why they are getting it both ways is because of the lobbying money.
Yes, that's exactly.
They lobbied one way, then they lobbied the other way, and they got both.
Yes, exactly.
Absolutely wrong.
Yeah.
And it should not, by the way, give us any comfort.
I mean, you were right in the interview with the Wired magazine newspaper.
It should not give us comfort that prominent conservatives or Republicans have decided to take a big paycheck from Google, that that somehow is going to fix everything.
That's not going to fix anything.
Because as we started,
people are the problem.
Yeah.
So let me go back.
And
when you say they are shaping us and they're pushing us a certain way,
first explain that.
How is that?
And I want specifically,
where are they pushing us and how do you know they're pushing us?
Right.
Well, one of the people that we feature in the film is a guy who is a Google ethicist.
And that's actually a title at Google.
And Tristan talks.
Tristan Harris?
Yes,
talks very, very openly about the fact that his job was to nudge people towards certain directions in certain areas.
And they were trying to figure out how to do that, quote unquote, ethically.
Now, you know my view on this is pretty clear if you're trying to influence somebody it should be out in the open it should be direct the only ethical way exactly and so i don't think there's an ethical way to do it but but he's in in in the open right it is exactly because then i'm choosing and i know you're coming at me with an angle but yes if you don't announce it there is no ethical yes exactly and you know it's it's interesting you go back at history you remember all the uh uh concern that was raised about the subliminal advertising you know, that you'd see
a Coca-Cola drink and there's actually sex written in the ice cubes or something like that.
And, you know, there are all kinds of disputes about how effective that was.
Well, that was restricted in the United States because it was deemed to be deceptive.
You know, you're trying to accomplish something here that you should be above board about.
And that's what I'm saying that Google should do.
So the issue of them nudging or trying to influence us, Tristan Harris has talked about that.
Eric Schmidt, the CEO of Google, has said that part of Google's purpose is to shape the values of Americans.
So they don't view themselves just as we're trying to sell advertising and we're trying to make money.
They have a much more ambitious agenda that goes along with this.
And it's tied up with their worldview.
It's tied up with the sort of the ethos of Burning Man.
It's tied up with sort of the Silicon Valley ideal of what they view is wrong with America and what they view wrong with human American society.
So I don't want want to get into,
I think for the sake of
intellectual dialogue,
I want to take the position that
it's totally cool to say don't be evil.
Yep.
Totally cool.
The problem is when you say they're nudging and they're pushing
and their ethos is don't be evil.
Could we define evil?
Could we define evil?
You know, when it's hate speech,
could we define hate speech, please?
Because I've seen speech that is very hateful that others are totally fine with.
I have seen things, for instance, we define evil.
Some people would say that it is
evil to take the right away for your own body as a woman.
And others would say, no, it's evil to kill a child in the womb.
Two valid arguments.
One, you can choose whichever one you want.
I've chosen one.
But both sides will say, you're evil for believing that.
So without the definition,
we don't know what it is.
And it can always change.
Right.
And I think you're exactly right.
And even to add to that further, here's the other fundamental problem.
I'm always very concerned when people throw around words like evil.
There is evil in the world, and we do need to define it.
But here's the problem.
When a company says, don't be evil, or when a activist says, I am fighting evil.
If they have not really thought through what they're saying, what does that do, Glenn?
It opens up the possibility to do anything.
I mean, the ends justify the means.
I'm fighting evil.
I'm halting evil.
So deception is now okay.
Dishonesty is now okay.
Harassment is now okay.
Manipulation is now okay, because I'm fighting evil.
So for a company to say, you know, don't be evil, yes, you never defined it.
And second of all, it creates, I think, kind of this mindset of, we're safe now.
I mean, our corporate motto says don't be evil.
So we're obviously not evil.
And we're fighting all these terrible things out here.
And we're justified in,
you know, and you see that in some of the emails that have come out that have been leaked and these discussions that
Google engineers had about, you know, Trump's immigration policy, regardless of what one thinks about that.
But these Google engineers are actually saying we should fiddle with the algorithm because this is horrible and we need to stop it.
That's the kind of mindset that gets adopted when you plant yourself on the grounds of we're not evil, we're fighting evil.
The biggest thing that made me successful
is the one thing that I have lost, and I am
so grateful that I have lost it.
And it is the problem,
I think, almost throughout society: certitude.
I want to take you to
a controversial meeting I had with Mark Zuckerberg at Facebook.
They had invited a whole bunch of voices, and I sat right across the table from Mark, and others were talking, and I watched him, and I watched him closely.
And
I tend, when I go wrong, I tend to go wrong believing the best in somebody.
So maybe I'm wrong.
But I really watched him.
And when he spoke many times, perhaps because I was just right across the table, he was speaking to me and he was looking me right in the eye.
And he said to me, Glenn,
why would we want to do this?
First of all, we would we'd wipe out 50%.
If I'm just pushing politics, I'm going to wipe out 50% of my base.
That's stupid.
For us to sit here and say, that's good, that's not, that's hate speech, that's not.
He said,
in different parts of the world, we don't know what is good and bad.
The C word in America
is horrible.
In Great Britain, not so much.
Okay?
It means something different.
So his point was,
we are not the policemen of the world.
It's impossible to be policemen of the world.
And I believe that in him.
I really do.
I believe he believes that.
But I also believe that Silicon Valley by nature
is not diverse.
It may be diverse in color.
It may be diverse in sexual lifestyle.
but it is not diverse in political thought.
And when you're in the bubble, and
explaining
the bubble that we're in, where they're sifting and we're getting the stuff we want to read, and we're living in that bubble, that Facebook bubble, they're living in a Facebook bubble.
So my question to you is.
Do you believe that Facebook, Google are
intentionally
going after and editing.
I mean, en masse.
I believe there are people that do this.
Right.
But en masse.
Are they intentionally doing this or are they just so isolated in their own bubble and surrounded by people that are absolutely certain they're on the side of good
and they don't see what they're doing?
That's a great question.
And anytime you talk about intention or trying to look into somebody's, you know, soul, as it were, I mean, it's very, very difficult to do.
I think what's striking to me about Facebook and Zuckerberg is, you know, when he was sat before the Senate, Ted Cruz and others asked him some very tough questions.
He acknowledged that Silicon Valley was a very, very liberal place, but he also didn't really have a good answer.
You know, when Ted Cruz asked him and said, look, this has happened to a Republican candidate here and here and here.
Do you know of any example where this has happened to a Democrat candidate?
And he he said, no.
And he said, this has happened to this pro-life group, to that pro-life group, to another pro-life group.
Do you know of any abortion groups that have had these issues?
He did not.
So part of it is the proofs and the pudding, right?
I mean, why is it that one side seems to overwhelmingly face these issues?
You know, I'm not talking about, you know, somebody who's got a blog site who's, you know, kind of out there.
I'm talking about very substantial, institutional, you know, good, reasonable, well-thought-out voices.
That
goes to the Harvard expert that did this research on the last presidential campaign.
Yes, exactly.
He's a Clinton supporter.
That's exactly right, Robert Epstein.
And what they essentially did, Glenn, is they took, they had 2,000 monitors around the country.
They were in red states, blue states, Republicans, Democrats, gay, straight, Catholic, agnostic.
I mean, you name it.
They had the gamut of it.
And what they essentially did was said, we want you to do searches through the six-month period before the 2016 election, and we are going to capture every single search result that you get.
We're going to have you search on Bing, and we're going to have you search on Google to see what kind of results you get on political topics.
What he found in the research when they cataloged all of it was that in all 10 of the search positions on Google, you saw a clear bias in favor of Hillary Clinton, meaning that they were suppressing negative stories about Hillary Clinton and pushing positive stories by Donald Trump.
They did not find find that problem with Bing.
You mean pushing negative stories about Tom?
Yeah,
sorry, negative stories about Trump.
And they did not run into that problem with Bing, which is a Microsoft product.
And they found this stunning because what you should find, what Google will tell you is the algorithm is sort of individually, in a sense, it's conditioned individually for the user.
So
Google knows Peter Schweitzer very well.
When I search in a certain subject, it's going to give me answers or responses that are tailored to my search history and to the best results that they see.
If you were,
I mean, I shouldn't have gotten any results
like either side, but if, so let's say if you were a Donald Trump supporter, you should have seen in your search results more positives about Donald Trump?
Yes.
Yes, you should have, and you should have seen for new sources that you're used to clicking on.
That's part of what they say.
So the fact that they found this uniform, consistent bias in favor of Hillary among all 10 search positions among the 2,000 people was quite astonishing.
And there's really no disputing that.
Google does not dispute Epstein's results.
They just say, well, no, this is just organic search.
It's sort of a circular argument.
We have organic search.
How do we know we have organic search?
Because we have organic search.
I did a...
I did a show recently on the difference between disinformation and misinformation.
Fascinating topic.
Right.
Yeah.
Can you explain the difference between the two?
I mean, one is one,
disinformation is when it is planted into what you think is a credible source.
Right.
Okay.
If it comes from Pravda, you're like, oh, it's Pravda.
Right.
But if it comes from the New York Times,
then it's disinformation.
In a way,
is engaged and Facebook, knowingly or not, in a disinformation campaign.
Yes, I think that you could certainly classify that.
And I think the issue becomes:
why are they doing this and how are they doing this?
You asked earlier about Zuckerberg and your interaction with him.
I don't know that Mark Zuckerberg, I'm not suggesting that Mark Zuckerberg is sitting around saying, how can we deal with conservatives on Facebook?
I don't think so either.
But the problem is, there are lots of people employed by Facebook who maybe they're in their mid-20s.
Maybe they were woke
on campus and they are now involved in the news feed.
I mean, we had
a Facebook employee who was involved with their trending section that came out a couple of years ago who said, oh, yeah, we sunk conservative stories and we boosted liberal stories.
So I don't think it's a question of the executives of these companies you know, in kind of a James Bond villain moment saying, here's how we're going to rule the universe.
I think it's a question of they've created these powerful companies and they've created a culture within these companies that for all the talk of tolerance is actually very intolerant.
And it reflects in the product that they are producing.
So
did you touch on it all?
I have a document from a meeting with Media Matters on
Inauguration Day.
It was a meeting that happened with far-left donors with Media Matters on the Election Day in Florida.
Oh, not on Election Day, on Inauguration Day.
And they said, here's where we went wrong, and here's what we're going to do.
And it talks
in that document about how they are going to go to
Facebook and Google, and they are going to advise.
Right.
Okay.
And in that document, it says, we have already been given access
to all of the raw data
in real time.
Yeah.
Peter, I don't think they'd give that to me or any organization that I know of.
No, they would not.
They would not.
And this is the problem.
I mean, the problem is the way that they are trying to deal with this is they're like, you know, we're being criticized by conservatives, so we'll go meet with conservatives.
I'm not saying that's a bad thing, by the way.
I think that's a good thing.
But that's not really the issue.
The issue is not, you know, saying nice words.
The fundamental issue comes down to, you know, what is this company doing?
And the whole debate now that's arisen about fake news, I think is a huge problem because it's allowing essentially these liberal groups like the Southern Poverty Law Center and Media Matters to essentially say to Facebook and Google, no, no, no, we want you to engage in more censorship.
We want you to classify.
And
Facebook would respond, well, but we have others.
We have, I don't remember if it's Heritage Foundation, but we have others that are doing it on the right.
I don't want
either side.
That's exactly right.
I don't want either side shut up.
Yeah, it's the
problem that develops is that nasty word, cronyism.
And cronyism is a problem where you give concentrated power or you give special access or favors to certain people, and invariably it's going to be misused.
And this is really the question, I guess, Glenn, is does Facebook and does Google so much distrust the American people that they believe the American people are incapable of looking at a news story and saying, that's totally yes, I'm not buying that?
And they don't.
They don't have confidence in the American people to do that.
They feel like they have to somehow be the arbitrators, and they don't.
So here's,
let's be clear.
Just a new study came out.
Goldfish have an intention span of nine seconds.
Americans have seven seconds.
Okay.
So let's be very clear.
We're not doing our job.
Okay.
Right.
And that has changed dramatically because of facebook and all of the interaction that we do yeah however
because i was just asked this question um well don't they have a responsibility shouldn't they be no they have a responsibility to be transparent and be a platform correct a platform correct i don't believe that you should censor anyone on a platform right it's the battlefield of ideas
to say that now what people will say is well, that's crazy because there's a lot of crazy people.
Yeah, there are.
There are.
Yeah.
Thomas Jefferson said, believe the people, trust the people.
Right.
Okay.
The key to that sentence was,
comma,
they will usually get it wrong,
but eventually
they'll get it wrong.
Right.
Right.
Right.
Exactly.
So we're going through this period right now.
The worst thing we can do is put a babysitter on top of us forever.
Yes.
We have to learn.
Fire is hot.
Yes.
No, you're exactly right.
And this is further evidence that I think they don't really understand the dynamics at work in the country today.
The dynamic at work in the country today is a rejection of sort of this elite view of how society should be organized.
It's one of the reasons why you have in financial markets, conservatives, people on the left don't trust the large banks.
They don't trust Wall Street.
It's a rejection of that.
It's the same reason conservatives, liberals, independents have a distrust of Washington, D.C.
It's not because they want tax policy to be slightly different.
It's they don't fundamentally trust them to reflect their interests and to look out for them.
And they also know that elites generally look down upon them.
So, you know,
my challenge to Silicon Valley is for all their talk of egalitarianism, for all all their talk about we love democracy and everybody having a voice, do you really?
Do you really?
I mean, the point is, we all remember as a kid, I grew up outside of Seattle, Washington, and I remember going down to a place called Pioneer Square.
You probably went there too.
There were all kinds of people wandering around saying strange things.
Well, those people today may have blog sites, and they're going to say some crazy stuff.
I didn't pay a lot of attention back then.
I'm not paying a lot of attention now.
And I have enough trust that most people aren't going to pay a lot of attention to them.
And that's, I think, what we have to embrace, because otherwise it's, we are going to have intellectual policemen that are trying to tell people, here's what you should think, here's what you should not think.
Not only that, but please don't even look in this direction.
You can't even look in this direction.
If you look in this direction, it might somehow infect you.
It's ridiculous.
The battlefield of ideas is such that the best ideas win.
And I happen to believe that the ideas of of the American founding were the best ideas, and they are going to win.
And we ought to be confident in that.
And the kind of monitoring and everything else,
honestly, goes against almost every single
article in the Bill of Rights.
That's right.
Almost every single one is violated.
Now, it's not violated by the government, but it is the same principle, especially the bigger they get.
Yes.
And that, by the way, goes on to what you were saying earlier about Huxley and Orwell.
You know, the traditional view is the government was going to use technology to control our lives.
It's really corporations now.
I've always, you know, I've always, always made fun of,
you know, Blade Runner, the corporation.
Please shut up about the corporation.
It's the government.
No.
Right.
No, we are now entering the time where the liberal concern about corporations is actually accurate now.
And it's weird that they're so in love with Apple and Google because these are the guys you've been warning us about.
So
let me take you
kind of to that Orwell place, but first
explain
Gmail is free.
Google searches are free.
Right, right.
They are free, but they come at a high price.
No, I'm not paying anything.
Well, you're not paying anything in terms of monetary.
That's true.
They're free.
But the question is, what is going on?
Because all these servers, all this capacity is expensive.
So what Google is doing is they have a product here.
You're not buying it.
You are the product.
They're selling you.
You're selling you.
And they're selling all kinds of secrets about you.
And Gmail is a perfect example of this.
I used Gmail up until I started on this project, and now I don't use Gmail anymore.
And what people have to realize about Gmail is they're scanning every email that comes in.
They're scanning it.
They know what's in it.
They are scanning every email that you send out.
And
if you draft an email, you know, you're upset with your cousin about something.
You had a, you know, debate over Thanksgiving and you thought they were rude.
And you said, you know, cousin Chris, I think you're rude and you're terrible and you're this and that.
And you say, you know what?
That's really kind of nasty.
I shouldn't send it.
And that draft, they're scanning that draft.
I want to make it clear.
You're not saying the draft that you save and put into drafts.
Correct.
It's the keystrokes.
It's recording the keystrokes.
That's correct.
Even if you delete all of it, it's still there.
All right.
What's important here is, again, to distinguish, when you say they're scanning, it doesn't mean they're reading it.
Correct.
Okay.
And why are they scanning it?
Well, they're scanning it because let's say you send an email to your friend.
Golly, I'm really tired of work.
I'd sure love to be on a beach in Mexico right now.
They're scanning it because they're scanning.
They're looking for beach in Mexico and you're going to probably see ads on your Google feed for apartments or condos in Mexico.
And lo and behold, the next morning, someday you wake up and they say, Mr.
Schweitzer, I've already booked two tickets.
That's right.
Would you like to go to Mexico today?
I know you're tired and you've been thinking about it.
Right.
That's right.
And that's where it's headed.
And again, there are certain amazing conveniences that come with this.
I mean, you know, you use Google Maps.
There are all sorts of great benefits to that, to Google search.
The thing that people have to keep in mind, though, is it's not a one-way street.
It's not just these wonderful good things they're doing for you.
It's the capacity they are developing to do things to you.
So when I say that they're scanning your Gmails, it's not that there's a person sitting in Silicon Valley saying, oh, look what Glenn just sent in Gmail.
Correct.
But they have the capacity to do that.
Yes.
And they have the capacity, if they don't like what you're doing, to shut you off from Gmail.
And Dr.
Jordan Peterson, we highlight him in the film.
That's exactly what happened to him.
He's a psychology professor at the University of Toronto, and he took a position against compelled speech, where there was a debate in Toronto about an ordinance that would require you to address somebody by their preferred gender.
Peterson's position was, I always address people by their preferred gender, but this is compelled speech.
You should not force people to do this.
He took this public position.
The next day, Glenn, he was shut out of his Gmail account.
He was shut out of his YouTube account.
Everything Google owned was shut down.
Now, you would think, why is this going on?
I think probably what happened is somebody connected with Google,
maybe mid-level, saw this, you know, is maybe in favor of this policy position and sort of in a juvenile way said, I don't like this guy.
We're going to sort of cut him off.
But Jordan Peterson lost his Gmail.
He lost his Google calendar.
The point being, you rely on these products.
It's going to give them an enormous capacity over your life.
And if they choose to, sometimes in an arbitrary way, they may just shut you out because they don't like a position that you've taken.
And the problem is, Google does not have a customer service department you can call to say, Why did this happen?
They have no customer service department, and they make clear we can choose to do this to you anytime we want.
So, I've been asking the question of everybody who I think
is paying attention to Silicon Valley or is involved in Silicon Valley.
The answer comes back exactly the same way every time.
Because it's been a plea of mine.
I'll say to them,
this might sound crazy.
However, politicians are politicians.
Economies are economies.
They usually repeat the same mistakes, okay?
We are at an economy that I don't care who's in office, at some point it's going to crash.
It always does.
We are going to feel real pain.
And the longer this one goes, the deeper the pain is going to be.
We have politicians that tell us,
well, I'm going to bring those jobs back.
Okay.
Well, you have people in Silicon Valley right now that are not celebrating a 4.
Whatever unemployment rate
because their entire job is to figure out how do we have a 100% unemployment rate.
Because that's the world of the future as they see it.
But no one is talking to people about this.
Bain Capital said about eight months ago, by 2030, the United States of America will have a 30% unemployment rate permanent.
And it will only go up from there because of the things Silicon Valley is doing.
So here's the scenario.
People start to lose jobs.
This starts to kick in around 2020.
People start to lose jobs.
They're not coming back.
The politicians have to blame somebody.
We're going to,
you know, I'm going to bring those jobs back.
I'm going to bring those jobs back.
At some point, the people say, no, those jobs aren't coming back.
They have to have another story.
It's them.
It's the people in Silicon Valley that are taking your livelihood away.
They have, they have
manipulated you.
They have, and it is torches to Silicon Valley.
Unless
the politician says,
How about we work together?
That's when Orwell happens.
Yeah.
Everyone, I have said, does that sound crazy?
Let's see if they respond the way you respond.
Does that sound crazy?
No, it sounds very realistic.
And if I were a Titan of Silicon Valley with sort of their worldview, that's precisely what I would do.
And if I was part of what I call the permanent political class in Washington, Republicans or Democrats, doesn't make all that much difference.
That's exactly what I would do.
And you will then have this, in a sense, unholy alliance between the political leadership and high-tech.
And, you know, we know who's going to get the short end of the stick when those two entities get together.
And that's going to be the American people.
I talked to a guy who is just in Beijing.
He is high up in the ladder.
And he told me
there are three
circles, three rings.
The center ring, the outside ring is kind of like American surveillance.
The second ring is British surveillance on steroids.
The new Sharp Eyes program is the center.
11 million people.
Okay.
They took somebody as a test.
He was there.
He saw it, took a test.
They put a guy out, said, go into the center ring.
All right.
11 million people.
You have two hours.
Hide.
They had him in the back of a squad car in eight minutes.
Remarkable.
Eight minutes.
Remarkable.
That's today's technology.
Yeah.
People have to understand, do not fear high tech.
Don't fear it.
Fear the goals of high tech.
Right.
And we don't know the goals, except don't be evil.
Right, right.
And you have this strange sense, this fascination that some
American elites have with the Chinese model.
I mean, remember, it was just a few years ago Thomas Friedman, the New York Times columnist and, you know, has got a lot of relationships in Silicon Valley,
was very frustrated in 2009 and 2010 when Barack Obama was president and actually held up the Chinese model and said, you know, the Chinese at least can deal with these big issues of the day.
The concern is...
Same exact same thing that progressives said about Mussolini before things went bad.
Yeah, and
this is the concern is does the sort of traditional American notion of, you know, the Constitution was predicated on you can change the Constitution, add amendments, but it's going to be tough.
We don't want radical shifts and radical change.
Is that very American notion that has been so central to America's development, is that really deeply accepted or embedded by a tech culture in Silicon Valley?
I'm sure there are some people that do, but I think a lot of them don't.
A lot of them are impatient.
They made their money quickly.
They were billionaires by the time they were 30 years old.
They built these massive corporations over the course of a decade.
They are impatient people who don't have a lot of experience outside of the world in which they operate.
So they become naturally impatient when you have things like checks and balances and civil rights.
So when it came out that Google, for example, had rejected working with the Pentagon on a contract, which is
certainly their prerogative, that's fine, but then said, we are going to work work with the Chinese government.
It's shocking in a lot of
senses, but it's not in others.
Because if you want to develop this sort of new technology this way, do it in a country that doesn't have civil rights.
There are very few constraints.
The government will let you do what they want you to do.
If you do a similar surveillance project in the United States, it's messy.
You've got courts to worry about and you've got congressional committees.
So this fascination that some tech giants have with China and with the sort of Chinese model and way
is very, very disturbing.
China is building the technology that Hitler only dreamed of.
Yeah.
The concentration camps that are being built right now in China are terrifying.
The Sharp Eyes program that that President Xi has put in, where it's an episode of the Black Mirror, where you are, you're graded on
your social interaction, who you follow, what you write, who you call, where you shop, all of it.
And you will lose your house.
Your kids won't be able to go to school.
I mean, it is terrifying.
And it really said
a lot to me when
I don't want Google in bed with the government and the Pentagon.
Please, no.
Uh-uh.
But one person spoke out about that.
One person, I'm sorry, no.
Everybody spoke out about that at Google with the government.
Only one
said, I can't do this.
Yeah.
One.
Yeah.
Yeah.
No,
it's that classic example of, you know, we're going to take this position, this moral position with the Pentagon, which you're certainly entitled to take, which is going to make us feel better.
But the moral decision that you should really be making, that we're not going to work with this suppressive police state halfway around the world, you're not prepared to make that decision.
And I think that is an example of how, as much as they sort of are clad in their t-shirts and they talk about their sort of progressive values, their
willingness to work with repressive regimes really reveals the fact that either A, they really clearly haven't thought this stuff through,
or B, their fascination or their sort of moral North Star is
collective.
Yes, it's the collective, and it's just the pursuit of technology in and of itself.
We're not going to attach any values, political values, or
intellectual values to that.
It's simply pursuit of the technology.
Everyone who says they care about civil rights and the minorities in America, and America was so bad,
I urge you to search what the Han, what Han Chinese means.
And if you are not Han Chinese, that's 10% of the population,
you're going into the back door.
That's right.
You've got separate hotels.
You can't have certain jobs.
It is segregation
1930 in America.
It's all Jim Crow laws.
And they don't seem to care.
Right.
And it is, you know, in a sense, I mean,
a natural fusion between a police state and a system of surveillance and influence and control.
I mean, one of the things I talk about in the film is,
you know, everybody always sort of brings up the sort of Hitler example, but
Hitler, Mussolini, Mao,
name them.
Stalin, I mean, they would have loved to have the capacity and the power that we have essentially handed Google and Facebook voluntarily.
We have we have these companies have more information and more access than the KGB ever dreamt of having.
That's right.
That's right.
And instead of, you know, in 1984 where, you know, you'd have the speaker sort of blare the speeches at you, it's sort of this over-the-top propaganda of Big Brother.
Now it's sort of embedded in this search, this wonderful looking search from Google, where, you know, it's sort of secretly being manipulated in a manner that a lot of people don't appreciate.
That's enormous power.
It's hidden power, but it's enormous power.
You didn't touch about this in the movie,
but I can't think of anybody I'm comfortable with
being the first to discover AI.
And I'm...
I would love to hear your thought.
We need a Manhattan project.
I mean,
this is going to make the nuclear bomb seem like a firecracker.
This changes all of humanity.
Right.
Possibly forever.
It puts a cage around us, may,
and it's not me, but it's Stephen Hawking, Bill Gates, Elon Musk, it may
kill.
all life on Earth.
It may.
We know Google's working on it.
We know China's working on it.
We know Russia is working on it.
I don't think our government has a clue as to what they're doing.
You know, there was
an AI conference up in Cambridge, and the president's person came in and said, hey, well, we'll even let you use our NSA servers.
They don't need the NSA servers.
They don't need the NSA servers.
That's like you can borrow my 1985 Chevy.
yeah.
Yeah.
I mean, it's, it's crazy.
Who do you feel comfortable with?
Anybody?
I mean, no.
I mean, I certainly don't consider myself an expert on AI per se, but here's, I think, what we know from human history, and particularly recent human history.
Every great technological development has been good, has meant good things and bad things.
And technological advances are essentially tools.
We think of them as tools because we control them.
AI is different by its nature.
If you are creating an intelligence system,
you want to continue to regard it as a tool.
But the point, the question becomes, you know, this tool is learning.
And, you know, think about it this way.
You have, let's say, you know,
a thousand robots that are operating and learning.
And let's assume that they only learn one new thing a day.
But, Glenn, those thousand robots are all communicating.
So, they're all actually learning a thousand new things a day.
And they're learning it the day after, the day after, the day after.
What does that compare to in terms of the advance in human knowledge?
Who is going to be more knowledgeable and more superior going down the road in that kind of scenario?
And what we know from human history: you know, splitting the atom,
great advances in biology, all of these these developments, wonderful things come out that benefit human society, and other consequences that we never intended or that we thought we could control, but that we can't
are part of it.
I have had
military types tell me, Glenn.
Glenn, the drones.
Yes, yes, it's facial recognition, but it requires a human to push that final red button.
For now for now right you should not teach intelligent machines to kill right it's a really bad idea well and the question becomes not only shouldn't you should you not teach them to kill but will they be able to learn themselves
how to kill yes they will and that's and that's you know the the the the challenge here is they will tell you no we will put a barrier there you can't
so a people who don't understand AI,
AI,
artificial intelligence, is like what we kind of have now, where it can only do one thing.
Yes, Big Blue can play chess and it can beat everybody on earth at chess, but it's that one thing.
The next step is AGI, and that's general intelligence.
That's a human brain that is
good at everything,
okay, or
many things.
ASI is super intelligence.
When it becomes AGI,
it will quickly become smarter than us because it doesn't forget, it learns, it's connected to everything, it's absorbing.
It's, you think the people at Google have power when Google is in charge and not the people,
it really has power.
Right.
And
people don't understand that.
I read a description:
We will be,
people
will be to ASI
as a fly is in a kitchen on a plate to the conversation people are having.
You will not be.
It won't care.
We're assigning these things that it should love.
people, it should care for people.
Well,
okay, we can teach it that, but it's going to to be so smart.
Right.
We are going to be a nuisance.
Right.
And the question becomes, Glenn, is you can say we're going to program it or we're going to set up within it a capacity to like human beings, but
what's going to determine the ethics of these machines?
The ethics of these machines is going to be determined, at least initially, by the humans that are constructing them.
Yes.
But we don't know where that ultimately leads.
Is it going to create its own ethic?
You know, looking at the fact that, you know, these human beings are being destructive.
So we have a responsibility now to damage or kill those humans.
If there was a little teeny fly Moses,
would we be listening and obeying the fly Moses Ten Commandments?
No.
No.
I mean, that was cute.
When I was little, that was cute.
Okay, I got it.
I got it.
I got it.
Yeah.
All right.
Buzz off.
Yeah.
This is what my goals are.
Right.
And
no one is thinking about
between here and there, earliest estimate that, you know, Kurzweil says a lot of people think is wrong, but 2030.
But if it doesn't happen by 2030, Bain Capital, who does this for a living, says we're going to have 30% unemployment.
So
something bad is coming and no one is discussing.
And no one is.
I was so pleased to see Tristan, that he was a source of yours.
I've talked to him several times.
He is rare
to walk out and say, I'm not having anything to do with this.
Right.
No, no, it is very rare because look, if you're in Silicon Valley, you're living a very comfortable life now.
You're making lots of money.
You're involved in these creative processes.
And I think the challenge with AI is that you're going to have
government
is going to be constantly behind.
I mean, look, if the prediction is that that this happens at 2030, we can expect congressional hearings in 2031, right?
I mean, that's usually how this works.
The capacity of government to deal with this is limited, and the pace of change is so great that if we're expecting government to sort of figure this out and manage this,
I think we're making a big mistake.
And then the question becomes within Silicon Valley, what constraints are there on people that are making these decisions?
And there are not very many.
You know, there's a reason, I think, sort of deeply embedded in the human psychology
that's concerned about technology.
And I'm not talking about Luddites who reject it in general, but there's a reason, if you go back to, you know, the Superman comics, that the threat is always sort of the madman who is misusing technology in a way to damage people.
I think people understand in terms of wisdom that we need to see technology as a benefit, but also as a potential threat.
And my concern is that in Silicon Valley, there seems to be far more interest in intelligence as an issue rather than wisdom.
And wisdom teaches us that historically, these machines and these tools, the smarter they get, they can do something for us, but that they can do something to us.
But you can't stop that.
You're never going to put this.
AI is coming.
Right.
It's coming.
Right.
And I am thrilled.
I am excited about the next 10 years.
I am not excited about
I wish I could watch it as a movie
because society is going to freak out.
But I'm excited about all of the potentials.
Yeah.
I'm also very concerned, but I don't, I don't,
you can't put this back in the box.
It's going to happen.
And
I don't know who I want to be the one who discovers this
because I don't trust
right anybody with this kind of this is this is we're not creating
the movie Frankenstein
he created a powerful big strong being okay yeah
this is creating a god
yeah
Something that's as close to being omnipotent as we can sort of conceive of in the in the
in human creation.
No, that's exactly right.
And the problem is compounded by the fact that this is, in a sense, a global arms race, as it were.
Because even if assuming by some miracle that we in the United States come up with sort of a constructive constraint on how this is going to be developed in a responsible way.
This is going on in China, where really it's about state power and state rule and what government wants.
This is going on in Russia.
This is a global trend that sort of transcends the United States.
So we can be in a situation where even in the most optimistic view, we somehow create some kind of code to determine how we are going to
create AI and what role it's going to play in American society.
But of course, that stops at the border's edge and you can't really enforce it.
So
it's a very bleak
view.
And yet, If people are increasingly becoming aware of it, you're talking about it.
Other people are talking about it.
My hope is, I'm an optimist by nature, Glenn.
My hope is we will not be taken by surprise.
We will at least be able to prepare ourselves in some ways.
I don't know what that preparation looks like right now, but I think it's going to be necessary.
Let me come out of the deep future here, the 10-year future.
And
let's end it here.
The problem today
is these companies are still
somewhat manageable.
Break them up,
regulate them,
God forbid, make them a utility.
What?
It's tough.
I don't think you make them a utility.
I mean, I've dealt with utilities where I've lived, and it would be a nightmare.
And by making it a utility, as you and I both know, you're really giving the government or a government body.
You're creating that marriage.
Exactly.
That's a power I don't want the government to have.
So I don't consider that an option.
Regulation is a great idea in a sense, but on the other hand, regulation never keeps up.
And there's also the issue of regulatory capture.
What are regulators going to do?
They're going to go to Google and say, How should we regulate you?
This is the way it's always done.
So to me, ultimately, it's about breaking them up, breaking them up into multiple pieces, because it's the concentration of power and control and market share that gives them such a dominant position.
Doesn't that destroy the opportunity for America to create AI?
I don't think it does.
I think American innovation is such that...
It takes server farms the size of a government
or Google.
Yes.
You know,
that's
American innovation, that ain't in your backyard.
Yes.
And that's not with a lack of information.
That's with access to all information.
Yes.
No, there's absolutely right.
There's no question that it in some respects could put America or these American companies at a competitive disadvantage on these issues.
But I just think the stakes are too high.
I don't think we can say that, you know, what's good for Google is good for America.
You know, to take the old phrase that what's good for General Motors is good for America.
I just don't think it's true, particularly because there is this sense that this is a company that wants power, that wants to influence and steer people.
If this were a company that said, we are simply providing information, this is what we're going to do, we're going to allow other competitors to rise, we're not going to exercise market dominance, and oh, by the way, we're also going to research AI.
I wouldn't have as much problem with it.
But that's not where we are today.
We have a company that's operating in a monopolistic fashion, that is trying to steer and manipulate the flow of information in our country in a way that has never been done in American history, human history.
And I just think the stakes between now and the next five years are too high.
And if this is not corrected, we will reach the point where Google will become the dominant player in determining the future of American elections and elections around the world, where they can sway 10% of the vote in one direction or another simply by manipulating their news feed or their search results.
And that, to me, is just not acceptable.
Am I on the wrong track or the right track?
Trying to
bring all voices.
I don't care what your opinion is, as long as it's Bill of Rights, and that's a wide range.
As long as it's Bill of Rights,
we need to protect ourselves and be on a separate server.
Yes.
Because I believe the voices of freedom and individual rights are going to be de-platformed.
Yes.
I think you're exactly right.
And when you mentioned to me earlier that you had set up, you're using your own system, I think that is so profoundly important.
We do the same thing at the Government Accountability Institute.
We have our own server.
We are not connected in the cloud or in any other way that gives people control.
It's about autonomy.
And, you know, one of the analogies that I use in the film at the end, Glenn, is
we're all accustomed to the notion of taking care of our physical selves, right?
You got to exercise.
You got to eat better.
And I'm trying to do that.
Other people are trying to do that.
I gave up 20 years ago.
You know, we try.
Well, we have digital selves, right?
We have digital selves.
We should be taking care of our digital selves because in a sense, they are close to being as important as ourselves because they're so intimately connected.
And that means recognizing these privacy issues, recognizing that you're being manipulated, recognizing that these entities have a capacity to try to influence you and that they're going to do it.
That, I think, is fundamentally important.
Whether you're running a operation as large as yours, whether you're running a small mom and pop shop, or whether you are just simply a student somewhere, take control of your digital self and don't be beholden to these entities who want control and want influence over the decisions that you make.
And there are steps that you can take.
It's not easy, but you can take those steps.
It's very, very important because the day is going to come when eventually they decide, you know what?
We don't like what Glenn Beck has to say anymore.
We want to try to, oh, we can't de-platform him because he's not under our purview.
that sort of decision everybody's going to have to at some point consider
want to end with
kind of where we started with the katie kouric today show on the future in five years we may look back at this and go oh my gosh we might look back at it and people don't know what it is because it failed
Have you heard of Solid?
You know what Solid is?
No.
And pods?
No.
The actual inventor of the Internet, most people don't even know who he is.
It was one guy, and he's the guy who came up with www.
Okay.
And he wanted a very open and free internet.
And so he's seen what's happening, and he's seeing these companies.
And so he has been working on a few things.
Solid, which is a, it's, I don't even understand how it works, okay?
But it is a new internet, okay?
And your pods is personal,
personal, I can't remember, but it's all of your personal information.
And he's redesigning things through this new company that
is allowing you to go out and buy apps.
But you bring them into your space and you control all the information and all of those apps only work in your space and it can go out and get things, but it doesn't leave the trail of information.
So all your pictures, everything, it's all in your pod.
So there is somebody trying to.
It's wonderful.
It's digital sovereignty.
It's digital sovereignty.
You know, every man's castle and each individual has sovereignty over themselves.
This is digital sovereignty.
I love this.
I think this is fantastic.
And look, one of the reasons I'm an optimist is human nature is such and the American spirit is such is we don't just sort of toss our arms up and say, okay, this is our fate.
I hope, Glenn, that we talk about this in five years and we chuckle and say, boy, we were so pessimistic.
We were so, and here came the power.
You're right.
And listen to that stupid explanation.
Exactly, exactly.
Everything is great and everything.
I hope that conversation happens.
One of the reasons that conversation might happen is because people now need to be paying attention to it.
This is something that can sneak up on people that could be a real problem unless people are aware of it and insist on taking control of their digital selves.
I'm going to put you in the calendar for five years from today.
I think that would be a wild thing to come back to.
We'll do it.
We'll do it, Glenn.
Thank you.