News Update: Who Could Tame Facebook?
In this Radio Atlantic news update, Rob shares what he learned from his exclusive interview with Zuckerberg, and from the CEO’s testimony before Congress. We discuss with Atlantic senior editor Gillian White whether Facebook can be regulated, and whether it will.
Links
- “Mark Zuckerberg Says He’s Not Resigning” (Robinson Meyer, April 9, 2018)
- “The 3 Questions Mark Zuckerberg Hasn’t Answered” (Robinson Meyer, April 10, 2018)
- “How Facebook’s Ad Tool Fails to Protect Civil Rights” (Gillian B. White, October 28, 2016)
- “Facebook Lets Advertisers Exclude Users by Race” (Julia Angwin and Terry Parris Jr., ProPublica, October 28, 2016)
- Sarah Jeong on Twitter
- “The Most Important Exchange of the Zuckerberg Hearing” (Alexis C. Madrigal, April 11, 2018)
- “Mark Zuckerberg Is Halfway to Scot-Free” (Alexis C. Madrigal, April 11, 2018)
- “My Facebook Was Breached by Cambridge Analytica. Was Yours?” (Robinson Meyer, April 10, 2018)
- “Can Anyone Unseat Mark Zuckerberg?” (Robinson Meyer, March 22, 2018)
- “The Cambridge Analytica Scandal, in 3 Paragraphs” (Robinson Meyer, March 20, 2018)
Learn more about your ad choices. Visit megaphone.fm/adchoices
Listen and follow along
Transcript
This episode is brought to you by Progressive Insurance.
Fiscally responsible, financial geniuses, monetary magicians.
These are things people say about drivers who switch their car insurance to Progressive and save hundreds.
Visit progressive.com to see if you could save.
Progressive Casualty Insurance Company and affiliates.
Potential savings will vary, not available in all states or situations.
Hi, I'm Matt Thompson, executive editor of The Atlantic.
I am in the studio today with Jillian White, senior editor at The Atlantic.
Hello, Jillian.
Hi, Matt.
And Robinson Meyer, staff writer at The Atlantic.
Hi, Rob.
Hey, Matt.
Our regular weekly discussion with my esteemed co-hosts, Jeff and Alex, will be in your feeds very soon, so get excited.
This is an extra special bonus episode with Jillian and Rob to talk about the big news that's lighting up Capitol Hill here in D.C.
this week.
So Rob, you first.
What exactly is happening on the Hill this week?
So what happened this week is that for the first time in the 14-year history of him running Facebook, Mark Zuckerberg went to Capitol Hill, went to Congress to testify before both the House and the Senate.
So on Tuesday, he testified before a joint session of the Senate Judiciary and Commerce Committees, and on Wednesday, he testified before the House Energy and Commerce Committee.
And I'd say on the whole, if you want the headline, on the whole,
he did fine.
He managed to avoid answering any hard questions.
Side-eyes from Jillian.
Sometimes by filibustering and sometimes by telling congressmen or congresswomen that he didn't know the answer and his staff would get get back to them later.
I think one of the key things that his presence revealed is that many members of Congress, particularly in the Senate,
do not understand
not even how Facebook works, but they do not even have a basic factual understanding of what Facebook is.
Aaron Powell, Sir, Congress brought Mark Zuckerberg to talk about the aftermath of this Cambridge Analytica scandal, which is pretty confusing.
What is this thing?
Yeah, absolutely.
So the Cambridge Analytica scandal is that basically data from 87 million Facebook users made it somehow to the servers of Cambridge Analytica, which is a voter profiling and kind of electioneering company that has a lot of ties both to President Trump's political world and to some dubious, let us say, political activities.
There, there was a personality quiz released on the kind of a third-party app released on Facebook many years ago.
It seemed to be an innocuous personality quiz, but what it was actually doing was collecting data on Facebook users that was then sold to Cambridge Analytica.
I am shocked.
Shocked at the pause.
And I think the other reason Congress wanted to bring in Zuckerberg is because there has been kind of a rolling jubilee of Facebook scandals, I would say, since the 2016 election.
That is such a lovely way of putting it.
There has been shock over the ability of fake news to propagate on the social network.
There has been despair.
I'm struggling to think of negative words here beyond scandal.
But of course, fake news propagated on Facebook during the election.
It seems that both Russian bots, networks of Russian bots and also networks of like concerted Russian affiliated actors
were involved on Facebook trying to
radicalize
readers and possibly move them toward Trump during the primary and Sanders
during the Democratic primary.
And there has also been, I would say, a concerted
feeling from Zuckerberg that he
or Zuckerberg really only recently kind of decided in public that this was a problem.
After the election, he said that he thought fake news was crazy.
The idea that fake news spreading on Facebook had had any effect on the election result was crazy.
And recently, kind of his tune about that changed.
And so, in some ways, this wasn't only about the Cambridge Analytica scandal.
This was about
Mark Zuckerberg finally, in some ways, being ready to talk about all the ways that Facebook had perhaps negatively affected the 2016 presidential election.
Jillian, I see you struggling to restrain yourself in the diplomatic way that Rob has just put this.
So, what has stood out to you?
Well, I think the first thing that stood out to me was kind of what Rob was just talking about, which is it seems almost as if Zuckerberg has this pattern of facing Facebook criticism by saying, no, no, no, no, no, everyone calm down.
That's totally not a problem.
Everyone's overexaggerating it.
Dot, dot, dot.
We could have done better.
This is our fault.
You know, it just keeps going on.
And this, of course, has been like the biggest and most concerning example of that.
But, you know, at a certain point,
it seems like you have have to ask yourself, why doesn't he just own up to it?
Or why don't they just actually think about these problems before they occur?
Which, I mean, is much easier said than done, of course.
Aaron Powell, Rob, in a moment, I want to ask you about the exclusive interview that you did with Mark Zuckerberg in recent days.
But right now, tell me about what you heard in his testimony before Congress.
Well, what I heard was
that Facebook believes it's taking, in Mark's words, a broader view of its responsibility for what's on its platform, that it is very sorry about all these things happen.
That it's for the first time, for one of the first times, I heard Mark kind of take personal responsibility for the Cambridge Analytica scandal, saying, This is my company, I'm sorry.
Other than that, I would say I heard often a basic ignorance among legislators about what Facebook is and how it makes money.
You know, Zuckerberg had to tell Senator Orrin Hatch, a Republican of Utah, who in the past has been very knowledgeable about technology.
I mean, I think he's getting kind of made out in the press to seem like he's just totally ignorant about technology, but he chaired the Microsoft antitrust hearings back in the 90s.
Like, he did know something about technology once.
Once upon a time, yeah.
Once upon a time, but Mark Zuckerberg had to tell him that Facebook makes money by running advertisements,
which I think is apparent to anyone who has logged into Facebook for five minutes.
I mean, it's right there on your screen.
Now, Mr.
Zuckerberg, I remember well your first visit to Capitol Hill back in 2010.
You spoke to the Senate Republican High-Tech Task Force, which I chair.
You said back then that Facebook would always be free.
Is that still your objective?
Senator, yes.
There will always be a version of Facebook that is free.
It is our mission to try to help connect everyone around the world and to bring the world closer together.
In order to do that, we believe that we need to offer a service that everyone can afford, and we're committed to doing that.
Well, if so, how do you sustain a business model in which users don't pay for your service?
Senator, we run ads.
I see.
That's great.
Jillian, you've listened to a lot of business leaders talk to congresspeople in the chambers of our congressional legislatures.
What
did you hear this week?
Yeah, so I mean, I 100% agree with Rob that one of the big takeaways of this was that congressional hearings are at times ridiculous, especially when what you're trying to get is a better understanding of something that is very technical and nuanced and something that probably not that many members of Congress have technical or nuanced understanding of.
I've told people in recent days that watching Mark Zuckerberg be questioned by some of the senators and House members who clearly don't have a great sense of what Facebook is, how it works, sometimes how the internet seems to work.
Reminded me a lot of the days after the financial crisis when they were bringing in big business leaders, you know, the heads of banks like Goldman Sachs, and they were sitting them there for hours and asking them questions that if you've ever worked in banking, paid close attention to banking or markets, you're just sitting there saying, why on earth are you asking these questions?
They were asking them questions like, do you feel bad?
I mean, sure, but what does that have to do with my actual job or my role in the bank or
front-running clients or anything that is actually a nuanced and potentially illegal or problematic behavior?
They just weren't asking the right questions, and it really betrayed a lack of knowledge about the nuances of how banks worked, what a president or CEO's role in a bank was, and how those things mattered.
And I think what we're seeing again is that same lack of technical knowledge that raises the question of why are we spending this time doing this in the first place?
And why are we giving this many people a chance to just kind of ask the same rote questions over and over?
It's not contributing to anyone's knowledge.
You could have answered some of these questions by simply logging onto Facebook.
So my hope is that this was a hearing that so many people cared about because so many people are on Facebook, so many people, you know, have a stake in this thing, that perhaps, you know, there was a bigger audience for this hearing than there have been for those in the past, especially when people, you know, people don't want to hear boring business leaders talk about
their companies.
But everyone wants to hear Mark Zuckerberg explain why they answered a simple personality quiz and all of a sudden their information is leaked everywhere.
So it sounds like American legislators aren't going to be doing heavy-hitting technologically informed interviews with the CEO of Facebook anytime soon.
But we have a staff writer of technology at the Atlantic who did do a heavy-hitting exclusive interview with the CEO of Facebook recently.
And that is you, Rob Meyer.
It's you, Rob.
What did you want to know from Mark Zuckerberg?
And what did and didn't he tell you in your conversation?
The question I had before the interview is the same question I have now after watching the two days of hearings, which is, how does Mark Zuckerberg actually think through the problem of running Facebook?
How has he processed the amount of power he has?
And how does he even approach that and approach all these really delicate and difficult questions of like weighing one interest against another and weighing one group against another and weighing users against advertisers and weighing the degree to which he calls this social network a community versus the fact that it's a hundreds of billions of dollars
worth of business.
I
did not see much of that in the hearings at all.
Yeah.
It really, we really didn't get to that at all.
In fact, many of the questions were about forward-looking things.
The senators or the Congress People's Times was so short, and Zuckerberg was so kind of able to reply to any technical question that he didn't know the answer, and his staff would get back to the member that we never really got into the nitty-gritty there.
In the interview, I tried to approach this by asking him about what the last 18 months of his life have been like, basically, and what they've they've been like from the perspective of him as Facebook chief executive.
Obviously, after November 2016, he didn't think fake news was a big deal.
He didn't think, to the degree that we understood Russian interference, he didn't really think that was much of a thing either.
Now he's coming in front of Congress about it.
He's swearing that his company
is being transformed, that it's taking a broader responsibility of how it's going to run its community, so to speak.
He's evolved.
He's evolved, right.
And so I wanted to know, when did this happen?
And what was, how did you decide that this was something that Facebook should do and that Facebook's power kind of warranted this broader responsibility?
And what did he say?
He didn't say anything.
He said, I said, you know, this must be such a surreal moment for you, Mark, to see people discussing what your role has been like, what thoughts and decisions that you must have personally made.
And you're describing a larger change, a larger transformation in what Facebook is going to be.
Has that transformation accompanied an emotional reckoning or an emotional turning point for you?
And what are you feeling?
I mean, you feel guilt or regret or anything.
And he said, well, I certainly feel very bad
about what we did.
He said, the Russian thing was a huge miss.
And I said, okay, well,
what was the darkest moment for you?
Was there a moment moment where you sat back and realized, wow, this is what's happening?
Or
you'd gotten all the information, you finally realized what it meant.
And he said,
no, there wasn't one.
Actually, he paused eight seconds and then said, no, there wasn't one.
And then I said, was there a moment?
And he said, after another pause, no, there wasn't.
I mean, I was quite kind of struck by that answer because here, you know, Facebook is a company that's very determined.
It's almost unique among American companies in that it's so determined by Mark Zuckerberg's like personal sensibility.
And Mark Zuckerberg has an enormous amount of power over this company.
He is not only the CEO, he's the chairman of the board, and he owns a majority of the voting stock.
So
unless Facebook were to suddenly just like tank, there's even no one who has the power to unseat Mark Zuckerberg because he owns all the stock that would make him do it.
So, this is, I want to focus in this question of power and managing Facebook in a moment.
But one last follow-up before we turn to that.
How did this interview come about?
How did you, did,
you know, did you get a cell phone text message from Cheryl Sandberg saying, hey,
no,
I mean, no, Facebook got in touch with The Atlantic
out of regard for the rigor and fairness of our reporting, I think, and
said that
Facebook would be unveiling at that point.
There were two different kind of attempts at self-regulation that they unveiled on Friday of last week and then Monday of this week.
One was a political ad transparency program and the other one is that they'll give social scientists and a kind of a steering committee of senior academics access to Facebook and they'll let them start to study it without Facebook having a veto over any published studies.
Awesome.
I bet those academics will come back with an answer and like a regulatory framework
by the midterm.
If there is anyone who you can expect to get an answer on tough questions of power back to you fast, concisely, and in clear, plain language, it is social scientists.
And so I think, I mean, they want,
I did talk to Mark about
both of those efforts and
kind of
then went to the context of his larger thing about CEO.
And the big striking thing to me about it was for all the power that Mark has and the fact that he kind of wields this company almost solely, now that the company is transforming, he couldn't really explain why or how he came to that decision
or anything about why he thought it was morally important personally
as the big cheese.
He just said Facebook's taking a broader responsibility.
Wow.
Jillian,
I want to pose this question to you.
And this I have no idea about, so I'm really curious how you think about this.
I want us to talk for a moment about who could regulate Facebook.
Who could tame the complex churning behemoth that is the global behemoth of like, you know, 2 billion.
1 point something billion monthly active users that is Facebook.
In your role covering business for the Atlantic for several years, you've covered a lot of different approaches to regulation of complicated companies and companies with a lot of power.
How do you think about that question?
First of all, let's talk about from a business perspective.
Should Facebook be regulated as a monopoly?
So I think the question of whether or not Facebook is a monopoly was one of the biggest questions and one of the most reported on questions that came out of the entire hearing.
You're trying to say my question is Elaine, Jillian?
Your questions are fantastic.
But I do think the spectacle of Senator Lindsey Graham trying to grill Mark Zuckerberg on what competitors Facebook has, and Mark Zuckerberg kind of trying to pivot and saying, well, in this sphere, it would be Twitter.
In this sphere, it would be Google.
In this sphere...
Who's your biggest competitor?
Senator, we have a lot of competitors.
Who's your biggest?
I think the categories of...
Do you want just one?
I'm not sure I can give one, but can I give a bunch?
So there are three categories that I would focus on.
One are the other tech platforms, so Google, Apple, Amazon, Microsoft.
We overlap with them in different ways.
Do they provide the same service you provide?
In different ways, different parts of the world.
Let me put it this way.
If I buy a Ford and it doesn't work well and I don't like it, I can buy a Chevy.
If I'm upset with Facebook, what's the equivalent product that I can go sign up for?
Well, the second category that I was going to talk about or something.
I'm not talking about categories.
I'm talking about is a real competition you face.
Because car companies face a lot of competition.
If they make a defective car, it gets out in the world.
People stop buying that car, they buy another one.
Is it an alternative to Facebook in the private sector?
The point is, and the point that Senator Graham was trying to get at, was that there is no other entity that does exactly what Facebook does.
And thus, if you wanted to 100% duplicate your Facebook experience elsewhere, you do not have that option.
In broad terms, that
could constitute a monopoly.
I think the bigger question is, and the bigger question always when it comes to antitrust regulation has been, is there a problem if there's a monopoly in a particular industry or among a particular company?
So when we think about monopolies, some of the issues that have come up, for instance, when people ask questions about whether or not there's a monopoly in the airline industry, that's a huge problem because the airlines are the way that a lot of people do a lot of their long-distance traveling.
So, if there are only a few companies, they can jack up prices as much as they want.
And people won't have the option of saying, this is absurd, I'm not going to fly with you, because I guess you would be driving or walking or taking a train.
Or something awful could happen, such as the death of several pets in a short period of time.
And you still might not have the option of saying, I am not going to fly with this particular airline because there are only so many.
So I think the question of whether or not Facebook is a monopoly also gets to the heart of what do we think Facebook is.
So there are a lot of people who say, I mean, it's just a social network.
If you don't want to be on Facebook, don't be on Facebook.
It doesn't matter.
But what has become true is that there are 1.
above five, 1.7, 1.9, almost 2 billion people people around the globe who use Facebook as a means of communicating with each other.
There's certainly a ton of companies and advertisers that use Facebook as a means to communicate with consumers and with audiences.
So the question is, has it become almost as necessary as a utility?
And if that's the case, should it be broken up?
So should the social media portion of Facebook live separately from the advertising portion of Facebook, live separately from the data collection portion of Facebook.
And that might be a way that it's regulated.
I think what we know doesn't work, I hate to say it, is the idea of self-regulation.
Because self-regulation is the thing that people do when they know that they're in trouble, but want to have the ability to kind of control the narrative that comes out and also control what is being overseen.
At the point that you've already had as many scandals as Facebook has had, at the point that you have a scandal that's as big as this one is, I think it's just a little bit hard for the public and for lawmakers, which is why we're here, to believe that Facebook is going to be able to or has the will to self-police
and that they can foresee problems or the areas that problems might arise in the first place in order to regulate them, in order to look into them or have other people look into them.
And then you have to trust that they're actually going to report out on them.
Yeah.
So calling Facebook a monopoly and taking that regulatory approach is kind of weird.
It doesn't sound like Congress knows much about Facebook or enough to be a regulator.
But if Congress did decide to do something
about Facebook, what could they do?
What are the pathways that you might expect them to take?
There's lots of things they could start to think about doing.
They could, I mean, if they think one of one of the great things that Senator Graham got to was
that Facebook has partly been so successful in not having a straight competitor by buying anything that could risk competing with it.
And so I don't think there are constitutional limits on Congress stepping in.
Normally, it would be the courts that would do this via a suit from the executive branch, but I don't think there's anything that stops Congress from saying that Facebook can't acquire a social network of a certain size or can't acquire another company of a certain size, really limiting it like that.
They could pass data privacy controls that would limit across the industry what Facebook can tell advertisers about your data and even what kind of information Facebook can collect about you.
They could make Facebook interoperable.
So if you wanted to leave Facebook and go somewhere else, go to a different social network that might not exist right now, you could take all your data, or certainly all your friend networks with you and go kind of connect to them all on that other social network.
And the last thing they could do is they could impose some kind of obligation on Facebook, like a fiduciary obligation that's imposed on a bank or a doctor, where Facebook has to act in your interest, has a legal obligation to act in your interest, and where you have maybe some choices in what kind of news Facebook shows you in its news feed, or you just have more fine-grained control related to that over
what exactly you defined your interest as being physically.
I think one of the things that we're seeing with Facebook right now is the thing that we've seen with a lot of companies that fancy themselves technology companies, but also operate as other things.
For instance, when you talk to Uber about what the company is, they say that they are a technology company.
They are an app company.
All they are doing is linking up drivers with riders.
But in effect, they are also acting as a transportation company.
In a lot of places, they are not regulated as such.
So this is a problem that we see across the tech space, not quite knowing what a company is and thus not quite knowing how to regulate it.
I think also a thing that has happened and that happens a lot with regulation is that regulation almost always follows innovation.
So, there's a lag and it's often not until there's a problem that regulation is thought of.
And then there's kind of this game of catch-up, which requires definition and all of these things that haven't been happening for years.
And I think trying to regulate at that point requires a level of imagination and innovation that often just doesn't exist in the regulatory space.
And I think for Facebook in particular, that's going to be true.
I mean, we've just talked about at least 10 different things that Facebook does and at least 10 different ways that that could be regulated.
All of those things would need to be kind of defined and siloed or reimagined and put together in completely new ways in order to adequately regulate for what Facebook is at this exact moment.
And that's to say nothing of what Facebook would be in a year, in two years, in three years, which is where they're already thinking.
So I think that's part of the huge challenge when we think about the potential for regulating Facebook.
Jillian, if you could magically wave a wand and hypnotize Congress, bring about a new regulatory agency, would there be any ill effects?
I mean, would you just wave the wand and do it?
Or would you worry about anything?
I don't think anyone should be waving wands and doing broad brush regulation.
I think broad brush regulation can in theory be problematic.
I think a big thing that we're seeing that happens with Facebook, that happens with Google, that quite frankly happens even outside of the tech sector is that there's no great way to protect consumer data.
There's no great way to protect data privacy.
We saw it with the Equifax breach.
I mean, basically any adult who has ever applied for or had credit had their information leaked.
And what was the big punishment there?
What was the big takeaway?
Supposedly, there's an investigation happening from the CFPB, but there are also rumors that that's getting tamped down on.
So, I mean, I think part of the issue with regulation is we need to be innovative in the way we think about it.
We need to think about regulations that can grow along with companies and that are actually getting to the real heart of the problem here.
So, if the heart of the problem is thinking about how people's data is used and what privacy security should be put on that data,
then we need to make sure that we're doing that in a way that one protects people and two doesn't just protect people when they are using social media, but also protects them when they're using more official things like a credit report.
And I think that's one of the big things that will need to be thought about when we think about how we're regulating Facebook.
Rob, if Congress did decide to just like go ham on Facebook, how might that change the experience of using Facebook?
Well,
it's hard to say because we don't know what Congress would do.
I think we also don't know how Facebook would respond.
I mean, of course, there are lots of companies that respond to getting regulated by then telling you in big letters that they are getting regulated and they hate it and they're doing this thing that they have to do because the government told them they had to do it and they loathe it.
And there's nothing to stop Facebook from doing any of that.
Maybe in Facebook language where they tell us that, you know, your privacy is really important to us, but we hate this law.
I guess the funny thing is, none of these hindrances, I think, are arguments against regulating Facebook, but they are a testament to how difficult this task is going to be, which is that any, you have to be really careful when you're designing regulations that you don't actually just entrench Facebook.
So unless we decide that Facebook is a utility and that there will always be something like Facebook and Facebook is basically it and therefore we want to just turn it into a public thing and it will be our one social network like that,
then
any kind of real reg
there's a lot of different ways you could set up the regulatory burden where it will either require having real big profit margins and having a lot of human staff to actually go over data, or where it would just make growth like Facebook's original growth kind of impossible, where Facebook was able to grow so early in some ways because it abused or misused user data in certain ways.
I mean, this is why the FTC consent decree exists in the first place.
But in the spirit of limiting Facebook, we could actually create a super Facebook company.
Right, exactly.
In the spirit of limiting Facebook, we could prevent any future company from ever acting like Facebook again, which maybe we decide is worth it.
But it could also mean that we're just kind of stuck with Facebook and we make it so that it's prohibitively expensive to compete with Facebook.
It could create the monopoly that people fear now.
Yeah.
I mean, I think the other thing is that I do want to take a much, a broader view of monopoly than maybe just like, does it have any competitors?
And one of the big questions with Facebook is, what is Facebook?
And there was even an article in New York magazine last year that talked about how like Facebook sometimes seems like the EU and sometimes it seems like the Catholic Church and sometimes it seems like this like four-dimensional object passing through our three dimensions, and we only see these weird corners and angles.
It makes no sense.
Can I read a clip of that?
So this is a story from Max Reed that you're talking about from New York Magazine last October called, Does Even Mark Zuckerberg Know What Facebook Is?
And he says, in one context, it looks and acts like a television broadcaster, but in another context, an NGO.
Over the past year, I've heard Facebook compared to a dozen entities and felt like I've caught glimpses of it acting like a dozen more.
I've heard government metaphors, a state, the EU, the Catholic Church, Star Trek's United Federation of Planets, and business ones, a railroad company, a mall, physical metaphors, a town square, an interstate highway, an electrical grid.
And he goes on from there.
This gets precisely at because we struggle to articulate all the things that Facebook is, we also then struggle to articulate even what its problem is.
Like what is the central problem of Facebook?
And it seems like there are actually like several overlapping central problems of Facebook.
The The one that legislators talked about was user privacy and user data and what rights do individual users have over their data.
There's another one which is what is the power of Facebook's newsfeed, which is like the main stream of content you see when you open the app or go onto the website.
And you mean like what is the psychological power of Facebook?
What is the psychological power of it?
How do things wind up there?
Because of the way the algorithm is kind of set up to maximize engagement for
surfacing content to the newsfeed, does that mean that, in fact, it surfaces like really radicalizing content?
Does it wind up bringing people to the political extremes or making people kind of hate their political enemies in a certain way?
Or does it lead people who
might not otherwise adopt a hateful view to wind up kind of adopting a bigoted view?
Then there's a third one, which is like that Facebook is just really big.
And
it's like, you know, Mark Zuckerberg's here, like, ruling this castle.
And he has 25,000 employees, and they all help him, like, protect the castle in some way, or they help, like, farm, you know, they're all involved in castle maintenance and growth.
And
also,
there are 2 billion people who live inside this castle.
And
there are billions more.
you know, who encounter it in some way.
And there are like
an unknowable number of people and institutions that are trying to get over the walls of this castle in some way.
And so, when Mark Zuckerberg talks, for example, about Russian interference, and he's like, oh,
we were anticipating them to fish, or we were to steal people as identities, or we were anticipating that they would hack the infrastructure somehow, and we never anticipated an attack like this.
Like, this is just part of the problem of being big, is that you can't anticipate every different way that someone's gonna try to attack your system.
And so we have these like three different overlapping Facebook problems.
And we talked a little bit about user privacy.
And then we talked not at all about the news feed and what algorithmic bias is.
And we kind of talked about scale, but in some ways, Mark Zuckerberg is so
approaches problems of scale like they're feature problems, like they just haven't added the right feature or the right technology yet to deal with the scale.
And so we never get to these delicate questions about scale because
he doesn't answer them like that.
Right.
This brings me back to the individual that our conversation began with, Mark Zuckerberg.
As I hear you two talk about this problem, I keep thinking back to this story that Jose Antonio Vargas, who's currently an immigration advocate and was
a journalist, did for the New Yorker back in 2010.
And he recounts this moment when he was talking to Zuckerberg about the choices that the platform, that Facebook put in front of him.
He said that Facebook at one point asked him to choose which gender of person he was more attracted to, men or women.
He said he chose men.
And Zuckerberg started to say, yeah, you know, that's really interesting.
Facebook is,
we've noticed, we seem to have noticed that
people are coming out a lot on the platform and this is just a general trend of society being more open.
And then after
a moment,
Jose writes that he told Zuckerberg that he actually unchecked his romantic preference because he didn't want to reveal to his potential sources as a reporter that he was gay.
And he says that
Mark Zuckerberg responded just by saying,
huh.
And
it was, it seemed to Jose, it struck him as being the first time that Zuckerberg might have considered the possibility that there are details of someone's private life that really they could have
rational interests in not being made public or not being shared.
And that is just such a striking exchange.
To move away from Jose's story for a moment, you have a gay, undocumented immigrant who is kind of revealing to Mark Zuckerberg for the first time that there might not be information
that is good for him to make public or to be shared about him.
And in his impression, this was the first time that Zuckerberg had considered that possibility, which is striking.
So I guess that leaves me with this question for you both.
And Jillian, I'll start with you.
We have, in some ways, been here before with Facebook, as you said.
Every few years, there's a Facebook scandal.
Do you expect that this one will play out any differently?
Will Facebook have to adjust?
I don't think they'll have to adjust in the near term.
So
that exchange actually reminded me a little bit about what kind of this big most recent scandal is about, which is what a lot of them have been about, which isn't Facebook sharing the information that you've already willfully shared or that you've already willfully posted.
It's the idea that Facebook is constantly there in the background trying to piece together who you are, whether or not you've given given them the consent to do so or the information to do so.
It reminds me a little bit about the thing that ProPublica dredged up, I think, in 2016, which was the fact that advertisers were targeting Facebook users based on race.
But it wasn't necessarily because Facebook users had checked that they were African American or that they were Hispanic.
It was because of what Facebook deems ethnic affinity.
So if you like a lot of content, say from Black Lives Matter, which a lot of black people might do, or like the overwhelming majority of people who do might be black, Facebook is going to say that you are a person who's most likely black.
And what landlords were able to do was say, great, I don't want to show any black people the advertisements for my housing, which is illegal.
You can't do that.
There's a federal law that prohibits it.
And Facebook had not.
Their housing act turns 50 this year, happy birthday.
This year.
um and facebook hadn't thought far enough to say not only is that perhaps morally not okay but there's a federal law that prohibits that and we should put a hard line put a hard stop on that they had basically what was a self-policing mechanism where advertisers would say i am not discriminating based on federal law and they've since supposedly changed it but even after that propublica found the ability to perhaps get around that so i say all of that to say we have been here before in so many different ways and there have been very real concerns, not just about what Facebook is collecting that we give to them, but also what they are putting together on their own and then how that is being used to hurt people.
to use Rob's example, both inside and outside of the castle.
And I just don't think we're there yet on finding a way to actually cut down on that.
Rob, for you, closing question.
We've talked about a lot of different types of agents that could hold Facebook accountable.
Congress, the executive branch.
How about users?
What do you see as someone who pays close attention to Facebook?
How do you think that users are navigating the new knowledge about this platform?
Well, Mark Zuckerberg said during the testimony a couple of times that users have not left the site in really noticeable numbers, that it's not something they kind of can pull out.
I hate to say it.
I mean,
I also report on climate change, and this is kind of reminding me of the question that I sometimes get asked, which is like, well, what can an individual person do about climate change?
And the answer is like,
like, your most powerful thing you can do about climate change is like at the ballot box and like in how you communicate to your legislators.
But this is, but I think
Sarah Zhang, who's a technology reporter, had a great tweet,
which is about the way that a lot of lawmakers were looking at this scandal, the Cambridge Analytica scandal specifically, which is that they seem to view it as kind of like 87 million individual harms, like 87 million like possible torts, you know, possible injuries.
And in fact, that's not the best way to view it because like when you look at what Cambridge Analytica actually probably, I was a victim of Cambridge Analytica.
I'm proud to say it.
And when you look at like what they probably got about me,
there's a lot they could figure out.
They could infer stuff demographically about me.
But you know, it was birthday, it was name, it was city, it was current city.
They might have gotten my hometown.
They probably got page likes.
They could learn a lot about me, but it's not stuff that I'm not sure any other advertiser or marketing broker doesn't already have.
It's what you can do when you start to have data about not just one person like that, but 87 million people like that, and all the marginal effects that start to come into play when that happens that seems so core to the question of how we're going to deal with Facebook.
And on top of that, you know,
there was an exchange where Mark Zuckerberg was asked if Facebook kept shadow profiles, that is, profiles or data about users who do not have Facebook profiles themselves, but kind of accumulated data about all their other activities.
They had like a,
almost like a, not quite a user ID, but they had like the sense of who that person was.
And Mark Zuckerberg professed to not know what this is, but there's a lot of reporting that really indicates that Facebook has data stores on people who don't themselves use Facebook.
And so even if you leave Facebook, first of all, you also have to leave Instagram and WhatsApp and all these other apps that Facebook owns.
Second of all,
Facebook may still be collecting data on you because any page on the internet with a like button is collecting data on you in certain ways.
So
in terms of what individual users can do here,
individual users can think a lot about what role they think Facebook should play in our democracy and what and what and they should think a lot about the amount of attention they spend on Facebook and its various products every day.
And them and all those social media.
And then they should call their, then they should either, yes, pursue advanced degree in social science or they should call their legislator or write about this.
I mean, just try to start thinking about these problems because I also think Mark Zuckerberg, I mean, I did get to ask him in the interview, is Facebook too big to be managed?
And he replied like, oh, we just need more transparency.
You know, he replied almost like this was a technical problem.
And I've seen him do this in other interviews.
Mark Zuckerberg isn't going to make these decisions for us.
And nor should Mark Zuckerberg, I I think, be making these decisions for us because we're a democracy.
And
we are governed by the people.
And so we should think of ourselves as kind of democratic citizens and not Facebook users as we start to
think about what's next for Facebook.
Yeah.
Robinson Meyer, as a survivor of the Cambridge Analytical Scandal,
I want to let you know that you are brave and I stand with you.
Thank you.
Yes.
Thank you very much for joining us.
Rob, Gillian, this has been a pleasure.
I would say check us out at facebook.com slash RadioAtlantic.
Seems hypocritical.
But, you know,
I won't be mad at you.
Thank you so much for having me.
Thanks for having me.
This episode of Radio Atlantic was produced and edited by Kevin Townsend.
Our executive producer of Atlantic podcasts is Catherine Wells.
As I said at the top, we'll have another full episode with my co-hosts later this week.
See you next time.