Frances Haugen at Code 2022
Recorded on September 6 in Los Angeles.
Learn more about your ad choices. Visit podcastchoices.com/adchoices
Listen and follow along
Transcript
Support for the show comes from Saks Fifth Avenue.
Sacks Fifth Avenue makes it easy to shop for your personal style.
Follow us here, and you can invest in some new arrivals that you'll want to wear again and again, like a relaxed product blazer and Gucci loafers, which can take you from work to the weekend.
Shopping from Saks feels totally customized, from the in-store stylist to a visit to Saks.com, where they can show you things that fit your style and taste.
They'll even let you know when arrivals from your favorite designers are in, or when that Brunello Cachinelli sweater you've been eyeing is back in stock.
So, if you're like me and you need shopping to be personalized and easy, head to Saks Fifth Avenue for the Best Fall Arrivals and Style Inspiration.
Hi, everyone.
This is Kara Swisher, and today we've got a bonus episode for you.
It's my conversation with Facebook whistleblower Frances Haugen from this year's Code Conference.
Casey Newton and I spoke with Frances about the impacts of her disclosure and what changes she'd still like to see.
I interviewed her first on CNN Plus.
Do you know how to show there?
How to show that?
No, I didn't.
I I never heard it.
I heard it got canceled.
Anyway, a little unexpected reminder during the conversation about the crucial role of technology.
Everyone's phones went off with an alert about the California power grid potentially going offline.
You'll hear it.
Enjoy.
It's now been basically a year since you made your revelations.
What practical effect do you think
came about as you coming forward?
I think one of the biggest things that happened was there is a major generational piece of of legislation that passed in Europe called the Digital Services Act, which I think is a landmark bill in terms of being focused on process and on transparency versus, say, having prohibited types of content.
And I think the thing that I'm most grateful for is Europe didn't have a whistleblower law prior to last year.
And they said explicitly that the information in my disclosures show them the importance, especially in a time where more and more critical parts of our economy are driven by these opaque systems, of the need to have protections for whistleblowers.
So that's probably the thing I'm most happy about.
Yeah.
And what hasn't happened that you wish might have happened in the past year?
Oh,
I think the stories around the mental health impacts on kids in the United States have really landed.
But the core thing that brought me forward originally was we forget here in the United States where we have good sources of independent journalism like The Verge, that in lots of places, particularly on non-English parts of the internet, Facebook is the internet still.
And Facebook's underinvestment in safety and security plays out with really catastrophic consequences in places like African countries or in Southeast Asia.
And I think there's a huge discussion that really hasn't come to fruition yet that we have to keep drawing attention to about that.
Well, one of the interesting things is the way I started to real pay attention writing about all these problems, sort of screaming about what was happening at Facebook back in 2015, is when Maria Ressa came to me.
She had been trying to get the attention of Sheryl Sandberg and Mark Zuckerberg on these issues.
And she called me and said, they seem to listen to you.
So could you tell them this, this, and this?
And she sent me a bunch of data.
And she said, we're the canary in the coal mine.
And I have to say, it changed.
I made a call right away to Cheryl Sandberg saying, you need to see this woman.
You need to talk about it.
And it was the Philippines, obviously, where she is now under trial,
possibly will be jailed again and again and again.
So when you think about what Casey was asking, do you feel like you made a difference?
It got a lot of attention.
I often feel like I get a lot of attention, and you, of course, did it in a quantum way with all the documents and everything else, but not a lot has changed from my perspective.
How do you look at it?
So the way I look at it is, you know,
when I went to business school, one of the biggest things that changed for me was how I viewed time, right?
So in Silicon Valley, we look at things and say, if something doesn't happen in two years, it won't happen.
If it doesn't happen maybe four years, it won't happen.
But the reality when it comes to things like regulation is the arc of history is much longer.
And when it comes to social platforms, our entire oversight process has been cut short because the public muscle of accountability never got built.
So the most important thing about the DSA is it demands access to data from the public for the first time.
And we are starting 15 years behind or 20 years behind.
And so it's going to look slow for a little while.
But ideally, you know, starting January 1, when data access begins, we're going to begin being able to have the public have their own theories on how to reform these things, be able to ask our own questions.
And so I think that's the most important catalyst.
Speaking of things that take time, you filed a bunch of documents with the SEC in hopes that they would investigate.
What can you tell us about what you know that the SEC may or may not be doing?
And have they granted you whistleblower protections?
So by the act of filing, I have whistleblower protections.
The SEC is a federal agency.
These are federal investigatory processes.
They have a history of being incredibly opaque until they bring sanctions.
And so I can't comment on my interactions with the government, but the fact that
it took 18 months from when Cambridge Analytica came out to when they did their first sanction, which was a lightning fast process.
So as soon as we could even imagine seeing something in those corners would be another six months.
Yeah.
Meaning investigation.
Meaning.
No, I mean like with Cambridge Analytica from the day that the story broke to the SEC announced sanctions, it was 18 months.
So that's like the fastest we can imagine the process going.
So
if we still don't hear anything a year from now, I would still not be surprised.
But six months even would be very, very fast.
What would you expect them to do?
The case that we've made for them is that Facebook's share price was artificially inflated because people use the product who wouldn't use it had they known the truth.
There were advertisers who would not have advertised had they known the truth.
And that Facebook was able to spend less on safety systems than they would had they told the truth.
When you look back over the five years before when my disclosures came out in the Washington and the Wall Street Journal, there had only been 27 instances where the price of Facebook stock price declined more than 5% versus the NASDAQ over those five years.
And something like 60 or 70% of the time, it was because they announced they were going to spend more on on safety or their users had gone down.
And so I think there's a case to be made that the share price was artificially inflated because it's gone down more than any of the other FANG except for Netflix.
And it went down substantially more than the other social media companies did.
Right.
So assuming that this complaint moves forward, what do you hope comes out of that specifically?
What specific changes do you hope might be made at Facebook or a lot of these other platforms that have all of the same issues?
So my personal fantasy, I have no idea how realistic this is, is that they would force, the SDC has disgorgement powers.
And right now, Mark Zuckerberg has 56% of all the votes at Facebook, you know, to the point where he actually introduced a whole nother class of shares just so he could sell his shares without losing control.
So I think it would be amazing if he was required to sell some of his shares and put them in trust, because then we could actually have the normal corporate oversight processes take effect.
The shareholders have been voting for one share, one vote for years.
This might be a way to do it.
I also hope things like
actual accountability around things like,
you know, having to work with the American Association of Pediatrics around mental health, like kind of the kinds of things that happened after tobacco.
Right.
Yeah, you mentioned mental health.
Obviously, the regulators seem to seize on that more than I think anything else in your revelations and have proposed a variety of regulations around the world.
In response, the company said, like, this was taken out of context.
This is a very small survey.
What did you make of the company's response?
And how has our understanding of those issues, do you think, advanced over the past year?
What I find fascinating about some of those responses is so in that first round of documents the Wall Street Journal published, they published six of a large number of documents that dealt with teenage mental health.
Facebook knew what those documents were.
They promised the Wall Street Journal they wouldn't try to front run it.
They wouldn't release them on their own.
The day before they released the documents, Facebook put up two of the six documents.
And those were the two of the smallest sample sizes.
The document that had 150,000 participants was not published, right?
So it's not all the documents are small sample sizes.
It's the ones that Facebook is trying to draw our attention to.
And the focus on that versus the things happening in other countries, you yourself even said, this is an issue, but not like this.
How did you, talk about the rollout and how you looked at how the press covered, how the politicians reacted.
You were very fetid, you got a lot of attention, but a lot of it was on that issue.
Did you mind that being the issue?
So I think the, it's interesting.
I actually care a lot more about the teenage mental health issue now than I did a year ago.
And part of that is I have had the honor of getting to like meet with the parents of kids who've died, things like that.
And I don't personally have children.
I hope to one day.
There is no pain as bad as losing a child.
And I appreciate a lot more now
how severe the harm is.
to our nation's children.
I think one of the reasons why it has resonated so much is if you talk to a parent of a 13-year-old, almost all of them are deeply, deeply concerned.
And it's one thing to worry about large-scale communal violence thousands of miles away.
It's another thing to watch your niece suffer, watch your daughter suffer, watch your neighbor's child suffer.
And I think that's why people have moved so much quicker on kids in the United States.
And you yourself, when you were talking about it, one of the things they did was attack you.
Either you weren't in the room wasn't in the room.
Oh, I was a low-level employee that never was in a C-suite meeting.
Yeah, you weren't in the room.
That's essentially, which is one of the favorite things of Silicon Valley.
Everybody does that, excellent Silicon Valley.
The other was you were, you didn't know what you, it wasn't your area of expertise, which you said it wasn't several times, and that it was skewed, as he said.
Talk about the control on you in terms of doing this and the impetus.
They also tried to imply you had all kinds of plots.
Who were you working for?
So my partner really likes conspiracy theories.
He more freely forages on the internet than I do.
And he loved, so
one theory was I was a crisis actor.
My favorite one was there's no way I could have been that good.
And for context, so I did National Circuit Debate in high school.
I taught it for four years.
My coach was so good, he became the head of the National Debate Association.
Okay.
And these little details like don't come out
in those contexts.
But I think the larger one was, you know, I never intended to come out.
And that's why I did things like I didn't just photograph the documents, I photographed all the comments on the documents.
Because I wanted Facebook's employees to be the ones saying this is real.
And I wanted to say contemporaneously, when people were discussing the context of this document, no one was questioning that it wasn't true.
You know, you also took another leap that I thought was interesting.
So I was, you know, one of the journalists that was able to get access to these files after the journal reported.
And something that I found remarkable was how little effort you attempted to exert over what journalists did with any of it, right?
It was just sort of like, here are the documents, you know, let us know if you have any questions.
Talk about making that leap and how you felt about the aftermath.
Do you feel like you got what you wanted out of putting that kind of trust in journalism?
I was really, so part of the reason why we made them more accessible.
So I always intended at least to have
non-English journalists get access.
There were certain issues with the rollout in terms of I had a very chaotic summer last year, which one day if you read my memoir, you'll see why.
But
I really believe in journalism as a critical component of democracy.
And if you try to control journalists, you are underrooting that immune system function of journalism and democracy.
And so I was incredibly, incredibly honored at how hard you guys worked.
Like I saw how seriously you took the process.
And I am incredibly grateful for the amount of resources, the number of publications invested.
Let me ask you, in that regard,
Casey likes that.
I wasn't fishing for a compliment.
It's just like, if I were in your position, it would be kind of scary to just, you know, because
theoretically, journalists could reach opposite conclusions and say, I don't know, there's not a lot here, right?
Well, that's part of the reason why I wanted these documents outside of Facebook, though, right?
Which is that
when we talk about, like, we put journalism programs in junior highs because we believe the general public should understand the process of journalism we don't like one of the things that i'm working on now is a simulated social network because i think every student should have a chance to play with how we structure social media platforms because little choices radically change the information works right so so in that vein how i want to talk about impact because do you think it changed their minds they just changed their name
so so there's there's little
And there's stock ticker.
Don't forget about stock ticker.
That's a big thing too.
Has not been doing well.
So there's some things that we've seen from them that do look like meaningful changes.
So they had years to roll out any parental controls on Instagram, right?
They could have done this 10 years ago.
And the first parental controls they rolled out were a couple months after my disclosures.
So there are things where it sounds like they've been listening.
There are other things that are really sad.
Like they've further dissolved various parts of election integrity.
They've invested less and less in
responsible AI.
So I think there are things where they haven't really learned the right lesson, which is I know because I've talked to these consultants, people were warning Facebook for years, if you hide the dirty laundry, eventually the dirty laundry will get erred and it'll be worse than it will be if you just fess up to it now.
And I think instead of coming back and saying, hey, we're running critical vital infrastructure, like we are the internet for at least a billion people online today.
People who live in societies that make up three or four billion people.
Maybe we should be more participatory.
I don't think they've learned that lesson yet.
And why is that?
I think.
I know when I talk to them, they feel like victims continually.
I'm mean.
I'm done.
I'm finished for that.
But you
were dishonest.
You were taking advantage of them, which I'm sort of like, how in the world do you take advantage to a multi-billion dollar corporation?
But okay, if you're going to believe that.
But
where is the problem of that?
Because at some point,
too much beating up.
Absolutely.
No question.
And unfairly.
At the same time, there is a moment where companies, I'm thinking Airbnb, go, oh, I really need to be different.
That should be different.
Right.
So like the thing people talk about hitting bottom is, in fact, you can keep going down.
Right?
Like you think you've hit bottom.
There's actually more.
I think the issue with Facebook is they haven't yet admitted that the way that they were doing business is what caused their problems, right?
That they have this echo chamber internally where because Mark is unaccountable, he can surround himself with people who tell stories like that, who say like, no, no, no, no, you're the victim here.
And reality never has to weigh in with consequences.
And so I think it's one of these things where until the incentives change, we should not expect behavior to change.
How would that change?
Sorry.
Oh, sure.
How would it change?
So there's a bubble around.
You're saying a bubble.
Everyone who works for him is being paid for him is telling him he's very pretty.
Yeah.
And therefore he's so smart.
He's so smart.
Yeah.
Oops.
Oh.
Is everything okay?
Flash flood warning.
Flash flood.
Electrical.
Uh-oh.
Well, if everything goes off, here we are.
Let's blame Facebook.
It's kind of like how Facebook went down the day after my six minutes.
Oh, did it?
So, so who would, how do you get to, you have to get to Mark, right?
Correct?
This is really funny.
So they turn off their phones.
While we're roofing.
Yeah, so when it went down, the day after my 60 Minutes thing,
people
who I will not disclose, all the A-name records disappeared for Facebook.
Like they disappeared in a way that
when your DNS records route you around the internet, like the phone book of the internet, Facebook's entry from the internet disappeared.
And it's very odd because to do that inside of Facebook, like three different employees would have had to all sign off on the same change.
So either there was a conspiracy inside of Facebook or Facebook brought itself down.
And so I just, I want to know what happened so badly.
Like someone one day is going to leak that one and it'll be delicious.
We'll be back in a moment with more from Frances Haugen at Code.
Avoiding your unfinished home projects because you're not sure where to start?
Thumbtack knows homes, so you don't have to.
Don't know the difference between matte paint finish and satin or what that clunking sound from your dryer is?
With thumbtack, you don't have to be a home pro.
You just have to hire one.
You can hire top-rated pros, see price estimates, and read reviews all on the app.
Download today.
We're back.
Now, more from code.
Let's talk a little bit about what you're doing now.
You mentioned that you're working on a simulated social network, which sounds like a sort of very grand project.
Like, like, what does that mean and what will be done with it?
So, right now, the way we teach data science, I think, is really reductive, right?
So, we teach data science with little toy problems, where I think the biggest problem with the methodology is they all presume there's answers.
So, when we do real industrial machine learning, it's not like there's a clean yes or no on Dewey Ship.
It's that there's 20 or 30 stakeholders on every change, and some win and some lose.
And we still have to decide: do we go forward?
And right now, we wait for students to land at Facebook or at Google or Pinterest to learn how to think about those trade-offs.
And so there's a number of academics who have made simple simulations of social media.
We want to build on that history and be able to teach big, sprawling, messy data science classes where we put students in seats where, you know, if we talk about should you have to click on a link before you reshare it, it sounds obvious.
It's like 15% less misinformation right there.
You didn't have to censor anyone.
You just put a little human in the loop.
But Twitter did it and Facebook didn't.
So there must be something more interesting there.
I want to have students sit in those seats and argue about it because we need 100,000 people who understand the physics of social media because that's how we'll design safe social media.
Meaning that they will create a diff that they could make different choices instead of engagement, virality, and speed, you would have context, accuracy.
Or even really simple things, right?
Like if you say, okay, so let's say you write something, it lands in Casey's New Speed, he reshares, it lands in mine, now it lands in some randoms.
That person doesn't know you, you don't know them.
Imagine if they could still say whatever they wanted, but they had to copy and paste to do it.
We just put a little bit of a speed bump.
That little bit of friction, that chance to contemplate, has the same impact on misinformation as the entire third-party fact-checking program.
And it's not obvious.
It's not obvious that a change that small would have that big an impact.
But that's why we need more people who understand that.
So do you imagine there will be, I talked to the lawyer for Google.
Oh, I'm totally blank.
You see.
David Drummond?
No, no, no, no.
Ken Walker.
It's a woman.
Okay.
Okay.
Of course it was a woman because it was smart.
She was talking about doing slow internet.
The idea of slowing everything down.
And her premise, this was 10 years ago, we did a podcast on this, was that the engagement, virality, and speed were the
design.
It's about design, really.
And if you changed it to context, speed is always there.
People want speed accuracy context something else it changes the entire experience it's how you can make something a building different essentially a building different is there any impetus to make it different why would there be if it's just to do good I mean that's not something you hear a lot it's like let's sell ads let's make money let's do this so let's imagine we rolled back in time to the Facebook of 2008 so one of the things that makes me feel kind of old is I go to college campuses now and I'll get up on my soapbox and be like do you remember the Facebook of 2008 And these like 18-year-olds will look at me with big eyes and be like, no, I don't.
But the Facebook of 2008 was about our family and friends.
It was the Facebook that if you stopped a random person on the street and said, what is Facebook?
It was that version of Facebook.
I think Facebook would not be seeing its users bleed away today if it had optimized a little bit more on slower growth.
on actually giving people that connection instead of just seeing how many ads could they get them to view each day.
And so there's, there's, I think there's a lot of investors who want to figure out how to help these companies be more long-term successful.
I think there's a lot of litigators that are all kind of, they can smell there's blood in the water.
All those things are incentives that can push companies towards more responsible decisions.
And I think building out that ecosystem of accountability, that's what gives us safe cars.
You know, Facebook loves to say cars kill people, we don't get rid of cars, but we have a really good federal transportation agency that checks those cars, and we have litigators, investors, advocates who understand how to hold them accountable.
Yeah, they love that argument.
And then when you say that, they're like, well, they walk off in some fashion.
Go ahead, Casey.
Another thing that is interesting
that you're doing that I think other whistleblowers have not taken, some whistleblowers drop the complaint and then disappear.
This is an active set of campaigns that you're working on, has many projects associated with this.
Is this your life's work?
I think there's this question.
He's also asking, can you ever be hired by a tech company again?
Probably not.
That's a no.
You know, there's a lot of tech companies out there.
And it turns out good data science managers are hard to find.
No, but
I really, I genuinely believe that if we don't figure out how to do this in a responsible, sustainable way, like we've seen Myanmar, where hundreds of thousands of people died because of a social media fueled act of communal violence.
We saw Ethiopia over the last couple of years.
We're starting to get glimmers of it in Kenya.
I think there's tens of millions of lives that are genuinely on the line in the next couple decades.
And so, you know, I know there are very few people who understand the human side and understand the algorithms and can organize people.
And so I am going to push on this as long as I think I'm making some distance.
Support for Pivot comes from groons.
If you've ever done a deep internet dive trying to discover different nutrition solutions, you've likely had the thought, surely there's a way to improve my skin, gut health, immunity, brain fog without offending my taste buds.
Well there is.
It's called groons.
Groons are a convenient, comprehensive formula packed into a daily snack pack of gummies.
It's not a multivitamin, a greens gummy, or a prebiotic.
It's all of those things and then some for a fraction of the price.
In a Groons daily snack pack, you get more than 20 vitamins and minerals, six grams of prebiotic fiber, plus more than 60 ingredients.
They include nutrient-dense and whole foods, all of which will help you out in different ways.
For example, Groons has six times the gut health ingredients compared to the leading greens powders.
It contains biotin and niacinamide, which helps with thicker hair, nails, and skin health.
They also contain mushrooms, which can help with brain function.
And of course, you're probably familiar with vitamin C and how great it's for your immune system.
On top of all, groons are vegan and free of dairy, nuts, and gluten.
Get up to 52% off when you go to groons.co and use the code PIVOT.
That's G-R-U-N-S dot C-O using the code PIVOT for 52%
off.
Running a business comes with a lot of what-ifs.
But luckily, there's a simple answer to them.
Shopify.
It's the commerce platform behind millions of businesses, including Thrive Cosmetics and Momofuku, and it'll help you with everything you need.
From website design and marketing to boosting sales and expanding operations, Shopify can get the job done and make your dream a reality.
Turn those what-ifs into
sign up for your $1 per month trial at shopify.com/slash special offer.
How do you think the other social media companies behave too?
I mean, right now, there's another whistleblower,
Mudge.
It's Peter Zetko.
Peter, yeah.
Zetko.
Claims about the platform security issues.
Speaking of conspiracy theories, it seems rather timing is interesting.
He's being represented by Whistleblower Aid, same group that represented you.
Talk about that, what's happening there.
So, for those who are not tracking the latest whistleblower hijinks out there,
Mudge came forward and said Twitter is chronically underinvested in very basic safety systems.
It's not even covering
engineering practices 101.
Like you could take out a couple data centers and you might not ever be able to bring Twitter back.
There are known foreign operatives working in the company.
I mean the list is quite long of the problems.
And the thing that really stood out for me is I think most people aren't aware that there are professionals in software who work on safety and integrity.
And this is just like how we have front-end designers.
This is like how we have ML engineers.
This is like a profession.
And right now, the whole industry is dependent on people who are trained in-house to do this job.
And the reason why I want to build a simulated social network is we need to to educate 10,000 or 100,000 of these people a year.
Because right now, Twitter can't hire even the skeleton crew to take care of things because there's such a talent war for them.
And so I don't think Twitter's problems are unique.
I think it's a question of there's extreme scarcity and companies that don't have, you know, the balance sheet that Facebook does, they can't even afford this.
Yeah, I think someone said that Facebook was able to hire when they were starting to see problems, hundreds of people.
And Twitter could hire three people and a cat, essentially.
You know, and not very good it's a very cute cat yeah whatever I'm not very talented but when you face a thing like that
if it's not safe and there's no accountability and you talk to Senator Klobuchar this bill might not pass many people don't think it will even though she's absolutely determined How do you then change that if there's been no legislation on technology regulation, really significant regulation, except in Europe?
How do you then, you can do your thing where you've got a ton of attention, but it flames out.
And I think someone at Facebook said, oh, she'll go away.
I remember, and I'm like, again, another person I wanted to punch in the face.
I feel like Will Smith right now.
But I think it was here, actually.
No, it wasn't here.
How do you
deal with that, which is the long game, the long rope of dope?
They'll go away, et cetera.
Well,
I'm a little anomaly for a tech person in that I have a history minor and I focused on Cold War studies.
And you look at the study of the 20th century and there were a lot of seemingly impossible things that took place because people worked on them for decades.
So if you had asked anyone in the world in 1870, will Britain ever leave India?
No one but like a thousand crazy Indians in India thought that was ever going to happen.
Or the fall of the Soviet Union or the end of apartheid.
The way these things will change is people will realize they could change.
And a year ago, Facebook's narrative was the only narrative.
They had spent hundreds of of millions of dollars to say you have to choose between safety and freedom of speech.
That's the only choice.
It's like, oh, you want some safety?
It's going to cost you.
The reality is there are product choices, things like that.
Should you be able to just reshare infinitely?
Should you have to copy and paste at some point?
Should you have to click on a link before you reshare?
Which have nothing to do with freedom of speech.
Which has nothing to do with freedom of speech.
Well, they want it to be about freedom of speech, don't they?
Because they know.
Tom Cotton can yell.
And Facebook, some of the documents on my disclosures were of studies about how angry people get when they get censored.
They knew that as long as we argued about censorship, we would never do anything differently.
And so the reason I believe we will find some kind of new social contract for social media is that people know there's more options now.
And that's the first thing for change.
I have one more that I want to ask, and then should we move to audience questions?
You know, you mentioned changes to the products, and there's this huge one unfolding now at Facebook, which is this shift to this discovery engine of AI-powered feed, basically turning the app into TikTok.
You know, years ago, Mark said that they had found it was really bad when people are just passively consuming video by thumbing, and now the entire product is becoming that.
Based on what you saw in your documents and what you know from your time there, what are the risks of a world moving toward these sort of purely AI-recommended feeds?
I think it is an incredibly important and incredibly pressing question.
So Facebook looks to TikTok and says, oh, TikTok's growing, they have the solution.
The thing that we in this room really need to understand is TikTok can operate the way it does because it is designed to be censored.
That algorithmic feed hyper-concentrates content onto a
small enough number of items each day that they literally have humans scan each one, look at it, and approve it before it goes out.
And this has been known for years.
There was a scandal a couple years ago where if you were visibly gay, if you're visibly disabled, they took your content down.
We're in trouble.
We're in trouble.
To protect you from being bullied.
They really love you.
That's why they did it.
But Facebook has internal studies that say if they give you more stuff from your family and your friends, from pages you actually followed, groups you actually joined, you get less violence, less hate speech, less nudity for free.
And so the world that they're choosing to go towards is one where you have to do censorship to be safe.
And I think that's a bad idea.
Right, because they will make choices, which they do now.
And to be fair, they do it behind the scenes.
They just don't talk about it.
They make choices every day with every design they make.
So one of the things, so your point is social media doesn't have to be bad.
It doesn't have to be despair, denial, destruction, dysfunction, everything else.
Do you still own Facebook stock?
You know, I think I do.
Oh, wow.
Because, like, I wasn't supposed to sell it for a while because I was an
insider, I guess.
I thought you were too low down for that.
Well, it turns out if you can crash the stock price 50%, maybe you shouldn't sell.
But
my lawyers told me, don't sell, and so I didn't sell.
I was good.
SEC, you can hear me.
But I think I do.
I still have like a small amount, like not a huge amount.
It's like a souvenir.
It's like a souvenir.
It's like a souvenir.
I have one last question.
We talked in 2021.
I asked what you would say to Mark Zuckerberg.
You said a lot.
Has that message changed?
Or if you want to answer even better,
if you could be Mark Zuckerberg for a day,
besides wrestling, what would you do?
I think he looked great.
So one of the things that he said recently in one of those interviews where, you know, I've been saying this since the beginning, my heart always goes out to Mark Zuckerberg because like, I think the reason why he thinks we're all going to spend like all day in the metaverse is because like he spends all day in the metaverse, right?
Like when he walks into a restaurant, people glare at him, right?
If he goes to the grocery store, people glare at him.
He said in a recent interview that every day when he opens his phone, it's like getting punched in the gut.
And I think the thing I would say to him is, Mark, you are so smart.
You are so capable.
You have functionally infinite resources.
There's all sorts of things you could do do you really want every morning to wake up and feel like you got punched in the gut yeah is there something else you could do with your life yeah because I think you might be happier so nice I'm like stop getting get out of the way of my foot yeah
or something else so you really think he can do that I I think at some point
You know, the body keeps the score.
Like, he can keep being in denial and he can keep hurting himself.
And as long as he keeps hurting himself, he'll keep hurting us.
And so I think the reason why I can be compassionate to him is I don't think people change because we yell at them.
I think they change because the cost of changing is less than the costs of not changing.
On that note, wow, you're a nice person.
Thank you.
Questions from the audience?
Questions, please, for Francis?
Hi, Francis.
Lauren Good from Wired.
Hi, Casey and Kara.
So I wrote a story for Wired last year that was about our relationship with memories online.
And a lot of it was focused on what was Facebook then, not Meta, and how Facebook was trying to keep us super engaged in its app by constantly resurfacing things from the past that in some cases we don't necessarily want to see.
So in Europe, there's obviously the right to be forgotten.
And I've wondered if there's potential for something around the right to forget and what that looks like.
So I'm wondering if you can put into context what you think Facebook is really doing and other social
with basically monetizing our memories and what that could look like, what a better version of that looks like in the future.
I mean, the reason why they have those features is really simple.
It's retention, right?
That they can remind you, oh, remember that effort you went to in the past of putting stuff on Facebook?
Like we have very complicated ML algorithms to guess what are the things that if we show them to you again, you'll reshare them, right?
And so they can tell like what tugs at your heartstring.
Is it your new baby?
Is it that moment when you were really happy?
And
you know, these systems are very nuanced and they're very good at emotionally manipulating us.
And I think the question is, you know, if we were designed to respect autonomy and dignity, what kinds of different choices would we make?
And I don't know the answer to that question yet.
But I think having some accountability for the public in that conversation, I think, is really important.
So that's not something that you are specifically focused on as part of?
So I am an algorithmic specialist, and the thing that I feel like I can give to the public conversation is
the physics of social software is very counterintuitive.
It's driven by things like power laws.
Very small changes on the margins can have very different consequences.
And that's the kind of thing that I can make a difference in terms of educating people, like making them realize they have many more options.
And I'm not a social psychologist.
I'm not even a UI designer.
The products I build are invisible, right?
And so that's the place I can help.
Great.
Thank you.
Are there any other?
Okay.
I was going to say I would keep going, but go ahead.
I'm sorry.
Thank you, Lauren.
Francis, thank you for being here.
I'm curious, and not to minimize the importance of design in creating outcomes here, but can any design ever be more important than the business model of the network?
And as long as the business model doesn't change, you know, are we just, it won't it always force design to the same results?
No.
So I think the, there's a couple different things there.
So unquestionably, if the incentives do not change, we should expect no change from these companies.
But if we were to look at, say, an auto company, Some of the tools that are available to us for any physical product are things like we can buy the product, we can buy a car, we can take that apart, we can put sensors on it.
If Volkswagen wants to lie about their emissions, we can challenge those.
And because we can form our own knowledge, we can form our own theories, we have a public muscle of accountability.
We build that.
We have never gone to do that with social media.
And that's why litigators don't know how to bring product safety or product liability lawsuits.
It's why even regulators don't know how to write effective regulation, right?
The only places you learn these skills, you learn things that I've learned, is by working at these companies right down in the trenches.
And so we need to be able to bring way more people to the table so we can begin talking about what are the right incentives to move these companies towards the public.
That's a great answer.
Was it worth it, Francis?
100%.
Like there's nothing, there's nothing in the world that feels as good as giving another person hope.
Right?
And like I meet so many people who like
Like I'll start the conversation off and they'll be angry and they'll be frustrated and like I'll talk to them for like 20 like on planes.
I'm a plane talker.
Oh no.
I'm willing to confess that.
Don't sit next to me.
Yeah.
But I'll talk to people and they'll feel hopeless.
Like it's like, oh, like we're just doomed.
They buy into the Facebook narrative of it's freedom of speech or safety.
And
I love that I get to watch people go into the world believing that the world could be different.
And so I don't know.
It was totally worth it.
Yeah.
You're from the Midwest.
Thank you.
We'll have more conversations from Code in the feed.
Stay tuned.