Introducing Crazy/Genius: Why Can't Facebook Tell the Truth?
Learn more about your ad choices. Visit megaphone.fm/adchoices
Listen and follow along
Transcript
This episode is brought to you by Progressive Insurance.
Fiscally responsible, financial geniuses, monetary magicians.
These are things people say about drivers who switch their car insurance to Progressive and save hundreds.
Visit progressive.com to see if you could save.
Progressive Casualty Insurance Company and affiliates.
Potential savings will vary, not available in all states or situations.
Hi folks, Matt here, and this week bringing you something different.
The Atlantic has a brand new show, and today you'll hear the first episode.
It's called Crazy Genius, and it's hosted by the Atlantic staff writer Derek Thompson.
Over eight episodes, Derek will talk to a host of fascinating leaders in different fields, all to answer eight bold questions about how technology is changing the world.
I'll be back next week with my esteemed co-hosts for a rollicking Radio Atlantic conversation.
For now, enjoy Crazy Genius, and if you like it, make sure to subscribe.
Where does the story of Facebook start?
A lot of people might say 2012, the month it became a public corporation.
Others might say 2004, the year Mark Zuckerberg launched the first version of his site at Harvard.
But I want to start the story a little farther back.
There is a really critical moment in the 1830s.
This is Tim Wu.
I'm a professor at Columbia University and author of The Attention Merchants.
The story he told me starts in 1833, New York City, with an ambitious young publisher named Benjamin Day.
He wanted to start a newspaper, so he founded a paper called The New York Sun.
Newspapers in the 1830s were an elite product.
At the time, that meant they sold for all of six cents.
But this guy, Benjamin Day, said, no, I'll sell the Sun for a lot less than that.
He priced it at a penny, which was like a quarter is today, or maybe a dollar.
That price meant he was selling at a loss.
But Benjamin Day had an idea for how to make up that revenue.
Advertising.
Lots and lots of advertising.
He was the first to conceptualize of his audience as a product.
He was like, I can resell them to other people as opposed to selling to them.
There is such sneaky genius here.
For centuries, businesses mostly sold things they made.
Day sold other people's attention.
And The Sun quickly became one of the biggest newspapers in the world.
But every invention has a little Frankenstein in it.
Because in order to keep those readers reading, Day started publishing new kinds of stories, like grisly descriptions of murder scenes and dead bodies.
I mean, they were really interesting.
The first Day's first issue, the headline is a tragic suicide.
So, I mean, that immediately draws you you in so he had interest
even in the 1830s you had 21st century clichés of if it bleeds it leads and cher cheer le femme these ideas have been with us for 180 years our brains have been with us for even longer suicide reports were just the beginning the sun published an edgar allan poe story as a news report they ran a 17 000 word feature claiming astronomers with telescopes discovered mythical creatures having orgies on the moon man bats, that's what he called them.
They had powerful sexual impulses.
They were also kind of unicorn-like creatures wandering around.
I should add, this was never retracted, published as news, never retracted.
Oh my God.
And a sensation.
Sensational stories got readers.
Readers got advertisers.
And advertisers got profits.
Fake news came out of the business model.
Facebook isn't a publisher, but it's in the same business, and it's got a lot of the same problems with fake news that The Sun did.
Its business is maximizing attention, the audience is the product.
It's an astonishingly influential business model, and the moment The Sun started to make money was a little to me like the Wright brothers airplane taking off, in the sense the world was never going to be quite the same again.
I'm Derek Thompson.
This is Crazy Genius.
Eight weeks, eight questions about how technology is changing the world, and eight answers.
Today's question is, can Facebook fix its fake news problem, or is its entire business built around a model designed to sell us lies?
When I first signed up for Facebook, I didn't immediately think, oh yes, here I am, offering myself as a product to be resold, telling them all my secrets and my friends and my interests so that I could be sold off to the highest bidder.
Facebook is an incredible business.
Facebook's valuation is the third highest ever in an IPO.
Investors are definitely liking Facebook.
The company shares soared today after reported record fourth quarter profits.
It is now worth more than almost every media company combined.
But Facebook is also a little Frankenstein.
Mark Zuckerberg, in the days after the U.S.
presidential election, he said it would be crazy to think that fake news swayed any voters.
He quickly had to change his stance.
Some Facebook users saw the Las Vegas crisis response page with a link to this Gateway pundit article incorrectly identifying Gary Danley as the shooter.
I think that the management of Facebook right now is committing malpractice.
As a shareholder, I'm terrified by what they're doing.
Facebook success and scandal are two parts of the same story.
And I think it's a predictable predictable outcome of a pure attention merchant business model, no sense of ethical restraints, a pure quest for profit.
Do you think Facebook, as it's currently constructed, is a negative force for democracy?
Yes, I do.
And why?
I think because it breaks down the barrier between what is news and what is rumor.
If you destroy that
basic line of some sense of what might be true and allow systematic, widespread propagandizing, that is incredibly destructive to democracy.
And Facebook is not pushing it itself, but it has become a vessel and a medium and an instrument of propaganda, private and public and foreign.
And that is the danger.
Right.
I think a defense of Facebook sometimes goes something like this.
You know, rumors, false gossip, they exist everywhere.
They don't exist on Facebook.
They exist in newspapers, in magazines.
They exist in church.
They exist in book clubs.
Facebook isn't any worse than that.
It's simply a technological mirror held up to the human condition.
And the human condition includes some people sharing stuff that's true and some people sharing stuff that's not true.
What do you say to that?
I think there is no human condition.
It all depends on the mirror and the angles the mirror is held.
I'm serious.
I think people behave differently,
you know, depending in the environment they're immersed in.
I mean, it's actually crazy because it is a very important undertaking.
It affects people's lives, happiness, you know, perceptions of themselves, their sense of their place in the world.
But it's not really been engineered carefully to try to make people genuinely feel closer together.
Because I think it's all a means to an end, these other ends.
And that's what I'm saying.
So this is where I have to jump in and ask,
how do you think Facebook should be fixed?
I guess
one way I think about this question is, does it need more government or more capitalism?
I think it needs three things, frankly.
First, it does
certainly need more competition.
So that's more capitalism, I guess.
It needs more government oversight in particular areas where people will never take care of themselves properly.
And then finally, it needs more ethics.
But when it comes to Facebook's ethics, Wu is not optimistic.
And to understand his pessimism, it's worth understanding his experience with Facebook, not as a user, but as a regulator.
Several years ago, Wu was working at the Federal Trade Commission under President Obama.
And in 2011, the FTC caught Facebook in a lie.
Facebook told users their data was private, but they used their information to sell ads anyway.
We said, you know, these privacy settings aren't really working, and you're lying to people about what they're doing.
So Facebook promised the government never again.
The company would shape up.
The buttons for privacy controls would mean what they say they mean.
But data leaks kept happening again and again and again.
Lo and behold, Cambridge Analytica does the same thing.
And lo and behold, they allow almost any advertiser to get at anything they want.
And the buttons did still turn out to be fake.
Wu says we can't assume that deep down Facebook is interested in publishing the truth online because it's not that interested in telling the truth offline.
What Facebook really cares about is keeping its business profitable.
Here's one of the reasons I think Facebook just has a
genetic makeup that is not predisposed towards privacy or ethics.
Companies have a DNA, and I don't think this one was set up with ethics in its DNA.
That part is missing.
According to Wu, Facebook's flaws are in its code.
But Facebook is a technology.
Can't code be rewritten?
Some work has been done.
A lot needs to be done still.
But I'm very wary of painting the whole endeavor as rotten.
How to fix Facebook.
Right after this.
Hey there, I'm Claudina Bade and I lead the audio team here at The Atlantic.
I think a lot about what makes great audio journalism.
It commands your attention but isn't noisy.
It brings you closer to the subject but leaves room for you to make up your own mind.
And when you hear someone tell their story in their own voice, you understand it in a deeper way.
When you subscribe to The Atlantic, you'll be supporting this kind of journalism.
You'll also enjoy new benefits just for Atlantic subscribers on Apple Podcasts.
Think ad-free episodes of our shows and subscriber-only audio articles.
To join us, go to theatlantic.com/slash listener.
That's theatlantic.com/slash listener.
If you're already a subscriber, thanks.
You can head to the Atlantic's channel page on Apple Podcasts and start listening right now.
The fun never stops.
The chair now recognizes himself.
This spring was, in some ways, the low point of Facebook's history in the public eye.
The company had suffered scandal after scandal, Trump ads, Russian propaganda, Cambridge Analytica.
Fighting for its global reputation, the company sent Mark Zuckerberg to testify before Congress.
Good morning and welcome, Mr.
Zuckerberg, to the Energy and Commerce Committee in the House.
One congressman asked what Facebook planned to do about the flood of false information on its site.
One of the things that we're doing is working with a number of third parties who, so if people flag things as false news or incorrect, we run them by third-party fact-checkers who are all accredited by this Poynter Institute of Journalism.
The commander of this international effort to cleanse Facebook of falsehoods is Alexios Montsarlis.
So I thought, maybe if I want to understand the future of facts in Facebook, this might be the most important person to talk to.
Alexios, let's start with your job title.
You are lead of the International Fact-Checking Network.
That sounds rather grandiose.
Yeah, and I confided you, I confided my dad, and I confided the Border Patrol.
So you're in good company.
Right after the 2016 presidential election, Facebook reached out to Montsarlis, who works for the Poynter Institute and is not paid by Facebook.
The social network said it wanted to work with outside fact checkers to find false information on its site.
And Montserrat said, yeah, I'll help you.
I'll help you find the best fact-checking organizations in the world.
But what I wanted to know was, why?
Why would an international authority on facts donate his time to a site that doesn't seem to care about them at all?
I'd make the case that around the world, organizations like my own in Italy, fact-checking organizations that didn't have the connections in the mainstream media, that Facebook was the only way that these fact-checking organizations would reach an audience.
So I'm very wary when people sort of brand the whole effort, the whole platform, as a univocal threat to democracy.
If Facebook were turned off tomorrow, would the average truthfulness of news around the world
significantly rise, significantly fall, or essentially do nothing?
Essentially, do nothing.
Montserr Leese has a defense of Facebook.
It goes like this.
Most news critics comparing the world's news to the standards of, say, the New York Times, they're like food critics saying every hamburger needs to taste like a ribeye.
It's a totally unrealistic demand.
Second, most readers don't care about double fact-check news stories.
They like gossip.
That's why they buy tabloids.
And three, fact-checking organizations in the U.S.
are really confrontational.
PolitiFact literally calls politicians liars.
Around the world, no other organization does that.
Facebook had very little to do with creating this reality, but Facebook can fix it.
Facebook is currently testing a new fact-checking system in several countries, including the US.
Most of you probably haven't seen it, but here's how it works.
Let's say you're on the newsfeed.
An article comes up claiming that scientists discovered evidence of winged bat creatures having orgies on the moon.
And you think, as much as I'd like that to be true, I'm a little concerned it's been shared by several relatives and 100,000 strangers.
So what would I do?
What you would do, Derek Thompson, if you see this on Facebook, you would go to the top right of the post, to the carrot, where you could click down and you can do all kinds of things, report, share, whatever.
And one of the elements, one of the items is reported as a fake news story, as a false news story.
If enough users like me identified this lunar bat boy story as false, then a group of Facebook-approved fact-checkers would study it.
If the fact checkers concluded the story was wrong, Facebook would insert a link debunking the story.
And then algorithms would push it way, way down in the newsfeed so that fewer people would see it.
And also, if by any chance you had already shared it, you would get a notification saying, hey, this article that you'd shared has been fact-checked.
You might want to check it out.
And I do want to stress one thing.
This does not remove content from Facebook.
That would not be something that I think fact-checkers would feel comfortable doing.
Aaron Powell,
Facebook's version of fact-checking is really more like fact-labeling than fact-checking.
And I think it's really important to distinguish the difference.
Let's say I've written an article about you, Alexios, and it's going into the next issue of The Atlantic.
And now let's take a falsifiable claim that I might make in that article.
So, here's one.
In college,
Alexios Mansarlis was a member of a a secret society that performed ritualistic sacrifices of animals to appease an irritable god.
Now, how did you know?
How did you know?
I made this claim in an article for The Atlantic.
You'd get a call from our fact-checkers.
And we can simulate that experience right here, literally right now.
Alexios, hi.
I'm a fact-checker working on an article by Derek Thompson.
Alexios, did you attend college?
I did, yeah.
Okay, so far so good.
Alexios, were you a member of a secret society of animal slaughtering satanists?
I was not.
You were not.
Okay, so we've established something quite important here, which is that my claim was incredibly wrong.
So now at the Atlantic, we'd remove the sentence.
And the theory would be that the Atlantic is an institution that values the dissemination of true information.
So when that information isn't true, we just don't disseminate it.
And Facebook seems to be trying to have it both ways.
that they say they value facts, and I absolutely believe that its individual employees value facts.
But their solution to verifying facts is insufficiently staffed.
Their method of labeling dubious crap is empirically questionable.
And their algorithm, designed as it is to reward emotional viral sharing, may very well naturally privilege false information.
There's lots to impact there, Derek.
I was.
Starting with my fixation on Satanism, potentially.
I mean, no, all excellent points.
I would note that once we lose these publishers, if we did lose, you know, other voices from Facebook, they're not going to just stop publishing Drivel, right?
They'll be publishing Drivel somewhere else where we might not be able to detect them.
Facebook, for one positive thing it has
is the capacity to detect things that are viral.
You can't detect things that are going viral on closed messaging apps like WhatsApp or WeChat, for instance.
And yet, fake news could be and is disseminated broadly outside the United States on those platforms.
And then the final point is about Facebook's structure, whether it is sort of fundamentally flawed.
For sure, its newsfeed, the way things bubble up, is built on an incentive mechanism around emotions.
And emotions are not great friends of facts.
What Facebook does do, which it shouldn't, is allow then those tabloid-y, gossipy, unfact-checked stuff to sort of grow and grow and cover all the rest of the stuff.
And that's where I was going, yeah.
Right?
So that's where the problem comes, right?
You should be able to see all the stuff.
You should be able to pick up the trashy gossip magazine that is made up for the 13th time that Queen Elizabeth has resigned or has stepped down.
But it shouldn't be sort of on each and every shelf.
Maybe Facebook really is a simple newsstand, but isn't that bad enough?
Facebook tells investors, recruits, and politicians that it's going to connect the globe and change the world for the better.
But for now, it's just another place where gossip often triumphs over truth.
It's a tool that needs a lot of work.
There's lots of things that are wrong with newsfeed.
Some work has been done.
A lot needs to be done still.
But I'm very wary of painting the whole endeavor as rotten.
He says Facebook can fix itself, but there's a problem.
I have no idea Facebook's fact-checking efforts are working, and neither does Montserr Leese.
We haven't seen enough data from Facebook, or really any data from Facebook yet to evaluate how it's been going.
So we have reassurances that prevalence of flagged fake news goes down, but we obviously still see blockbuster viral hoaxes reaching enormous audiences.
And so this is my biggest point now with Facebook is to continue and ask for transparency from them.
Montserratis told me this in a kind of offhand way, but I think it's the whole damn story.
Facebook is famous for giving up information about its consumers to companies and even to political groups, but it won't share data with its own fact-checkers.
Facebook's most important resource is user information.
Advertisers get it, fact-checkers don't.
That's Facebook's business model.
And it's fundamentally incompatible with a business that puts truth over advertising.
In the end, I think Tim Wu is right.
And I very strongly believe the important mission that we need to be thinking about is how we design competitors to Facebook that are engineered from the outset with a different set of goals.
That's Tim Wu again.
The first thing that comes out doesn't need to be the last.
In 1833, Benjamin Day was the first publisher to sell his audiences a product.
He inspired hundreds of copycats.
But the real solution to the sun's legacy of hoaxes didn't didn't come from the sun itself.
The fix came from competition.
It came from other 19th century newspapers, the New York Times, the Wall Street Journal.
And yeah, they had advertising, but they also paid extra for ethical reporters and honest news.
Their secret wasn't some whiz-bang technology.
These companies published the truth because that was their business.
If we want a Facebook built for truth, we should probably build a new social network.
Someone else needs to get in there, do that job with a different business model or a more restrained business model, maybe a nonprofit, maybe something else.
The opportunity is there, and I
will not rest happy until someone comes up with something better.
And that way, I'm an optimist.
You know, a true technologist believes when something is just broken, sometimes instead of trying to fix it, you build something better.
And that's what I think we need to do with Facebook.
So, in conclusion, move fast and break Facebook?
That's good.
Don't fix it, replace it.
Move fast, break Facebook, and, you know, we can do better.
Crazy Genius was produced by Krista Ripple and Catherine Wells with help from Abdullah Fayad.
David Herman is our engineer.
Breakmaster Cylinder composed our theme song and all the music in this episode.
Special, special thanks to Matt Thompson and Kevin Townsend.
And hey, if you like the show so far, help us out.
Give us a five-star rating on Apple Podcasts.
I'm Derek Thompson.
See you next week.