Introducing Crazy/Genius Season 3
This week’s Radio Atlantic is a preview of the new season of Crazy/Genius, The Atlantic’s podcast about technology and culture. Staff writer Derek Thompson joins Isaac Dovere to discuss Season 3, which kicks off with an episode about privacy.
Subscribe to Crazy/Genius: Apple Podcasts | Spotify | Stitcher | Google Play
Learn more about your ad choices. Visit megaphone.fm/adchoices
Listen and follow along
Transcript
There's only one place where history, culture, and adventure meet on the National Mall.
Where museum days turn to electric lights.
Where riverside sunrises glow and monuments shine in moonlight.
Where there's something new for everyone to discover.
There's only one DC.
Visit Washington.org to plan your trip.
Hi, Radio Atlantic listeners.
This is Isaac Dover, staff writer here at The Atlantic.
We're doing something a little different this week, giving you a preview of the season three premiere of one of our other great podcasts here at The Atlantic, Crazy Genius, hosted by my colleague Derek Thompson.
And we've got Derek right here.
So Derek, tell us about the show for those who haven't heard it before.
What is Crazy Genius?
Crazy Genius is a show about technology.
This is our third season.
The first season took on some major tech headlines and questions like, should we break up Amazon?
Can we save local news from the advertising monopolis?
And do aliens exist?
We answered Fermi's paradox.
The second season was about science.
And this, our third season, is all about fixing internet problems.
The theme of this season is unbreak the internet.
And so in each of the eight episodes this season, we're going to examine one specific internet-you problem from privacy to extremism on the platforms to AI and criminal sentencing algorithms, and basically explain what the problem is and then explain how people can think about fixing that problem.
Including a very,
to me, freaky doll that is involved in this week's episode.
Tell us about Kayla.
Kayla, my friend Kayla, is a smart doll.
That means this is a doll that can actually talk with children as they talk to the doll.
But the question is,
where are those children's sentences being stored?
According to a Harvard researcher we spoke to, Shoshana Zuboff, those dialogue chunks, as they're called, are being stored in a server owned by Nuance Communications, and Nuance does business with federal surveillance agencies like the CIA, thus suggesting that it is very much possible that dialogue from children in the United States and beyond is going to surveillance agencies to help them understand
how to spy on people, how to essentially interpret information and language.
And so essentially these children are, you could choose your metaphor, either doing labor for the CIA or essentially serving as raw material that my friend Kayla and Nuanced Communications are essentially harvesting in order to make money with a government relationship.
Is it possible that you're just being paranoid?
Well, that's the big question that we ask at the top.
I call it the privacy paradox.
On the one hand, a lot of people seem very concerned about what's happening with their data, with their online privacy.
And on the other hand, look at our behavior.
We use Google, we use Facebook, we make great use of these technologies that we say we fear.
According to one survey that asked Americans, would you pay for a replacement for a search engine or a social media network that protected your privacy?
Two-thirds of Americans said no.
So it's like we value privacy, we just value it at exactly $0.00.
And much of the episode is looking at this privacy paradox and explaining why Americans have such a fraught relationship with privacy.
And you did other topics, as you said, in other seasons.
Why was surveillance and privacy the right way to go at things for this season in 2019?
I think surveillance and privacy is the single inescapable issue in technology in 2019.
I mean, so much of our conversation about the rise of technology and the power of these tech giants is about whether or not they are somehow stripping away our autonomy.
You know, when we talk about, is Alexa spying on us?
Can we find a way to make Google more responsible?
What was Cambridge Analytica really about?
These are all stories about privacy and our own autonomy.
So I really wanted to kick off the season with a look at what exactly is it that we mean when we talk about privacy.
because the idea itself has really changed a lot over time.
When America was founded in the late 18th century, privacy was not a thing.
No one talked about privacy.
It was not a big public concern.
The very concept of a right to privacy was really coined in the 1890s with the rise of early communications technology like the telephone and the telegraph.
And so the concept of privacy has evolved so much in the last 150 years that I wanted in this episode to do two things.
One, to provide a history to explain how we got to where we are.
And then two, to provide provide a really clear and hopefully provocative take on the privacy paradox.
Why are we so paranoid about this?
If, on the other hand, we seem to be choosing every day to engage with the very platforms that seem to be stripping us of our autonomy.
There's a lot in the season, but let's get to that season premiere.
This is Crazy Genius, season three, episode one, and we'll be back next week with a new episode of Radio Atlantic.
The internet is broken.
And nowhere is its brokenness more obvious than in the endless series of scandals in one particularly sensitive subject.
But there's another privacy crisis brewing, and Facebook is reportedly.
Mark Zuckerberg says he's sorry about the privacy data breach.
We might even need a constitutional amendment to protect our privacy.
Cybersecurity experts are calling a popular app on Facebook a privacy nightmare.
More than 17 million people have taken it away.
Everybody is talking about privacy these days.
Who has it, who doesn't, which companies are taking it away, and how to get it back.
But I have a confession, and I'm a little nervous to say this out loud because I don't want to sound like a fool in the first episode of this season, but I have no idea what privacy even means anymore.
And I sort of get the feeling a lot of you don't either.
We complain about Alexa listening to our orders, about Instagram targeting us with ads, about smart devices tracking our behavior.
But every year we buy more Alexa products, post more on Instagram, buy more smart devices.
If you ask Americans if online privacy is in a state of crisis, they say yes.
But if you ask them, would you actually pay for devices and apps that would guard your identity?
two-thirds of them say no.
So we value privacy.
We just value it at exactly $0.00.
This divergence between our complaining and our behavior drives me a little nuts.
Like, come on, people, we can't buy rooms at the Panopticon Hotel and then complain about the surveillance.
This discrepancy between attitude and behavior is not a paradox.
I can explain it to you.
Oh, thank God.
Welcome back to Crazy Genius Season 3.
Our theme for the next eight episodes, Unbreak the Internet.
First up, the privacy wars.
When people think about what's wrong with the internet, they think about privacy.
But before we can figure out how to fix it, we need to agree on what it is and what, if anything, is at stake.
When it comes to protecting our private data, is everybody a bunch of paranoid hypocrites?
Or am I the crazy one?
For The Atlantic, I'm Derek Thompson.
This is Crazy Genius.
Why are you interested in privacy?
Isn't everybody interested in privacy?
That's Sarah Igo.
She's a history professor at Vanderbilt University.
She literally wrote the book on the history of privacy in America, The Known Citizen.
I wanted to know, what is privacy?
What does it mean?
And have we always been as worried about it as we are now?
Privacy is one of those interesting values that really doesn't surface until it's violated.
So people don't enunciate it until they think they don't have it.
Igo told me privacy isn't a stable concept.
And it hasn't always been that controversial.
What's really striking when you look at the period before, say, the late 19th century, really before the Civil War, you don't actually find a lot of public debates around privacy.
You don't even find the word privacy in these debates.
Pull up a copy of the U.S.
Constitution on your computer.
Control-F search for the word privacy.
Zero results.
The closest you'll get is something like the Third Amendment.
No soldier shall be quartered in any house without the consent of the owner.
But in the 1700s, most Americans didn't even own a house.
Most Americans, in fact, were not particularly entitled to privacy unless they were propertied, unless they were men and heads of households.
Lots of people didn't even own their own labor, for instance, if you think about enslaved people in the United States.
And so we're talking about a pretty small sector of the population that thought of itself as entitled to privacy.
What do you think changed in the late 19th century such that Americans' relationship to privacy really had a turning point in that period?
What comes to the fore in the late 19th century is that all these new technologies kind of make privacy precarious in a way that it hadn't been before.
Photography is one example, instantaneous photography, amateurs taking photographs, being able to publish and trade images of people that were, in a real sense, kind of private images, but also telephone lines, telegraph cables that made communications faster and more convenient and less expensive in certain ways, but also much more porous so that you get worries about wiretapping and listening in and so forth.
So a whole bunch of technologies are one cause.
Telegraphs and photographs brought tech into our personal space, but something else was getting into our personal space.
Other people.
In the 1800s, Americans moved from sparse farms into dense cities.
People didn't realize they valued privacy until it disappeared and they were forced to sleep, eat, work, and live all on top of other people.
Those things kind of collide in a pretty serious fashion in the late 19th century and will give rise to the first modern calls for a right to privacy.
So where does this term right to privacy come from?
Pretty late.
It's in 1890, and it comes from a Harvard Law Review essay written by Louis Brandeis and Samuel Warren.
Lewis Brandeis would eventually move to the Supreme Court, and they call for what they called a right to be let alone.
They believe that this is a right that people have against various kinds of invaders of their private affairs and private lives.
And what were they responding to?
They were talking about a right to privacy from an aggressive press, from journalists who were scouring, especially the lives of the elite for scandalous stories about divorce and.
Wait, that's amazing.
So the original right to privacy was a right to privacy from journalists?
Yes.
So from private actors, but also from what Warren and Brandeis called the new devices that allowed things that were meant to be expressed in private to be shouted from the rooftops.
Snooping journalists and newfangled telephones telephones weren't the only things freaking out Americans.
There was also
the mail.
When postcards were authorized and then went on sale, they were immensely popular, but they will immediately also cause a backlash by editorialists and moralists, etiquette writers who believe there was something fundamentally problematic about people sending private matter through the mails without an envelope, right?
That couldn't be sealed.
Americans are often charged with disclosing too much, with being too free and loose with their information.
And so some of the critiques of the postcard read as if they were talking about in the late 19th century social media.
I mean, the terms are exactly the same.
The right to privacy has essentially shifted in these 100 years from a right to privacy of property to privacy of communications, where the bad actor isn't so much the government anymore, it's private actors.
It's muckraking journalists.
And it's who else?
Advertisers.
There's this wonderful case, one of the early right to privacy suits from 1902 in New York State, which is about a woman who discovers, much to her shame and humiliation, that her face appears on advertisements for Franklin Mills' flower.
And she sues because this seems like a kind of a violation of her privacy rights, not a tangible property right, again, but a right to control her own image.
She does not win her suit, but it does cause an uproar in the press.
It's a nationally followed case, and in fact, is responsible for the first New York state laws regulating the right to privacy.
As American life became more public, more Americans insisted they had a right to be left alone from snooping journalists, from nosy neighbors, from advertisers.
Finally, their fears were heard by the Supreme Court in 1965 with the landmark case, Griswold versus Connecticut.
The 1965 case is a birth control case, the right to privacy of a married couple to discuss and use contraceptives without state interference.
So it outlawed an 1873 Connecticut law that had made contraceptive counseling and use illegal.
In Griswold, the Supreme Court said clearly what the Constitution did not.
Yes, America, there is a right to privacy.
It was a long time coming.
It was also perfect timing.
From the middle of the 20th century onward, Americans have faced an onslaught of new privacy-invading measures.
Increased awareness around wiretapping to the Cold War state, investigating Americans' private activities and surveilling them in the name of McCarthyism and anti-communism,
the increasing use of psychological instruments in all kinds of places in motivational research and advertising, but also personality tests at the workplace and at schools.
These all came together to create a new privacy consciousness in the United States.
In a weird way that I totally wouldn't have expected before I spoke with you, we have more privacy in our laws,
but also less privacy in our lives.
There are more laws now
with privacy at the heart of them.
And I wonder if you think we are overreacting to the privacy threat today.
Are we going to look back on this the same way we look back at the postcard freak out and say, I can't believe that we were so concerned over nothing?
Are we overreacting?
I actually don't think that we're overreacting.
And I think what concerns me the most is that we have a ton of privacy in our society for the powerful, you know, state secrets and corporate secrets and intellectual property and so forth.
But we seem to feel anyway that we have less and less privacy for the individual citizen.
The reason we have less and less privacy is because we live more and more of our lives online.
50 years ago, it would have been a logistical nightmare to learn every American's favorite book, song, fashion company.
Now we offer up that information over breakfast on our phones.
Companies spy on us, and in exchange, we get their free cool stuff.
That is the calculus of the web, the central equation of life online.
What are we in this equation?
We're the empty pit mine that's been left behind after it's been stripped of whatever is valuable.
We're the carcass of the elephant that's been left behind after the ivory has been poached.
How to solve a privacy crisis: 200 years in the making.
We'll be right back.
nuesces y fruta que todos vanadis frutar.
Honey punches devotes para todos.
Today albener para sabermás.
Say hello to the next generation of Zendesk AI agents.
Built to deliver resolutions for everyone.
Zendesk AI agents easily deploy in minutes, not months, to resolve 30% of customer and employee interactions on day one, quickly turning monotonous tasks into autonomous solutions.
Loved by over 10,000 companies, Zendesk AI makes service teams more efficient, businesses run better, and your customers happier.
That's the Zendesk AI effect.
Find out more at zendesk.com.
In the 1700s, the big threat to privacy was someone breaking into your house.
In the 1800s, it was someone reading your postcards.
In the 1900s, it was someone telling you how to use contraception.
But today's privacy threat is about, well, everything.
Or more specifically, about the massive corporations that seem to control everything.
But how did we lose so much control?
One scholar has spent much of her career trying to figure that out.
I'm Shoshana Zuboff, and I'm the author of The Age of Surveillance Capitalism, the fight for a human future at the new frontier of power.
What is surveillance capitalism?
Ah, funny you should ask.
What it does is it unilaterally claims private human experience as a free source of raw material.
Private human experience as raw material.
That sounds intense, I know, but think of it this way.
In original capitalism, nature is sold as land.
and work sold as labor.
In surveillance capitalism, the raw material isn't land or labor.
It's behavior, our clicks, our views, ourselves.
This raw material is converted into behavioral data, which is computed into prediction products, and these are now sold and purchased in a new kind of marketplace that trades exclusively in the future of human behavior.
Privacy under old capitalism is about stealing, stealing land, stealing information.
Privacy under surveillance capitalism isn't about stealing at all.
It's about influence.
The best way to understand what that means is, again, to know the history.
Zuboff says surveillance capitalism came into the world at the dawn of the new millennium.
It was invented almost by mistake.
Let's start with Google's big discovery back in 2000, 2001, when it was getting data about how people search and how they browse.
It discovered that a lot of behavioral data that it had collected almost by accident had tremendous predictive value.
They came out with predictions about our future behavior.
Now, in this case, the predictions were about a very specific form of our future behavior, and that was the kinds of ads that we were likely to click on.
Like a prospector looking for gold and accidentally striking oil, Google was trying to refine its search engines, but hit upon the unimagined world of predictive behavior, and it became one of the world's most valuable companies in the process.
Suddenly, every company wanted in.
This economic logic traveled from Google to Facebook.
It became the default logic in the tech sector.
But now, and perhaps what is less visible to a lot of our listeners, this thing has traveled through what we think of as the normal economy.
And so the CEO of Ford Motor Company is saying, hey, I want revenues like Google and Facebook.
Why can't Ford have those kinds of revenues?
So what he's proposing is that, hey, we've got a hundred million people driving around in Ford cars.
Let's get the data from all of these people.
And so the business of surveillance has spread to smart cars, smart fridges, smart televisions.
Every product that begins with the word smart, every service that begins with the word personalize.
Yes, I mean you, Alexa.
The smart TV that you maybe just put in your family room or the smart dishwasher that you've bought because it's cheaper than the old dishwasher that doesn't have the smarts.
So this is an economic logic that now has become the beacon, the gold standard for virtually every sector of our normal economy.
From the auto sector to the retail sector, the tentacles of these data supply chains reach into every nook and cranny of our lives,
even
to a child's playtime.
A few years ago, it was discovered that a children's doll, a doll called My Friend Kayla, My Friend,
this doll comes with an app.
So the parents buy the doll.
The parents put the app on their phone.
They give the doll to the child.
And the software in the doll is intelligent enough to actually recognize the child's speech.
The child will say, my name is Shoshana.
What's your name?
And of course the doll will recognize that and say, well, my name is Kayla.
I promise not to tell anyone.
It's just between you and me because we are friends.
In another era, that might have been the end of the story.
In the age of surveillance capitalism, it's the beginning.
The doll is actually recording the child's voice, and that recording becomes another product in the supply chain.
In the voice recognition business, the recordings of voices are called dialogue chunks.
There are markets for dialogue chunks, many markets for dialogue chunks.
So eventually it turns out, the company that owned the doll, is collecting the dialogue chunks from children and they are selling these dialogue chunks to another company called Nuanced Communications.
Nuanced Communications sells dialogue chunks to other entities
and it became clear that one of the big clients of Nuanced Communications was none other than the CIA
who bought dialogue chunks from Nuanced Communications in order to extend and develop its own voice recognition software, which it uses to recognize the voice of a terrorist or some
profile of a particular kind of voice that it's searching for among millions and millions of voices.
If dialogue chunks are the product and nuance is the merchant and the CIA is the buyer, then what does that make the children?
I love elephants.
So this is a metaphor that came to mind.
Our behavioral data, that's the ivory.
That's the stuff that has driven the huge revenues of these companies.
What are we in this equation?
We are the carcass that is left behind.
We are not the ivory.
We are not what is poached.
We're the empty pit mine that's been left behind after it's been stripped of whatever is valuable.
The key thing here, Derek, is that these companies do not care about us.
In a world of surveillance capitalism, the concept of privacy is more complicated than it used to be.
These companies don't want our land or our labor.
They want our lives.
They want to harvest our data to influence our future behavior.
Now, I'll admit, that sounds really scary, or at the very least, extremely creepy.
But here's where I get confused.
All this surveilling and harvesting and desiccating,
I don't understand how it will actually hurt me.
I think that people are overwhelmed.
Yes.
There's too much information about how your data is being collected and how you have so little control over it.
Yes.
People are confused because they're like, I get that everyone's taking my data, but I don't know exactly how I'm being harmed.
Exactly.
That's Julia Anguin.
She's an investigative reporter who's spent over a decade covering this stuff.
I think privacy is the wrong way to describe the issue we face in a world of pervasive, unregulated data collection.
The truth is that the harm of living in such a world is a collective harm.
It's something we all suffer together.
If privacy is the wrong word, what's the right word?
Right now I'm leaning towards data pollution.
Data pollution?
Yes.
I kind of like that.
What is data pollution?
Well, I've long felt that the issue we call privacy is very similar to the issue we call environmentalism.
It's pervasive, it's invisible, attribution is hard.
Even if you get cancer, you don't know if it's because of that chemical plant down the road.
You will never know, actually.
So I feel like these issues are very similar.
Living in a world where all of your data is collected and swept up in these dragnets all the time and will be used against you in a way that you will probably never be able to trace and you will never know about,
it feels like that same type of collective harm.
Think about what Edward Snowden revealed, a phone database of every single call every American had made for seven years.
It is not clear who was harmed by that, right?
There could have been investigations prompted into people as counterterrorists that we will never know about.
What was truly harmed to us was actually our idea of being an American.
One of the amazing things about this country is that we have some of the strongest rules against domestic surveillance of citizens of any country.
And so that was violated.
Three years ago, another crisis.
A voter profiling firm called Cambridge Analytica is now at the center of a massive alleged data breach at Facebook.
Reportedly.
Cambridge Analytica was a firm working with the Trump campaign in 2016.
They invited Facebook users to fill out a personality quiz.
Then they took the data from that quiz to build psychological profiles of voters who they targeted with Facebook ads.
These ads were designed to energize Trump supporters and discourage Clinton voters.
You could make an argument: oh, these people's quiz data was taken and then they were vulnerable to all this propaganda that was fed to them, and then they voted a certain way.
But if you talk to those particular people, they're probably fine with how they voted, right?
So the harm was not individual.
The harm was actually to our understanding of what are the rules of fair play in our elections.
With the steady increase in data emissions, there will be more Cambridge analyticas and worse ones.
As it stands, we may not know what data we've given up until it's used against us to manipulate our voting, to raise our insurance premiums, or to show up on our credit scores when we're trying to buy a house.
The reason I've been so confused when we talk about privacy is because I've thought about it as an individual threat, like robbery, when it's really a collective and diffuse problem, like climate change.
But what's the alternative?
I need to use Google Maps.
My family's on Facebook.
I have to browse the web for work.
If I give up those things in the name of privacy, I'll be cut off from my world, my life.
At the top of the show, Shoshana Zuboff said, there is no privacy paradox.
And she's right.
There is no paradox because there is no choice.
We can't choose to use these tools.
They're not optional.
They are the utilities of modern life.
We are forced to get online, to use the various devices, the products and services that are also the interfaces for surveillance capitalism supply chains.
And I regard this as as an intolerable situation, a choice that 21st century citizens of democratic societies should never have to make.
Zuboff says that surveillance capitalism doesn't have an individual solution, the same way that my recycling a few plastic bottles won't stop climate change.
Above all, collective action is going to be essential.
You know, a century ago, we had a raw, destructive, violent violent industrial capitalism.
And a century ago, most people thought that the case was hopeless.
How would we ever triumph over these incredibly wealthy industrial capitalists who were employing child labor, who could force us to work 70 hours a week?
But what happened?
We discovered that we weren't just individual workers and we came together and we forced our democracies to grant us the rights we needed to institutionalize new forms of collective action, collective bargaining, to withhold our labor, to create an equilibrium between the power of capital and what ultimately became the power of labor.
We face a similar kind of challenge today.
If reducing our carbon footprint is the key to fighting climate change, perhaps it's time to talk about reducing our data footprint.
And that's not going to happen with each person individually tweaking their Facebook privacy settings or individually deactivating Instagram.
Doing this in a meaningful way will require that we collectively advocate for a new suite of privacy laws.
It should be harder for people to give up their data, harder for companies to sell our data, and harder for groups like political campaigns to buy it.
The motives of surveillance capitalists are mostly invisible to us.
Imagine how much better it would be if your behavior or mine were just as invisible to them.
Throughout this episode, I've been hung up on the questions, how does surveillance hurt me?
Why should I care?
What have I got to hide?
But those are the right questions for the wrong century.
American privacy used to be about individual rights.
Keep your soldiers out of my house.
Keep your eyes off my postcards.
But this is a new age of privacy.
We are all blind participants in a grand experiment to engineer our future behavior without any knowledge or say over what that future will be.
And to fix this problem, we can't think of it as one we can face alone.
We've been sold a bill of goods by the surveillance capitalists.
They say to us,
if you have nothing to hide, you have nothing to worry about.
That is a profound abuse of the truth.
The truth is this, Derek.
If you have nothing to hide, then you are nothing.
Because everything that makes makes you Derek and that makes me Shoshana, without this private inner resource, we are simply creatures of stimulus response, reduced to an animal condition of how they would tune us and hurt us toward their behavioral outcomes.
Is that the world we want to create for our children?
My answer is no.
Emphatically, no.
Crazy Genius was produced by Patricia Jacob and Jesse Brennan with help from Kevin Townsend.
David Herman is our engineer.
Breakmaster Cylinder composed our theme song and all the music in this episode.
Katherine Wells is the executive producer of Atlantic Podcasts.
Agent LaFrance is our executive editor.
I'm Derek Thompson.
See you next week.