Should this creepy search engine exist?
Support the show: searchengine.show
To learn more about listener data and our privacy practices visit: https://www.audacyinc.com/privacy-policy
Learn more about your ad choices. Visit https://podcastchoices.com/adchoices
Listen and follow along
Transcript
This episode of Search Engine is brought to you in part by Perfectly Snug.
I am a hot sleeper.
I do all the right things and still wake up at 3 a.m., wide awake and sweaty.
Perfectly snug is a fix for that.
It's a two-inch mattress topper with whisper-quiet fans that actively move heat and humidity away from your body.
The sensors actually work.
You don't have to fiddle with settings in the middle of the night.
And if you, like me, need quick relief, burst mode cools things down in about 10 minutes.
I recommend their dual zone setups so that one side of the bed can run cooler than the other side of the bed.
And there's no water to refill and nothing snaking off the bed.
It's just a slim topper that sips power.
Setup is a few minutes.
They offer a 30-night risk-free trial with free shipping and returns.
If you, like me, are tired of sweating through the night, try Perfectly Snug.
Not all search engines are good.
Some are morally dubious.
A couple months ago, I found myself using a morally dubious search engine.
This search engine lives on a website.
I'm not going to tell you the name of it for reasons that'll become clear, but I am going to describe to you how it works.
So you open the site, you upload a photo of a person's face.
I can upload a photo of myself right now.
Wait about 30 seconds for the search to run.
And then I get taken to this page, which includes all these different photos of my face from all these different places on the internet.
I can click on any of the photos and it'll take me to the site where the photo lives.
This is Shazam for faces.
If I put in a stranger's face, I'll almost always get their real name because it'll take me to one of their social media profiles.
From there, I can typically use a different search engine to find a physical address, often a phone number.
On the site itself, I also usually see stills from any videos the person has appeared in.
When I first learned about this site, I did what you do when you get Google for the first time.
I looked myself up.
And then, I started looking at my friends.
And it took about 30 seconds before I saw things that made me pretty uncomfortable.
I was just seeing stuff I should not be seeing.
I don't know the most delicate way to say this, except
people I knew had compromising stuff on the internet.
Stuff they had put there.
but not under their real names.
And I don't think they knew, I certainly hadn't known, that the technology to search someone's face was right around the corner.
I decided to stop using the search engine.
The line between general internet curiosity and stalking, this felt like the wrong side of it.
It felt seedy.
But now, even just knowing this tool existed changed how I thought in the real world.
I found myself trying to reach for it, the way Any digital tool that works begins to feel like another limb.
I found a driver's license on the floor of a dance club.
The person had a name too common to Google, like Jane Smith, but I realized, you just find their face with the search engine.
Another night, two people at a restaurant were talking.
One of them, the guy, was telling what sounded like a very personal story about the vice president of America.
Who was this guy?
I realized if I snapped a photo of him, I now had the ability to know.
We take for granted the idea that we have a degree of privacy in public, that we are mostly anonymous to the strangers we pass.
I realized this just wasn't true anymore.
Right now, there are a lot of discussions about AI chatbots, about the ethics and problems of a very powerful new technology.
I feel like we should also be talking about this technology, these search engines, because my feeling using one was we are not at all ready for this.
This thing that is already here.
And I wanted to know, is it too late?
Is there a way to stop these tools or limit them?
And I especially wanted to know, who unleashed this on us?
So I called the person you call when you have questions like this.
Can you introduce yourself?
Sure.
I'm Kashmir Hill.
I am a technology reporter at the New York Times, and I've been writing about privacy for more than a decade now.
Kashmir is one of the best privacy and technology reporters in America.
She published a book a few months ago about these search engines and about the very strange story of how she discovered that they even existed.
It's called, appropriately, Your Face Belongs to Us.
Her reporting follows a company called Clearview AI, which is not the search engine I was referencing before.
Clearview AI is actually much more powerful and not available to the public.
But in many ways, Clearview created the blueprint for copycats, like the one I had found.
Cashmere told me the story of when she learned of Clearview AI's existence back when the company was still in deep stealth mode.
So I heard about Clearview AI.
It was November 2019.
I was in Switzerland doing this kind of fellowship there.
And I got an email from a guy named Freddie Martinez who worked for a nonprofit called Open the Government.
And he does public records research and he's obsessed with privacy and security as I am.
And I'd known him for years.
And he sent me this email saying, I found out about this company that's crossed the Rubicon on facial recognition.
That's how he put it.
He said he'd gotten this public records response from the Atlanta Police Department describing this facial recognition technology they were using.
And he said it's not like anything I've seen before.
They're selling our Facebook photos to the cops.
And he had attached the PDF he got from the Atlanta Police Department.
It was 26 pages.
And when I opened it up, the first page was labeled privileged and confidential.
And it was this memo written by Paul Clement, whose name I recognize because he's kind of a big deal lawyer, was solicitor general under George Bush, now in private practice.
And he was talking about the legal implications of Clearview AI.
And he's describing it as this company that has scraped billions of photos from the public web, including social media sites, in order to produce this facial recognition app where you take a photo of somebody and it returns all the other places on the internet where their photo appears.
And he said, we've used it at our firm.
It returns fast and reliable results.
It works with something like 99% accuracy.
There's hundreds of law enforcement agencies that are already using it.
And he had written this memo to reassure any police who wanted to use it that they wouldn't be breaking federal or state privacy laws by doing so.
And then there was a brochure for Clearview that said, stop searching, start solving, and that it was a Google for faces.
And as I'm reading it, I'm just like,
wow, how have I never heard of this company before?
Why is this company doing this and not Google or Facebook?
And does this actually work?
Is this real?
Because it is violating things that I have been hearing from the tech industry for years now about what should be done about facial recognition technology.
I flash back to this workshop I've gone to in DC organized by the Federal Trade Commission, which is kind of our de facto privacy regulator in the United States.
And they had a bunch of what we call stakeholders there.
Google was there, Facebook was there, little startups, privacy advocates, civil society organizations, academics.
Good morning, and I want to welcome all of you, both here in Washington, D.C.
and those watching online to today's workshop on facial recognition technology.
This workshop that Kashmir remembers, it happened in 2011.
It was called Face Facts, or maybe Face Facts.
The video of the workshop on the FTC's website shows a string of speakers presenting at a podium in front of a limp-looking limp-looking American flag.
We will focus on the commercial use, that is, on the possibilities that these technologies open up for consumers, as well as their potential threats to privacy.
Most of you know this, but the mission of the FTC, well, you're going to need a club before you get folks through.
They're talking about the nitty-gritty of facial recognition technology.
What safeguards need to be put in place around this technology that's rapidly becoming more powerful?
And everyone in the room had different ideas about what we should be doing.
You know, Google and Facebook at that point were just tagging friends and photos.
And there are some people there saying we need to kind of ban this.
But there was one thing that everybody in the room agreed on.
And that was that nobody should build a facial recognition app that you could use on strangers and identify them.
Since day one, we asked ourselves, how do we avoid the one use case that everybody fears, which is to de-anonymize people?
That's the CEO of a facial recognition company that would soon be acquired by Facebook.
He was saying they had to prevent the use case no one wanted.
Shazam for faces.
So the input into our system is both the photos and the people that you want to to have identified and that will give you back the answer.
So in fact, you can never identify people you do not know.
That's our mantra, right?
Is this one thing that we wanted to make sure that doesn't happen.
And so now I'm looking at this memo that says that has happened.
Right.
And so yeah, I was very shocked.
And I told Freddie, I'm definitely going to look into this like as soon as I fly back to the United States.
And that's what I did.
So at this point, in late 2019, here's what Kashmir knows about Clearview AI.
It's supposedly a very powerful technology that has scraped billions of photos from the public web.
And it's being used by the Atlanta Police Department.
She doesn't know who's behind the company.
but she has ideas about how to find them.
She starts calling their clients.
And so I reached out to the Atlanta Police Department.
They never responded.
Other FOIAs were starting to come in that showed other departments using Clearview.
And I just did a kind of Google dorking thing where I searched for Clearview and then site.gov to see if it showed up on budgets.
Oh, that's really smart.
Yeah.
And so I started seeing Clearview and it was really tiny amounts, like $2,000, $6,000, but it was appearing on budgets around the country.
And so I would reach out to those police departments and say, hey, I'm like looking to ClearView AI.
I saw that you're paying for it.
Would you talk to me?
And eventually, the first people to call me back were the Gainesville Police Department, a detective there named Nick Ferrara.
He's a financial crimes detective.
And he calls me up on my phone.
He said, oh, hey,
I heard that you're working on a story about Clearview AI.
I'd be happy to talk to you about it.
It's a great tool.
It's amazing.
And he said he would be the spokesperson for the company.
So he told me.
He's just like, this is great.
He loved it.
He said he had a stack of unsolved cases on his desk where he had a photo of the person he was looking for, like a fraudster.
And he'd run it through the state facial recognition system, not gotten anything.
And he said he ran it through Clearview AI and he got hit after hit.
And he just said it was this really powerful tool.
It worked like no facial recognition he'd used before.
The person could be wearing a hat, glasses, looking away from the camera, and he was still getting these results.
And this is sort of the positive case for any of this, which is that if a dangerous person who like has committed violent crimes is out in the world and there's some photo of them where maybe they were like robbing a bank and their mouth was covered and there's a hat low over their head.
And if a cop can take that surveillance still, plug it into a big machine and find this person's name, we live in a safer world.
Right.
This is the ideal use case.
Solving crimes, finding people who committed crimes, bringing them to justice.
Yeah.
And so Nick Ferrara, this detective said, yeah, it works incredibly well.
And And I said, well, I'd love to see what the results look like.
I've never kind of seen a search like this before.
And he said, well, I can't send you something from one of my investigations, but why don't you send me your photo?
And I'll run you through Clearview and I'll send you the results.
So I do that.
I send some photos of myself.
How do you pick the photos?
I tried to choose hard photos.
So I had one where like my eyes were closed, one where I was wearing a hat and sunglasses, and another that was kind of like an easy photo in case those other two didn't work.
And then I waited to hear how it went and see for myself how well the software works.
And Nick Ferrara ghosts me.
He just totally disappears.
Disappears.
Won't pick up when I call him, doesn't respond to my email.
Kashmir says she tried this again with a different police officer in a different department, and the same thing happened.
They were friendly at first.
Kashmir asked them to run a search on her face.
They agreed.
And then
they were gone.
And so eventually I kind of recruited a detective in Texas, a police detective, who was
kind of a friend of a friend at the times and said, oh, you're looking into this company.
I'm happy to download the tool, tell you what it's like.
And so he requests a trial of Clearview.
And at this point, Clearview was just giving out free trials to any police officer as long as they had an email address associated with the department.
And so what Facebook did when they first opened, but with college campuses.
Yeah, exactly.
It was exclusive just for government workers.
And so he goes to their website where he can request a trial.
Within 30 minutes, he's got ClearView on his phone and he starts testing it, running it on some suspects whose identity he knows, and it works.
He tried it on himself, and he kind of had purposely not put a lot of photos of himself online because he was worried about exposure and people coming after him who he had been involved in catching, sending to jail.
And it worked for him.
It found this photo of him on Twitter where he was in the background of someone else's photo.
And he had been on patrol, so it actually had his name tag on it.
So it would have been a way to get from his face to his name.
And he immediately thought, wow, this is so powerful for investigators, but it's going to be a huge problem for undercover officers.
If they have any photos online, it's going to be a way to figure out who they are.
Yeah.
And so I told him about my experience with other officers running my photo.
And he ran my photo and there weren't any results, which was weird because I have a lot of photos online.
Like it just came up like nothing.
Nothing.
And then
within minutes, he gets a call from
an unknown number.
And when he picks up, the person says, this is Marco with Clearview AI tech support.
And we have some questions about a search that you just did.
Oh, my God.
And he says, why are you running photos of this lady from the New York Times?
And the detective kind of plays it cool.
And he's like, oh, I'm just testing out the software.
How would I know somebody in New York?
I'm in Texas.
And anyways, his account gets suspended.
Oh, wow.
And this is how I realize that even though Clearview is not talking to me, they have put an alert on my face.
And every time an officer has run my photo, they've gotten a call from Clearview telling them not to talk to me.
Just to spell out what Kashmir believed was going on here, these police officers may have thought they were using a normal search engine like Google, but what they hadn't counted on was that someone on the other end of that search engine seemed to be watching their searches, surveilling the cops who were using the surveillance technology.
It was a moment where Kashmir saw clearly how this mysterious company, by being the first to build this tool no one else would, had granted itself immense power to monitor Kashmir, to monitor these cops.
This company whose product would reduce the average American's privacy was keeping quite a lot of privacy for itself.
Of course, Kashmir is, fortunately for us, a nosy reporter, so all this cloak and dagger behavior just made her more curious.
She tries to crack into the company a bunch of different ways.
She's reaching out to anybody online who might have links to the company.
She finds an address listed on Clearview AI's website.
It's in Manhattan.
But when she goes there in person,
there's no such address.
The building itself does not exist.
It's a real Harry Potter moment.
Finally, she tries something that does work.
On the website Pitchbook, she can see two of Clearview AI's investors, Peter Thiel, no luck there, but also an investment firm based in New York.
They're north of the city and they weren't responding to emails or phone calls.
So I got on the Metro North and went up to their office to see if they had a real office.
And it was kind of an adventure being there.
The office was empty.
All their neighbors said they never came in.
I kind of hung out in the hallway for about an hour.
A FedEx guy came.
He dropped off a box.
He says, oh, they're never here.
And I thought, oh my gosh, this is a waste of a trip.
But then I'm walking out of the building and it was on the second floor.
And I'm coming down the stairs.
And these two guys walk in.
And they just, they were wearing like lavender and pink.
And they just looked like moneyed.
They stood out and I said, oh, are you with Kiranaga Partners, which is the name of this investment firm?
And they look up and they smile at me and they say, yeah, we are.
Who are you?
And I said, I'm Kashmir Hill.
I'm the New York Times reporter who's been trying to get in touch with you.
And their faces just fall.
I said, I want to talk to you about Clearview AI.
And they said, well, Clearview AI's lawyer said that we're not supposed to talk to you.
And I was around seven months pregnant at this time.
And so I kind of like opened my my jacket and just clearly display my enormous belly.
And I was like, oh, I've come so far.
It was cold.
It was raining out.
And David Scalzo, who's the main guy, main investor in Clearview at Kirinaga, he says, okay.
So Kashmir and the two investors go inside the office.
Kashmir tells them, all this not talking, it's making Clearview AI look pretty nefarious.
She has a point.
And so one of them agrees to go go on the record and starts talking about his vision for the company that he has invested in.
David Scalzo said, right now they're just selling this to kind of like retailers and police departments.
But our hope is that one day everybody has access to Clearview.
And the same way that you Google someone, you'll clearview their face and be able to see all the photos of them online.
He says, yeah, I think, we think this company is going to be huge.
And now they give Kashmir the information that she'd really been looking for.
The names of the people who are actually responsible for this tool.
And they said, oh yeah, we're really excited about the founders.
And they say it's this guy, Richard Schwartz, who's kind of a media politics guy, worked for Rudy Giuliani when he was mayor.
And then there's this tech genius, real mastermind, young guy, and his name's Juan Tantat.
And I, we were in a conference room.
So I'm like, can you write that up on a whiteboard for me?
How do you spell Huantan Tat?
And so he writes it out.
And this is the first time I figure out who the people are behind this.
After the break, the story of how Juan Tan Tat and his engineers got your face and my face and 3 billion photos worth of faces into this powerful new search engine.
This episode of Search Engine is brought to you in part by Bombas.
Fall's here, kids are back in school, vacations are over.
It is officially the start of cozy season, which means it's time to slide into some bombas.
You know bombas, the most comfortable socks, slippers, tees, and underwear out there, made from premium materials that actually make sense for this time of year.
The season's softest materials, think merino wool that keeps you warm when it's chilly, but cool when it's hot.
Sapima cotton that's softer, stronger, and and more breathable than regular cotton.
And even rag wool, the thick, durable, classic cozy sock you'll want all fall.
The best part, for every item you buy, Bombus donates one to someone experiencing homelessness.
Over 150 million items have been donated thanks to customers.
I mostly just wear bombus socks.
I've been only wearing red socks because it's harder for people to steal them from me.
But I've decided this year, I'm switching to purple.
You can head over to bombus.com/slash engine and use code Engine engine for 20% off your first purchase.
That's bombbas.com slash engine.
Codeengine at checkout.
This episode is brought to you in part by Ollie.
I have a dog who I love very much named Ralphie.
Ralphie is a very picky eater, and I worry a lot about his health.
And so I'm always looking for great dog food for him.
And if my dog could talk, he would probably beg for Ollie.
Ollie delivers clean, fresh nutrition in five drool-worthy flavors, even for the pickiest eaters.
Made in U.S.
kitchens with the highest quality human-grade ingredients, Ollie's food contains no fillers, no preservatives, just real food.
The way it works, you fill out Ollie's 30-second quiz, and they create a customized meal plan based on your pup's weight, activity level, and other health information.
For first-timers, Ollie will send out your pup's first box with two weeks' worth of meals, a free storage container for mess-free serving, and a guide for how to gradually switch them over to their new diet.
One to two months on Ollie can help your dog reach a healthy weight.
Head to ollie.com slash search, tell them all about your dog, and use code search to get 60% off your welcome kit when you subscribe today.
Plus, they offer a happiness guarantee on the first box.
So if you're not completely satisfied, you'll get your money back.
That's o-l-l-ie-e.com slash search and enter code search to get 60% off your first box.
Now that Kashmir had a name to search, Juan Tan Tat, she learned Juan had an internet trail.
This guitar you're hearing, part of the trail.
Juan had a habit in one chapter of his life of posting YouTube videos of himself pretty capably playing guitar solos.
In the videos, he doesn't speak, but you see him, tall and slender with long black hair, fashionable.
Juan has Vietnamese roots, raised in Australia.
He moved to San Francisco in 2007.
His internet breadcrumbs suggest a strange collage of a person, a Bay Area tech guy who presents in a slightly gender fluid way, has photos from Burning Man, but then also seems like a bit of a troll.
In a Twitter bio, he claims to be a, quote, anarcho-transsexual, Afro-Chicano American feminist studies major.
What is clear is that Juan had come to America with big dreams of getting rich on the internet.
He started in the Farmville era of Facebook apps when you could make money building the right stupid thing online.
Nothing he tried really took off, though.
Not apps like Friend Quiz or Romantic Gifts, not later efforts like an app that took an image of you and Photoshopped Trump's hair on your head.
In 2016, Juan would move to New York.
At some point, he'd delete most of his social media.
But Kashmir found an old artifact of who he was on the internet back then.
I found an archived page of his from Twitter on the Wayback Machine.
And it was mostly him kind of retweeting like Breitbart reporters and kind of saying, why are all the big cities so liberal?
Yeah.
He doesn't have a Twitter account.
He doesn't have a Facebook account.
It seemed like, wow, this is weird.
Like this guy guy is in his 20s, I think, but he doesn't have a social media presence beyond like a Spotify account with some songs that he apparently had done.
It was a strange portrait, but it came away thinking, wow, this person is a character.
Yeah.
And I really want to meet him.
At this point, it seemed like the company understood that Cashmere Hill was not going to go away.
A crisis communications consultant reached out.
and eventually offered to arrange an interview with Juan Tantat.
When I met him, he was not what I expected him to be, which was he still had the long black hair, but now he had these glasses that felt very like office worker glasses.
And he was wearing this blue suit and he just looked like security startup CEO.
Okay.
Which just, again, wasn't what I expected based on everything else I saw about him.
We met at a WeWork because they didn't have an office.
I would find out that he mostly kind of worked remotely, did a lot of the building of Clearview AI.
At the time, he lived in the East Village, and he kind of just did it in cafes, like places with free Wi-Fi.
So they booked a room at WeWork for our meeting.
The crisis communications consultant was there.
She'd brought cookies.
What type of cookies?
Chocolate chip.
And I feel like they were Nantucket or like sausalito cookies.
I can't remember the brand.
Okay.
But yeah, and we had lattes at the WeWork cafe and we sat down and I just started asking my questions.
And for the most part, he's answering them.
And we had a couple of hours to talk.
And he really was telling me a lot.
And so it was this complete 180.
In person, he's very charismatic, very open,
and would be evasive about some things, wouldn't describe anyone else involved with the company besides Richard Schwartz, his co-founder.
But yeah, I mean, he was open.
And I was like, you have built this astounding technology.
Like, how did you do this?
How did you go from what what you're telling me about Facebook apps and iPhone games to building this?
And he said, well,
I was standing on the shoulders of giants.
And he said, there's been this real revolution in AI and neural networks and a lot of research that kind of the most brilliant minds in the world have done.
They've open sourced.
Oh.
They've put it on the internet.
Juan told Kashmir that in 2016, in the early days of building what would become the Clearview AI facial search engine, he'd taught himself the rudiments of AI-assisted facial recognition by just essentially Googling them.
He'd gone on GitHub and typed in face recognition.
He'd read papers by experts in the field.
He told her, quote, it's going to sound like I Googled flying car and then found instructions on it, which wasn't too far off.
Until pretty recently, facial recognition existed, but was somewhat crude.
What Juan was learning on the internet was that machine learning, neural networks, had just changed all that.
Now computers could teach themselves to recognize a face, even at an odd angle, even with a beard, provided that the computer was given enough images of faces, training data to learn on.
We reached out to Clearview AI for this story.
We didn't get a response.
But in the years since his interview with Kashmir, Juan has done plenty of interviews with the press.
One thing I do respect is the fact that you decided to come here live for an interview, so I appreciate you for taking the time.
And thanks, Pat, for having me on.
What an interesting thing.
Here's one with the YouTube show, Valutainement.
Juan's dressed as he was with Cashmere in a suit, looking, again, like a standard tech exec, just with unusually long hair.
Here he describes what his research process for this search engine was like.
I was looking at the advances in AI.
So I saw ImageNet, which is a competition for recognizing things in images.
Is this a computer?
Is this a plant?
Is this a dog or a cat?
And the results got really good.
And then I looked at facial recognition, and I would read papers.
So Google hadn't Facebook both had deep learning papers on facial recognition I said hey can I get this working on my computer and we ended up getting it working and what we realized was getting more data to train the algorithm to make it accurate across all ethnicities Iranian people black people white people brown people that was really key to improving performance
this would be Juan's real innovation a somewhat dark one his advantage was how he would find the training data he needed he built a scraper a tool that would take, without asking, photos of human faces pretty much anywhere on the public internet that could be nabbed.
He also hired people who'd built their own scrapers to hoover up even more photos.
He said, part of our magic here is that we collected so many photos.
And he built the scraper.
He hired people from around the world to help him collect photos.
And so it's similar to like when people talk about like large language models right now and companies like OpenAI,
some of what they're doing is tuning their neural networks, but a lot of what they're doing is like feeding their neural networks.
It's like they have to find every text that's ever been published in every library and then they run out of all the library texts and they have to find like
transcripts of YouTube videos, which like maybe they shouldn't be loading in there.
It's like part of what he had done correctly to get his product ahead of where the other ones were.
It's just like he was not a genius at making the underlying AI.
That was mostly open source.
He was passionate about finding faces on the internet to put into it.
Yes.
And so where was he looking?
Oh, man.
So the first place he got faces was Venmo.
This is funny to me because as a privacy journalist, I remember people being upset at Venmo's approach to privacy, which at the time was if you signed up for Venmo, your account was public by default.
Yeah.
And your profile photo was public by default.
And so he built this.
scraper, you know, this bot that would just visit Venmo.com every few seconds.
And Venmo at the time had a real-time feed of all the transactions that were happening on the network.
And so he would just hit Vemmo every few seconds and download the profile photo, the link to the Venmo profile.
And he got just millions of photos this way from Venmo alone.
And this is essentially what he was doing.
But with, I mean,
thousands, millions of sites on the internet, Facebook, LinkedIn, Instagram, employment sites.
Yeah, just anywhere they could think of where there might be photos.
I just want to butt in here to say all of this is completely astonishing.
I know people at the dawn of social media who just didn't want to join Facebook or didn't understand why you would voluntarily offer your private life to the public.
But I don't think anyone, or at least anyone I knew, had an imagination sufficiently tuned to dystopia to know that if you had the brazenness to upload your photo to Fenmo or to LinkedIn, you could one day be feeding your face into a machine.
A machine that today, if you go to a protest, is capable of using a photo of your face to find your name, your email, your employer, even your physical address.
Who knew this was the future we were fumbling our way towards?
I asked Kashmir about all this.
I use Venmo.
I use Facebook.
I'm fairly sure that when I signed up, I signed to terms of service.
I did not read it carefully.
I don't think there was a section in there like okaying that my face could be used in photoscraping software.
Is what he did legal?
So Venmo and Facebook both sent Clearview AI cease and desist letters saying, stop scraping our sites and erase the photos that you took, delete the photos that you took, but they never sued.
So this hasn't been tested in court, whether it's illegal or not.
So it's still a bit of a gray area and it hasn't been tested with Clearview because none of these companies have sued them.
In one interview, shot just a month after his conversation with Kashmir, Kwan sat down with a CNN reporter who asked about this, the legality of his project.
Is everything you're doing legal?
Yes, it is.
So we've gone through and have some of the best legal counsel from Paul Clement, who used to be the Solicitor General of the United States.
He's done over 100 cases in front of the Supreme Court.
And he did a study independently saying this is not, the way it's used is in compliance with the Fourth Amendment.
All the information we get is publicly available, and we have a First Amendment right to have public information on the Internet.
And you have to understand what it's also being used for.
We're not just taking your information and selling ads with it or trying to get more.
We're actually helping solve crimes with this.
So your counsel is making the argument that there's a First Amendment right to information that is publicly on the Internet?
Yes, and so if you take something like Google, Google, you know, crawls the internet, collects all these web pages, and you search it with keyword terms.
We're the same thing.
We take all this information on public web pages, but search it with the face.
Juan's making a pretty radical argument here, even though his tone doesn't suggest it.
He's saying that someone being able to take your face and use it to make a search that will pull up your name, possibly your address, and more is nothing new.
It's just like Google.
His point is that Google collects every instance of your name on the internet.
Clearview AI is just doing that, but with your face.
And you know, attaching it to your name.
Whether you agree with this idea or not, it has happened and it has fundamentally changed how privacy works.
Kashmir says that most of us are just not prepared for this brave new world.
I just don't think that most people
anticipated that the whole internet was going to be reorganized around your face.
And so a lot of people haven't been that careful about the kind of photos they're in or the kind of photos they've put up of themselves or the kind of photos they've allowed to be put up of themselves.
And Juan actually did a clear view search of me there.
And I said, oh, well, last time this happened, there were no results for me.
And he said, oh, there must have been some kind of bug.
Sure.
He wouldn't admit that they had put this alert on my face, that they had changed my results.
But he ran my face and there were just tons of photos, like lots of photos I knew about.
But in one case, there was a photo of me at an event with a source.
And I was like, wow, I didn't realize.
I hadn't thought that through now, that if I'm out in public with a sensitive source and somebody takes a photo and posts that on the internet, that could be a way of exposing who my sources are.
And it was really stunning how powerful it was.
I mean, for me, there were dozens, if not hundreds of photos, me kind of in the background of other people's photos.
I remember there were, like, I used to live in Washington, D.C., and there were photos of me at the black cat, which is a concert venue, just in the crowd at a show.
It was incredibly powerful.
And I remember asking him, I was like, you've taken this software somewhere no one has before.
Like you have created this really powerful tool
that can identify anybody, you know, find all these photos of them.
You're just selling it to law enforcement.
But now that you've built this and you've described to me the accessibility of building this, there's going to be copycats.
And this is going to change anonymity, privacy, as we know it.
What do you think about that?
And I remember he kind of was silent for a little bit and he said, that's a really good question.
I'll have to think about it.
And it was just this stunning moment of seeing in action people that are making these really powerful technologies who really just are not thinking about the implications, who are just thinking, how do I build this?
How do I sell this?
How do I make this a success?
Since Juan's interview with Kashmir, it seems like maybe he's had more time to think through better answers to hard questions.
We've watched a lot of these subsequent interviews.
What you notice is that now he'll say that as long as he's CEO, he'll make sure his tool is only ever in the hands of law enforcement and in some cases banks.
And he'll point again and again to the one strong reason why Clearview AI does need to exist.
Without Clearview AI, there are so many cases of child molesters that would have never been caught or children who wouldn't have been safe.
Child predator will be extorting your children online.
You don't even know about it.
Sex tortion.
Child abuse.
Child abuser.
Child crimes.
Crimes against children.
Dark web.
Troves and troves of children's faces.
These are kids that wouldn't have been identified.
This is why our customers are very passionate about keeping the technology and making sure it's used properly.
It's hard to take the other side of that argument, but of course, Clearview AI is not just being marketed as an anti-child predator tool.
A Clearview AI investor told Cashmere he hoped one day it would be in the hands of regular people.
And potential investors in the company were given the tool to use on their phones, like just to use as a party trick.
Will Clearview AI actually ultimately roll this tool out for wide use?
Well, it sort of doesn't matter whether they do or not.
Because remember, the copycats already have.
After the break, how people are using and abusing this technology.
Right now.
This episode episode of Search Engine is brought to you in part by Chili Pad.
Will my kids sleep tonight?
Will I wake up at 3 a.m.
again?
Am I going to wake up hot and sweaty because my partner leaves the heat on?
Those are the thoughts that bounce around my head when I can't sleep too.
And let's face it, sleep slips away when you're too hot, uncomfortable, or caught in a loop of racing thoughts.
But cool sleep helps reset the body and calm the mind.
That's where Chili Pad by SleepMe comes in.
It's a bed cooling system that personalizes your sleep environment.
environment so you'll fall asleep faster, stay asleep longer, and actually wake up refreshed.
I struggle with sleep constantly and I have found that having a bed that is cool and temperature controlled actually really does make a huge difference.
ChiliPad works with your current mattress and uses water to regulate the temperature.
Visit www.sleepme slash search to get your ChiliPad and save 20% with code search.
This limited offer is available for search engine listeners and only for a limited time.
Order it today with free shipping and try it out for 30 days.
Even turn it for free if you don't like it with their sleep trial.
Visit www.sleep, S-L-E-E-P, dot me slash search and see why cold sleep is your ultimate ally in performance and recovery.
This episode of Search Engine is brought to you in part by LinkedIn.
As a small business owner, you don't have the luxury of clocking out early.
Your business is on your mind 24-7.
When you clock out, LinkedIn clocks in.
LinkedIn makes it easy to post your job for free, share it with your network, and get qualified candidates that you can manage all in one place.
I have actually tried posting a job to LinkedIn jobs.
It is exactly as easy as they advertise.
LinkedIn's new feature can actually help you write job descriptions and then quickly get your job in front of the right people with deep candidate insights.
Either post your job for free or pay to promote.
Promoted jobs get three times more qualified applicants.
At the end of the day, the most important thing to your small business is the quality of candidates.
And with LinkedIn, you can feel confident that you're getting the best.
Based on LinkedIn data, 72% of small businesses using LinkedIn say that LinkedIn helps them find high-quality candidates.
Find out why more than two and a half million small businesses use LinkedIn for hiring today.
Find your next great hire on LinkedIn.
Post your job for free at linkedin.com/slash pjsearch.
That's linkedin.com/slash pjsearch to post your job for free.
Terms and conditions apply.
So you leave that conversation.
At that point, do you write your story?
Yeah, I think the story came out about a week after that interview.
And what was the response to the story?
Did people understand the size of the thing?
Yeah, so it was a front-page Sunday story, and it was a big deal.
I remember it landed, and my phone was just blowing up because I was being tagged.
This was back when Twitter was still a healthy space for conversation.
So my Twitter's blowing up.
I'm getting all these emails.
People want to book me to talk about it on TV, on the radio.
Like it was just this huge deal.
People were stunned that it existed, that it was using their photos without consent.
Just the way Clearview had gone about it, the fact that they had surveilled me as a journalist, tried to prevent the reporting on the story.
It was a huge deal.
I thought this is going to be one of the biggest stories of the year.
This was January 2020.
I see.
Shit.
So the pandemic happens and does it just kind of like.
Yeah.
Like I was hearing that there were going to be, there were going to be hearings in dc
they start getting cease and desist letters lawsuits happen but then march 2020 covet hits and it just instantly changed the conversation in the U.S.
and around the world to health concerns safety concerns and then i started seeing people talking about can we use facial recognition technology to fight covid can we start tracking people's faces see
where people were with other people if there's a known exposure can we track people and there was this talk about yeah using facial recognition technology.
So it's like we almost skipped the scared outrage phase of the technology because
the needle kind of juddered on everything with COVID.
Yeah, we did a little bit.
I mean, for certain groups, like European privacy regulators, they all launched investigations into Clearview and they essentially kicked Clearview out of their countries.
They said it was illegal.
I mean, there were repercussions for Clearview, but I feel like the bigger conversation, what do we do about this right now,
it just, it got pushed aside by that that larger concern around COVID.
Somehow, our debate about these search engines was just one of the infinite strange casualties of COVID, a conversation we never quite got to have.
In the meantime, Clearview AI's copycats have continued to go further than the company itself, offering their search engines online for the public to use at a small cost.
None of these search engines is as powerful as Clearview, but all of them are powerful enough to do what privacy advocates were worried about back in 2011.
The tool I would end up finding online was one of those copycats.
I have been noticing more and more people using them mainly to settle scores with strangers on social media.
I asked Kashmir where she has noticed people using these search engines in the wild since she published her book.
I've seen news organizations using it.
One of the more controversial uses was a news organization that used it after October 7th to try to identify the people who were involved in the attacks on Israel.
Oh, wow.
And I was a little surprised to see it used that way.
Why were you surprised?
I was surprised just because
it's still controversial whether we should be using facial recognition this way.
And the same site that was using it had published stories about how controversial it is.
Oh, and that these search engines have scraped the public web and that they invade privacy.
Yeah, so I think it's still complicated.
It was a news outlet that had done maybe we shouldn't have the stories.
And then they were also using the tech.
Yeah.
Like maybe this technology shouldn't exist, but also it's there.
So we're going to use it.
Which sort of feels like the story of every piece of technology we've ever had a stomachache about, which is we say we don't want it to exist.
And then some contingent circumstances arises in which at least some of us feel like, well, it's okay for me to use this here, even if I don't think it should exist.
Right.
Case-by-case basis.
I did ask Kashmir whether she'd seen these search engines used in a clearly positive way.
I've heard of people using it on dating sites to figure out if the person they're talking to is legit, make sure they're not being catfished, make sure this person is who they say they are.
I've heard about it being used
by people who have compromising material on the internet.
Say they have an OnlyFans or just something they don't want to exist on the internet.
And they've used these search engines to figure out how exposed they are.
And some of the search engines do let you remove results.
And so they've done that.
They've gotten rid of the links to stuff they don't want the world to know about.
I've talked to parents who have used these search engines to figure out if there's photos of their kids on the internet that they don't want to be out there.
Oh, wow.
Also, one woman I talked to who she's an influencer, she gets a lot of attention, and she didn't want it kind of blowing back on her kids.
So she stopped featuring them in any of her videos.
And she searched for them with one of these search engines and found out that there was a news photo of one of her kids.
A summer camp, I think, that one of the kids had gone to had posted photos publicly.
And so she asked them to take it down.
But yeah, I mean, there are some positive use cases for these engines.
So what Kashmir is saying is that the most positive use cases for these search engines might just be finding compromising content on the internet about yourself first before someone else using one of these search engines does, which seems like a questionable upside.
Kashmir has also seen facial search engines used in a way that I have to say was just breathtaking in its pettiness.
She recently reported on how the owner of Madison Square Garden was using facial recognition and surveillance to ban from the venue anyone who worked at a law firm his venue was in litigation with.
Kashmir even tested this.
She tried to go see a hockey game with a personal injury lawyer, something one used to be able to do freely.
So I bought tickets to a Rangers game and brought along this personal injury attorney whose firm was on the ban list just because I wanted to see this for myself.
And yeah, so I met her.
I was meeting her for the first time that night.
We stood in line.
Is this the ticket?
Yeah.
Just need to see the seat.
There we go.
We were walking in.
We put our bags down on the conveyor belt and just thousands of people streaming into Madison Square Garden.
But by the time we picked them up, a security guard walked over to us.
He said, I need to see some ID from her.
She shows her driver's license and he said, you're going to have to wait here.
Excuse me.
One moment.
So I just have to to ask you to stand by manager just has someone speak with you okay and um we appreciate pace and just hang on for me for okay couple minutes here we'll get someone down to talk to you and a manager came over and gave her this note and told her she wasn't allowed to come in wow she had to leave
where the friends involved with
legal action against
your friends the whole time it was insane to see just how well this works on people just in the real world walking around.
Yeah.
It was so fast.
God, that's crazy.
When Kashmir reported this story, she actually heard from facial recognition companies who said they were upset that Madison Square Art was doing this.
It was making their tools look bad.
It was not how they said they were supposed to be used.
But misuse of any technology, it's almost a given.
And facial recognition is being misused not just by corporations, but also by individuals.
So there was a TikTok account where the person who ran it, if somebody kind of went a little viral on TikTok, he would find out who they were and expose them.
The one video that really struck me is during the Canadian wildfires when New York City kind of turned to orange.
Yeah.
Somebody had done a TikTok that just showed Brooklyn and what it looked like, that it looked like something from Blade Runner.
And this guy walks by and he became the hot Brooklyn dad.
And so the TikTok account found out who hot Brooklyn dad was and then found out who his son was and said, if you want to date somebody who's going to look like Hot Brooklyn Dad one day, here's his son's Instagram account.
That is wildly bad behavior.
That's crazy.
Because that person didn't even consent to being in the first video.
But I'm sure people sent it to him and were like, hey, the internet thinks you're hot.
Don't worry.
They don't know who you are.
And they not only invaded his privacy further, but invaded his kids' privacy.
Yeah, just for fun.
And so that account was doing a lot of that.
And 404 Media wrote about it.
And eventually TikTok took the account down.
I mean, the thing that sort of hovers around all of this is that prior to the invention of these things, it was like the internet had taken a lot of everyone's privacy.
But the one thing we had was the idea that if people didn't know your name or if you did something under a pseudonym, there's a degree of privacy.
And now it's like your face follows you in a way it wasn't supposed to.
Or the internet follows your face is how I think about it.
It feels like there's a world in which technology like this would just be like fingerprint databases, where law enforcement would have it, the general public wouldn't have access to it.
Isn't that one way this could be going instead of the way it's happening?
Yeah, that is definitely a possible future outcome where we decide, okay, facial recognition technology is incredibly invasive, kind of in the same way that wiretapping is.
So let's only let the government and law enforcement have access to this technology legally.
They're allowed to get a court order or a warrant and run a face search in the same way that they can tap your phone line with judicial approval.
And the rest of us shouldn't have the right to do it.
I think that's one way this could go.
That seems preferable to me.
It seems good, but then you also think about governments that can abuse that power.
So recently here in the U.S., Gaza has been such a controversial issue.
And you have people out doing protests and there was a lot of talk about, well, if you are on this side of it, then you're aligned with terrorists and you are not going to get a job.
We're going to rescind job offers to college students who are on this side of the issue.
And it's very easy to act on that information.
Now, you can take a photo of that crowd of protesters and you can identify every single person involved in a protest and then you can take their job away.
Or if you're police and there's a Black Lives Matter protest against police brutality, you can take a photo and you can know who all those people are.
But I think you notice now when you see photos from the protest, all these students are wearing masks.
They're wearing COVID masks or they're wearing something covering their face and it's because they're worried about this.
They're aware of how easily they can be identified.
And the thing is, it might work, but I have tested some of these search engines.
And if the image is high resolution enough, even wearing a mask, somebody can be identified.
Really?
So just from like nose and eyes and forehead?
Yes.
I did this consensually with a photo of my colleague, Cecilia Kong, who covers tech and policy in DC.
She sent me a photo of herself with a medical mask on.
I ran it through one of the search engines and it found a bunch of photos of her.
There's a world, you can imagine it, where someone passes a law and these tools are no longer offered to the public.
They become like wiretaps, something only the police are allowed to use.
We would get some of of our privacy back.
But, and this might not come as a surprise, there have been problems when the police use these tools as well.
These search engines sometimes surface doppelgangers, images of people who look like you, but who are not you, which can have real consequences.
Cashmere reported the story of a man who was arrested for a crime he was completely innocent of.
The crime had taken place in Louisiana.
The man lives in Atlanta.
The police department had a $25,000 contract with Clearview AI, though the cops wouldn't confirm or deny that they'd used Clearview AI to misidentify him.
How do these search engines deal with errors?
Do they like correct things?
If they make a mistake, is there a process?
So in the minds of the creators of these systems, they don't make mistakes.
They aren't definitively identifying somebody.
They are ranking candidates in order of confidence.
And so when Clearview AI talks about their technology, they they don't say they're identifying anyone.
They say that they are surfacing candidates and that ultimately it's a human being who's deciding which is a match.
It's the human making the mistake, not the system.
So if I were running for local office somewhere and there was a video of someone who looks like me doing something compromising and someone wrote a news story being like, hey, we put his face in the thing and this is what we found.
And I went, hey, you're smearing me.
They'd be like, we're not smearing you.
We're just pointing out that you look like this guy doing something he's not supposed to to do in a video.
Right.
It's the news service that covered it that smeared you, not the facial recognition engine.
But for the person in jail, they know that they would not have been in jail if this technology didn't exist.
Yes, exactly.
So there's this power of the government, right?
Power of corporations.
And then just as individuals, I think about this basically every time I'm at dinner now at a restaurant and there's people sitting around me and I start having a juicy conversation, whether it's personal or about work.
And I think, wow, I really need to be careful here because anybody sitting around me could, if they got interested in the conversation, snap a photo of your face.
And with these kinds of tools, find out who you are.
That's what I always think about.
I was at a restaurant recently and like, it was like outdoor dining and I was with a friend and like in the next sort of like closed booth, there was this person and they were, they took a phone call and they were like, one sec, this is Kamala Harris.
And I think they were joking, but I could like hear them.
And I was like, oh, I could just hand their face.
I could kind of figure this out.
I might be able to find out privileged stuff about like a conversation with a very high ranking member of the U.S.
government.
I was like, this is, I felt real nausea.
I felt nausea at the possibilities.
Yeah.
I mean, I think there's just so many moments in our daily lives where we just rely on the fact that we're anonymous.
Like, you know, you're at dinner, you're having this private conversation and then creepy PJ is going to be sitting there and looking up your connection to the vice president.
Has it made you more, are you different in public now?
Yeah, I mean, I just think that is the risk of facial recognition technology.
The same way that we feel this concern about what we put on the internet, like the tweets you write, the emails you write, the texts you send, just thinking, am I okay with this existing and possibly being tied back to me, being seen in a different context?
That is going to be our real-world experience.
You have to think all the time, is something that I'm saying right now that could be overheard by a stranger something that could get me in trouble or something that I would regret.
And I don't know, that just terrifies me.
I don't want to be on the record all the time, every minute, anywhere I am in public.
You just kind of assume that these things that you're doing aren't going to haunt you for the rest of your life or follow you for the rest of your life or be tied to you.
Yeah.
Unless you're a celebrity of a very famous face.
And it's been funny because I've talked.
with various people who do have famous faces and I talk about this dystopia where it's like everything you do in public will come back to haunt you.
And usually after the interview, they'll say, that's my life.
I'm like, yes, what this technology does is it makes us all
like celebrities, like famous people.
Minus the upsides.
Minus the upsides.
What do you do if you don't want to be in these databases?
Don't have photos of yourself on the public internet.
It's hard not to get into these databases.
These companies are scraping the public web.
So we can't get out of Clearview's database.
And there's no federal law yet that gives us the right to do that.
European privacy regulators have said that what Clearview I did was illegal and that Clearview needed to delete their citizens.
And Clearview basically said we can't tell who lives in Italy or who lives in the UK or who lives in Greece.
So there's not really much we can do.
It's funny though, because I'm not a technology CEO.
And if you asked me to actually fix that problem, I actually could fix that problem.
Like you could say anybody can email us and ask to be taken out if they prove that they live in Greece.
You would think they could actually do something about that.
Yeah.
This is where it gets so complicated.
For a while, Clearview AI was honoring requests from Europeans who wanted to be deleted from the database.
Yeah.
But then at some point, they just stopped and said, actually, we don't feel like we need to comply with European privacy laws because we don't do business in Europe anymore.
God.
Yeah.
They're like ungovernable.
Yeah.
In some jurisdictions, you can get the company to delete you.
In the U.S., there are a few states that have laws that say you have the right to access and delete information that a company has about you.
California is one of those states.
If you live in California, you can go to Clearview AI and give them your driver's license and a photo of you, and they'll show you what they have of you in the database.
And if you don't like it, you can say delete me.
But there are only a few states that have such a law.
For most of us, like here in New York, we don't have that protection.
So we can't get out of Clearview's database.
Facial recognition is hard because these companies are based in places that don't have great privacy laws like the United States.
And they're making people around the world searchable.
It really is a hard problem.
And on a larger sense, as a country, society, world, if we were like, we just don't want this technology to exist.
I know this is kind of like a child's question, but what would it look like to put the genie in the bottle?
I mean, make it illegal, force all companies to delete the algorithms, and you'd have to decide: are we talking about all facial recognition, your iPhone opening when you look at it?
Right.
Or are we talking about just these big databases that are searching for your face among millions or billions of other faces?
I don't think that's going to happen.
I don't think it's going away.
But I do think we have this kind of central question about facial recognition.
Should these companies have the right to gather all these faces from the public internet and make them searchable?
And I think that is something that could be shut down if we wanted it to be.
Kashmir Hill, she's a reporter at the New York Times and author of the very excellent book, Your Face Belongs to Us.
Go check it out.
Stick around after the break.
We have some show news.
This episode is brought to you in part by Odoo.
Running a business is hard enough.
So, why make it harder with a dozen different apps that don't talk to each other?
One for sales, another for inventory, a separate one for accounting.
Before you know it, you're drowning in software instead of growing your business.
That's where Odoo comes in.
ODU is the only business software that you'll ever need.
It's an all-in-one, fully integrated platform that handles everything: CRM, accounting, inventory, e-commerce, HR, and more.
No more app overload, no more juggling logins, just one seamless system that makes work easier.
And the best part?
Odoo replaces multiple expensive platforms for a fraction of the cost.
It's built to grow with your business, whether you're just starting out or already scaling up.
Plus, it's easy to use, customizable, and designed to streamline every process.
So you can focus on what really matters: running your business.
Thousands of businesses have already made the switch.
Why not you?
Try Odoo for free at odo.com.
That's odoo.com.
Will my kids sleep tonight?
Will I wake up at 3 a.m.
again?
Am I going to wake up hot and sweaty because my partner leaves the heat on?
Those are the thoughts that bounce around my head when I can't sleep too.
And let's face it, sleep slips away when you're too hot, uncomfortable, or caught in a loop of racing thoughts.
But cool sleep helps reset the body and calm the mind.
That's where ChiliPad by SleepMe comes in.
It's a bed cooling system that personalizes your sleep environment.
So you'll fall asleep faster, stay asleep longer, and actually wake up refreshed.
I struggle with sleep constantly, and I have found that having a bed that is cool and temperature controlled actually really does make a huge difference.
ChiliPad works with your current mattress and uses water to regulate the temperature.
Visit www.sleepme slash search to get your ChiliPad and save 20% with code search.
This limited offer is available for search engine listeners and only for a limited time.
Order it today with free shipping and try it out for 30 days.
You return it for free if you don't like it with their sleep trial.
Visit www.sleep, S-L-E-E-P dot me slash search and see why cold sleep is your ultimate ally in performance and recovery.
Welcome back.
So, quickly before we go this week, we are heading towards towards the end of season one of Search Engine.
Is there going to be a season two of Search Engine?
How has season one gone?
Great questions.
We will be answering them, all of them, and whatever other questions you have about Search Engine's present and future in questionably transparent detail at our upcoming board meeting.
The date is Friday, May 31st.
We will be sending out the details with the time and a Zoom link to join.
This is only for our paid subscribers, people who are members of Incognito Mode.
If you are not signed up, but you want to join this meeting, you've got to sign up.
You can do so at searchengine.show.
You get a lot of other stuff too.
You can read about all the benefits on the website.
Again, that URL is searchengine.show.
If you're a paid subscriber, look out for an email from us next week and mark your calendar, May 31st, 2024.
Search Engine is a presentation of Odyssey and Jigsaw Productions.
Surge Engine was created by me, PJ Vote, and Truthy Pinimanini and is produced by Garrett Graham and Noah John.
Fact-Checking This Week by Holly Patton.
Theme, original composition, and mixing by Armin Bazarian.
Our executive producers are Jenna Weiss-Berman and Leah Reese Dennis.
Thanks to the team at Jigsaw, Alex Skibney, Rich Perello, and John Schmidt.
And to the team at Odyssey, J.D.
Crowley, Rob Morandi, Craig Cox, Eric Donnelly, Kate Hutchison, Matt Casey, Maura Curran, Josephina Francis, Kurt Courtney, and Hilary Schuff.
Our agent is Oren Rosenbaum at UTA.
Follow and listen to Search Engine with PJ Vote now for free on the Odyssey app or wherever you get your podcasts.
Thank you for listening.
We will see you in two weeks when
we'll have a double episode for you.
It's our version of the wall.