Sextortion: The Darkest Deepfake Scams, How to Protect Yourself and Laurie Segall's Hunt for “Mr. Deepfakes”
Nicole and Laurie zoom out to ask even bigger questions: What does this mean for women, for democracy, and for the future of AI? Spoiler alert: we are still very much in the Wild West.
Follow Laurie’s work, and her investigation into Mr. Deepfakes here.
Press play and read along
Transcript
Speaker 1
Here's one piece of advice that I've given for years. Build an emergency fund.
Aim to stash away enough to cover at least three months of expenses in case your income suddenly drops.
Speaker 1 Sounds simple, right? But let's be honest, it's not. Saving even one month's worth of living costs can feel impossible.
Speaker 1
Just when you're making progress, that check engine light blinks on and derails your plans. Life already throws enough curveballs.
You don't need your bank adding to the chaos.
Speaker 1 That's why it's so important to choose one that makes savings easy and doesn't nibble away at your hard-earned money with ridiculous fees. QIIME understands that every dollar counts.
Speaker 1 That's why when you set up direct deposit through QIIME, you get access to fee-free features like free overdraft coverage, getting paid up to two days early with direct deposit, and more.
Speaker 1 With qualifying direct deposits, you're eligible for free overdraft up to $200 on debit card purchases and cash withdrawals. To date, QIIME has spotted members over $30 billion.
Speaker 1
Work on your financial goals through Chime today. Open an account in just two minutes at chime.com/slash MNN.
That's chime.com/slash MNN. Chime feels like progress.
Speaker 2
Chime is a financial technology company, not a bank. Banking services and debit card provided by the Bank Corporation Bank NA or Stripe Bank NA.
Members FDIC.
Speaker 2
Spot me eligibility requirements and overdraft limits apply. Timing depends on submission payment file.
Fees apply at Out of Network ATMs, bank ranking, and number of ATMs, according to U.S.
Speaker 2 News and World Report 2023. Chime, checking account required.
Speaker 3 I live in LA now, but lately I have been craving the seasons. Snow, hot cocoa, the whole thing.
Speaker 3 I don't even ski, but I have been daydreaming about working remotely from somewhere really cozy on the East Coast, like a cute little ski town for a little bit.
Speaker 3 And whenever I know I'm going to be gone for a while, I always remind myself that my home can actually be working for me while I'm away because I host my space on Airbnb.
Speaker 3 It is one of the easiest ways to earn passive income from something you already have, and that extra income feels particularly helpful this time of year as we approach the holidays.
Speaker 3 A lot of my friends say, that sounds amazing, but where do you find the time to manage? manage guests and bookings? And that's when I tell them about Airbnb's co-host network.
Speaker 3 Through Airbnb, you can find a local co-host who can help you set up your listing, handle reservations, communicate with guests, provide on-site support, even help with design and styling.
Speaker 3 I like to give a personal touch when I'm hosting on Airbnb. So I make a list of my favorite restaurants in the area and I handwrite a note welcoming my guests to the property.
Speaker 3 My guests love it, but I also know that some of those little personal touches can take a lot of extra time. So this is the exact kind of thing that you would want your co-host to help you with.
Speaker 3 Whether you're traveling for work or chasing the snow or escaping it, or you've got a second place that just sits there empty more often than you'd like, your home doesn't have to just sit there.
Speaker 3 You can make extra money from it without taking on extra work. Find a co-host at airbnb.com slash host.
Speaker 1
Here's one piece of advice that I've given for years. Build an emergency fund.
Aim to stash away enough to cover at least three months of expenses in case your income suddenly drops.
Speaker 1 Sounds simple, right? But let's be honest, it's not. Saving even one month's worth of living costs can feel impossible.
Speaker 1
Just when you're making progress, that check engine light blinks on and derails your plans. Life already throws enough curveballs.
You don't need your bank adding to the chaos.
Speaker 1 That's why it's so important to choose one that makes savings easy and doesn't nibble away at your hard-earned money with ridiculous fees. QIIME understands that every dollar counts.
Speaker 1 That's why when you set up direct deposit through QIIME, you get access to fee-free features like free overdraft coverage, getting paid up to two days early with direct deposit, and more.
Speaker 1 With qualifying direct deposits, you're eligible for free overdraft up to $200 on debit card purchases and cash withdrawals. To date, QIIME has spotted members over $30 billion.
Speaker 1
Work on your financial goals through QIIME today. Open an account in just two minutes at chime.com/slash MNN.
That's chime.com slash MNN. Chime feels like progress.
Speaker 2
Chime is a financial technology company, not a bank. Banking services and debit card provided by the Bankcore Bank NA or Stripe Bank NA.
Members, FDIC.
Speaker 2
Spot me eligibility requirements and overdraft limits apply. Timing depends on submission of payment file.
Fees apply at out-of-network ATMs, bank ranking, and number of ATMs, according to U.S.
Speaker 2 News and World Report 2023. Chime checking account required.
Speaker 4 I'm Nicole Lappen, the only financial expert you don't need a dictionary to understand.
Speaker 5 It's time for some money rehab.
Speaker 3 Today I'm joined by one of the bravest voices in tech journalism, Lori Siegel. And as you're about to hear, I've known her for about 100 years, but you know her too.
Speaker 3 You've seen her on CNN at 60 Minutes. And if you've been following her reporting over the last few years, you've probably found yourself both captivated and terrified.
Speaker 3 Lori's latest work uncovers one of the darkest corners of the internet, deep fakes, specifically the dangers of AI-generated images of real people in fake sexual acts.
Speaker 3 In our conversation, she explains the common deep fake crimes and scams and how to protect yourself. We also talk about her totally insane investigation into Mr.
Speaker 3 Deep Fakes, a man that she calls one of the most dangerous people online, and what happened when she actually tracked him down and confronted him.
Speaker 3 And finally, we talked about the bigger picture, what this means for women, for our democracy, and the future of AI. Honestly, my takeaway, it's definitely still the Wild West.
Speaker 4
So happy to be here, Maria. So happy to say welcome to Money Rehab.
Yeah, thank you. We've known each other for 100,000 years.
Correct. We worked together at CNN 50,000 years ago.
Correct.
Speaker 4
But when I saw online that you were sex torted, first of all, I wanted to kill that person. Yeah.
And then I was like, what is sex torsion? And can I be sex torted?
Speaker 5
Yeah. The answer is unfortunately like any of us can, which is terrifying.
That's like the reality of the world we're entering.
Speaker 4 So what is sex torsion?
Speaker 5
This would be like if you're a child and someone reaches out on like Instagram and pretends to be your friend. And let's say you have a teenage boy.
By the way, all this is going to seem really dark.
Speaker 5 So I just, sorry. Let's say if you're like a teenage boy, pretty girl reaches out, starts trying to get them to send some kind of provocative image.
Speaker 5 And the next thing you know, they say, if you don't pay me X amount of money, I'm going to send this to every single person you know. But your child never even took the image, right?
Speaker 5
They were never even tricked into it. It's a deep fake of them.
And it doesn't matter that it's not real because it looks real.
Speaker 5
And these types of like sex dorsion campaigns are so horrific and children are ending their lives. And this isn't just children.
I say this is happening to adults.
Speaker 5 This has been an ongoing thing for a while.
Speaker 5 So what's happening with the rise of artificial intelligence and defakes is basically the democratization of these types of scams and none of us are safe, right?
Speaker 5 So someone could say, I have a nude image of you and I'm going to pass it around to everyone. And I always look at this from a victim standpoint.
Speaker 5
You can't just say, oh, it's just not me because you'll see that image. It looks like you.
And it looks like you. To an untrained eye, it's you.
It could tarnish your reputation.
Speaker 5
People might not know it's real. So these images aren't real, but the impact is real.
So we always want to like shout this from like the mountaintops. This is what's coming.
Speaker 5 And we need tech companies to do much better jobs.
Speaker 4
So let's talk about it. So somebody reached out to you.
And then what happened?
Speaker 5 Apparently, this is like a very common scam. And I just happened to know a lot of good guy hackers from my days going to hacker conferences as a tech journalist.
Speaker 5
So I was able to afford this and be like. what is happening? But someone reached out and said, I have images of you.
I've been able to hack into your device.
Speaker 5 If you don't give me X amount of money, implying they had intimate images of me, I will put this out there.
Speaker 5 And what they do is I think they put some identifiable information about you, like your home address or something like that. You thought it was like you thought maybe somebody for a second.
Speaker 5
And this is what happens. And I'm like a long time tech journalist, right? Like I am like, I got this.
Like I don't have to freak out. And even me, I was like, oh my God, do they have some?
Speaker 5
What do they have? And you start questioning. Your breath gets short.
I literally had a security guy come onto my computer. I like had him remote into my computer and look for any malware.
Speaker 5
I wanted to be completely sure. And he was like, no, these scams are actually going around.
So I posted about it.
Speaker 5
And the next thing you know, I get all these people messaging me privately being like, this is happening to me. I was so scared.
And one of the things they say, this is so dark.
Speaker 5
I love that we're really starting out strong on this. They'll be like, we saw you on a porn site or something.
And now we've remoted in. People are embarrassed to talk about it.
Speaker 5 So it's just like a wild, I would say, like.
Speaker 5 It is a wild west right now. And there are so many of these different types of scams going around.
Speaker 5 Like we are in the the wild west of scams that are only made so much more believable by artificial intelligence, right?
Speaker 5 Like parents getting calls from what they believe are their children because their voice has been imitated using artificial intelligence because it takes 30 seconds of a voice sample to be able to mimic that voice.
Speaker 5 This is, I think, the world we're entering at all these different levels where our identities are up for grabs and AI can just mimic our most intimate features, our face, our bodies, our voice.
Speaker 5 And so it's a bit of a wild west. And I think we have a long way to go with educating people on it.
Speaker 4
For sure. My husband and I even had this conversation recently where we said we needed a safe word.
Right.
Speaker 4
So if somebody gets called saying that they were kidnapped or I don't know, I can't even imagine. We probably don't even know what could happen or will happen.
Like say
Speaker 4 strawberry. That's not our safe word, but say something like that.
Speaker 5
100%. Hilariously, we all need human passwords.
This is literally what one of the top security guys said to me. I was like, how do people protect themselves? He's like, human passwords, safe words.
Speaker 5 I called my mom and I was like, if you get something like this from me, which it seems crazy to have to call our parents and say, like, if you hear an AI generated voice or you're not sure if it's me, this is the word you need to say.
Speaker 5 This is our human password. In an interesting way, it's almost like our humanity is the thing that we're hoping will help us pull through in this weird time.
Speaker 4 The analog. I know.
Speaker 4
Yeah, I guess. In this brave new world.
So then what happened? You brought in the security experts. You are used to tracking down criminals.
Speaker 5 So this was all kind of a precursor to we did a larger investigation that we've been working on. When I say like, I get obsessed with topics, like this is for better and for worse.
Speaker 5 And I think three years ago, I became obsessed with this idea. Someone had mentioned to me there was this really shady site online and it's a deep fake pornography site.
Speaker 5 So literally it looked like, and I mean, this is dark, but it's like you were watching sex tapes, essentially, of many women in in the public, even though they never made them, even though they never would ever consent to something like this.
Speaker 5 But you were looking at hyper-realistic deep fake pornography of if you are even kind of a public figure, there's a chance you were on this site.
Speaker 5
And I remember going to this being like, wait, this is insane. And then I started looking into it at the peak of it.
18 million people were going to this site on a monthly basis.
Speaker 5 So I'm like, none of these women consented to having their image and likeness used.
Speaker 5 And like, you couldn't tell if it was real or fake, although like, we know it's not real, but that harm is very real. And I just remember being like, why does this exist?
Speaker 5 Like, why are 18 million people allowed to go and see this? And these women have no control. And ironically, it was like a lot of women in power.
Speaker 4 So Joseph and Taylor Swift.
Speaker 5
Yeah. Taylor Swift was on.
There were so many people that were like, their likeness was taken and used on the site.
Speaker 5 And this site, I became obsessed with it because I was like, okay, it's not just like about this shady site on the internet, but it was a platform, right?
Speaker 5 So it's not like people just went and saw these horrific videos. They could also like create them or pay people to create them.
Speaker 5 And so like it became a whole platform and an ecosystem where the idea of sexually explicit deepfakes of saying, oh, I like that woman. I want her doing this with this person.
Speaker 5
I don't care what she says. I'm just going to use AI to make my dreams come true.
Like your wish is AI's command. That was what this site was.
And it was called Mr. Deep Fakes.
Speaker 5 And I remember they also had like training manuals. So it wasn't just about these public figures, these women.
Speaker 5 It was about training young men how to do this and take this into their schools, articulate this into their workplace.
Speaker 5
So you look on the message boards and it'd be like, oh, I want to do this to my sister-in-law. I love tech.
I love artificial intelligence. I think it's going to do incredible things.
Speaker 5 But this is ground zero for what happens when it is misused and it's used as a weapon against women and girls and eventually all of us, right? So I became very obsessed with Mr.
Speaker 5 Deepfakes and tracking him.
Speaker 5 And it took us all on an investigation that was very wild and never a dull moment spoiler alert we found mr deep fakes yay yes it was probably a couple years ago and i'm like we should just start talking about this on the internet and explaining why people need to care about this shady site so we said okay i believe this is one of the most dangerous men on the internet the person behind this and we need to know his name before it's too late because why should this person who has harmed so many women be afforded anonymity.
Speaker 5
This site had been up and running for for seven or eight years and he was anonymous. So you had no idea who was doing this.
And I just thought, let's find him. Let's just try.
On my
Speaker 5
team, we have like some incredible investigative journalists that came with me for my 60 minutes days. One of my colleagues, Nicole, she could be an FBI agent if she wanted.
She's wonderful.
Speaker 5 We started talking about it and we went out and I remember I started talking about it like a moms conference and all these moms got behind us with this idea that this might be about this shady deep fake porn site, but actually this is about the future of bullying.
Speaker 5 This is about what could happen in your schools with young men doing this to women, thinking it's okay. Like this is normalizing a new type of abuse.
Speaker 5 And so I think a lot of people really resonated with that message. And I remember I was getting my nails done.
Speaker 5 And all of a sudden, I didn't even know I had another inbox on TikTok, but I was on TikTok looking at other inbox, which is like messages that sometimes they filter.
Speaker 5 And this security company, security legal company called Sedenti had reached out and a guy named Jordi was like, we have a tip. We believe we have found him.
Speaker 5 And so I'm like, okay, this feels, I'm not sure if this is real, like 100%, but I'm like, obviously we're vetting it.
Speaker 5 And we ended up like going on this what we got like a dossier that had, I want to say 17, 18, 19 different data points.
Speaker 5 I brought in another security firm and like we all basically tracked down like via social media, via the names we were given.
Speaker 5 And there were so many connections because anytime you do something on the internet, like you're just not hidden. This is what I've learned through all my years in investigative reporting.
Speaker 5
Like covering your traces is actually like very difficult and you will make mistakes. And like, you know, he made mistakes years ago.
There was an 8-chan post with him, like an 8-chan.
Speaker 5 It's like 4chan, but like this message board where people put like crazy theories and memes and cultural things.
Speaker 5
And it's like a place where a lot of like, you know, internet lovers for better and for worse go and say some of the weirdest. stuff and great stuff too, but it's a weird place.
He had an 8-chan post.
Speaker 5
We had him talking about a car, like a red Mitsubishi. We were able to track and we ended up in front of his parents' home with the red Mitsubishi.
Like all sorts of crazy investigation went into it.
Speaker 5
And we tried to reach out to him many, many times. He wouldn't answer.
He took down all his social media. We reached out to friends and family.
And then finally, we said, let's go.
Speaker 5 Let's try to find him and talk to him in person.
Speaker 4 Hold on to your wallets.
Speaker 5 Money rehab will be right back.
Speaker 1 Hey, Money Rehabbers, aren't you crushing it with your savings goals? Of course you are. But did you know that you can make your money work as hard as you do with U.S.
Speaker 1 Bank Smartly checking and savings?
Speaker 1 From tracking your spending to growing your savings, Bank Smartly can help you reach your goals faster with higher savings rates and waived monthly fees on eligible accounts.
Speaker 1
Because when your bank believes in your potential as much as you do, that's when real progress happens. That's the power of us.
Visit usbank.com today.
Speaker 1 Interest rates and annual percentage yields for variable accounts are determined by the bank's discretion and can change at any time. Deposit products are offered by U.S.
Speaker 1 Bank National Association, member FDIC.
Speaker 4 And now for some more money rehab.
Speaker 5 I found out he worked as a pharmacist in a hospital, like helping people.
Speaker 5 I found out that he was the man that had really helped create this site that enabled so much, I would say, digital abuse against women, had a wife. He had a new baby.
Speaker 5
Like he was really living a double life. And we showed up outside the hospital.
We were able to call the floor he worked on, figure out exactly when his shift was starting.
Speaker 5 We were there the next day and we confronted him. And so it's been a pretty wild journey just to say we shouldn't live in a world where this type of thing is enabled.
Speaker 5 And it's interesting because when we confronted him, I knew we would have 30 seconds.
Speaker 5 I knew that he wasn't going to want to speak to us and I knew he would know exactly who I am because I had been reaching out to him for months before.
Speaker 5
And he saw me and he just started walking incredibly quickly towards the door. And I just remember asking for comment.
Legally, I want to ask for comment, right? We have all this evidence.
Speaker 5 I asked him, I said, I want to negotiate how someone who's a father and a son can create this type of thing that perpetuates this type of abuse.
Speaker 5 And as the doors were closing, I said, the harm is real. And did he say something? He wouldn't say a word.
Speaker 5 And I've interviewed like some categorically sketchy folks in my career, but I was really shaken by how he looked at me. And that was just part of our investigation.
Speaker 5 We did so many things to be able to really fan out. And we presented our findings to lawmakers around the world, started talking about why this mattered.
Speaker 5 And I think when this happened to Taylor Swift, I want to say like January or something of 2024. And I hated this thought, but I thought, well, maybe now people will pay attention.
Speaker 5 Like it's happened to one of the most powerful women in the world, which is horrific. And it shouldn't take this type of abuse happening to Taylor Swift for people and lawmakers to pay attention.
Speaker 5
But it did help, I would say, people be like, oh, this is the language behind it. This is why it's bad.
And I think we were able to speed up our investigation. And so it's been never a dull moment.
Speaker 5 And then I got pregnant and had a child in the process.
Speaker 4 But did that change how you viewed this and bringing a child into this world?
Speaker 5
It's a really good question. I think.
When we were initially out, I was thinking about it because we just had Mother's Day and I was thinking about having a child. And I remember thinking, like,
Speaker 5 if if we are not careful, it's not just about the victims, right?
Speaker 5 We are going to train a whole new generation of abusers, of young men who grow up and think that I can nudify this girl from class in a couple clicks using artificial intelligence.
Speaker 5 And I think that actually was very much as I was thinking about wanting to have a child.
Speaker 5 Like, God, I remember not, this is like probably way too much information, but like when we were in the hotel room the day before tracking him, I was literally tracking ovulation.
Speaker 5 Like I was like, it was so top of my mind of thinking, what happens for our children? I just feel like we have to do better for them. And so it was wild.
Speaker 5 We went out there and a couple months later, I found out I was pregnant. And I, this felt so personal to me.
Speaker 5
I just don't want my child to grow up in a world where people think they can control women and girls. It spreads out.
And we had a team of women in the field, which is pretty incredible.
Speaker 5
And the producer I was working with who worked on Mostly Human, which was my show at CNN, she was six months pregnant in the field. And we had this moment.
She still wanted to come.
Speaker 5
I was like, are you sure you want to come? She's like, 100%. I'm like, okay.
We're doing like car stakeouts. And she's literally six months pregnant at the time.
Speaker 5
And I remember we had confronted him at the hospital and he left through another door. Like he was able to get out.
He took some kind of car out because we were right near where he had parked.
Speaker 5
We didn't know his home address was. I remember it feeling like a little bit of a dead end.
We came all the way out here. We wanted to get some answers.
Speaker 5 We wanted to ask for some kind of comment and understanding of how you could have created this thing that became so big without any accountability. I'll never forget.
Speaker 5 We were in the car and Roxy, who's the producer I was working with, she was like,
Speaker 5
because we had just figured out he was a dad because we had gone to his parents' home and we saw a baby seat, like a car seat in the car. And I was like, is Mr.
Deepfakes a dad?
Speaker 5
And she was like, let's call a local toy store and see if he's registered. Like pregnant Roxy is saying this.
And I'm like, oh, that's actually probably not a bad idea. And we we ended up calling.
Mr.
Speaker 5 Deepfakes was registered at, I guess, for his child.
Speaker 5 And like, we were able to somehow get his home address from that, which was just this, it took like, I think a lot of women just in like the only way like I feel like a pregnant.
Speaker 5
person would think we're trying to figure out a better future for our children. And the reason I focus so heavily on Mr.
Deepfakes is because it's not just about Mr. Deepfakes.
Speaker 5 It's about the future of consent and bullying and being able to like create a better world for our children.
Speaker 5 And I think that was really personal to me because I was thinking about having a child during this investigation. Then I got pregnant and then I had a child.
Speaker 5 It's been a wild journey, but it makes it, I think, really meaningful that the site is now down. As of the last couple of weeks, the site was down.
Speaker 5 And I would say it took probably part of it, us showing up at his door, other people beginning to understand who he was. It took people creating friction, Google deranking the site.
Speaker 5
So it took all this friction, but it was such a win because I think so many people sometimes say, oh, it's a game of whack-a-mole. It's what? You take one down.
There's going to be so many others.
Speaker 5 And I just don't buy it. Do you know how many women are going to sleep better tonight because of this? And if it's like a game of whack-a-mole, we just whacked like a giant one.
Speaker 5 So that makes me sleep better. Yes, me too.
Speaker 4 Do you know if he had a boy or a girl? I don't know. It's messed up in both ways.
Speaker 5
I think he might have had a boy, and I'm not positive when we did a little investigating. Which is just crazy to me.
And I might sound like a total crazy person now.
Speaker 5 I always try to understand the why. I think it's too simple to be like, you're just this terrible person and you've done this thing.
Speaker 5 I think it's actually in trying to understand the why, maybe the more interesting reasons.
Speaker 5 He reminded me a little bit of Ross Ulbricht from Silk Road, the guy who created one of the largest sites on the dark web where illegal things were bought and sold.
Speaker 5
Ross very much had this libertarian ethos of this is kind of the future of the internet and all these things. I can't speak for David.
That's the name of one of the creators of Mr.
Speaker 5
Fakes, according to all of our evidence. I can't speak for the why, but I do think that it started as more of a hobby and an interest.
Deep fakes and also porn and all this stuff.
Speaker 5
And I don't know if there was just a lack of empathy, if maybe he didn't believe that the harm was real. I think that those walls closed in on him.
I think the stakes got higher.
Speaker 5 As the site got bigger and as people started talking about it more and as more people started saying, this is really harmful. He never shut it down until a couple of weeks ago when it was shut down.
Speaker 5 So I have no idea where he is now.
Speaker 4 Did he get fired?
Speaker 5
I don't know if they fired him. There was a report that he could potentially be overseas.
I don't know.
Speaker 4 Does his wife know? I have.
Speaker 5
I had that question too. I mean, so I reached out to her after all of this happened and his name is out there and the site is down.
She hasn't responded.
Speaker 5 I did at one point show up at his parents' home. And it didn't feel like I always think it's important for it not to feel like, oh, I got you.
Speaker 5 I'm going to get the bad guy it felt sad we grew up in like a beautiful neighborhood where kids are playing on the street i i didn't get the sense his parents knew but i don't know like you talked to them spoke to his father very briefly before he went inside and i didn't want to how do i say this i didn't want to stay for too long and be harassing at all there's always this fine line of between the reporters that are between but i never wanted to be that.
Speaker 5 We've seen that just without any empathy. And I'm not saying like I need to have empathy for this, but I think like empathy is the thing that we lack.
Speaker 5 So many instances, it's the whole reason I think we're seeing a problem in sexually explicit deepfakes. People don't realize that there's real harm here.
Speaker 4 And so person and a family.
Speaker 5
Yeah. I like to think that we showed up with a certain amount of empathy and being inquisitive without harassing his family.
I think I walked away feeling really sad.
Speaker 4 How did it affect you?
Speaker 5 I think I get frustrated sometimes because it's like,
Speaker 5 I thought for so long, this is why it's so big, right? It starts here, then it goes to schools, then it goes to democracy where we can make anyone say anything, and then it goes to conspiracy.
Speaker 5 So, I always like to be like, How do I frame this to different people? And I think sometimes it can get frustrating to be able to be like, no, it's not about the shady site.
Speaker 5
It's actually about safety and consent. And it's about a tech threat that you don't realize.
It's not what all the tech bros are talking about, which is AI becoming conscious and like Terminator.
Speaker 5 I'm like, no, no, no, this threat is already here and it's it's impacting your children.
Speaker 5 And I think sometimes that can be frustrating to me because I'm sometimes a couple of years ahead on this and I feel like I talk about it and people are like, huh?
Speaker 5
But I do think people are really understanding and I don't blame them. It's a weird one to wrap your head around.
But yeah, I think it's, you have to like divide in certain ways.
Speaker 5 I said this to my colleague this morning because we were speaking to a woman whose son ended his life after an AI sextortion. Someone using AI did exactly what I explained at the beginning of this.
Speaker 5
And he ended his life. And how old? I think he was like 14.
He was a teenage boy. And I said to Nicole, I'm like, because we're going into turbo mode, we're like, go, go, go.
Speaker 5 And I think sometimes if I sit, I'm like, man,
Speaker 5 like, I have a boy, right? That's so messed up. And I can't almost, it sounds whatever to say, like, I can't sit in it for too long, but I think feeling it is probably the most important thing.
Speaker 5 And how do we, as a tech, like for my company and like trying to like tell stories about technology, like how do we produce humanity and how do we produce empathy and just use tech as our way to do that?
Speaker 5 I think part of that is you have to feel it and you have to like not just be outraged, but be organized about that outrage and be able to tell that story and let other people tell their stories and see them.
Speaker 5
So I don't know. That's a roundabout answer to say, I think I do okay with it.
Good days, bad days.
Speaker 5 And I think it's weird when you have a child and you just look at like a child who's so innocent and amazing and like you're just obsessed with your kid and you're like i don't want you to see this world i want i want to they want you to have the best world
Speaker 5 hold on to your wallets money rehab will be right back
Speaker 4 and now for some more money rehab
Speaker 4
it's a reality i think that we're going to see in a few years because you're always ahead on these trends. Like when you and I were growing up, guys still looked at Playboy.
Totally.
Speaker 4 And then they moved into porn
Speaker 4
online. And we've seen how that's affected men.
And so is the next generation going to be involved with user-generated AI porn?
Speaker 5
I think that's the thing that's so scary, which is like... At least like Playboy, like they consented.
There are all these issues that we think about when it comes to this.
Speaker 5 But now it's like the big thing. And one of the biggest questions about the future of artificial intelligence, and we're seeing this play out in Hollywood.
Speaker 5
We're seeing this play out like literally with writers. We're seeing this play out everywhere is consent.
Did you consent to have your materials uploaded? Did you consent to all of these things?
Speaker 5 And I think, so when we look at this through the lens of consent, it's should
Speaker 5 anyone be able to have the power to make anyone do anything without their consent? And I feel like this is like a no-brainer. The answer is no, but it's a wild west.
Speaker 5 And oftentimes by the time we're having the conversation, it's too late to have the conversation because in the time that Mr.
Speaker 5 Deepfakes has risen and also fallen, there are all these nudifying apps, right? There are all of these apps that have been popping up that allow people to do this with so low friction.
Speaker 5
You don't have to be high-tech to do this. It's just a couple clicks.
And now we're seeing the conversation around that. And thankfully, the laws are catching up.
Speaker 5 But the genie is certainly out of the bottle at this point.
Speaker 4
In that time, it sounds like this horrific story of a young boy killing himself came from another site. So you're playing this game of whack-a-mole.
There are obviously other moles.
Speaker 5 Yeah. And I think it's how do we educate parents now to say, okay, what are the conversations we need to have with our children so we can keep a really open environment?
Speaker 5
If something like this happens to them, they don't feel embarrassed. They don't feel like ashamed to come to us and say, hey, I received this photo and I didn't take it or I did.
Who cares?
Speaker 5
Being able to even be prepared for these types of things so we can get in front of what's going to be inevitable. You have these groups online that are not targeting folks.
And so it's like Mr.
Speaker 5 Deepfakes was just our way in to talking about like a deepfake world where we can't really believe anything we see, where our likeness is weaponized against us, where our most intimate qualities are mimicked by artificial intelligence.
Speaker 5
And that can seem scary, but the biggest thing for me, honestly, is giving people agency. There's also stuff we can do to get in front of it.
The idea that Mr.
Speaker 5 Deepfakes is down, there was so much friction created that like legally made it very difficult for this site to operate the way it was. That's agency.
Speaker 5
That's like saying, we're just not going to live in that world. We can actually make changes and AI can work for us.
It doesn't have to work against us. It's a tool.
Speaker 4
There's so many amazing things that AI can do. You're so into the tech world.
You've covered it for. a couple of decades.
You know, all the major tech founders was meta, TikTok. What did they say?
Speaker 5
Some of these companies have done better than others, right? It's also like a closed model. So it's harder to have AI generate these types of images.
They have worked very hard against this, right?
Speaker 5 But some of these other open source models make it easier for this type of thing to happen. The thing that I've been obsessed with, I feel like this is my next thing, is as I started digging into Mr.
Speaker 5 Deepfakes and I was like, I want to talk to other women who have been victims of this and survivors of this. And I spoke to a woman recently.
Speaker 5 Her name is Bree and she's so hard to describe other than joy in the morning she is a local meteorologist outside of nashville she is loved in her community i feel like she's a person who walks down the street and people hug her because in news right now the meteorologist is the least controversial figure ever right and they are in your basement with you when there's a tornado telling you what to do so she's really loved and i got in touch with her after seeing her
Speaker 5 she was trying to get a law passed in tennessee after she was like all of a sudden on facebook meta, she would post something and then a fake Brie who seemed like her, fake profile, would respond to her fans and say, hey, reach out to me on Telegram more soon.
Speaker 5 And she had someone message her and say, I think your husband has been sending nude photos of you out. And she was like, no, he hasn't.
Speaker 5
And they were using. deep fakes of her to make it look like she was nude.
And they would get her fans to go on a Telegram account and they would send this.
Speaker 5
And one of these scammers said, meet me at a hotel in Nashville, pay me like X amount of money. And here's a taste.
And sent these images of her. And there was literally, she was shocked by this.
Speaker 5 And then another one reached out to one of her fans, got them on Telegram. They think it's her and said, join my VIP fan experience.
Speaker 5 There's another one that said, I'm in a terrible relationship and I can't get out. There's abuse, like lies.
Speaker 5 All of these are lies, but preying on her fans and like utilizing AI and sexually explicit defects. And they used an AI generated video of her to say, no, it's really me.
Speaker 5 And, and all of a sudden we started looking into it.
Speaker 5 And we worked with a security company called Remilio and they did like an analysis of how many fake profiles of her were out there and 5,000 and counting.
Speaker 5
So she was living this whole fake life on the internet where people were profiting off of her likeness. They were sexualizing her.
They were doing all this stuff.
Speaker 5
And she reached out to Meta many times with profiles. And she told me the woman said to her, I don't know what Telegram is.
I was like, you know what Telegram is.
Speaker 4 Oh, like, you know what? I would love to look in your Telegram, by the way. Oh, my God.
Speaker 5
I've been talking to one of her scammers for weeks now. It's wild.
As yourself? As a fan to try to understand.
Speaker 5
But I think like the biggest thing is we don't even realize it, but our identity has been taken. And AI is front, row, center here.
And we could be living these fake lives on the internet.
Speaker 5
We don't even realize it. Selling crypto.
selling sexually explicit deep fakes, all of these things, because it wasn't just her.
Speaker 5 She started talking about this and all these other meteorologists came out and said, This is happening to me. I realized it was happening to me.
Speaker 5 There's multiple fake lorries out there selling crypto scams.
Speaker 4 Can we check if it's happening to me?
Speaker 5 Yes, 100%.
Speaker 5
This is like my latest obsession. I think we are living these fake lies out there, and I am sure the tech companies know.
They are, people are reporting it. I think they know about this.
Speaker 5
And I think this is the tip of the iceberg. Deep fakes are way into talking about a whole deep reality where none of us are immune.
And we're just beginning to see that.
Speaker 4
Thank you so much for the work you do. I am officially scared.
And a lot of this extortion or sextortion is around money. They want money in crypto.
Speaker 4 And so we end our episodes by asking all of our guests for a tip that listeners can take straight to the bank. So how can you protect yourself?
Speaker 5 I would go back and say what we were talking about at the beginning, because it's a real tangible thing, this idea of a human password and also monitoring your accounts and making sure there aren't those small charges, right?
Speaker 5 If there's a small charge on your account, you're not sure where it comes from, like oftentimes this is what scammers will do.
Speaker 5 They'll try to see if they can get away with a little and then they'll go and charge a lot. So that's definitely one thing.
Speaker 5 And I think really trying to talk to people you love, tell your parents, tell your friends, like These links that are coming up, these text messages that you are getting, these emails, like you have to be 1000% sure before clicking and sending your information because now they are personalized.
Speaker 5
These scammers are getting better and better. They make it seem high stakes.
And I hate to say this because I don't want to end it in a sad way, but like question everything.
Speaker 5 If you need to, if you're getting some stuff from the bank, call the bank.
Speaker 5 Actually, not the number from the text message that they send it, but look up online the number to your bank and call it or go in person, right?
Speaker 5
And triple check because these scams are getting really sophisticated. They feel very personal.
And they're coming from all directions.
Speaker 5 And I think being able to understand that is going to be really important for the future.
Speaker 4 listen i just got identity thefted and so i'll just tease this we're going after you mr identity theft we are coming for you that's our next episode lori's gonna find you i'm going to your rack
Speaker 4 i know look in your car yes yes so you're going down yeah
Speaker 4
Money Rehab is the production of Money News Network. I'm your host, Nicole Lapin.
Money Rehab's executive producer is Morgan Lavoie. Our researcher is Emily Holmes.
Do you need some money rehab?
Speaker 4 And let's be honest, we all do.
Speaker 4 So email us your money questions, moneyrehab at moneynewsnetwork.com, to potentially have your questions answered on the show or even have a one-on-one intervention with me.
Speaker 4
And follow us on Instagram at MoneyNews and TikTok at Money News Network for exclusive video content. And lastly, thank you.
No, seriously, thank you.
Speaker 4 Thank you for listening and for investing in yourself, which is the most important investment you can make.