The Tea Breach Just Keeps Getting Worse
YouTube version: https://youtu.be/q17GRPq7K3o
Women Dating Safety App 'Tea' Breached, Users' IDs Posted to 4chan
A Second Tea Breach Reveals Users’ DMs About Abortions and Cheating
UK Users Need to Post Selfie or Photo ID to View Reddit's r/IsraelCrimes, r/UkraineWarFootage
Credit Card Companies Are Hurting the Future of Video Games
Steam Bends to Payment Processors on Porn Games
LeBron James' Lawyers Send Cease-and-Desist to AI Company Making Pregnant Videos of Him
Subscribe at 404media.co for bonus content.
Learn more about your ad choices. Visit megaphone.fm/adchoices
Listen and follow along
Transcript
YubiKeys are made by YubiCo, a company with Swedish roots and HQs and manufacturing centers in both Sweden and the United States.
YubiKeys come in a variety of small form factors with Lightning and USB-C connectors.
They're small, sturdy, and easy to use.
They stop phishing attacks and account takeovers before they start using modern phishing-resistant multi-factor authentication or MFA, which is just a fancy way of saying they're a proven security solution that cannot be bypassed by hackers or other malicious actors.
Unlike basic forms of MFA such as SMS, which is text messages, one-time passcodes, and mobile authenticator apps, YubiKeys help businesses of all sizes stay ahead of evolving cyber threats and regulatory requirements, protecting some of the world's largest banks, telcos and tech companies, critical manufacturers and energy concerns, and government agencies and non-governmental organizations.
UBKs are also great for regular people and everyday users.
YubiKeys help people protect their email, financial and social media accounts, password managers, productivity tools, developer tools, and so much more.
Learn more about how YubiKeys secure the applications, services, and accounts that people and businesses of all sizes rely on every day at ubiquo.com.
That's y-ub-i-c-o.com.
Hello, and welcome to the 404 Media Podcast, where we bring you unparalleled access to hidden worlds, both online and IRL.
404 Media is a journalist-founded company and needs your support to subscribe.
Go to 404media.co as well as bonus content every single week.
Subscribers also get access to additional episodes where we respond to their best comments.
Gain access to that content at 404media.co.
I'm your host, Joseph, and with me are 404 Media co-founders Sam Cole.
Yep.
Emmanuel Mayberg.
Hey.
And Jason Kebler.
Hello, hello.
Okay,
final reminder.
Last chance.
Literally, because if, I mean, if you're a subscriber, you get this early.
I think you get this on Tuesdays.
If you're a free listener, you get this the day after.
That's another benefit of subscribing.
You actually get it early than everybody else.
July 30th, Wednesday, 6 p.m.
at Rip Space in Los Angeles.
We are having a live event.
It's going to be
me, Sam.
Jason, and then we're also going to be joined, I believe, by Dexter Thomas.
Right, Jason?
Do you just want to to explain what we're going to be talking about and sort of what we're doing briefly?
Yeah, so we're going to be talking about our reporting on the technology that powers ICE.
We felt that was appropriate to do in Los Angeles because a lot of this technology is being deployed in LA.
And so our friend Dexter Thomas, who
used to work with us at Vice and now has a
podcast called Kill Switch, but also is an independent journalist like us,
will be with us and talking about his reporting on the ground because he went to a lot of the protests and filmed at a lot of the protests in LA.
So we're going to do that for like an hour-ish,
and then the rest is going to be a party with the DJ and beer and wine and just hanging out.
And
it should be fun, should be a good time.
So if you're on the fence, please come.
It will be fun.
You can find tickets at bid.ly slash 404 rip space
or on our website.
And if again, if you're a subscriber, you get free tickets.
So on our website, there's a code.
Just scroll back to where we posted about the event.
You can find the code.
Yes.
All right.
That all sounds good.
Looking forward to seeing everybody there.
Let's get straight into this week's stories.
The first one, Emmanuel's the first by line, and I helped out on it as well.
But Emmanuel, the headline is women dating service app T breached users' IDs posted to 4chan.
I guess, first of all, Emmanuel, how did you get this tip?
This was on Friday, I believe, and it was pretty fast-moving.
Can you walk us through when and how you got that tip and what happened next?
Yeah, sorry, I have to correct you.
You call it a dating service app, I think.
It's a dating safety app, which is
an important distinction.
It's like an app where women are invited to,
they thought, safely exchange information about
men
that they want to date or are dating.
And the way this happens, I believe this was Friday morning.
And
I just get a call on my phone, and I can see that it's to my Google voice number, which is kind of the number that I've shared previously for people who want to send us tips.
And I don't always pick that up, but I picked it up for whatever reason that morning.
And it was like a good Samaritan, I would say, somebody who is his day job is like IT adjacent.
And he sounded pretty frazzled and panicked.
And he was like, hey, there's something going on with 4chan.
You have to see it.
I'm sending you some links.
And I was like, I couldn't really understand.
But then I went to 4chan.
And by the time I got there,
it was obvious that this app called T
had a major breach
that
people could
get
users' images, which the app asks people to upload selfies or previously photos of their ID in order to prove that they're women because it's an app for women.
And people could get their hands on thousands and thousands of those images,
some messages, and some other data.
We can get into all of that.
And not only was it clear that it was available, people were already making it available off their app.
So it's like when I got there, people like anybody could go there,
can get into
the cloud computing.
It was a Google service that the app used to deploy the app.
People could like rife through that.
Because it was exposed.
It was fully exposed, yeah.
And but people were already like archiving it and using the images and
mocking these women.
It was like
the vibe was like, oh my God, this is really bad.
And it's already like way too late.
It's like, as you know, Joe, it's like you can discover.
a hack at several stages.
You know, it's like it could be a researcher and they disclose it responsibly and then the company closes it.
Or you can find it as a journalist because of a tip and then you tell the company or it's like it could be like a little known vulnerability or one that only certain hackers are exploiting.
This was like open season.
Everybody was in there
and kind of gleefully taking advantage of it and making fun of the users.
Yeah, I mean, one of the quotes, maybe from the original poster, I believe, yeah, I think it was them.
I mean, their direct quote was in all caps saying,
drivers' licenses and face pics, get the F in here before they shut it down.
It was, as you say, open season.
And it wasn't just a
quietly posted link.
It was people explaining to one another, hey, here's a script to rifle through the metadata of the files.
It's just a series of attachments and that sort of thing.
And then people were using scripts to download those images in bulk and then making them available.
And of course, I ended up.
downloading some of these
or I think all of them.
I downloaded the entire dump once it was made available so I could verify.
And
I mean, it's a lot lot of data.
It's a lot of images.
And T later confirmed it was tens of thousands of people's selfies and their identity documents.
Just to back up a little bit, Emmanuel explained what the app is, but Sam, you've covered these Are We Dating the Same Man Facebook groups?
And this essentially
is the appified version of this, or it came from this.
Is that fair?
And can you tell us sort of what those those groups are a little bit more broadly?
Because it's kind of the same thing, right?
Yeah, I mean, I didn't get to use T before it went down.
So I don't know exactly what, like the, if it's exactly one-to-one, the same thing, but it's the same idea.
It's definitely like,
so the are we dating the same guy groups, women would post a picture of a guy usually, or maybe a description of a guy, but usually a picture and say, hey, I'm going on a date.
For example, I'm going on a date with this guy tonight.
Does anybody have any red flags?
And it's like red flags are like the code for like, don't go or like, we have information for you.
On Facebook, this was happening with full names, obviously, people attached to it.
So
more risky, but the groups were closed.
So ideally, you wouldn't be able to get in if you were a bad actor.
But
yeah, it's like a vetting thing where like, you know, you could say, oh, yeah, this, this guy I've been on four dates with.
Does anybody else know him?
And someone's like, that's my husband.
You know, it's like, that's an extreme case, but like, that does happen plenty.
And that's the idea with the are we dating the same guy once?
And then, and men get super, super pissed about the existence of those
groups themselves and just are enraged that they're on them.
And then obviously, I think a lot of that rage is what we saw happen with the TAG.
Yeah, I mean,
can you just briefly elaborate on that?
And I know I'm not really asking you to put yourself in the mind of a 4chan user necessarily, but you see, I mean, it's an obvious question, but I'm going to ask it anyway.
You see a correlation there between some men getting very, very angry when...
say their face is posted into one of these Facebook groups and
4chan users rifling through this database.
Is that sort of like one and the same thing almost?
Obviously, one's a lot more extreme than the other ones, but is it sort of the same sort of behavior?
What do you think?
Yeah, and I don't, I mean, it's just different.
It's not really more extreme.
I mean, it's what's extreme is like guys suing these groups and like the administrators of these groups on Facebook and saying, you know, I'm suing you for libel or whatever, like defamation of which they've done, my image, which they've done,
which is obviously
a pretty serious reaction to having your picture put on Facebook by someone else.
But yeah, I mean, it's, we saw it like with tea in general, people were like, tea went a little bit viral earlier in the week, last week, and people were talking about it, even though it's been around for a while.
And then a lot of people were like posting like these like satire, or like maybe they were real, I don't know, like fake apps that were like
like the flip side, like really being like really derogatory toward women and like saying like we're gonna put um you you know your faces in here and
after the tea hack we saw like tea spill which is like a hot or not game based on the hack images it's just it's all like
um
proving the point a little bit it's like yeah like you know uh women were warning each other about you
and this is your response to be you know, horrible back at them.
It's just like, that's, it's kind of like, okay, no wonder.
But that was, I do feel like it's, it's all kind of connected.
And it's all coming from like this online dating world of like, you know, guys are
dangerous to go on dates with very often.
Women are, you know, the victims of violence on online dates quite often.
So
it's understandable that this exists, but the hack is definitely, I feel,
kind of a response to that.
Yeah.
And that website website you mentioned, it was somebody had taken those exposed tens of thousands of images, and you're shown two images, I think, and you have to select one based on their perceived attractiveness, like
as you sort of hinted at, almost like the very early Facebook site from Zuckerberg, right?
And that's got tens of thousands of rankings or something like that.
So already the information is being abused in various ways.
So that all happens.
There's the media firestorm.
We reveal and first report this data exposure.
Then it gets worse.
And the headline of this one is a second tea breach reveals users' DMs about abortions and cheating.
Jason, it was actually you that got this tip.
How did that come about?
Yeah, so a researcher who we had done a story with back at Motherboard, like all the way back in 2016,
reached back out to me and was like, it's not just this initial hack
or this initial like exposed database.
The actual like main database of T was exposed as well, which included,
you know, like
presumably or seemingly all of the DMs, including things like usernames.
You know, it was searchable to some extent.
So they were able to show us there was like women talking about abortions that they had sought or that they had had,
talking about, you know, cheating situations, like really intense
situations that they had with their partners.
Like it was very, very, very sensitive information.
And so,
you know, since you and Emmanuel had been reporting on this for a while, I passed it off to you to actually do the reporting and confirm the story because you're already very deep in it.
But basically, this person is really good at decompiling information and sort of found that there was an exposed database, like a further exposed database that made the hack like way worse than we initially thought.
Yeah.
Whereas the first one was a Firebase instance.
Again, while Emmanuel was talking about Firebase is this like app development platform by Google.
And it seems that you didn't need any sort of real authentication to go in and get those selfies and ID photos, that sort of thing.
The way it was described to us by this researcher for this second data exposure was that any user's API key, you know, you make an account with your username and password or whatever, and you're given an API key.
That's just how apps generally work, right?
The way it was described was that any user's API key could query the entire database.
So basically you had almost like admin level access, even though you're a random person who just downloaded the app.
And that's not good, obviously.
That's really, really not ideal when anybody can access all of that sort of information.
So the researcher sent us over all these screenshots, and they were very interesting.
But we need the data, you know, to verify what is going on here.
Of their own accord, they had already downloaded this information,
sent it over to us.
And
kind of selfishly, I very much enjoy these like reporting puzzles of we have this data and we have to figure out and verify and prove it came from a certain service.
For the first one with the
driver license photos, I downloaded the APK, I decompiled the Android app, and I found that, oh yeah, that exposed database in there, the same URL is in the app.
So that was pretty good verification.
For this one, with the more than 1 million messages, we went through,
did get some phone numbers,
texted some alleged users in there.
One eventually got back to me and confirmed, yes, I am a user of T, but that was actually after we published.
How we verified this one was that we took the usernames
from the million messages, not all of them, just a random snapshot.
And then I tried to make accounts on T with those usernames.
And in every single case, that was not possible because that username was already in use, indicating that, yes, these are million messages
have come actually from the T app.
So whenever we get data like that, and whenever we can verify it like that, I'm always supremely confident
in the veracity of what we've got.
Jason touched on this, but Emmanuel, you also went through
some of the messages and we didn't quote really any
directly.
Can you just explain a little bit more about why we didn't do that and maybe just a bit more on the sensitivity of these messages that you saw when you were scrolling through?
So we didn't quote anything because
it is possible to word search.
the data
that we got.
And
I was trying to explain this to my wife because I was talking to her while we were reporting the story, and she was like, Who hacked into T?
And that's an understandable question, and I guess the answer is like, nominally, I don't know, some bad people who have 4chan accounts, but that is not the question, or like that is not the problem.
It's not that somebody,
the story is not that somebody broke into T, it's that, I don't know, if T was a bank, they just like left the vault door wide open.
Um, And that's the real
offense here.
And because of that, we don't know who else got their hands on this information.
And we don't want to give specifics because we don't want to make that stuff easy to find.
And we don't want to have that stuff easy to find because
my poking around the messages, it took me like two minutes to identify someone because people are DMing each other.
Like that's the kind of conversation that's in the data.
And they're being
very real and they're sharing real names and phone numbers and social media handles.
So somebody is talking about the person they're dating being someone else's husband and them cheating.
And some people were talking about abortions.
And it just, it was
incredibly easy to
find those people in the real world, just given the context of the conversation.
Yeah.
And
so we do that.
I contact T for comment about the exposed direct messages on Saturday.
I tell them explicitly, hey, this research has found this.
Also found apparently the ability to send push notifications to more than a million people,
which is kind of crazy.
That's like, we don't even, we barely ever mention that, but that's also wild.
And I contact T, as I said, on Saturday.
They
don't comment specifically, just like we're continuing to investigate, and they're we're not going to share more information at this time.
We then publish on Monday, and then very soon after,
they make a post on Instagram saying, Oh, we've just learned actually the direct messages were exposed, so we're turning off DMs now, and then sent that statement to CBS News or various other people as well.
Um, to be clear, they did know since at least Saturday.
And
also to clarify, the researcher said their access to that database was cut off sometime late last week.
If the access was still live, at least to our understanding,
you know, we may not have published at that time.
We don't just like find a vulnerability or get told about a vulnerability and then go, okay, cool.
And then just publish an article because that's going to potentially actually lead to more data exposure.
As the best,
to our knowledge, it was closed.
And I think T just turned off DMs as an extra
precaution as well.
Very briefly, just last thing, Emmanuel, just before we were recording, I think we just published about a class action lawsuit.
Do you just want to mention what that is briefly?
I feel like that just happens now, right?
That's just normal.
Yeah, unsurprising, I think, but
a law firm that specializes in in data breaches has reached out and told us that they filed a class action against T.
And
yeah, I think it's that doesn't guarantee that that will go anywhere or that they'll be successful.
But I'm not at all surprised that the complaint has been filed.
They're expecting other complaints to be filed, and they hope to kind of take the lead on that and have all those other lawsuits join them in the class action.
Yeah, very, very standard.
All right, we'll leave that there.
I'm sure we'll continue to cover T,
even though none of us had heard of this app until last week.
It's a really significant data breach.
So we'll definitely be following that.
We'll leave that there.
And we'll be right back after this
to talk about.
I mean, it's complicated.
You're just going to see.
Okay, we'll be right back after this.
We all spent years working for a big company, where we had no control over the business, its priorities, or whether we'd have a job that next week.
We all took a big step by striking out our own, and now we couldn't be happier.
We thought starting our own business would be overwhelming and confusing, but we found smart tools like Shopify, which have made things easy along the way.
Shopify is the commerce platform behind millions of businesses around the world and 10% of all e-commerce in the U.S., from household names like Mattel and Gymshark to brands that want to be household names like 404 Media.
Shopify has got you from the get-go with beautiful ready-to-go templates to match your brand style.
Their easy-to-use backend helps you manage your store's inventory and makes creating an attractive shop for your customers really simple.
They also help you find your customers with easy-to-run email and social media campaigns.
And if you get stuck, Shopify is always around to share advice with their award-winning 24 7 customer support
turn those dreams into
and give them the best shot at success with shopify sign up for your one dollar per month trial and start selling today at shopify.com slash media go to shopify.com slash media shopify.com slash media
This limited edition inbound September 3rd through 5th brings attendees to San Francisco for a one-time only West Coast event with insights they won't find anywhere else.
Just revealed, the Inbound 2025 agenda is now live from the agent.ai Workshop, from Idea to Agent, and Dwarcache on AI's Future, research-backed, bold predictions with Dwarkash Patel, explore 200-plus sessions, all created for your growth.
Get fresh perspectives on innovation from a dynamic lineup including Sean Evans, the host of Hot Ones, creative force Amy Poehler, tech reviewer Marcus Brownlee, and AI pioneer Dario Almaday.
They'll bring their unique approaches and expertise to inbound 2025.
Cut through the noise with focused, actionable takeaways on the latest marketing, sales, and AI trends that give businesses a competitive edge in today's rapidly changing landscape.
Network with decision makers in San Francisco's AI-powered ecosystem, where innovative technologies are creating entirely new approaches to business.
Experience firsthand how San Francisco's technology ecosystem is is revolutionizing content creation, distribution, and monetization through AI and innovative tech solutions.
Secure your spot at inbound.com/slash register.
And if you want the VIP experience at Inbound, don't wait.
VIP tickets are almost sold out.
Get exclusive perks that help you make the most of every moment, including a welcome party, exclusive networking opportunities, and early registration access to limited capacity sessions.
Registration for limited capacity sessions begins in August.
Don't miss this opportunity to secure your ticket in advance, start favoriting your must-sees, and be ready when reservations open.
Session reservations for VIP ticket holders begin August 5th, and GA opens August 12th.
All right, and we're back.
Honestly, okay, I'm going to read out the first headline, then I'm going to go to Sam and ask her about the UK age verification law.
And then I think Emmanuel has some sort of weird diagram that he's going to like describe or something.
And maybe we'll upload it on the show
or something.
But okay, so the headline is: UK users need to post selfie or photo ID to view Reddits, R slash Israel crimes, R slash Ukraine war footage.
This is about the UK's new age verification law and some unintended but maybe foreseen consequences of that.
Sam, what is this law that just passed in the UK about age verification?
Yeah, so this passed or it went into effect last week, which is why we're talking about it and everyone's talking about it this week and last week.
It's called the Online Safety Act.
It's really similar to a lot of the age verification laws that we've talked about on this podcast a thousand times and written about quite a bit in the US
Basically, it's like a protect the children type law.
It requires, so it does a lot of, it does a handful of things, including like
like adjusting algorithms so that kids can't see
things organically in their feed that would be harmful or like considered harmful
and a bunch of different like regulation requirements for platforms.
But the big one is that it's requiring platforms to
get to keep operating in the UK, they have to implement age verification to check whether users are 18 and over.
So far, we've seen that mostly look like selfies or IDs, which is very coincidental,
I guess, considering what we just talked about, that all these platforms will be holding IDs or, you know, I think in most cases, third parties will be handling the verification.
So on Reddit, it's like something called
Persona, I think.
And there are a bunch of different
third-party age verification.
platforms that will do this.
But like at the end of the day, you're going to have to show, and you currently, if you're in the UK and not using a VPN, you have to show that you're 18 using
a valid driver's license or some kind of like a government issued ID
or like biometric data.
So it would like scan your face and determine whether or not you're 18 or like use these things in conjunction with one another.
And not complying is like an $18 million fine or something.
It's huge.
to not comply with this.
It's not a risk that platforms are going to take.
So already we're seeing like lots of different subreddits, which we'll get into going um behind like an age verification wall.
Um, foreign sites that want to comply with the law are doing it this way.
Um, just certain Discord communities are requiring age verification.
I think Xbox just announced today that they're gonna
start
doing this too.
It's just like a little sky.
Yeah, the gamers, yeah.
Yeah, which actually is, I would assume, probably
quite a bit of harmful content.
Roblox.
Minecraft, not the best places all the time.
Yeah, exactly.
So, yeah, that's in a nutshell what it is.
And obviously, it has these like massive repercussions that I'm sure we'll get into.
Yeah, so
it's mostly, I mean, in my eyes, it's mostly about porn.
It's mostly about online pornography, sites like Pornhub, that sort of thing.
But then as Emmanuel's story gets into, it is impacting all these other
websites.
Emmanuel, how do you want to do this?
Do you want to talk about this Reddit one first, then get into the payment processes?
Is that what you want to do?
Here's what we'll do.
Good, because I don't want to.
Yeah, no, I don't either.
It's really
complicated.
I asked, I don't usually feel very strongly about what we talk about in the podcast, but I really wanted to talk about this because
I don't know.
I wanted everyone to check in and I want to see how
everyone else is thinking about this.
I find this to be one of the most complicated
subjects that we cover and I kind of switch how I feel all the time.
And like surprise, surprise, censorship and platform governance is like a very complicated subject.
We all know that.
it is changing and evolving now in a way that we talked about for years but is now actually happening
and it's just a mess it's a huge mess so sam i don't know if you saw in the podcast room.
I just like posted this word cloud of like all the different entities that are involved in this.
So,
well, I guess I'll back up.
I'm just gonna like whitewall craziness.
Yeah, so it's like to back up, there's like just to like run through a few things that have happened in the past like month or so.
We talked about Civitai, this AI model sharing platform that was used to create non-consensual pornography.
They got pressured by
credit card companies to change their policy in a way that completely changed the nature of the platform and remove a bunch of those models that we found were really harmful.
Then a few weeks later, Steam, which is like the
default way of buying PC games online, they changed their policy.
They said explicitly to come in line with what credit companies want and removed a bunch of sex games on Steam.
Steam, in case you didn't know, for years has allowed like sexually themed games.
And there's been a lot of spam and like low-quality games flooding the platform every day since they've done that.
And they didn't remove all of it, but they removed a bunch of like incest-related games
and like very violent, very graphic
games.
And that happened.
And then I think later that week or the week after itch.io, which is this huge platform mainly for like independent game developers and students to share their work because it's easier to upload your games there, and you can also be more flexible about how you charge for the game.
So, you can charge nothing, you can decide what the split is between your game and what itch.io makes as a platform.
And they just like took this really radical action, probably because the credit card companies were threatening to shut them down any minute and just like de-indexed all their not safe for work games, all their sex games, made a few of them unavailable in a way that people found really shocking.
Like if you're in this indie dev game
community, a lot of your favorite games, award-winning games, fall under those not safe for work categories and they were like disappeared from the platform.
And that really rocked people.
And now
that this law came into effect, Reddit is also forced to use age verification because of this law in the UK.
And
as you said, Joe, mostly people think of it in terms of pornography and like not letting kids access that type of content.
And the way they do it is, it's like, because of this law, it's Reddit's responsibility as a platform to verify that every user who accesses that type of content is an adult.
And in order to do that, they have like an age check, which, much like the other story we just talked about, you upload a picture of yourself or a picture of your ID.
And this company called Persona verifies that you're of age.
But it doesn't only do that for
subreddits with sexual content, but like anything that is mature, which can be any subreddit
that is about the news, but in a very graphic and like immediate way.
So, as you said, Israel Crimes,
which is a community that mostly focuses on, you know, war crimes that Israel is committing in Gaza, that has an age check.
And that subreddit is filled with like very graphic, horrible content and like movie videos and images of real people dying.
But it also has like normal discussion about the politics of this and why it's wrong, and people organizing and just like sharing their opinions about
why they think this is wrong.
And also, like, normal news articles, right?
It's just a community that has this perspective that is willing to show that type of content.
That now, in order to see it in the UK, you have to jump through all these hoops and potentially jeopardize your privacy in order to participate.
That's kind of like a few things that have happened recently.
And
as you can see, like
I oscillate between,
you know, we're in the media,
we're our company is very focused on getting impact.
So when we expose that Civitai is enabling this like really bad behavior, and credit card companies respond by saying, Hey, you have to change your policies, or we're going to not work with you in a way that will completely end your business, I would consider that like a positive impact or like a good result.
But when the same exact mechanism is used to nuke,
you know, thousands of
games that are people's personal art and expression of who they are and, you know, things that I enjoy that don't intend, or I think you could even reasonably argue cause harm to anyone, right?
Those are nuked by the exact same mechanism and sometimes by the same interest groups.
I think that is awful.
And it's just like all these platforms are forced to make all these decisions right now.
And
I think some of them are probably positive.
A lot of them are horrible.
And it's just like it's just a very complicated
landscape.
I think it's a very complicated landscape, but I think that
this like legislation in the UK and the age verification laws that we've seen in states in the United States about porn are
pretty like definitively censorship
and
are not the type of intervention that you want to see from the government.
I think
Sam has written a lot about this.
We've all touched it in some way.
I think Mike Masnick at Tech Dirt has done like really good work on this, but
it's like using a hammer to fix something that you would prefer a finer, finer tooth comb to mix my metaphors there.
But it's just like, it's a, it's a super messy thing and it fundamentally like undermines the idea of having a free and open internet.
And then, as you said, you have like payment processor and credit card companies putting pressure and stress on the entire situation.
And you also have a lot of these
like anti-porn non-profit type vibes that are putting pressure on the credit card companies that are that are lobbying for a lot of these laws.
And especially like in the U.S., in some of these states, these states are like pretty captured by one political party.
And therefore, it's like,
it's pretty easy to push through some of these laws.
And it's like you have states that are
essentially adding this censorship layer to the entire internet without like understanding what they're doing.
And then, or maybe they do understand what they're doing, but they don't care.
And
then on top of it, it's like you have tons of like VPN companies.
It's like VPN downloads in the UK are through the roof.
So people like do find ways around this, but it's very similar to what you'd expect from like authoritarian regimes.
It is very similar to like,
I went to Indonesia last year
and there were many, many, many websites that I could not access without a VPN.
And there was almost like no rhyme or reason to which websites were blocked and which were not.
And it's like, I believe it was an anti-porn law, but
Reddit was blocked, 404 media was blocked, like random things are blocked, and it was very hard to tell what was blocked and for what reasons.
And then
we haven't talked about this yet, but it's like
the sites that are complying with this
are
adding a layer, like an ID verification layer that
Sam has written about, I've written about.
There's like a bunch of different companies that are offering these services where you have to upload your ID to access these different websites.
And it's like, there's a variety of different ways that this is being implemented.
And so
like many of the services that are offering age verification services say that they are deleting your IDs after like a certain amount of time, or they say that they're encrypted, or they say that they maybe delete them immediately after verifying who you are and your age, and that sort of thing.
But it's like, we don't know.
Like, there's so many different
services that do this, and different websites are picking which services they're going to use.
And it's like, we just spent a half hour talking about T and people's IDs being leaked on the internet.
And it's very easy to imagine a future where one of these ID verification companies gets hacked or where their security isn't perfect.
And, you know, pretty sensitive data ends up on the internet.
So
I think that the problem that they're trying to solve is a very difficult one.
And one of the reasons that it's gone unsolved for so long is because
the like quote-unquote solutions to it are often worse than the problem itself or like create more complicated
situations that undermine the idea of like having a free internet.
I think,
first of all, Persona, which does this for Reddit, says they keep your images for seven days,
which I guess is better than keeping them forever, but it's not as if nothing can happen to that data in seven days.
And it presumably could be millions of images.
So, another way in which, like,
I think this is very complicated is: I wrote this story about Reddit, and I was like, hey, I don't think I don't even vouch for these subreddits, and I don't think that they are necessarily like the most productive places in the world or anything, or at least I can't say that they are.
It just like it did not seem positive or good to me that now, in order to see that stuff, whether you're
a minor or an adult,
you have to show your face to Reddit or show your ID to Reddit.
That seems like a hurdle
that overall has the effect
of making
the news
more cleaned up than it is in reality.
And people were responding to me on Blue Sky being like, What do you think?
You think kids have to watch other dead kids?
And it's like, obviously, no.
Obviously, I don't think you have to force kids to watch like
frontline footage from the war in Ukraine.
That's crazy.
And I'm not saying that in the article, but that stuff is going to be harder for anyone else to see.
And sometimes that is the stuff that like radicalizes people or makes them change their position or like that historically, you know, how Americans felt about Vietnam, how Americans felt about the Iraq war.
how Americans felt about the Holocaust.
A lot of that had to do with what kind of images were in their heads.
And, like,
policymakers in the UK just decided for their citizens that, like, that stuff is going to be harder for them to access.
I don't know how
you
police that.
I don't know how Reddit should
manage that, but it seems clear to me that while the problem is real,
this kind of policy
is not the solution.
We're trying to solve a problem
with the wrong tools.
Also, sorry, I wanted to like, the whole thing reminds me of our journey with
Trafficking Hub and Exodus Cry.
Sam, I don't know if you wanted to talk about that, where it's like,
we were reporting about Pornhub for so long that I think they
and
organizations like ENCOVE thought that we were like allies or that we had the same interests, but then our reporting shifted from reporting on Pornhub to reporting on those organizations.
Did you want to talk about like
well, maybe who, maybe who those organizations are?
Because I don't think everybody is.
Which are involved in the STEAM, by the way.
It's like Exodus Cry is like
behind some of the activism that led to STEAM and itch.io to change their policies.
Yeah.
Yeah.
I mean, these are religiously affiliated, current or past
conservative, I would say, groups that hold up anti-trafficking as their like mission, but and that's how they get nonprofit status is they want to,
you know, save trafficking victims.
Um, but the problem is they define trafficking as porn, all porn, all-sex work.
Um, anyone in the adult industry is like a victim of self-trafficking.
Um,
and they're the ones behind a ton of this pressure that gets put on payment processors.
They're the ones behind a lot of the pressure that gets put on politicians, which the pressure needed there is like a pinky finger push.
It's like all you have to do is slide a bill in front of a Republican politician that says save the children and they'll sign it immediately and not read it.
That's something that we know for a fact.
A lot of these bills don't get read.
before they're voted on.
So yeah, they're the ones that kind of are a
force for a lot of the changes that you see.
And now at this point, it's like the administration is like welcoming this type of rhetoric in very actively in the US.
I guess in the UK too, I don't know a ton of UK politics, but it's a very much one-to-one
comparison there.
What happens there and what happens here are kind of in tandem.
So if you consider all that as being like, this is like, you know, extremely well-funded lobbying groups whose full-time job is to moralize what we do on the internet.
I don't know if, like, I don't, I don't particularly know if like the CEO of MasterCard
really gives that much of a fuck, but like the lobbying groups do, and they have a lot of money and a lot of pressure.
And
people behind these campaigns,
I think, like, I guess it's funny that everybody's talking about this right now because it's something that sex workers have been talking about for seven years
to 30
if you take the long range.
I think the way you know that this isn't necessarily like we talk about unintended consequences, it's like,
I don't know, it's like whether they intended these consequences or not, I think is aside from the point.
I think
we know that
the
actual like meaning and like purpose of a a lot of these bills is not necessarily to protect kids because we know for a fact it doesn't work.
Like, there's such studies for this.
There's research behind this.
It doesn't work.
We see it happen every time.
People just do VPNs.
People go to worse and worse, more dangerous sites.
What we know works, governments refuse to actually put any power behind it.
It's like we know that device-based age,
like parental controls and verification works.
We know that kids probably shouldn't be handed an iPhone completely without restrictions,
you know, at age three or whatever.
That's probably a recipe for disaster.
And yet we do it every day.
We know that like sex education and
like age appropriate discussions about consent works as far as media literacy and also
just like understanding what you're consuming on the internet, what you're seeing.
But there's no push for any of that in any level of government.
It's just these
conversations about platform regulation, which I've always thought is
the wrong way to go about it.
Platforms are motivated by profit and engagement.
That's not their
duty to parent your kids.
But now the government's involved and now they're going to crack down on content that like legal
adults should be allowed to access without a problem.
But because
there are a lot of kids in these countries without any supervision online,
this is what we have to deal with, which I think sucks.
I think the stuff going on with
Steam and
itch and
that you have to show your ID to be be on blue sky in the UK is probably not good.
Yeah, I think to expand on that a little bit, it's like how these things play out in practice is, as Sam said, it's like if you are in a state that has an age verification law right now, it's like Pornhub is blocked because Pornhub has decided to block itself rather than comply with these laws.
And so, you can't access Pornhub in a lot of states in the U.S.
And so, people in these states either use a VPN or they use other websites that simply don't comply with the age verification law.
And like a lot of those sites are based in places that have like
very poor laws around things like copyright.
Like a lot of it ends up being like pirated content, like quasi-legal content.
Like who knows what's going on?
And we know that this is happening because like
If you look,
like I've seen conversations on Reddit, on other places, like, hey, can you share this without a pornhub link?
For those of us in Texas, like, can you, can you give us a link to X videos or like a different website that, you know, is not complying with the law?
And so that's going to happen in the UK if it hasn't happened already.
And then
even, even in places that have like really authoritarian governments like China, China has a law or had a law.
I don't know the current state of it and like the specifics here might be slightly wrong.
But basically, they were trying to limit how much children were gaming.
And so I think they had like a one hour a week gaming limit for kids.
And what ended up happening was kids were taking their grandparents' IDs and they were just using them to log into, you know, the game server or whatever.
And so you had these like 80, 90 year old people playing like dozens of hours a week of different video games.
It's like, that is probably going to happen.
It kind of rocks.
It's like people are going to find ways around it, first of all.
Second of all,
you know, this is sort of, I just want to stress again, it's like, it's not just blocking
like
distasteful, violent, whatever, like news because the world is a bad place and there's bad news, but it's also blocking like consensual porn that adults want to access.
And
like there are reasons why an adult who wants to look at porn, like might not want to upload their ID
to tell a random company.
So that random company can tell Reddit or Pornhub or whatever that this person is an adult.
Like
there are many, many, many people who just like don't want to do that.
And
it's a similar problem to the ones with like Facebook's real name policy and things like this, where it's just like
anonymity is important on the internet.
It's been important on the internet since the beginning beginning of the internet.
And we are like kind of just throwing that away for like
because people have gotten better at lobbying, essentially.
I think
to crystallize what makes me feel like yucky about the whole situation is like our journey.
It's just as a dad,
as a father.
No, not even close.
Okay.
We spent so long reporting on how bad Pornhub is.
And at some point around 2020, I guess, a bunch of politicians and interest groups were like, you're right, Pornhub is banned.
Let's ban porn, right?
And we're like, no, like that, that wasn't the point at all.
And now that we spend a bunch of time talking about like non-consensual AI content on the internet.
you know the uk is like so let's age verify the whole internet or we spend a bunch of time talking about people live streaming mass shootings on twitch so it's like oh age verify that or don't allow violent content on facebook it's like no that's not what we're saying so it's like it's the way that our reporting is being leveraged to justify these like puritanical censorious
politicians and interest groups that really like rubs me the wrong way.
And I guess all we can do is just like keep reporting what is actually happening.
I don't know.
And like, we never have, but, but, but like, never advocate for like these type of solutions, like these overbearing and terrible solutions.
I think a lot of the time people assume, you know, we hear this a lot about like our AI reporting.
People assume that that is what you want.
You know, it's like, oh, you want to like censor Twitter or you want to censor like social media.
It's like, no, not at all.
Yeah.
I think that's a really, really good clarification.
You're right.
Okay.
We will leave that there.
If you're listening to the free version of the podcast, I'll now play us out.
But if you are a paying 404 Media subscriber, we are going to talk about LeBron James and how he is not pregnant, as far as I know.
You can subscribe and gain access to that content at 404media.co.
As a reminder, 404 Media is journalist founded and supported by subscribers.
If you do wish to subscribe to 404 Media and directly support our work, please go to 404media.co.
You'll get unlimited access to our articles and an ad-free version of this podcast.
You'll also get to listen to the subscribers only section where we talk about a bonus story each week.
This podcast is made in partnership with Kaleidoscope.
Another way to support us is by leaving a five-star rating and review for the podcast.
That stuff does really genuinely help us out if you could do that.
This has been 404 Media.
We'll see you again next week.