Pokémon Go to The Military Industrial Complex

51m
This week we start with Emanuel's couple of stories about Niantic, the company that makes Pokémon Go, and its plan to build an AI model based on data collected by its users. After the break, Jason and Emanuel talk about their big investigation into the rise of "AI pimping." In the subscribers-only section, Joseph explains why he doesn't use a mobile phone and how he uses an iPad Mini instead.

YouTube version: https://youtu.be/9NhJGqEqx-U

Pokémon Go Players Have Unwittingly Trained AI to Navigate the World

Pokémon Go Data ‘Adding Amplitude to War Is Obviously an Issue,’ Niantic Exec Says

Inside the Booming 'AI Pimping' Industry

I Don't Own a Cellphone. Can This Privacy-Focused Network Change That?

Subscribe at 404media.co for bonus content.
Learn more about your ad choices. Visit megaphone.fm/adchoices

Listen and follow along

Transcript

and Alyssa are always trying to outdo each other.

When Alyssa got a small water bottle, Mike showed up with a four-litre jug.

When Mike started gardening, Alyssa started beekeeping.

Oh, come on.

They called a truce for their holiday and used Expedia Trip Planner to collaborate on all the details of their trip.

Once there, Mike still did more laps around the pool.

Whatever.

You were made to outdo your holidays.

We were made to help organize the competition.

Expedia made to travel.

Hello, and welcome to the 404 Media Podcast, where we bring you unparalleled access to hidden worlds, both online and IRL.

404 Media is a journalist-founded company and needs your support.

To subscribe, go to 404media.co, as well as bonus content every single week.

Subscribers also get access to additional episodes where we respond to their best comments.

Gain access to that content at 404media.co.

I'm your host, Joseph, and with me are two of the 404 Media co-founders.

The first being Emmanuel Mayberg.

Hey.

And Jason Kebler.

Hello, hello.

Yes, Sam is not here this week.

I presume she'll be here next week, but we have basically

the Emmanuel show this week, or at least in the free section.

I mean, Jason worked heavily on the second story as well, but the first section.

Emmanuel, you've published a couple of stories about Pokemon Go and data collection and AI.

The first one is

Pokemon Go players have unwittingly trained AI to navigate the world.

So the company behind Pokemon Go is Niantic.

Is that correct?

So what did Niantic announce exactly?

And then we'll get into the more of the specifics.

Niantic announced something called a large geospatial model.

This is a term that they coined.

They explain in their blog post announcing this that that name is in direct reference to a large language model, which is the type of AI model that we talked about endlessly here over the past

more than a year now.

And

what is the reference they're making there?

A large language model.

How does that work?

You scrape tons and tons of text from the internet, usually without permission.

And based on that data, you train an AI to kind of statistically

assume what is the most likely next word in a sentence.

And with enough data, that sounds like coherent English that can answer you

on a whole range of subjects.

A large geospatial model is basically trying to do the same thing for the physical world.

Joe, sorry to put you on the spot, but there's maybe like a short paragraph in the story that you can read where like they give an analogy where he talks about

a church.

Yeah, I've just found it and I agree.

I think this would be useful because I'm almost having trouble visualizing even what that would actually do, but I'll read out the section here.

And it says, imagine yourself standing behind a church.

Let us assume the closest local model has seen only the front entrance of that church, and thus it will not be able to tell you where you are.

The model has never seen the back of that building.

But on a global scale, we have a lot of churches, thousands of them, all captured by their respective local models at other places worldwide.

No church is the same, but many share common characteristics, and LGM.

is a way to access that distributed knowledge.

So basically, it's like predicting what an area is going to be.

How would you characterize it?

Yeah, exactly that.

Like, there is enough data about physical spaces in the model that the model would be able to predict how to navigate the space, right?

It's like the thing that it made me think about is like when you're at a restaurant that you've never been to before and you need to go to the bathroom, right?

It's like you don't know where the bathroom is, but you've been to enough restaurants to know that the bathroom is like not in the middle of the restaurant.

It's somewhere off to the corner, to the edge of the restaurant, right?

And it's like you're kind of training an AI model to do the same thing.

So it hasn't seen this specific church, but it has seen enough churches to know that this is where the entrance would be or something like this.

It's funny, the restaurant example, because

I always ask the staff.

Like, I also always ask, but you know where it's not going to be.

You know what I mean?

Yes, that's fair.

That's fair.

So for those who have lived under a rock for, what, five, six something years, let's back up a little bit and can you just describe what Pokemon Go is

specifically?

It's funny because while researching these stories, I learned something that I thought was apocryphal, but is apparently true, which is

on one of the,

on an April Fool's Day,

Google put out this video where they

kind of announced a fake integration of Google Street View and Google Maps with Pokemon.

And it was just a joke about like, oh, you'll be able to see Pokemon

in Google Maps.

And it was such a viral video that the Pokemon company and Nintendo got in touch with Google and were like, hey, can we make something like this?

And they did.

And that game is Pokemon Go.

It's a mixed reality or augmented reality game that really blew up in 2016,

where you kind of go around the real world with your phone and like real locations in the real world have Pokemon or gyms there that you can interact with and

have kind of like a real world Pokemon game where you're capturing them, doing battle with other Pokemons and stuff like this.

That's all based on like Google Maps data.

Yeah, I I mean it was obviously a phenomenon.

I imagine all of us played it.

I didn't play it all that much, just a little bit, but I have a very vivid memory of it was the the first weekend it came out, and it was a Friday night in London, and

everybody in the street was playing it.

And, you know, like lads who were out at the pub were screaming in the roads because they were trying to capture Pokemon.

It was insane.

That was, uh, I like played it in Central Park in New York and just being there.

Actually, I don't think I was playing it, but I saw people

like I could tell the groups of people that were playing it.

Like, you could tell by their behavior that they were just like groups of people walking into the woods and stuff like that.

I feel like there were also some stories soon after it launched about like children wandering off into the woods and things like that.

Like, that was a bit of a panic for a while.

Walking to the road and stuff.

I seem to remember that and like putting themselves in danger or something.

So, you're playing this game, and the way you do that is you point your

phone's camera somewhere in the real world.

Like maybe there's a Pikachu or whatever outside a landmark and it's on the pavement and you have to film it and point your phone at it like that.

I mean, what data is being collected there?

And I appreciate we probably didn't really know this at the time we were playing Pokemon Go.

It's sort of we kind of learned this.

a little bit later through reporting and through this announcement as well.

But what is happening there?

Is the phone collecting information about surroundings or something?

So Niantic collects a ton of data from Pokemon Go, from Ingress, from a bunch of other games it has launched since the success of Pokemon Go.

And that data is

used and collected in a bunch of ways that I don't want to get into here

because Niantic did talk to me and they were very careful about saying what kind of data goes into this LGM product that they're working on.

And this data is when one of those games asks you

to

take a picture or scan a real world location.

So, for example, they recently introduced this product, this feature in Pokemon Go called Pokemon Playgrounds.

And that is where

you can go to a physical location and pin your Pokemon to that location.

You're like, on this bench in this park, I'm putting my

Pikachu.

And then somebody else can come there, and that location is saved in the game so they can take out their phone and see it.

And like, that data feeds into this product.

They also had some other features that asked you to like scan real locations and monuments and stuff like this.

It is unclear to me

whether that stuff feeds into the LGM or not.

It's a little vague, but it's like that kind of stuff that is

informing the LGM, like pictures of real places and things in the world that are also attached to location data.

So Niantic knows where that thing is.

Yeah.

So I edited this piece and I was like looking into this and it's it's not super clear which specific things

you know Niantic is using for this specific product, but I will say that over the years they have

I guess I just want to highlight that they like incentivize players to do this.

It's like you get little rewards if you do it.

Right.

It's like gamifying that type of collection.

Yeah.

And it's, I mean, okay,

this is not a real example, but in my mind, I'm just like, please go scan the nearest like nuclear silo.

There's, we put a bulbasaur there so that you can like get images of this.

And to be clear, that's not a real example, but it like, that's kind of the vibe where it's like, please go to your nearest monument or church and like hold your phone up to it and walk around it, get it from all angles, and like, we will give you some Pokeballs for doing so.

Yeah.

It reminds me of another story from a couple of years ago in the Wall Street Journal.

Byron Tao wrote it, and the headline was, gig app gathering data for US military others prompts safety concerns.

And it was this app called Premise where users were told, hey, go here and take some photographs and we will give you money or whatever.

I'm obviously oversimplifying the piece, but basically a gig platform for going and doing OSIMP for people.

I'm not saying it's one and the same here.

They're building an AI model here, but it is still the outsourcing of gathering data about certain certain landmarks.

I don't know.

I find that pretty interesting.

Was this in the aftermath of Russia's invasion into Ukraine?

Because I remember there were a couple of stories immediately after Russia invaded Ukraine where there was like a panic over gig workers scanning specific things and then concern that it was being used for targeting in some way.

It was like in the immediate days after the invasion.

And I don't remember, like there was definitely some

over panic there.

Like, I don't know how it actually shook out, but I do remember that there were definitely

concerns that it was like

not Uber drivers, but you know, like gig workers, as you say, like taking pictures of plotting like different things that people were worried were being used for targeting.

Yeah, definitely around that time, and definitely those concerns as well.

Byron found it was being,

as well as that, this was being used by the US military, which was, that's like almost even crazier to me in a way.

I was going to ask, what does Niantic say it's going to do with the data, but I feel like we touched on that, and actually, that'll be better for the second story.

So before we get to that,

in the headline, we put that Pokemon Go players have unwittingly trained AI.

We got some pushback against that.

I think Jason or Emmanuel elaborated in Behind the Blog, you know, sort of our behind-the-scenes paying subscribers article that we publish every Friday.

But just briefly, I mean, a very simple question.

Do we think that the English lads who were playing Pokemon Go on Friday night when it launched and getting all rowdy, do we think they were aware they were contributing to a, you know, a mass data collection model being used to power,

you know, the generation of a new piece of AI technology?

Can I give my version?

And then, Jason, you can give the spicier version.

So, I think there are

two types of people

who played the game in regards to this question.

And

one category of people,

they think it's just a game.

They had no idea at all that any of this is happening, that they're collecting data of any kind.

These are children that are playing this game.

It's like, it's really ridiculous to assume that they would know that this is, that they're generating data for this company to use.

And then there's this other crowd that

maybe are people who read our website and are tech savvy and are concerned about data collection and privacy and security and all this stuff.

And they rightfully assumed that

the players are generating very valuable data and that Niantic will leverage it in some way.

And I think that's a fair assumption, obviously.

But even in that case, no one could have predicted, I think, that they were doing

this to create something called an LGM, a concept that didn't exist back when

the game first blew up.

So

some people may have had suspicions about Niantic using the data, but I don't think we could have assumed that this is what it would be used for.

And it surely, they will continue to leverage the data in

new ways as time goes on.

Yeah, I mean, this was my behind the blog, and it was probably one of the spicier ones that I've done.

My post about it like went relatively viral.

So a lot of the people complaining about that unwitting part were like in my mentions, but Emmanuel pretty much nailed it.

I think all I would add is that

there have been studies that show that no one reads the terms of service for things.

Like we know this, and it's not like you can negotiate terms of service, but studies have shown that like 99.8% of people do not read terms of service.

And then of the people that read terms of service, more than 99% of people do not understand what they mean.

So there's that.

There's the fact that

like Niantic was part of Google, and then I believe it was sort sort of spun off.

I don't know if there's like any relation anymore, but it has always been something of a mapping company.

And so there are people who said, Oh, well, that's like what Niantic does.

But Ingress, their first game was released in like 2014, I believe.

And then Pokemon Go came out in like 2016.

And to Emmanuel's point, like a lot of the people playing this are children.

It's like these, these kids and their parents don't know the history of Niantic as a company or even what Niantic is.

It's like, I don't think that a lot of people are researching sort of like the background and funding of the mobile game that they play.

Like it's, it's just, I play video games all the time and I couldn't tell you who developed half the games that I played for like hundreds of hours, even if their name flashes every time I start it up, just because it's like not something that I pay attention to.

And then the other thing is that you could assume, well, so I just went through the signup process recently on Pokemon Go before we published this, and it asks for all these permissions.

It's not like it's saying, hey, we're building a model, an AI model based on this stuff.

It's like, hey, scan this thing and we'll give you

where we put a bulbasaur and we'll give you some pokeballs.

And if you don't want to do that, you don't have to do it.

it incentivizes people to do this.

And anyone who's actually playing the game and wants to be successful at it probably is doing these quests to some degree, unless you're highly informed.

I think even if you're highly informed, like what is happening with AI now is just new and I think unexpected.

So like

when you use Facebook, when you use YouTube, you

grant the company all these rights.

And we all kind of know that because we know how these companies work.

But I don't think that people who post their drone videography to YouTube could have assumed at any point

that YouTube would or some other company would then like process all those videos to create a AI video generator that would eventually put them out of a job because video is much cheaper to generate this way than pay them to make it.

It just,

it's, it's the way that the data is being leveraged that is totally new and unexpected, I think.

Right.

And the very last point I was going to make is people said, oh, like, what did you think their business model was?

And it's like, well, if you're even thinking about that, first of all, it's a free-to-play game, but it has in-app purchases and they make billions of dollars from those in-app purchases.

Like, that's a pretty common one.

You also might assume, because as Emmanuel said, it's like you sort of know that you're giving them information.

And you might think like, hey, there's a business model that I'm familiar with.

And one of those business models is targeted advertising based on your location.

And Pokemon Go has that also.

And so, you know, that's not something that I like.

Joseph, I know that's not something that you like.

But that is like a type of thing that an informed person might think, okay, they're using my location to target ads at me.

And you can make that calculation, like, I'm okay with that or I'm not okay with that.

But then what they've done is recontextualize that same information that they're using to deliver targeted advertising to you to then build like this big mapping platform that can be used for all sorts of other things, which probably leads us into the second story.

Yeah.

And I would just add that even if you did assume it and, you know, you thought that, oh, my data is going to be used in some sort of way.

Okay.

But now they admit it.

It's news.

They are coming out and saying they're building a model.

So I don't know.

That's new information so people can make informed decisions, right?

But you're right.

It does go into the next story which is pokemon go data quote adding amplitude to war is obviously an issue end quote niantic exec says i kind of butchered that headline but basically there was a conference run by belling cat the osint organization um the person on stage was brian mclendon niantic senior vice president of engineering and as you alluded to jason formerly the co-creator of Google Earth, Street View, and Google Maps.

Emmanuel, what did Brian say on stage that made you think I should pull this out and, you know, do a second article on these comments?

So, I should say that the reason I blogged this in the first place was

Garbage Day had a short item about Neantics blog post announcing this,

and

it was referencing a tweet by an AusNet researcher who was saying haha isn't it funny that all our Pokemon go

gaming is gonna fuel killer robots and that's a very provocative idea but I didn't say anything like that in the first story because there wasn't any

information indicating that it would be used that way.

And I asked Niantic and they didn't address the question.

Well, what did you ask them?

I asked them, will you have any restrictions on who you sell this data to?

Specifically, will you sell it to militaries and governments?

Will you have any limitations about it being used in some kind of lethal force?

And they just did not address the question either way.

And I didn't think it was fair to include because that would be pure speculation.

And all the original blog posts said is that they wanted to use the LGM as a critical component for AR, which is augmented reality, and robotics, content creation, and autonomous systems.

And that could be like anything, right?

Jason has covered these

food delivery robots in LA.

So you can imagine it informing something like that.

It would be very useful in that scenario.

But then I found out about this Bellingcat conference where he was, and it's a fascinating talk, which I recommend everyone

watch.

And

Bellingcat asked the obvious question to which he responded that

basically,

if a military government uses it in any way that a regular consumer would use it, that's okay.

But if they use it in a way that adds amplitude to war, that's definitely an issue.

Him saying that's definitely an issue is not a no, by the way.

And I asked Niantic as well, does that mean you won't do it?

And they did not say that.

They said, this is very early on.

We just announced this.

This is months or years away from being deployed.

And we're exploring all options, which is definitively not a no.

It's a very long no.

Yeah, it's a very long

or it's not, but yeah,

it's a very long, like, probably.

Right, right.

And I mean, that's already incredibly interesting in those comments and sort of that dancing around the issue and him saying that, you know, adding sort of force to a war could be an issue for them.

We can't really go beyond that at the moment because as you say, it's been developed.

But like, what do we make of this?

Is there a future in which this AI model could be used?

for military things.

It could be used for more innocuous things.

Is it like basically a, we wait and see, you know, and because

I think the one thing we do know is that this data is probably quite valuable so um

i think during his talk he shows this video about how

um

he did he just he shows this clip where they take all the scans of this courtyard with the fountain in the middle and with the photos of that location plus

the like geolocation, right?

This like knowing where the phone is physically in the world, they were able to create like a 3D model of that space, like a video game, right?

So, they take a bunch of 2D images, they take a bunch of location data, and they create what looks like a 3D level from a video game.

Now,

with a game as popular as Pokemon Go, if you could do that to like major cities all over the world, and then one of those cities becomes like a war zone, right?

And you have like what they call centimeter-level precision and mapping of those locations, then you don't have the problem that Jason's delivery robots deal with, right?

Because it's like you have, you're navigating those spaces like it is a 3D level of a video game.

Like you know what the space looks like.

So

if you have like a, you know, like a spot robot mounted with a machine gun, it's much easier to navigate that space if you have that kind of data as opposed to doing things like we currently do, which is like you have a robot with LiDAR cameras and they just go down the street and they try to figure out what is happening around them as it's happening and parse it out.

And that's how you get them like falling off the curb and you get like self-driving accidents and stuff like that.

You don't, you're not trying to parse out the environment as it's coming at you.

It's like you literally know every inch of the environment.

Or

you can, the Boston Dynamics, whatever sort of dog robot can predict that in this sort of European style city or whatever or this American style city there are certain characteristics and while I turn around this corner there is a high probability that the curb is going to be this sort of height you know the curb in London is going to be of a certain texture or a certain height or certain angle or whatever and it could maybe make you know predictions and movements based on on that.

So yeah, you can see the benefit of it.

Yeah.

And most, I should also say, and Niantic says this as well, most of this type of data that exists at the moment does come from like dash cameras and self-driving cars.

And

the

killer app aspect that Pokemon Go has is that it's a bunch of data where only pedestrians go.

And different levels.

Like if it's a dash cam, it's just one stationary sort of thing.

And if you're moving a phone around, it's going to be all over the place.

That's right.

Yeah.

Jason?

I was just going to say that they're not my delivery robots, but

I should start a delivery robot corporation, open source it.

Yeah, I mean, why not?

404.

Well, actually, tell you what, you can take that branding yourself.

Maybe, maybe that can be a side project for you.

All right, we'll take a break there.

When we come back, we're going to talk about something completely different.

We're going to talk about the rise of the AI pimping industry.

That is a word or a series of of words I didn't think I would ever say.

We'll be right back after this.

For a great first minute or last minute gift and a gift that lasts a lifetime, there's Masterclass.

With Masterclass, your loved ones can learn from the best to become their best.

Masterclass is the only streaming platform where you can learn and grow with over 200 of the world's best.

That's why Wirecutter calls it, quote, an invaluable gift.

I'm going to be doing a lot of cooking this holiday season, so I'm learning the art of home cooking from Alice Waters, French pastry fundamentals from Dominique Ansel,

and how to talk about wine without being a snob with Emily Wines.

My favorite thing about Masterclass is it works in whatever format is convenient for me.

I can watch it on my computer, phone, or TV.

And when I take my dog for a walk or drive, I can listen to the audio versions.

There's no risk.

Every new membership comes with a 30-day money-back guarantee.

Give your loved ones a year of learning with Masterclass.

Masterclass always has great offers during the holidays, sometimes up to as much as 50% off.

Head to masterclass.com/slash 404pod for the current offer.

That's up to 50% off at masterclass.com/slash 404pod.

Masterclass.com slash 404 pod.

It's the most wonderful time of the year for deals and also a good time to think about making some changes.

Speaking of changes, how about changing your mobile provider to Mint Mobile with their 15 bucks a month deal with the purchase of a three-month plan?

Unlike other life changes, this one doesn't take too long.

The longest part of the whole process is the time you'll spend waiting to break up with your old provider.

And Mint Mobile's website makes it very easy for you to port your device and your phone number over to Mint Mobile without changing your wireless experience.

To get started, go to mintmobile.com slash 404media.

There you'll see that right now, all three-month plans are only 15 bucks a month, including the unlimited plan.

All plans come with high-speed data and unlimited talk and text delivered on the nation's largest 5G network.

You can use your own phone with any Mint Mobile plan and bring your own phone number along with all your existing contacts.

Find out how easy it is to switch to Mint Mobile and get three months of premium wireless service for 15 bucks a month.

To get this new customer offer and your new three-month premium wireless plan for just $15 a month, go to mintmobile.com/slash 404 media.

That's mintmobile.com slash 404 media.

Cut your wireless bill to 15 bucks a month at mintmobile.com slash 404 media.

$45 upfront payment required, equivalent to $15 a month.

New customers on first three-month plan only.

Speed slower above 40 gigabytes on unlimited plan.

Additional taxes, fees, and restrictions apply.

See Mint Mobile for details.

Hackers and cyber criminals have always held this kind of special fascination.

Obviously, I can't tell you too much about what I do.

It's a game.

Who's the best hacker?

And I was like, well, this is child's play.

I'm Dina Temple Restin.

And on the Click Here podcast, you'll meet them them and the people trying to stop them.

We're not afraid of the attack.

We're afraid of the creativity and the intelligence of the human being behind it.

Click here: stories about the people making and breaking our digital world.

AI machines, satellites, and telegraphs.

Click here.

And listen.

Click here.

Every Tuesday and Friday, wherever you get your podcasts.

All right, and we are back.

This is one that both Emmanuel and Jason wrote inside the booming AI pimping industry.

So, how does this story start?

And I think it might be good to give people a concrete example of what someone might see when they sort of scroll through social media and they come across one of these posts that you're talking about in the piece.

I don't mean the educational ones and how people learn to do AI pimping, but sort of what does the end product

look like?

Yeah, so this is actually a follow-up.

So we did this story in partnership with Wired.

It was the second story we published on Wired and Emmanuel and I worked on it for several months, sort of like in the background for a while.

And it grew out of reporting that I did and I believe published back in February about the rise of AI influencers on Instagram.

And essentially, if you are scrolling through Instagram and let's say that you

generally

like a lot of Instagram influencers or models, I mean, these are like usually like almost always women, although there are sometimes men.

Like we saw some AI-generated men as well, but it was mostly women.

And it's mostly them in bikinis, like at the beach or by the pool or in front of a mirror, that sort of thing.

And it's like modeling images more or less.

And, you know, it's just like lifestyle content, for lack of a better term, like aspirational travel content.

And, you know, you'll either see a grid post where it's just like a photo.

or you might see a reel.

And that's a critical distinction that we can talk about.

But basically, it's like both.

We saw both types of things.

So, yes, you have that lifestyle content, the sort of stuff you see on Instagram all of the time.

Emmanuel, what does it look like?

Is it photo?

Is it video?

And crucially, what is different about this?

Because we're not writing about normal Instagram influences.

What is different about this content?

Yeah, so first I would say it's very possible that you've seen this content and you don't realize it because

it looks real.

It looks convincingly

real.

You would not be able to tell that there's anything not right about it.

But yeah, it's either still images,

models, influencers, beautiful people that you can aspire to be,

or reels which are really being promoted by Instagram right now.

It's trying to put them in front of you in many ways.

And those look like real video.

And then, what's different?

Is it that the face has been swapped?

Like, what differentiates this from a normal piece of Instagram content?

It is

what all this content that we highlight has in common is that it is originally content from a real human creator who posted it to their social media.

And

someone else just took that video and used AI to make a deep fake video.

But usually, when we think about deep fakes,

someone takes a porn video and puts someone else's face into it.

And in this case, they take

a PG-13 rated video and puts an AI-generated face onto that video in order to make it seem like original content.

And the face is consistent across the account.

So it seems like you're following a real person, but you're not.

You're following an AI generated person who is stealing videos from all these other, usually women,

in order to create like a viral popular account.

Yeah.

I mean, really, really crazy stuff.

And it's especially wild that people may have seen this and not even realized it because they are super convincing.

You scroll through them and the faces match.

And it's the continuity.

as you say, which has always been a big problem for AI, right?

But it's like a consistent fake character across the social social media posts.

That's the craziest thing to me.

So you have all of this, but you didn't just talk about that.

You actually went sort of a layer lower than that, because it turns out there's an entire industry about monetizing these fake creators now.

So who is making

these deep fake social media pieces of content?

And like, you know, what's happening there?

So when Jason covered this

back in February, he was just covering the fact that it was happening.

And it wasn't clear if it's just a few people doing it for fun or to fuck with people

or they just found a way to monetize it.

The reason we came back to it is that the entire space has exploded and has become totally professionalized.

So I couldn't tell you who most of the people who who run the accounts are, but there are many of them.

And also, there's this other class of people now

who are

kind of like many other Instagram hustlers.

They say, I manage all these accounts.

I make

this amount of money.

Do you want to make money like this as well?

You should follow me, buy my guides on how to do this, watch my videos.

There's like an entire industry of people teaching you how to do this and you paying them for that privilege.

And

yeah.

Yeah.

So we found a couple different communities of people doing this.

You know, one is called the quote-unquote digital divas.

And it's like,

I don't know who runs it.

I mean, that's kind of one of the frustrating things is that

the quote unquote people running it are just AI avatars themselves, but it's like three

AI influencers who are teaching other people how to do this.

And it's like a $50 guide.

They have a Discord and they have like coaching on that Discord.

So you can like join the Discord and they will teach you how to generate these models, how to get the face similar,

so on and so forth.

I think one interesting thing about the digital divas is that they

claim that they're an anti-deep fake server.

And so they're like, don't do the face swaps onto the real bodies of real women, um,

which

makes them feel like they are doing something that's like a little bit more moral or ethical.

Um, seems like splitting hairs when it comes to the ethics of it.

Like, so what are they doing?

Are they generating a whole person?

They're generating a person, and then they're, they're generating, yeah, they're like, they're basically not doing body swaps, they're not doing face swaps onto real videos of other women, but they're they're like using those videos as inspiration and in some cases, like as training data.

Like they're putting them into the different Lauras and

meaning sort of like the instructions for generating

these models and these images.

And then they're also saying, like,

rather than just stealing like Ariana Grande, why not take Ariana Grande and Sabrina Carpenter and like turn them into a hybrid person?

And so, therefore, it's like not stealing a copyrighted image you're you're like remixing it in a way that we find to be more ethical and so i think it's like important to remember that kind of like all of this ai generated art like regardless of whether it's a one-to-one theft of another

person's image or body or likeness it's like They're all trained on images of real people.

They're all trained on videos of real people.

And that's like what underpins this entire industry.

But it's interesting that they took this idea and they're like, oh, we're the ethical ones because we're not stealing content.

Yeah, because using it to train data is still stealing.

It is just obfuscating that fact.

You're stealing, but it's like in the background, basically.

It's effectively the same sort of thing.

And you're just less likely to get caught.

So, how are these AI pimps?

And I should say, that's not our term.

That is what some of the people say themselves, right?

We have marketing material that we got and pull it in the article.

How are they monetizing this material they're making?

Yeah.

So the digital divas just stand in contrast to one of the other people that we found who ran this account called Emily Pellegrini, who is an AI influencer.

And it's like, I'm just going to say she, but again, not a real person.

was written up by like the Daily Mail.

Like all sorts of people covered her as like one of the biggest and first AI influencers and had like hundreds of thousands of followers on Instagram.

And now the person behind that account, who again, is using like an AI persona of a man,

uh, is teaching people how to do this.

And I would say that that guide is

like much more nefarious than the digital divas.

Like, I just want to be clear that it's a spectrum.

You have like the people who are like, oh, don't steal content.

And then this other one is like, here's how to steal content.

And it's like a real like wink, wink, nod, nod.

Be careful what you do.

But it's really runs the gamut between like, hey, this is

like, we're creating a new thing, a new industry.

And like, here's how to steal money, steal women's bodies and make money off of it.

And the way that they are monetizing it is they're basically linking to this account, this website called FanView or a few other websites that are OnlyFans OnlyFans competitors and knockoffs, more or less,

that explicitly allow people to sell AI-generated nude images or nude videos.

And those

accounts also use stolen content

often from adult actresses.

Right, right.

There's the stealing from the Instagram influencers to make the, for lack of a better term, safer work content that's on Instagram.

And that acts as the funnel to get people to these OnlyFans-like services,

which is where they've then got AI-generated stuff, which is presumably ripped off either deep faked or users' training sets, adult performer content, and you sort of have those two sets.

So,

what

is Instagram's role in this and sort of their response?

I'm not sure which one of you pinged Instagram for comment, but sort of what did they say?

Well,

it's funny because

before we get to what Instagram said,

I talked to Elena St.

James, who

is an adult content creator, who is very aware of this problem.

This has happened to her content.

This has happened to content to other adult content creators that she knows.

And

the

problem here is not just stealing content from other people.

It is then also creating a situation on Instagram where she is competing with like an infinitely generating number of accounts that can produce content

much more easily than her because she has to take photos and make videos.

And these people are just like stealing or generating content.

And it's just drowning her out.

And in case you don't know, obviously Instagram has a lot of restrictions about posting adult content or nudity of any type.

But the way that

people who make money on OnlyFans and other similar platforms gain an audience and monetize their content is by essentially advertising on Instagram.

And she says that ever since this started, she's been having a harder time picking people up and converting them and so on.

And

when I was talking to her, she was speculating that

Instagram

is not only aware, and it's not only that it doesn't care, which I think is something we can assume based on our past reporting and Jason's past reporting on AI

slop on Facebook and stuff like that.

They might actually be in favor of this because it juices the numbers, right?

It's just like more people posting, more people engaging.

More likes, more likes.

It just inflates the numbers for Instagram.

And I think, Jason,

didn't Zuckerberg essentially confirm this in a way on a quarterly earnings call?

Yeah, I think we, I don't know if we talked about it on the podcast, but on the last quarterly earnings call, Zuckerberg essentially said that AI-generated content is driving more engagement on both Facebook and Instagram to the point where they are thinking about creating a whole additional feed just for AI-generated content, which

in the past, when Facebook has made new feeds, they like start as new feeds and they just like integrate them into the main news feeds.

And so they think that this is

good, like by and large,

the idea of AI-generated content.

I will say that Instagram did take down a couple of the accounts that we flagged and that they have over time because they have an anti-impersonation policy.

And

I mean,

it's very small number of accounts that they've taken down.

And there are many, many, many, many accounts doing this.

And then I think Elena St.

James raised a really interesting point, which is that when she has reported her own content as being stolen, that often brings scrutiny to her account in general.

Like her account

is no longer able to fly under the radar, which is not to say that her account is even doing anything wrong.

It's just that Instagram's enforcement around adult content is all over the place.

And so you'd probably rather,

you know, a moderator not look at your account than look at it.

Is that fair to say, Emmanuel?

Yes.

You spoke to her.

Yeah, yeah.

So just to answer more directly to Joe's question, we flagged a bunch of accounts to Instagram and we're like, here's the AI content.

Here's the original content that it's stealing from.

Is this a problem?

And they're like, maybe it's a problem.

Maybe it's not.

We can't say, based on you flagging this, the person whose content was stolen has to report this directly, prove it's their original content, and then we maybe do something.

So we flagged a bunch of stuff.

They took some down.

Most of it is not.

They generally like don't care.

The person who's being stolen from has to report it and then maybe they will do something.

To Jason's point, there's a bunch of stuff in how

sex workers and adult content creators have to operate on Instagram that makes it easier to steal from them.

For example, so

Elena has her account and in the bio of her account,

she names another account that people should follow in case her primary account is shut down, right?

Because Instagram will randomly, without explanation, shut down

accounts for adult content creators.

And rather than them having to build their following from scratch, they're trying to have like an emergency lifeboat that people can follow.

And this is the same kind of behavior that you're seeing with the AI accounts, right?

So it's like there's two Elena St.

James accounts.

They're both posting Elena St.

james

content

and she owns both of them but now there is a third a fourth a fifth that are not hers and are not authorized and are just like monetizing her videos without her permission so that makes it harder for instagram to like tell which is which right

yeah and

maybe this is a naive silly point for me because i didn't know this is this and you both reported it but the fact that they Instagram might take it down for impersonation is tricky in a way.

I mean, you would hope they would take it down because it is lifting hers or others people's content, but it almost brings in too complicated or philosophical a discussion where you can imagine Instagram would like start stroking their chin and be like, well, is it really impersonation if they've generated a new thing off this person's body?

And it's like, dude, we're not here for the 101 philosophy class because you just like stop people getting ripped off.

Um, but yeah, I guess we'll see.

We will leave that there.

If you're listening to the free version of the podcast, I will now play this out.

But if you are a paying 404 media subscriber, we're going to talk about something that's been requested by a lot of people, but I just wanted to pull it kind of on the pod: how and why I haven't owned a phone since around 2017, and how I live my nightmare life due to that.

You can subscribe and gain access to that content at 404media.co.

You'll get unlimited access to our articles and in our free version of this podcast.

You'll also get to listen to the subscribers only section where we talk about a bonus story each week.

This podcast is made in partnership with Kaleidoscope.

Another way to support us is by leaving a five-star rating rating and review for the podcast.

That stuff really does help us out.

Please do that if you haven't already.

This has been For Reform Media.

We will see you again next week.