Luxury Surveillance (With Chris Gilliard)
YouTube Version: https://youtu.be/4kBgnjn5cC0
Subscribe at 404media.co for bonus content.
Learn more about your ad choices. Visit megaphone.fm/adchoices
Press play and read along
Transcript
Speaker 2 Hello, and welcome to the 404 Media podcast, where we bring you unparalleled access to hidden worlds, both online and IRL. 404 Media is a journalist-founded company and needs your support.
Speaker 2 To subscribe, go to 404media.co.
Speaker 2 You'll get bonus content every week, and subscribers also get access to additional episodes where we respond to their best comments.
Speaker 2 And they also get early access to our interview series, which you are listening to right now. Gain access to that content at 404media.co.
Speaker 2 This week, I'm joined by Chris Gilliard, a surveillance justice expert and just tech fellow at the MacArthur Foundation. I've talked to Chris for so many stories over the years.
Speaker 2 He's one of the smartest researchers studying surveillance, privacy, and how they affect us and the people around us. His book, Luxury Surveillance, is coming out early next year.
Speaker 2 Today, we're talking about luxury surveillance or how big tech companies get us to opt ourselves and the people around us into surveillance. Here's my conversation with Chris.
Speaker 2 Chris, how are you doing?
Speaker 1 I'm doing great. Thanks for having me.
Speaker 2 Yeah, no, thank you so much for coming on. When we started this interview series all of like one week ago,
Speaker 2 I knew I wanted to have you on really early because I feel like our interests and careers and paths have intertwined a lot. I feel like often whenever I'm writing about some dystopian, horrible thing,
Speaker 2 you're like one of the first people that I want to call.
Speaker 2 And not because I want to make you sad, but because I feel like you always have like very good context for why a specific technology is working in a specific way or how surveillance is affecting people and sort of spreading to the masses.
Speaker 2 And so one, thank you so much for being that person that I can call. And always have, I feel like you always have the right perspective and a very smart perspective.
Speaker 2 But, two, I feel like this conversation will probably go all over the place for that reason because we've talked about so much over the years.
Speaker 2 I guess I want to start with luxury surveillance because I believe this is sort of like what you're most known for in the public sphere is sort of like
Speaker 2 starting, like
Speaker 2 spreading spreading awareness of this idea of luxury surveillance, which is, well, I'll let you define it, but basically it's like people who opt into surveillance because it's easy and makes their lives better or so they think.
Speaker 2 And then that's sort of distributed across the population in all sorts of nefarious ways. But
Speaker 2 how do you like think about luxury surveillance? I know this is like a project that you've been working on for quite some time. There's a book coming out, et cetera.
Speaker 2 But what is luxury surveillance for people who don't know?
Speaker 1 No, that's exactly it.
Speaker 1 I think that definition holds up.
Speaker 1 I started thinking about it mostly because
Speaker 1 most of the models that I saw
Speaker 1 for surveillance talked about, and I think this is important and it's true and it matters.
Speaker 1 But some of the models talked about surveillance that was kind of deployed against people, like involuntarily,
Speaker 1 often against kind of the most marginalized or
Speaker 1 most vulnerable populations. And that's true.
Speaker 1 But as I was writing about surveillance and talking to people and things like that, like a couple of things kept coming up.
Speaker 1 And one is that people would always, they would regurgitate some version of the nothing to hide argument,
Speaker 1 you know, or they would kind of privilege their own needs. or desires in front of everything else.
Speaker 1 You know, so they would say, if I talked about video doorbells, they'd say, well, I still need to get my packages and things like that.
Speaker 1 And so I started trying to think of kind of how to leverage that in terms of,
Speaker 1 in terms that I thought would help people get a better grasp of sort of what some of the problems were with that.
Speaker 1 I mean, as you know, like, I wish it moved the needle to just tell people that, oh, this is going to be used to deport your neighbors, you know, but unfortunately, sometimes that doesn't move the needle.
Speaker 1 And that people, I think if they understand the degree to which some of these systems come for everyone, sometimes kind of helps a little bit.
Speaker 1 And so the initial comparison I came up with was talking about
Speaker 1 Fitbits and Apple Watches as analogous to ankle monitors.
Speaker 1 And that's where that came from.
Speaker 2 Right. So
Speaker 2 I want to tell you an experience that I had recently, and you can tell me if this is luxury surveillance. I feel like maybe it helps frame the conversation a little bit.
Speaker 2 Are you familiar with the Intuit Dome in Los Angeles?
Speaker 2 No, who plays there? So the Clippers play at the Intuit Dome and it's a new like several billion dollar
Speaker 2 arena that opened last year.
Speaker 2 And a couple of friends and I wanted to go to a basketball game, watch the Clippers. So I bought tickets on StubHub.
Speaker 2 And to get the tickets, I had to download the Intuit Dome app.
Speaker 2 Like there is no way of transferring the tickets through Ticketmaster, which already have tons of problems with Ticketmaster, but you know, such is life if you want to go to a concert or sports game or whatever.
Speaker 2
So I had to download that. I had to give them tons of information just to sort of get the tickets.
And not only that, everyone in my party also had to download the app.
Speaker 2 So I couldn't get all four tickets.
Speaker 2 You know, I had to get one, and then someone else
Speaker 2 in my group had to get one.
Speaker 2 And they have this
Speaker 2 entry system that
Speaker 2
you scan your face in the app, and then you enter with facial recognition. And that face is connected to your wallet.
And so, and then the entire interior of the arena,
Speaker 2 they have tried to get rid of humans at every possible angle. So it's like they have stores in there, but they're just walk out stores where you just grab something and you leave.
Speaker 2 You have to scan into these areas with your your face.
Speaker 2 And they claim that you can use your credit card without this, but those like are always broken.
Speaker 2 And then of course like once you're in this area, you don't like interact with any humans like basically at any level.
Speaker 2 And if you want to opt out of this system, they like make you walk around the entire arena to go to one single gate where you can like use the old barcode system.
Speaker 2 And so the first time I went, that's what I did because I was like, I don't want to do this. Second time, we were running late and it was a playoff game.
Speaker 1 And so I was just like, you know what?
Speaker 2
Like, fuck it. I'm going to do it.
I'm going to do it. And I did it.
Speaker 2
I had already, I have had a few beers. So I like opted in.
And
Speaker 2 I feel, I feel like that is my life in many ways, where it's like, you can often try to resist these things, but it's often like very inconvenient. They make things like worse.
Speaker 2 But also like opting into these systems in the first place, like download the app, give them all your information, scan your face. Like
Speaker 2 that was super inconvenient as well.
Speaker 2 And I don't have a question here, but to me, it's just like I thought of you immediately.
Speaker 2 I thought, I was like, this is luxury surveillance, or at least it's trying to give this like seamless experience where you give them your face.
Speaker 2 You just grab a beer, grab chicken tenders, you don't interact with any human beings, and like all the data goes like God knows where. Like, who knows what happens to it?
Speaker 2 Um, and and I don't know, I thought of you, and I'm curious what you think of that, uh, experience.
Speaker 1 Yeah, I mean, it absolutely is, you know. I mean, it, it, um, kind of engages another thing that I've talked about in the past, which is, I call friction-free racism, right? That the
Speaker 1 first of all, like, who has the capacity to do this?
Speaker 1 Like, who's got the phone, the credit card, like all these ways that, but also who fantasizes about this experience where you don't interact with anyone else.
Speaker 1 You know, I mean,
Speaker 1 I've worked in the
Speaker 1 at stadiums and at the vendors and things like that. So
Speaker 1 I don't fantasize about the experience where I go get a beer and a popcorn and don't talk to the person behind the counter, you know, but I know that people do.
Speaker 1 And it is, I think, a mark of privilege.
Speaker 1 But it also, you know, there are so many ways that it ties into so many other things, which is like going to a game is supposed to be a communal experience.
Speaker 1 The people who work there are part of that community,
Speaker 1 or they should be. Like, I think of them as that.
Speaker 1 And so, to say, like, to pitch as a privilege or a bonus or benefit that you can kind of slide through the experience
Speaker 1 and all you have to do is pay for it with your face, you know, I think. um is uh yeah i think it's a prime example of that uh and the other part is that
Speaker 1 you are incentivized to believe that it's to your benefit and that and that the only way to opt out, you know, I mean, we've seen it in the airport, for instance,
Speaker 1 is to undergo a series of sort of onerous steps in order to not do this thing, which is supposed to be good for you or, you know, be better for you.
Speaker 2 Yeah, yeah.
Speaker 2 And I think that, you know, it's not a far jump from there, but I think a lot about the delivery robots that are driving around Los Angeles and college campuses and other places all over the country where,
Speaker 2 again, it is you've taken something that's already kind of abstracted, like
Speaker 2 you have traditional delivery, and then you have things like Uber Eats and Grubhub and
Speaker 2
the associated algorithmic management of those workers and the depersonalization of the whole situation where it's like, just drop it off. Like, don't, don't talk to me.
Don't look at me.
Speaker 2 Like, food arrives.
Speaker 2 And then you, you just literally get rid of the human or, well, you put the human maybe in like a, uh, in Nebraska or something with an Xbox controller to drive around the, the, um,
Speaker 2 the food and these little robots.
Speaker 2 But then also you attach cameras and microphones to the robots because these are like, this is the property of these delivery companies and you know, people don't like these robots, so they're attacking them all the time.
Speaker 2 And, and sort of the side effect of all of this is like there are more cameras that can be subpoenaed by cops that often don't have to be subpoenaed by cops. They are sort of given footage willingly.
Speaker 2 And
Speaker 2 for this, it's like something where a rich, privileged person
Speaker 2 abstracts themselves from the entire like food delivery experience, but then they're also opting in all of their neighbors to this type of surveillance.
Speaker 2 And I was wondering if you could talk a little bit about that, because you talk a lot about that in your work where
Speaker 2 it's like
Speaker 2 you are opting other people into surveillance via your decisions.
Speaker 1 Yeah, I mean, if like,
Speaker 1 you know, frankly, if these decisions only hurt the rich people or the people who,
Speaker 1 you know, signed up for it, like, I kind of wouldn't care.
Speaker 1 You know, but that is, that's sort of the unspoken part is that we're all
Speaker 1 we're all sucked up into these systems when people engage in them, not only in the way that you mentioned, but like it normalizes it.
Speaker 1 When
Speaker 1 these things are seen as kind of luxury or aspirational or making your life easier,
Speaker 1 it normalizes it in a way I think that is really
Speaker 1 Yeah, unhelpful, like harmful.
Speaker 1 That when everyone has a video doorbell, when everyone is carrying carrying around a microphone, when people are wearing glasses, you know, that
Speaker 1 with
Speaker 1 cameras, you know, it, and we've seen it. I mean, we've seen it
Speaker 1 with ICE and CBP, you know, we've seen it with, you know, states and municipalities requesting footage of
Speaker 1 protesters. You know, we've seen it with...
Speaker 1 like Flock and automated license plate readers that these things that are promoted as some kind of benefit or you know safety for people um wind up you know we've seen it in ed tech that these systems they wind up making people lots of people less safe
Speaker 2 i remember a past a bad past where whenever i needed glasses, I had to go to multiple places that all felt like stuffy doctor's offices.
Speaker 2 And I had to spend hundreds and hundreds of dollars on glasses I didn't really like and had to wait weeks and weeks to get them. Warby Parker changed all that.
Speaker 2 Warby Parker uses nothing but premium materials in each frame.
Speaker 2 They design the frames in-house and their collection includes different silhouettes, colors, and fits to suit every face and every fashion style. I love that it's a one-stop shop.
Speaker 2
They offer everything you need for happier eyes. eyeglasses, sunglasses, contact lenses, and eye exams.
You can shop with them online and in stores.
Speaker 2 Warby Parker has over 300 retail locations across the US and Canada. And their glasses start at just $95 and include prescription lenses with anti-reflective scratch resistant coatings.
Speaker 2
Add a pair and save 15% when you purchase two or more prescription pairs of glasses or sunglasses. This offer is available both online and in stores.
Free shipping and free 30-day returns.
Speaker 2 I've been going to Warby Parker for years now once I learned that they had figured out the glasses game. And I haven't really looked anywhere else since.
Speaker 2 And if you have a big head like mine, you'll love their virtual try-on system that lets you figure out which frames and what size you need.
Speaker 2 Gone are the days where you'll be disappointed with your glasses. Warby Parker has over 300 locations to help you find your next pair of glasses.
Speaker 2
You can also head over to warbyparker.com/slash 404 media right now to try on any pair virtually. That's warbyparker.com/slash 404 media.
Warbyparker.com/slash 404 media.
Speaker 2 Do you think that there's a way to design some of these systems in a way that they are safer?
Speaker 1 Um,
Speaker 2 or like, like, what is your sort of perspective on?
Speaker 2 I mean, it's such a broad class of
Speaker 2 things, but like, what is your like perspective on something like, I don't know, like an Uber, a food delivery robot.
Speaker 2 And then I think we can probably get into like flock and ring doorbells and things like that. But like,
Speaker 2 these in Amazon, like, I want my stuff in one day.
Speaker 2 I'm going to be targeted with advertising that, you know, is going to hit me with
Speaker 2 really
Speaker 2 convenient things because they know so much about me and they know that I'm out of cereal or whatever. Like, I guess, do you think that there is a way to design these,
Speaker 2 design technology so that it is more
Speaker 2 respectful of users while also being like providing this convenience or do we have do we need to sacrifice some of that convenience for our own privacy and the privacy of others?
Speaker 1 Yeah. Well,
Speaker 1 this is an interesting question for me, right? Like I think some of these things shouldn't exist, you know, like it's a sort of you don't got to hand it to them kind of question.
Speaker 1
Like I think that it's understated. the degree to which some of these things are really corrosive and harmful to society.
You know, the idea of that.
Speaker 1 I should be able to order non-essential things and they are, you know, a drone drops them off at my house an hour later, you know, things like that. I mean, there are
Speaker 1 use cases, you know, but
Speaker 1 mostly no.
Speaker 1 And so I think, so my kind of short answer is the only way to do that would be to divorce it from capitalism.
Speaker 1
You know, I mean, I think about the push towards smart glasses. And there are some cases where people talk about them.
Um,
Speaker 1 you know, for instance, like translation,
Speaker 1 and so would it be cool to have a thing that immediately like translates something that worked and wasn't wonky, you know. Um,
Speaker 1 but like, uh, so yeah, like, and I, like, I don't think those need to be glasses, for instance, like those could be headphones, but
Speaker 1 I mean, this is like a sort of like capitalism kind of ruins everything.
Speaker 1 in that
Speaker 1 mostly companies don't, they're not incentivized in any way to produce these things at scale in a way that would protect
Speaker 1 people's privacy or wouldn't be invasive or wouldn't be corrosive.
Speaker 1 There are some designs, decisions, I think, that make these things worse, you know, because like we
Speaker 1 companies sort of try to maximize their intrusiveness. But a lot of these things,
Speaker 1 you you know, like I don't really think should exist. I think that the idea that convenience kind of trumps all
Speaker 1 really has been,
Speaker 1 yeah, I think it's a very harmful way to go about things.
Speaker 2 Well, it's this idea that everything should be frictionless. And if you don't submit to these systems, we are going to add a lot of friction, like an unnecessary amount of friction.
Speaker 2 Something you said made me think about like my reporting with AI slop,
Speaker 2 which is just like,
Speaker 2 you know, we can talk about how AI has changed the calculator supercharge a lot of these things that you're talking about and sort of the incentives of these AI companies. But I think a lot about how
Speaker 2 you'll talk to like someone who's really excited about AI, a generative AI, which is, I feel like fewer and further between for me.
Speaker 2 But like every once in a while, I'll like run into someone who, who, like a friend of a friend, who's like, yeah, I like really love how I can storyboard with runway ML or something.
Speaker 2 And it can make me help me write my scripts and things like that. And I think often of like
Speaker 2
the like best intentioned people using. AI to make their little projects and things like that.
I'm like, okay, that's cool.
Speaker 2 But like the vast majority of what we see on the internet using these systems are like highly corrosive, highly invasive, used to like further inflame tensions and
Speaker 2 they're just like polluting the internet.
Speaker 1 And I think about that
Speaker 2 like when you mentioned smart glasses and things like that as well, where it's like, yes, maybe they can be used for translation and there should be like better access, better accessibility tech.
Speaker 2
I also think about like, I would love to use meta glasses to like film myself cooking. for myself.
So I'd be like, oh, that's like a cool, a camera on my face for use in my own home.
Speaker 2
But that's not how people are using it. People are using it to like go film viral TikToks while they're harassing people in public and things like that.
And I guess it's just like,
Speaker 2 I personally feel I'm just like, we can't be trusted with this technology as a species, really.
Speaker 2 And the incentives of these big tech companies are not like responsible use of these technologies. They're like,
Speaker 2 you know.
Speaker 2 use them as much use them to replace human contact and use them to spy on your neighbors and use them to like inflame tensions and like keep yourself safe and and that sort of thing um
Speaker 2 and i guess it does kind of go back to like this
Speaker 1 the incentives of capitalism kind of where it's like it's always like it's like maximalist maximalist maximalist do you agree with that or or what yeah i mean and i you know i i hate it because it sounds reductive you know but i think for a lot of these things you have to ask why they exist you know why are
Speaker 1 companies pouring billions and in some cases, trillions of dollars into these things?
Speaker 1 And also, when, you know, sometimes even their own research says, like, this is the effect, or this is how people are going to use it. Or now that people are using it, this is what it's doing.
Speaker 1 Even with those things, like, you know,
Speaker 1 when we think about like the chat bots
Speaker 1 and how they've been linked to suicide, particularly of young people.
Speaker 1 And it's like,
Speaker 1 like one of those should cause those, the people at those companies to like, you know, go pull some plugs at the data centers, you know, like, but there's been far more than one, you know, that,
Speaker 1 and then, um, you know, companies will come out and say, well, we've addressed that. Or, you know, for instance, character AI, I think, is now
Speaker 1 doing some kind of, or they, they are going to start putting in like kind of age restrictions or something like that.
Speaker 1 And, you know, like it really seems insufficient,
Speaker 1 particularly given the harms, right? I mean, like, throughout even recent history, like we've seen examples of when
Speaker 1 children are harmed, when someone tried to do, you know, commit a terrorist act or anything like that,
Speaker 1 that there have been
Speaker 1 pretty extreme reactions in terms of like uh
Speaker 1 uh
Speaker 1 uh reactions in terms of trying to like protect the public um
Speaker 1 but that seems like uh you know sort of like a bygone era now um
Speaker 1 but like because and like like the thing is like when for something like sora you know i was just kind of rereading your your piece on sora and the facebook slot
Speaker 1 And it's like, why?
Speaker 1 So, you know, like a really thing important question is like, why does Sora exist you know like what is it for you know um you know a friend of our like mine you know like actually how I met you was David Golumbia you know and when I think in 2022
Speaker 1 before he passed he wrote a piece called chat GPT should not exist
Speaker 1 you know David was like always kind of like
Speaker 2 way out in front on stuff I always thought, I mean, so for people who don't know, David Golumbia was a researcher who wrote quite a lot about
Speaker 2 like techno-libertarians and anarcho-capitalism and things like this. I think you could explain it better, but I remember him as always being like far in front of the curve and things like this.
Speaker 2 Like some of his early stuff on like Bitcoin libertarianism, I feel like that was like
Speaker 2 before I had even come close to connecting the dots there.
Speaker 2 If you want to talk like a little bit about David and some of his work, I would really appreciate that because it's so sad that he passed and he did such great, great work.
Speaker 1 Yeah. So,
Speaker 1 you know, full disclosure is a really good friend of mine and like a mentor. We wrote some pieces together.
Speaker 1 But he was like really,
Speaker 1 I think one of the things he's known for is his work on Bitcoin and
Speaker 1 on cyber libertarianism.
Speaker 1 And
Speaker 1 I'm not really going to do it like a great justice, but
Speaker 1 what I would point out is the connections he made between these technologies and
Speaker 1 the far right and between some of these technologies and some of these technologists or, you know, venture capitalists and things like that.
Speaker 1 And this understanding or this belief that
Speaker 1 technology should sort of trump democracy.
Speaker 1 you know, that innovation and the idea that software or algorithms or whatever, what have you, that companies and platforms and investors should have free reign
Speaker 1 in terms of putting technologies out into the world that are harmful because the most important thing is innovation. He called that cyber libertarianism, you know, like that
Speaker 1
there's this still, I think, prevailing narrative. I mean, it's disappeared somewhat, but that Silicon Valley had long been sort of liberal or left.
He does in
Speaker 1 his book that was published posthumously, does a really deep dive into talking about why that really was never true.
Speaker 1 And yeah, I think that in terms of
Speaker 1 the essay on chat GPT,
Speaker 1 I think one of the things he located really early on,
Speaker 1 which I, you know, I agreed with them, but like we're seeing really come true, is the way that it's like a very, it's like
Speaker 1 that chat bots writ large are like an anti-democratic force.
Speaker 1 And also the glee with which
Speaker 1 the people, you know, kind of chatbot fetishists and Gen AI fetishists, like the glee with which they think about putting artists and creatives and professors, you know,
Speaker 1 out of work
Speaker 1 is tied to their distaste.
Speaker 1 and animus towards democracy. And so
Speaker 1 I think about them a lot whenever some of this stuff comes up, because
Speaker 1 when that piece came out, you know, he received a significant amount of pushback.
Speaker 1 You know,
Speaker 1 I think it holds up really well. I mean, it was only three years ago, but like
Speaker 1 most of it turns out to have been pretty accurate.
Speaker 2
Yeah. I mean, but a lot.
A lot has changed and a lot hasn't since then, I guess.
Speaker 1 And,
Speaker 2 you know,
Speaker 2 this is not a surprise to anyone, but it's like, I have a lot of problems with AI and how it's used and how it's affecting people.
Speaker 2 And I feel like a lot of readers will email me and sort of say, like, well, there's this, and you don't talk as much about the environmental impacts as you should.
Speaker 2
That's like one line of criticism, which we don't need to get into now. I agree.
I want to do more on that. And then the other is
Speaker 2 that we're not talking enough about the potential privacy and surveillance implications of chatbots.
Speaker 2 And to be honest, one of the reasons that I feel like I haven't done that much work on that is because my mind has not expanded large enough to understand the ways in which this data is so sensitive.
Speaker 2 You know, there's the obvious things like sometimes it leaks its training data. Sometimes, you know, there's been multiple instances of chats being leaked, things like that.
Speaker 2 But I honestly feel like the evolometer has not been ratcheted to 10 yet in terms of what is possible. And we've started to see a little bit of this with
Speaker 2 Chat GPT will call the cops on you if you start showing suicidal ideation or something like this. And it's like, rather than fix things, they're just like, oh, we're just going to call the cops.
Speaker 2 Or rather than to pull it out, there's that aspect. But I'm curious, I feel like you have definitely thought about this more than me.
Speaker 2 What are some of the privacy concerns, surveillance concerns of
Speaker 2 like chat bots and our
Speaker 2 relationships with ai um i i hate when people say our but like people people are like talking to these things and telling them a lot about themselves and it does feel like a pretty big like privacy risk but i don't even fully understand the implications of it yeah i mean i think some of it um
Speaker 1 and i've been working and thinking a lot about this for for the book which is that um the big push and i don't even know how much this works how much it's going to work you know things like that.
Speaker 1 But the big push towards agentic, you know, in quotation marks, AI.
Speaker 1 That
Speaker 1 the thing is now that these,
Speaker 1 that in terms of like making AI an operating system or giving it access to everything you do, like your calendar, your contacts, your emails, your doctor's notes, like all these things,
Speaker 1 that in order for these systems to work,
Speaker 1 not on a sort of like, not even on a sort of technical scale, but just in order for it to do the things companies claim that it will do, it will need kind of unprecedented access and kind of unsiloed access to like your entire life.
Speaker 1 And, you know, because privacy is relational, not only yours, but everybody you know.
Speaker 1 And
Speaker 1 it would need to link all of those things together, you know?
Speaker 1 So if I wake up in the morning and say you know like check my chatbot to see what my schedule is and ask it to schedule um you know um appointment with my doctor and you know write a love note to my partner and like you know ask you know tell my kid happy but like all these things right like that kind of like access is like unprecedented um
Speaker 1 and the other thing is like These,
Speaker 1 I think when the companies are saying like it can, you know, chatbots can be your coach, can be your partner. These things are not, you know, like
Speaker 1 really,
Speaker 1 I don't know that the companies have like kind of like a,
Speaker 1 I don't know what their model for making money is.
Speaker 1 Like, I honestly don't, right? Except to get people to pay for this,
Speaker 1 you know. But I mean, they've suggested things like putting ads and things like that in it.
Speaker 1 And so,
Speaker 1 how will we know, for instance, instance that um
Speaker 1 what is going to be the level of influence that these things will have in terms of like um you know like nudging people in certain ways and things like that
Speaker 1 um
Speaker 1 that and i think uh that thing that you pointed out with it with like you know chat b2 gpt will call the cops on you right like um I think that that's like just sort of a small glimpse into it because like I understand these companies as being very closely aligned with an authoritarian government.
Speaker 1 Not only in terms of kind of what
Speaker 1 kinds of ideologies these chatbots are allowed to have, right? But in the ways that it will report on you if you don't have a matching ideology. I mean, we're seeing it in...
Speaker 1 we're seeing that push sort of in universities right now. We're seeing it in the president's
Speaker 1 edict against woke AI and things like that.
Speaker 1 And so I think when those are like a couple examples, but I think of like when the information silos, for instance, and things like that, they exist for a reason.
Speaker 1 Linking all these things together with a company that's also inclined
Speaker 1 or allied with a particular ideology and is willing to function as sort of a carceral system, I think is really dangerous.
Speaker 1 And I don't think people have like even begun to kind of unravel that. Like I don't, like some people, but I don't think a lot of people.
Speaker 1 Like people are pouring their hearts into these systems and they don't know where that information is going.
Speaker 2
Right. That's like where my brain goes is the advertising bit.
I mean, also all the things you said right now, like
Speaker 2 you made many more connections than I had previously.
Speaker 2
But for me, it's like these companies are not making money. In fact, they are burning money.
And one of the most reliable ways to make money online is with targeted advertising.
Speaker 2 And it's like, you know, you see Facebook and Google spending just
Speaker 2 an insane amount of money on data centers and things like that.
Speaker 2 And they're not having to take on funding in the same way that OpenAI has needed to take like tons and tons of funding because they're losing money.
Speaker 2 But like Facebook and Google are propped up by their ad ecosystem and they're just printing money with ads.
Speaker 1 And
Speaker 2 it's so scary to think like Chat GPT starting to nudge you towards certain products or whatever when you're talking about a fight you had with your partner or something.
Speaker 2 It's like, well, have you tried roses from this like place? I bought them for you already. And like, it's just like the possibilities there are very scary to me.
Speaker 1 And what would that mean for politics, for instance, right? I mean, like, think about the way that we're beset by ads during election seasons.
Speaker 1 And so are people, when people are kind of walking around with chat GPT in their ear all the time through their glasses or their headphones or some other wearable or whatever, right?
Speaker 1 So what would like a targeted political campaign look like through
Speaker 1 chat GPT? I mean, it's like so nightmarish.
Speaker 2 If you're a security or IT professional, you've got a mountain of assets to protect, devices, identities, and applications.
Speaker 2 It's a lot and it can create a mountain of security risks fortunately you can conquer that mountain of security risks with one password extended access management over half of it pros say securing saa apps is their biggest challenge with the growing problems of saaa sprawl and shadow it's not hard to see why thankfully trelica by one password can discover and secure access to all your apps managed or not Trelica inventories every app your team uses and uses pre-populated app profiles to assess risks, manage access, optimize spend, and enforce security best practices.
Speaker 2 You can handle shadow IT, securely onboard and off-board employees, and meet compliance goals, all in one place. Over the years, I've come to trust OnePassword.
Speaker 2 I use it for all my accounts, which actually wasn't always the case.
Speaker 2 I used to use a different password manager, which I was constantly having problems with, but that took me forever to switch away from due to pure inertia.
Speaker 2 When I finally made the switch to OnePassword, I couldn't believe I hadn't done it sooner. I was also also shocked by how easy it was.
Speaker 2 Migrating my passwords and getting myself back up and running took only a few minutes. OnePassword is trusted by millions of users and over 150,000 businesses, from IBM to Slack.
Speaker 2 It's ISO 27001 certified, regularly audited, and even has the industry's largest bug bounty.
Speaker 2 Take the first step to better security for your team by securing credentials and protecting every application, even unmanaged shadow IT. Learn more at onepassword.com slash 404.
Speaker 2 That's onepassword.com slash 404, all lowercase. The holidays come fast.
Speaker 2 You're decorating, shopping, and trying to keep up, and somewhere in the middle of it, you realize those blinds could really use an upgrade. But who has time to deal with all that?
Speaker 2 With blinds.com, you don't have to. You can refresh your space online easily and affordably before the holidays are in full swing.
Speaker 2 Thankfully, there's blinds.com, a site that helps you bring your home to life your way.
Speaker 2 Whether you want to DIY, have it all done for you, or you're somewhere in between, choose what works for you and rest easy knowing the fit is guaranteed.
Speaker 2 There are hundreds of options, textures, colors, patterns, so you can build your perfect view without ever leaving the couch.
Speaker 2 And at Blinds.com, there are no pushy salespeople, no awkward in-home consultations. They can even send a licensed pro to measure your windows at no cost and install everything for one low price.
Speaker 2
No hidden fees, no surprises. Just beautiful blinds guaranteed to fit.
Whether you want blackout or motorized options to classic shutters.
Speaker 2 So, whether you're DIYing it or want the white glove treatment, blinds.com has you covered because the only thing they treat better than Windows is you.
Speaker 2
The Black Friday deals are going strong all month long, so don't miss your chance to save big. And as always, our listeners get $50 off when you spend $500 or more.
Just use code 404Media at checkout.
Speaker 2 Limited time offer, rules, and restrictions apply. See blinds.com for details.
Speaker 2 I want to pivot back to cheerier topics such as I want to talk about Flock and Ring.
Speaker 2 I feel like they're close and I feel like you have had some of the best perspective on how to think about these companies.
Speaker 2 And we've written so much about Flock. I've been very, very interested in it as a company, just
Speaker 2 primarily kind of how it has spread so quickly and has become such like this omnipresent
Speaker 2 thing, like a go-to for cops. Like the first thing that they type in when they're investigating a crime is like a license plate, et cetera.
Speaker 2 And it's like this, this company began as a company that sold automated license plate readers to private citizens, more or less. Like they were selling them to homeowners associations.
Speaker 2 and
Speaker 2 that is like an example of
Speaker 2 rich people opting others into surveillance for their own safety and primarily like we've seen with ring and uh its associated uh
Speaker 2 i hesitate to call a social network but neighbors is like the app that comes with ring It's like this technology is disproportionately used against black and brown people, against undocumented immigrants, against delivery people, like working class people who are like just doing their jobs, and who some like rich person thinks is suspicious for whatever reason.
Speaker 2 And I'm wondering if you can talk a little bit about that phenomenon, because I feel like it has repeated itself over and over and over again. And it's like
Speaker 2 the toys of the rich are used against
Speaker 2 people that
Speaker 2 are disproportionately criminalized by the cops, disproportionately marginalized by society, et cetera, et cetera.
Speaker 2 I know I just talked a lot there, but I feel like you have
Speaker 2 a better perspective of the history of it all and how this technology is used against black and brown people and delivery workers and working class people.
Speaker 1 Yeah, I mean, there's so much there.
Speaker 1 I think,
Speaker 1 you know, I mean, first,
Speaker 1 the thing I would say about Flock is, you know, I think the one interesting thing, because I'm strangely kind of obsessed with these guys' origin stories.
Speaker 1
And this guy, Langley is his name, who's the CEO of Flock. And he claims like he kind of started it because his car got broken into.
And like, you know, there was no, you know,
Speaker 1 like the person wasn't caught or something like that. But their
Speaker 1 tagline is that
Speaker 1 they plan to eliminate crime.
Speaker 1 And I think, I mean, not white-collar crime, of course, right? Not wage theft.
Speaker 1 But,
Speaker 1 you know, I think
Speaker 1 the statistics, not only with Flock, but with Ring,
Speaker 1
you know, there's been various deep dives over the years. Serus Barivar did a really interesting one.
Alfred Ngs talked about it several times. Carolyn Haskins done great work on it.
Speaker 1 That they don't.
Speaker 1 The effect on crime in terms of these devices and these technologies,
Speaker 1 the effect on like decreasing crime is widely overstated by the companies.
Speaker 1 But it does prey on people's fears.
Speaker 1 You know, that in conjunction with neighbors or, you know, next door,
Speaker 1 you know, one of the things, like one of the functions of these devices is to help kind of exacerbate people's fears about who does and doesn't belong in their neighborhoods, you know, to police workers
Speaker 1 and
Speaker 1 delivery drivers.
Speaker 1 You know, I think, again, that's going to be exacerbated by this move with Amazon to now have workers wear glasses, you know, smart glasses, the move towards having
Speaker 1 retail workers and other workers wear body cams now.
Speaker 1 You know, and so they're all linked in these ways, but that A big part of that is to sort of impose a system of watching on people, again, who are like
Speaker 1 service workers, like people who
Speaker 1 certain people don't think belong in their neighborhood and things like that. And that,
Speaker 1 you know, you asked before, kind of,
Speaker 1 would it be possible to have these things in a way that wasn't sort of like so destructive to society? I mean, I think security cams are an example.
Speaker 1 You know, there are some times that people would want security cameras or might need them.
Speaker 1 But the sort of mechanism and the landscape of having a ring, you know, now a lot of times
Speaker 1 people can, like, ring and flock are forming a partnership or people can have their ring footage or their video doorbell footage, not just ring, sent to like fusion centers, you know, so that cops can have like constant access or on-demand access to these systems.
Speaker 1 I mean, you all just reported. I'm kind of like rambling too, but like these things are all connected, right?
Speaker 1 That you all just reported about the ruling that flock cameras are the ruling of Washington. That flock cameras are.
Speaker 2 Yeah, flock camera footage or well the
Speaker 2
snap the images taken from it are public records in Washington. Yeah.
Sort of. And anyone can request them.
Speaker 1 And like
Speaker 2 I'm like kind of a transparency maximalist when it comes to like the systems that are being used by governments.
Speaker 2 And so it's like, I want to know what they are, but I can also see how this is like quite, quite concerning.
Speaker 1 Yeah, the implications for this are
Speaker 2 wild.
Speaker 1 You know, and I don't know if that, I'm not an attorney, so I don't know how that will hold up or, you know, if that will be used as president or anything like that.
Speaker 1 But the implications for that are absolutely wild.
Speaker 2 Yeah. I mean, I think that
Speaker 2 I knew when I had you on, I was just like, there's not one thing I want to talk to you about because it feels like all of this stuff is very connected.
Speaker 2 It's the same thing happening like over and over and over again. And then these different systems are like, they're all hooking together.
Speaker 2 And I think that's like the big thing for me where it's like, we talk about data silos, where it's like information used for one purpose should not be used for another purpose.
Speaker 2 And that's sort of like how you would like to, that's how you would like, say, like government information to work.
Speaker 2 Like your tax information should not be used to arrest you by, like, I shouldn't be able to find your address because they asked the IRS or whatever. Right.
Speaker 2 And, like, all it feels like all of those walls in society are breaking down, like, all at the same time.
Speaker 2 And I don't know what we're supposed to do about it.
Speaker 1 Yeah.
Speaker 1 Please, please fix, Chris. Please fix.
Speaker 1 Yeah. I, you know, I mean, I think,
Speaker 1 you know,
Speaker 1 I think there being, you know, people are actively breaking them down, you know, and companies.
Speaker 1 You know,
Speaker 1 a thing I think is really important
Speaker 1 is
Speaker 1 the positive effects of a lot of these things. Again, like, I mean,
Speaker 1 it's, it's generally true that because companies have a thing to sell, they're going to wildly overstate the positive effects of these things.
Speaker 1 Whether that's like, how good is an Apple Watch for your health or how, you know, much a doorbell camera increases your safety like i mean generally like the answers are kind of not much you know i mean it's a little bit more complicated complicated than that but like kind of the answer is like not much um but these companies have done such a great job of convincing everyone um not only that these things are going to greatly help them but um kind of occluding the the harmful effects to society.
Speaker 1 I mean, like, it's so corrosive now that like you would have to wonder, like, if someone has a pair of thick-framed glasses, you know, and you're in a locker room or you're having a private conversation or whatever, now you have to like wonder, is this person, you know, filming me?
Speaker 1 You know, you always have to wonder, like, are you being recorded? Like, you know, people are
Speaker 1 can't put pictures of their kids' faces on the internet anymore because it's being like scraped and winds up, you know, contributing to the production of CSAM.
Speaker 1 I mean, like, all of these things are so,
Speaker 1 like, I keep using this word over and over, but they're like so harmful and so toxic and corrosive to society. But here's like where I'll, like, I'm not an optimist, right?
Speaker 1 But like, here's where the thing I think is that I think there's more and more awareness of this. You know, I think that
Speaker 1 in a lot of ways, these companies have begun to overplay their hands in terms of not only their like explicit political alliances, but the ways that they've
Speaker 1 the way I've put it before is that they used to sell you something and say you paid for it with surveillance.
Speaker 1 And now they're just selling you surveillance, you know, that particularly with wearables, but for a lot of things like with so-called agentic AI, it's like, look, if we can ingest everything, like we'll help you optimize your life.
Speaker 1 Like that's the thing now, right? As opposed to like,
Speaker 1 you know, well, you get to be connected with your friends overseas and we'll show you a few ads, you know, like that's like a major change.
Speaker 2 And I think that's a really good point. Sorry, sorry.
Speaker 1 No, no, no.
Speaker 1 I think they're overplaying their hand a bit.
Speaker 2 Yeah.
Speaker 2 You mentioned,
Speaker 2
you know, not putting your kids' faces on the internet. Like, we're talking audio only.
This will be audio only when it's online.
Speaker 2 And you've made like sort of the conscious decision to try to keep your face off the internet as much as possible. I think given this conversation, that is
Speaker 2 the reasons for that are obvious. But I'm curious if you want to talk a little bit about
Speaker 2 not just why that is, but more like, has it been difficult to do this? Or is it just like, no, I'm just not posting pictures of myself on social media?
Speaker 1
I mean, it's not difficult for me. You know, there are pictures of me on the internet.
I didn't put them there is like the way I'll put it.
Speaker 1 You know, I try whenever whenever possible to not feed the machine.
Speaker 1 You know, and I know like my state sells pictures of my driver's license to the feds and things like that. There are ways we cannot avoid that.
Speaker 1 But I really started thinking about this like during the pandemic and when everyone was zooming.
Speaker 1 And, you know, it became the norm to sort of critique people's backgrounds and the books in their background. And everyone was supposed to get a ring light and things like that.
Speaker 1 And,
Speaker 1 you know, ed tech companies were doing room scans to make sure that,
Speaker 1 make sure, getting quotation marks, that students weren't cheating, you know, that the easy kind of normalization of people's image and all the ways that is has been used to harm them.
Speaker 1 You know, again, like to bring it full circle again, right? All the way up to and including like deep fakes and, you know, other kinds of like
Speaker 1 sort of sexual assault imagery that can be made now with some of these systems. And so, yeah, I do my best when, whenever possible to not feed the machine.
Speaker 1 It hasn't been super hard like once I explained it.
Speaker 1 Like initially, people would give me a hard time, you know, because I do, I've done some podcasts and in the Washington Post profile, I had to like fight pretty hard to like not have my face in that you know i think there's a different standard for not famous people and famous people in terms of like um like how much you're allowed to control your image um i'm very much i think you're famous
Speaker 1 um but uh
Speaker 1 yeah i i i do it because and part of the reason i do it is like I have gained like enough credibility at least to say I don't want to do this and people still will still have me on their show or whatever.
Speaker 1 And so I want to kind of like
Speaker 1 create a space where other people could do that too.
Speaker 2 Yeah. I think this idea of not feeding the sheen,
Speaker 2 that's sort of like what I take into my
Speaker 2 day-to-day life outside of journalism is like
Speaker 2 I try to I try to like buy stuff from small businesses rather than Amazon, rather than Walmart. Like I stopped using Amazon
Speaker 2 when we were reporting on their labor
Speaker 2
abuses like eight years ago. And I didn't use them for like five years.
And I didn't find it to be like very difficult to stop.
Speaker 2 And I feel like once you start cutting things out,
Speaker 2 you realize, as you said, like they don't necessarily add that much to your life. And like, maybe that is not going to stop these companies.
Speaker 2 Maybe it's not like some huge act of resistance to like use my feet to walk to the store or to bike to the store or drive to the store rather than like click around
Speaker 2 but it made me feel better at least um and i i guess um
Speaker 2 you know i i have opted into tons of like
Speaker 2 tons of tech companies and stuff like i'm recording this on a you know
Speaker 2 on riverside which is some ai podcast company that makes the stuff easier. And that's just because like
Speaker 2
it's not that I gave up. It's just that I've, I've started to try to be like pick and choose like which services I'll use and which I won't.
And I'm wondering if you think that's like, is that good?
Speaker 2 Should people be doing this? Is it, does it matter? I feel like it matters, but yeah.
Speaker 1 And so I think it matters, um, not in the exact way I think people often think about it. And so, um,
Speaker 1 the way I try to parse it is like, um,
Speaker 1
like Amazon tech, you know, like the company, they're not affected if I don't use Amazon. And like, I don't use Amazon.
Like, they're not affected.
Speaker 1 But I think, um, and so this sort of idea, um, I think because of sort of like how VC money works and, you know, like how monopolies work and things like that, um,
Speaker 1 this sort of idea that as consumers, we're going to make like some small choice that's going to like force these companies to change. Like,
Speaker 1
I don't think it works works like that anymore. Right.
Like, and there's a lot of reasons for that. Like, that we, I don't think I have time to go into.
Right.
Speaker 1 Here's like how I think about it, though, is that it does sort of acclimate us
Speaker 1 into a mode of thinking and
Speaker 1 a form of community
Speaker 1 that will give us access to sort of larger movements. and and um
Speaker 1 and actions that will move the needle, right? And so like, like,
Speaker 1 you know, like the example I've used forever, like I live in Michigan and there's like, you know, pipelines running underneath the Great Lakes with oil, right? And like, so like
Speaker 1 those companies are way bigger polluters than like I'll ever be, you know?
Speaker 1 But I still don't like throw
Speaker 1 like my garbage out the window, right? I still recycle, right? Like, I, you know, like, I understand these things like as me as an individual, like kind of doesn't like
Speaker 1 it matters not in the sense that like I'm changing the world by like doing or not doing those things but like by activating me my consciousness in terms of like um doing larger actions that might actually have some kind of like meaningful um resistance right like um I mean I don't know people really who use chat GPT or door chatbots.
Speaker 1 Like I don't know most people I know don't have ring doorbells and things like that.
Speaker 1 Um, I don't think that's incidental, right? Like, or accidental. Like, I think that I'm part of a community who, like, understands like a lot of these things as like harmful to society.
Speaker 2 Right. I mean, I think that
Speaker 2 I think the last thing I want to talk about is
Speaker 2 how we fight back or like raise awareness and that sort of thing, which is like, you said you're not an optimist. I'm, I'm not really an optimist either.
Speaker 2 However, like, I have seen when we write about about these technologies, like there is always someone in a town somewhere or city somewhere saying, like, I brought this up at our city council meeting and we are reconsidering our contract with Flock or we are like reconsidering whether we want to buy a ring doorbell, like that sort of thing.
Speaker 2 And I do think that people are beginning to
Speaker 2
like. It's not just like a niche concern anymore.
It's not like a niche concern of privacy activists and journalists.
Speaker 2 It's like people I am like randomly meeting seem to like understand that this is an overreach, that these companies are overplaying their hand. And then, like,
Speaker 2 yes, they're, they're continuing to expand, but they're also losing contracts.
Speaker 2 Like, these surveillance companies are losing contracts at a state and local level all the time for privacy overreaches and that sort of thing. And like, I guess that gives me hope.
Speaker 2 Is that like the best we can hope for at the, at this moment? Is sort of like a piecemeal, like
Speaker 2 local level resistance that hopefully like bubbles up at some point to
Speaker 2 a broader, better system? Or like,
Speaker 2 how do you kind of see our way
Speaker 2 out of this?
Speaker 1 I mean, I think what you all do, like you all are crushing it. Like, I mean, I like,
Speaker 1 you know, I think, and I think part of what
Speaker 1 something that you said that I want to latch onto is that like people don't know, right? Like,
Speaker 1 I
Speaker 1 spend all of my time thinking and writing and reading about this stuff I mean you obviously are a journalist and you know your organization is just like scooping doing scoop after scoop but like the average person they doesn't know a lot of this stuff like they're not deeply embedded um in you know like they don't know um about flock or you know um they don't know the extent to which um you know facebook slop is, or like AI slop is like being monetized on Facebook, like all these things.
Speaker 1 Like, I think that that awareness is crucial because
Speaker 1 I think there are so many people who don't know that these systems, like the extent to the harm of the harm of these systems, right? Like, I think a lot about
Speaker 1 the story that these companies tell about what their privacy invasions are. You know, like Facebook's like, well, you know, there's a little bit of surveillance and you get like meaningful ads.
Speaker 1 And like, meanwhile, there's like pixels of, you know, Facebook, like when you are making an appointment with your therapist or like, you know, whatever.
Speaker 1 So I think that is really important.
Speaker 1 I think it's kind of a grim moment right now
Speaker 1 in terms of a lot of this stuff.
Speaker 1 But, you know, this is a point I made before, but I think that it's now pretty explicit
Speaker 1 where these companies stand, you know, know,
Speaker 1 what kinds of things they're willing to do, like erase trans people,
Speaker 1 you know,
Speaker 1 target immigrants,
Speaker 1 you know, like
Speaker 1 get rid of
Speaker 1 apps on their sites that help keep people safe, like all these things. Like they've been very explicit about who they are.
Speaker 2 It does feel like a mask off moment society-wide that we didn't really have after 2016, where like a lot of these tech companies were like no we care about this stuff like
Speaker 1 kind of kind of
Speaker 1 right and like I think people will remember this you know um
Speaker 1 I think that they're um
Speaker 1 that
Speaker 1 when the wind changes right like you know they can't then come back won't be able to come back and say oh actually we do care about equity right or actually we are pro-immigrant like not like we know who you are now um where
Speaker 1 um i think i you know again for like people who have the advantage of always paying attention to these companies like yeah they're capitalist companies and they sort of um you know uh
Speaker 1 change their message according to like you know like uh to align with power right but like
Speaker 1 they're pretty it's pretty extreme right now and i i think that that um there's going to be um
Speaker 1 yeah, there's no kind of pretending now who or what they are. And I think that that's really important in terms of like how we move forward.
Speaker 2 Chris, where can where should people find you?
Speaker 1 I mean,
Speaker 1 I kind of am on blue sky, you know, as hyper-visible.
Speaker 1 I,
Speaker 1 you know, my book is going to come out sooner or later on MIT Press, Luxury Surveillance.
Speaker 1 Still doing some writing here and there.
Speaker 1 And yeah,
Speaker 1 that's mainly it.
Speaker 2 As always, thank you so much for taking the time to talk to me. I feel like we could talk for like three hours, but
Speaker 2 Joseph needs to record a podcast in a few minutes.
Speaker 2 But I really appreciate the time. We'll have you back on soon.
Speaker 2 And thank you so much for all the work you do and for always making time for us to help us contextualize some of these horrors so that we can spread the word and then eventually fight back we are fighting back but uh yeah and i'm a huge fan you know like i get like of your team of you and you know samantha and emmanuel and joseph like
Speaker 1 um again you all are crushing it like so um i don't get to talk to you that much so i just want to point out like yeah like the stuff you do is like so valuable to the rest of us
Speaker 2 we do leave the compliments in the podcast
Speaker 2 i i no i really i really appreciate it it means so so much like just we have so much respect for your work and you know that's very nice thank you
Speaker 1 yeah i'm not paid to say that by the way
Speaker 1 it's like it's it's very genuine
Speaker 2 As a reminder, 404 Media is journalist founded, journalist owned, and supported by our subscribers. If you want to subscribe to 404 Media and directly support our work, you can go to 404media.co.
Speaker 2 You'll get unlimited access to our articles and an ad-free version of this podcast.
Speaker 2 You'll also get to listen to the subscribers-only section of every episode, where we talk about a bonus story each week. This podcast is made in partnership with Kaleidoscope.
Speaker 2
Another way to support us is by leaving a five-star rating and review for the podcast, which really helps us out. This has been 404 Media.
We will see you next time.