Decoder with Nilay Patel: What's Next for the Controversial 'Child Safety' Internet Bill

38m
There’s a major internet speech regulation currently making its way through Congress, and it has a really good chance of becoming law. It’s called KOSPA: the Kids Online Safety and Privacy Act, which passed in the Senate with overwhelming bipartisan support late last month. At a high level, KOSPA could radically change how tech platforms handle speech in an effort to try and make the internet safer for minors.
Nilay Patel talks with Verge senior policy reporter Lauren Feiner, who’s been covering these bills for months now, to explain what’s happening, what these bills actually do, and what the path forward for this legislation looks like.
Listen to more from Decoder from Nilay Patel here.
Learn more about your ad choices. Visit podcastchoices.com/adchoices

Listen and follow along

Transcript

Support for the show comes from Saks Fifth Avenue.

Sacks Fifth Avenue makes it easy to shop for your personal style.

Follow us here, and you can invest in some new arrivals that you'll want to wear again and again, like a relaxed product blazer and Gucci loafers, which can take you from work to the weekend.

Shopping from Saks feels totally customized, from the in-store stylist to a visit to Saks.com, where they can show you things that fit your style and taste.

They'll even let you know when arrivals from your favorite designers are in, or when that Brunello Cachinelli sweater you've been eyeing is back in stock.

So, if you're like me and you need shopping to be personalized and easy, head to Saks Fifth Avenue for the best follow-rivals and style inspiration.

Commercial payments at Fifth Third Bank are experienced and reliable, but they're also constantly innovating.

It might seem contradictory, but Fifth Third does just that.

They handle over $17 trillion in payments smoothly and effectively every year.

And were also named one of America's most innovative companies by Fortune magazine.

After all, that's what commercial payments are all about.

Steady, reliable expertise that keeps money flowing in and out like clockwork.

But commercial payments are also about building new and disruptive solutions.

So Fifth Third does that too.

That's your commercial payments of Fifth Third Better.

Hi, it's Kara.

We're off for the holiday today, but we have an episode for you from Decoder with Neili Patel.

In this episode, Neili talks to Verge senior policy reporter Lauren Feiner to break down the Kids Online Safety and Privacy Act, which is controversial and which recently passed in the Senate.

Enjoy and we'll see you Friday.

Hello and welcome to Decoder.

I'm Neil Ai Patel, editor-in-chief of The Verge, and Decoder is my show about big ideas and other problems.

We've talked a lot on this show about the various attempts to regulate the internet in the United States and how almost all of them run into the very simple fact that the internet is mostly made up of speech.

The First Amendment prohibits most speech regulations in this country.

Literally, it says Congress shall make no law abridging the freedom of speech.

That's why we don't have a lot of laws about the internet.

But there's a major internet speech regulation currently making its way through Congress right now, and it has a really good chance of becoming law.

It's called COSPA, the Kids Online Safety and Privacy Act, which passed in the Senate late last month with overwhelming bipartisan support.

You've probably heard of COSPA's predecessor, COSA, the Kids Online Safety Act.

We've been talking about COSA for a while now.

And in fact, we discussed it here on Decoder earlier this year with Senator Brian Schatz, the Democratic senator from Hawaii who was one of COSA's co-sponsors.

That got bundled up with another build called COPA 2.0, the Children's Online Privacy Protection Act, and that's how you get COSPA.

At a broad level, COSPA is supposed to accomplish two big goals, better protecting the privacy of minors online, and making tech platforms more responsible for what those minors see and do.

The first part, COPA 2.0, is basically a spec bump.

The first COPA, passed in 1998, made it so websites and now social media apps can't knowingly have users under the age of 13 on their platforms without their parents' consent.

Of course, that hasn't stopped any kids from using any of these things, and there's been a host of research and experiences with kids on the internet since.

So COPA 2.0 bumps the age limit to 17 and bans things like showing targeted ads to minors.

This feels relatively straightforward.

It's the second part, the COSA part, that's been controversial for some time, and it remains controversial even as the bill gathers momentum.

COSA creates what's called a duty of care for platforms like Meta, Google, TikTok, and others, effectively making them liable for showing harmful content to kids.

That's a speech regulation, through and through.

It dictates what the platforms can show and what their users can see.

And like every speech regulation, that means COSPA has to get over First Amendment objections.

COSPA certainly has opponents who are making First Amendment arguments.

But there's a strong argument on the other side that the government's interest in protecting children is enough to overcome those problems, and that the political power of parents being worried about the effects of the internet on their children will push COSPA through.

But COSPA is far from a done deal.

It hasn't passed the House of Representatives, which is now in recess until September, and House leadership has indicated they may not even consider the bill in its current form.

On top of that, once the bill passes the Senate and the House, it ends up on the president's desk, and we're in the middle of a very contentious presidential election.

So there's a lot to talk about and a lot that might change.

To help break it all down, I'm talking to Verge senior policy reporter Lauren Feiner, who's been covering these bills for months now.

She's going to help explain what's what's going on, what these bills actually do, and what the path forward for COSPA looks like.

Okay, COSPA and the child safety debate on the internet.

Here we go.

Lauren Feiner, welcome to Dakota.

Thanks for having me.

I would say I'm excited to talk to you, but it's about the First Amendment and regulating speech from the internet.

So always dicey, but I think we're going to have fun.

My plan is to have fun.

Let's start with the bills in Congress right now.

There's one called COSA and there's one called COSPA.

And they are meant, I think somewhat sincerely, to improve the lives of children on the internet.

What are COSA and COSPA?

COSA is the Kids Online Safety Act.

And that's a bill that was introduced by Senators Richard Blumenthal and Marcia Blackburn.

It's a bipartisan bill that's received an overwhelming amount of support in the Senate.

It's basically a bill that seeks to create a duty of care for online platforms that kids use so that basically the platforms have to be responsible for making their products safe for kids.

It's meant to safeguard them from things like eating disorder content or other mental health harms that they could come across on the internet.

So, COSPA is basically a mashup of COSA and this other kid safety bill called the Children's Online Privacy Protection Act, COPPA 2.0.

They actually added teens into the title this time.

The original bill was meant to protect kids under 13 years of age.

This one is under 17.

So it includes a greater group of kids in this bill.

And it also does some additional things adding on to the existing protections for kids online, like banning targeted advertising to that group.

There have been a lot of attempts to save the children online.

What about these two bills made them the ones that got mashed up in advanced?

COSA was really taking a different approach than we've seen before and was trying to do something really overarching.

And I think that was something that was really attractive to lawmakers who they see something like protecting kids on the internet.

And that seems like a really politically good issue.

So I think COSA, something it did interesting was just create this duty of care for these platforms.

So, you know, that's something that even if the platforms change the way that they do business in the next few years or they change something fundamental about their platform, there's still this underlying responsibility, which I think is really attractive to lawmakers who want some sort of lasting way to protect kids on the internet.

And COPPA 2.0, I think, is, you know, a little bit more of an update bill, but something that is maybe easy to get through because everyone wants to protect kids on the internet, wants to protect their privacy.

And I think there's been kind of this updated understanding among lawmakers that maybe we do need to protect a greater swath of kids online than just under 13.

There have been a lot of organizations advocating for this bill.

We should talk about them for a second.

There are a lot of organizations advocating against it.

Who are they and what are their specific concerns?

The people advocating against this bill are particularly concerned about COSA and the way that it might impact free expression expression on the internet more broadly.

And this is not just for kids, it's for adults as well.

But there's also concerns about how kids may or may not have access to potentially really helpful information on the internet.

These are organizations like the Electronic Frontier Foundation, the ACLU, Fight for the Future.

And these groups are concerned on the one hand that kids maybe won't be able to access important resources, especially if they're from marginalized groups, you know, trans youth or kids who are in difficult home situations, being able to access potentially life-saving or affirming information on the internet if platforms decide to take a really broad, censorious approach to taking content off of their platforms to avoid liability.

There's also kind of this greater fear that, you know, the bill doesn't necessarily mandate age verification, but if platforms feel like they have to in some way have a better understanding of how old the people are using their platforms, they might have to implement some ways to do that more that could be privacy invasive or maybe just kind of cleanse the kinds of content that they have on their platforms in general so that it can't be seen as unsafe for kids.

It feels like that argument boils down to if you force the platforms to make the internet safe for kids, they will just make it safe for kids and all the things that adults do will go away.

That's definitely a big fear here that, the internet is just going to become kind of this place that is free of these more controversial discussions or more controversial topics that maybe do need to be aired in some kind of capacity to be hashed out.

But is that forum going to be the internet moving forward?

If it is not the internet, we would just go back to what, like the 1980s?

It feels like you could still make platforms in the internet that don't have kids on them to talk about these things.

You would just have to do age verification, right?

You would have to guarantee that kids couldn't access them.

I think it's unclear.

We'd have to see how it plays out in the courts, whether you have to really guarantee that kids aren't accessing them, or you have to be pretty certain that it's just adults using them.

I think that kind of remains to be seen exactly how that might work.

But I think that is part of the fear here that it's, you know, we're creating an age-gated internet.

And then that fear is basically a First Amendment fear, right?

We're going to prevent everyone from speaking because we don't want them to speak to children.

Right.

Because if you take something like age verification, it's not just that you have kids doing age verification to make sure they're not getting onto sites they shouldn't be.

You're having everyone do age verification because how could a website know you're not a kid until you do that?

Are there any mechanisms to do age verification that work?

From what I've seen, there aren't really any completely foolproof ones.

I think there's some that are better than others, and the ones that are better tend to be a little bit more invasive than maybe people are used to at this point in time.

And, you know, I think maybe it's something where we see what the standard is that people are okay with shifting in the future.

You know, are you okay with having your face scanned to guess your age if that's something that happens on device and is deleted?

Or are you okay with a digital ID that just is issued by the government or some central entity and only gives away up or down information about how old you are based on the birth date on that ID.

There are solutions out there, but they could be considered more privacy invasive than people are used to.

And so I think we're going to see what are people okay with in the kind of reality where these laws exist.

And then the other argument is there are a group of children who might be exploring their gender or their sexuality in some way or some other topic that society says is only for adults, and they won't have any access and they will be blocked off from their communities.

That's a much squishier one, right, than the straightforward, if you make the internet for children, you will be chilling the speech of adults as well.

It seems like there's a lot of conversation about what kinds of access children should have to these communities.

And there's at least some people who say that access should be protected because it helps those children grow into themselves.

I think a lot of people would say there probably is some content that we wouldn't really want kids to access at a certain age.

Think about maybe at the most extreme, like pornography.

I think most people would say they don't think kids should be accessing porn on the internet.

But you know, I think we're talking about something different here when it comes to resources that could be actively helpful for some kids.

And when we look at that,

I think The fear here isn't that the bill says that this information needs to be taken down or it needs to be kept away from kids because it actually says that if a kid is searching out certain information, the platform is allowed to to serve them that.

And it says that information that could actually mitigate some of the mental health harms that the bill contemplates, that's okay.

But I think the concern here is that the platforms might go a step further than the bill actually prescribes in order to avoid liability, because it could just be such a risk to have something that might be politically charged in this moment be served to a kid and then have, you know, a conservative AG or a conservative FTC come after them and say, this is something that we don't think is appropriate for kids.

We need to take a quick break.

We'll be right back.

Every day, millions of customers engage with AI agents like me.

We resolve queries fast.

We work 24-7 and we're helpful, knowledgeable, and empathetic.

We're built to be the voice of the brands we serve.

Sierra is the platform for building better, more human customer experiences with AI.

No hold music, no generic answers, no frustration.

Visit sierra.ai to learn more.

Support for Pivot comes from Groons.

If you've ever done a deep internet dive trying to discover different nutrition solutions, you've likely had the thought, surely there's a way to improve my skin, gut health, immunity, brain fog without offending my taste buds.

Well, there is.

It's called groons.

Groons are a convenient, comprehensive formula packed into a daily snack pack of gummies.

It's not a multivitamin, a greens gummy, or a prebiotic.

It's all of those things and then some for a fraction of the price.

In a groons daily snack pack, you get more than 20 vitamins and minerals, 6 grams of prebiotic fiber, plus more than 60 ingredients.

They include nutrient-dense and whole foods, all of which will help you out in different ways.

For example, Groons has six times the gut health ingredients compared to the leading greens powders.

It contains biotin and niacinamide, which helps with thicker hair, nails, and skin health.

They also contain mushrooms, which can help with brain function.

And of course, you're probably familiar with vitamin C and how great it's for your immune system.

On top of all, groons are vegan and free of dairy nuts and gluten.

Get up to 52% off when you go to groons.co and use the code PIVOT.

That's G-R-U-N-S dot C-O using the code PIVOT for 52%

off.

Support for this show comes from IcyHot.

There's no match for that feeling you get from a good comeback, especially after being out of commission for a few days.

And let's be honest, you were mentally ready to get back on the grind on day two.

Well, with the right post-workout treatment, you actually can be ready to get back in the game.

All you need is a mixture of cooling and warming sensations to relieve the pain, and you'll be able to come back strong.

Icy Hot Nighttime Recovery relieves pain at nighttime while your body recovers and repairs.

And like regular Icy Hot, it's fast-acting with a cooling sensation to instantly ice out the pain, and then a warming sensation to keep it away.

Its crafted fragrance blend includes lavender and eucalyptus essential oils that'll help relax you for bedtime.

It's a no-mess, quick-dry formula that makes for a comfortable application, and it contains two times the pain-relieving ingredient than the leading competitor's nighttime no-mess product.

Tomorrow's comeback starts tonight.

Ice works fast, and heat makes it last, so you can wake up thinking, I'm so back.

Buy Icy Hot Nighttime Recovery No Mess Now.

We're back with Verge Senior Policy Reporter Lauren Feiner.

Before the break, we discussed why some critics have expressed concern about COSPA, but there's major bipartisan support for the package, and that momentum is what helped it secure passage in the Senate.

So those are the big arguments against COSPA, and it seems like those arguments haven't been really persuasive yet.

This bill is moving along.

What are the arguments in support of COSPA, and who has really been making those arguments?

The arguments in support of COSPA are really that this is a bill that can really fundamentally shift the burden of protecting kids on the internet onto the platforms where kids are spending their time.

I think there's been this sense from a lot of parents that this is just too much to handle.

There's so many different platforms.

There's so much to keep track of, so many different threats on the internet that maybe kids in the past faced in a certain way, but the scale of this and the speed at which it moves is just so vast.

So I think parents have really been looking for ways to protect their kids, something that's more all-encompassing.

And I think that's what a lot of parents feel like they found in COSPA.

And I think that's why you're seeing parents really leading the charge here.

You know, we've seen kind of this persistent group of parents who have, in many cases, lost their children to suicide after cyberbullying or other kinds of harms that they experienced on the internet, really coming out and being the lead voices for this bill and saying, you know, this is something that would have protected my child and that I wanted to protect someone else's child.

There are some organizations that had strong opinions against COSA that have seemed to have died down over time.

Who are they and what has been done to ameliorate their concerns?

There was a prominent group of LGBTQ plus groups, groups like CLAD, groups like Human Rights Campaign.

These groups used to oppose COSA, but they've come out and said, we think after the most recent revision that was earlier this year, that this bill doesn't really put the groups that we are meant to protect in such grave harm.

And they didn't go as far as to endorse the bill, but they said, you know, the risks have been mitigated enough that we won't stand in the way of it passing.

And I think that was really significant.

These concerns, you know, while maybe they do exist on a certain level, maybe they're somewhat speculative, or maybe they wouldn't come to be in the way that some of these groups that are still opposing the bill would fear.

That said, I think no one really knows what will happen with these bills until they take effect and until they're challenged in the courts because

I think no platform is going to say, you know, here's exactly how we're going to comply until they see

what the bounds are.

And was there something specific in a bill that was changed to reduce those concerns, or did they just get over it?

There were a few different things that were changed.

Things like limiting the ability of state attorneys general to enforce certain aspects of the bill, like the duty of care, which kind of ameliorated some fears that maybe a particularly conservative or politicized AG might use this against content for trans youth, for example.

And then there were things like specifying that this is really talking about like design features and not content.

I think those were things that just like gave these groups a little bit more of a sense of ease.

The Senate voted 91.3 in favor of COSPA.

We're obviously waiting on what will happen in the House.

And it's an interesting vote because it doesn't feel partisan, right?

91, that's a lot.

And then the three who voted against it are all over the place.

It's Ron Wyden and Rand Paul and Mike Lee.

And I, what's going on there?

I think the reasons that these three voted against the bill are somewhat varied, somewhat overlap, I guess you could say.

You know, Ron Wyden was concerned with, you know, the potential impact on privacy-enhancing technologies like encryption, he said, and anonymity.

And he also said he took seriously the concerns from the LGBTQ plus groups that are still concerned that the bill could be used to censor content that could help trans or gay youth.

On the other hand, I think that's not quite the concern for someone like Mike Lee, who worries that the bill is something of an overreach.

He worries about how the bill uses a definition of mental health or mental disorders that relies on a psychologist group that he says is politicized.

And then Bran Paul is, I think, making a free speech argument that, you know, who's to say what constitutes mental health harms?

And, you know, neither side should be happy with how this plays out because, you know, it could be politicized on either end.

So you've got the classic, I have some concerns about the law, all the way to we shouldn't have laws from Rand Paul.

Is there mostly just consensus from everyone else then?

I think so.

I mean, this bill had like 70 co-sponsors.

So that's a huge amount.

And that was even before it got this vote on the floor of the Senate.

So I think there's just this huge appetite to do something to protect kids on the internet.

And there's just been this consensus around this bill that really built consistently over maybe the past year.

It's a weird time in American political history, I would say.

It feels like an understatement, where in a furious sprint of a sort of reshuffled presidential campaign, it doesn't seem like anyone knows what's going on.

Is this going to make it to the House?

Is it going to get a vote in the House?

What happens next?

Now that it's been voted on in the Senate, it's really on the House to figure out what to do next.

And I think Speaker Mike Johnson has been open to it.

He's seen in public statements.

He has said he's kind of vaguely interested in this bill and wants to take a look at it.

But I don't think we really know for sure yet.

I think the House is kind of a different beast than the Senate.

You have so many more voices.

You have, you know, a lot of maybe different coalitions that will make up the House that might have more varied feelings on this bill than we saw in the Senate.

But at the same time, when you see something get out of the Senate so overwhelmingly, I think you really have to take it seriously in the House.

If the House does vote to pass the bill, is President Biden going to sign it inside of this last little bit of his term?

Yeah, Biden is encouraging the House to take up this bill.

and he said that he wants the House to send it to his desk and that he would sign it.

Do we know how a potential President Trump or a potential President Harris would respond to this bill?

I don't know that Trump has spoken specifically on this bill, but we did see Vice President Harris come out in support of COSPA.

So that's the sort of procedural situation.

Here's the bill.

Here's where it's at.

There's a bigger question going on with all of this,

which is it feels like everyone thinks it's time to regulate the internet.

We've had several generations of people grow up in the free-floating chaos of the internet we have now.

The issue that has always come up is the internet is mostly speech.

The First Amendment is very protective of speech in this country, I think for good reason.

And there's not a lot of ways around it.

The two ways historically have been copyright law, which works to whatever extent.

And then it seems like now the answer is we're going to say we have to protect the kids.

and that will overcome the strict scrutiny tests that courts apply when it comes to First Amendment issues.

Is there a compelling government interest and does this narrowly achieve that interest?

I'm assuming everyone in Congress thinks COSPA narrowly achieves a compelling government interest here.

I think that's correct.

The Senate seems convinced the interest of America's children is compelling enough to overcome any potential First Amendment issues that this bill might face.

And, you know, obviously there's plenty of tech groups and you know, these other groups like the ACLU that seem to disagree with that.

But I think the Senate seems pretty convinced that it can overcome those challenges.

The other mechanisms for regulating speech on the internet don't really come up and face those challenges.

Copyright law, we've just decided that Disney's interests overcome the speech interests of a lot of people on the internet.

When it comes to

other approaches, there's Section 230, which protects the platforms from the actions of their users.

And one way you can affect speech regulation on the Internet is by making companies like Facebook and Google and TikTok responsible for what their users do.

And that's what a Section 230 carve-out is, right?

You're saying Facebook is responsible for a user posting X, and that gets them to moderate X.

Is there a way to achieve the goals of COSPA

using those other mechanisms?

I'm assuming not copyright law, but using something like a Section 230 carve-out?

There would be interest from plenty of factions of Congress to use Section 230 in that way.

I think the problem we've seen with Section 230 reform is that it's really hard to get even

the most strident advocates in favor of Section 230 reform to agree with each other on what that should look like.

So I think COSPA kind of got around that issue by creating a new bill that could approach this in a different way that I think was able to rally the kind of political support needed to do that.

That said, yeah, I think in theory it's possible to carve out, add carve-outs to Section 230 for certain kinds of harms.

I just think politically that seems like a more difficult path.

Are there other approaches to regulating the internet that seem like they might survive First Amendment scrutiny?

Will someone think of the children?

That seems good.

Like they got that one.

This is actually the argument that Hawaii Senator Brian Schatz made when we had him on the show back in January.

Compelling government interest and all the rest of it, but it's also the public policy argument, which is like, can we please argue about everything else, but agree that an 11-year-old shouldn't have their brainstem melted?

Is there any other hook that might survive the courts?

Yeah, I think policies that really narrowly target certain design features or, you know, things that are not about the ordering of content or things that could be argued as editorial discretion of the platforms.

I think it might be possible to regulate those kinds of things.

I think, you know, with the Supreme Court weighing in on Texas and Florida's social media laws recently in the NetChoice cases, you know, that made clear that content moderation and curation is First Amendment protected speech.

That's at least what it seems like the majority of the justices believe.

So I think it's going to be hard to uphold laws that do seem to touch on those things.

But when we're talking about design features or, you know, maybe there's other ways to get at how platforms go about their business that doesn't touch on their ordering of content or their surfacing of content that might be still ripe for for policymaking.

We need to take another quick break.

We'll be right back.

Support for this show is brought to you by CVS CareMark.

CVS CareMark plays an important role in the healthcare ecosystem and provides unmatched value to those they serve.

They do this by effectively managing costs and providing the right access and personalized support.

The care, empathy, and knowledge that CVS CareMark provides its customers is proven time and time again with their 94% customer satisfaction rating.

Go to cmk.co slash stories to learn how we help you provide the affordability, support and access your members need.

I'm William Googe, a Vuri collaborate and professional ultra-runner from the UK.

I love to tackle endurance runs around the world, including a 55-day, 3,064-mile run across the US.

So I know a thing or two about performance wear.

When it comes to relaxing, I look for something ultra-versatile and comfy.

The Ponto Performance Jogger from View is perfect for all of those things.

It's the comfiest jogger I've ever worn.

And the Dreamknit fabric is why I'll always reach for them over other joggers.

Check them out in the Dreamknit collection by going to bury.com/slash William.

That's V-U-O-R-I dot com slash William, where new customers can receive 20% off their first order.

Plus, enjoy free shipping in the US on orders over $75 and free returns.

Exclusions apply.

Visit the website for full terms and conditions.

Thumbtack presents Project Paralysis.

I was cornered.

Sweat gathered above my furrowed brow, and my mind was racing.

I wondered who would be left standing when the droplets fell, me or the clogged sink.

Drain cleaner and pipe snake clenched in my weary fist.

I stepped toward the sink and then...

Wait, why am I stressing?

I have thumbtack.

I can easily search for a top-rated plumber in the Bay Area, read reviews, and compare prices, all on the app.

Thumbtack knows homes.

Download the app today.

We're back with Verge senior policy reporter Lauren Feiner to discuss how attempts to rein in some forms of harmful speech have run into similar issues when facing the First Amendment.

There are other big problems on the platforms right now, particularly as they relate to AI.

There's a lot of conversation and heat around the notion that we should regulate deep fakes in some way, particularly intimate deep fakes.

You have Google and Microsoft asking for the regulation.

You have some bills like the No Fakes Act introduced in Congress.

There are other bills.

How do those add up?

Are they using the same kind of mechanism?

The problem there is that a deepfake of Donald Trump, say, or Joe Biden or Kamala Harris is probably protected free speech.

So you have to just go regulate speech.

But then you definitely don't want intimate deepfakes.

You definitely don't want teenagers being bullied by deepfakes at their school.

What are the approaches?

to regulating that kind of speech here?

Is it the same as COSPA?

We're going to find a First Amendment hook?

Are we trying a different approach?

Well, I think an interesting bill that we saw recently on the intimate deep fakes issue is the Intimate Privacy Protection Act, which basically would create a Section 230 carve-out for intimate AI deepfakes.

So essentially, that would mean platforms wouldn't be immune for hosting that sort of content.

And it's interesting that that bill actually does include the term duty of care, which is the same kind of term we saw in COSA.

And, you know, maybe that signals that this is something that's popular in Congress right now.

Congress is really looking to take kind of a sweeping initiative on kids' safety.

They don't want something that's going to be knocked down the next time that a business decides to change how their platform operates.

So I think, you know, creating some kind of immunity carve out seems to be another potential way that lawmakers might look at attacking some of these issues.

And then it seems like the other approach is creating a larger intellectual property framework for people's likenesses.

That's mostly in state state law right now.

And the idea that we would have a federal likeness law, so somebody clones my face, I could sue them.

This has already happened, by the way.

That's why I'm laughing.

But the idea that we should have a federal law and a federal framework, that seems to be the other approach to deepfakes in particular.

Yeah, I think so.

We're still in really early phases of any kinds of AI legislation.

And so I think we haven't yet seen all of the ways that that's going to play out.

But I think that is definitely likeness something that lawmakers seem interested in.

The reason I'm asking is it just seems like there's two ways of getting around the First Amendment.

One is creating these intellectual property regimes and the other one is saying you will be liable if you do a bad thing and we'll let the user sue you for doing this bad thing or we'll let people sue you for what your users do.

And COSPA is right down the center of that.

And it's saying, actually,

because the welfare of kids is so important, we can just do this directly.

And that seems unusual to me.

I think this gets to a fundamental question of the internet that I think comes up especially around conversations around 230, which is who is responsible for the harmful content on the internet?

Whether that's people selling drugs online, AI deepfakes being shared on the internet.

Does it come down to the user who created and distributed that content or is it on the platforms themselves?

Maybe I think another question here is what is the role of both parents and other technologies that could aid parents in protecting their kids from the existing content on the internet?

What is their role as well?

And that's a difficult thing to approach because no one wants to tell parents you have to do more.

And it is really hard to stay on top of all of these platforms.

But I think that is something that comes up in these conversations is, you know, whose responsibility is it to keep kids safe on the internet?

There are some famous carve-outs like FOSSA and SESTA, which said you can't have sex workers on your platform basically.

Did that accomplish anything?

That was a really controversial carve-out and it also was a really popular one in Congress.

But there's been

some evidence that that carve-out hasn't even been used that much in the courts.

And there's also been a lot of pushback from the sex worker community that believes that that carve-out actually made their work less safe because when they were using online platforms, it was easier to vet clients or know who might be more or less safe or communicate with other sex workers.

There is this idea that maybe a piece of legislation meant to protect one group could cause real harm in another group.

So I think it's really tricky with all of these issues.

I think everyone's heart is in the right place here, but it's a matter of

what really will work in reality.

The FOSSA example is an example of an unintended consequence.

We had this great intention, we passed a law, and it turns out it didn't work and actually might have made a problem significantly worse.

There are some similar potential consequences of this bill, and maybe they're actually intended consequences.

Marcia Blackburn, who's one of the lead sponsors of COSPA, is out there saying we just don't want our kids exposed to trans material, and this bill will help do that.

Is that being taken into account by other people who are supporting the bill and pushing it forward?

There was a comment that she made kind of early on in during the COSA advocacy, and and that was before a lot of the more recent changes to the bill.

So I think it's important to add that context there.

But certainly, you know, people who oppose this bill and think that it'll be used by politicized enforcers to go after, you know, trans youth content or just to scare platforms into not hosting that kind of content.

Certainly they point to comments like that to say, you know, this is really the intent of the bill.

Now, I think the authors of that bill would deny that that is the

objective of it.

But, you know, I think it is something that you have to consider when we're in a politicized environment.

So that's the whole bill.

Let's talk about what happens now.

So if the House doesn't take up the bill before the end of the year, will it come up again in the next Congress?

It's hard to say.

You know, I think in Congress, like...

you have to kind of work off of momentum.

And this bill has a ton of momentum right now.

And, you know, we saw that that helped kind of propel it out of the Senate in this this huge way.

If it doesn't pass out of the House this year, could it come back?

Yeah, I think it could.

It has a ton of supporters and it has, you know, a really passionate base of parent advocates who I don't think will just put down the fight if it doesn't go through this year.

But that said, you know, Congress has a lot of different priorities.

And will this rise to the top once again if it failed to go through the first time?

It's hard to say, but I'd say the possibility still remains.

Both sides seem interested in regulating what happens on the internet.

I would say the conservatives are much more interested in directly regulating the platforms in various ways.

If Trump wins or there are more conservatives in the House or Senate, do you think it's more likely or less likely?

I think this is an issue that really falls outside of the left-right binary.

I don't know if it's something that we would see more of or less of under a conservative or liberal administration.

I think maybe there would be different approaches.

Maybe the kinds of policies that surface change a bit.

But something like COSPA has received such a huge array of support that I don't know that it's necessarily going to change under one administration or the other, but maybe we'll see tweaks in different ways.

Honestly, that makes it seem like the most unusual legislation we've had in years.

Definitely.

I mean, it's not normal to have a bill receive 91 votes in favor of it in the Senate, especially something that

does come with some controversy, some groups that are really strongly standing against it.

But at the same time, I think you're seeing a ton of support outside of Congress for it as well.

COSPA has transcended the chaos of Congress.

Do we think it can transcend the chaos of our judicial system?

Someone's going to sue, right?

It seems inevitable.

How do you think the courts are going to handle it?

Yeah, I would say it seems quite likely that someone would sue to block this law.

And I think it's not yet entirely clear how the courts will consider this.

I think, you know, certainly the groups that oppose COSPA

are probably taking some solace in the fact that NetChoice, which is sued to block several bills throughout the country that deal with kids' online safety or age verification, has successfully received preliminary injunctions in many of those cases on the basis of the First Amendment, which is basically the court saying, you know, we think that on the merits, this case will be decided in favor of NetChoice because we think it will harm the First Amendment in an unacceptable way.

Will that be the case with something like COSA?

I think it has kind of a different approach than some of the bills that we've seen.

And I think, you know, protecting kids' safety on the internet is a compelling interest.

It's just, is it something that the courts will say is compelling enough to justify any potential diminishing of free speech on the internet?

And what have the big platform companies said?

Obviously, NetChoice has an opinion, but have Google or Meta or any any other ones said anything?

We haven't really seen the big tech companies come out with very clear statements on this.

You know, NetChoice is opposed, which is funded by many of those big tech companies.

We've seen a handful of smaller tech companies come out in support of COSA, like Pinterest, for example.

But I think, you know, the platforms that people

probably will really be concerned about or want to see how they handle this are, you know,

Google with YouTube and Facebook.

I'm assuming TikTok as well, if TikTok is still around.

Yes, definitely.

If it's still around is the key question.

All right.

Well, Lauren, I imagine we'll be tracking this for the next year, if not more.

We'll have to have you back soon.

Awesome.

Yes, I would be happy to be back.

Thanks again to Lauren for joining us on the show.

I hope you enjoyed it.

If you have thoughts about this episode or anything you'd like to hear more of, you can email us at decoderatheverge.com.

We really do read all the emails.

You can also hit me up directly on threads.

I'm at Reckless1280.

And we have a TikTok for as long as there's a TikTok.

Check it out.

It's at DecoderPod.

It's a lot of fun.

If you like Decoder, please share it with your friends.

Subscribe wherever you get your podcasts.

And if you really like the show, hit us with that five-star review.

Decoder is a production on the verge and part of the Box Media Podcast Network.

Our producers are Kate Cox and Nick Stat.

Our editor is Callie Wright.

Our supervising producer is Liam James.

The Decoder music is by Breakmaster Cylinder.

We'll see you next time.

This month on Explain It to Me, we're talking about all things wellness.

We spend nearly $2 trillion on things things that are supposed to make us well.

Collagen smoothies and cold plunges, Pilates classes and fitness trackers.

But what does it actually mean to be well?

Why do we want that so badly?

And is all this money really making us healthier and happier?

That's this month on Explain It To Me, presented by Pureleaf.

AI agents are getting pretty impressive.

You might not even realize you're listening to one right now.

We work 24-7 to resolve customer inquiries.

No hold music, no canned answers, no frustration.

Visit sierra.ai to learn more.