The World of AI Regulation with Brian Merchant

29m

In this episode, Ed Zitron is joined by writer Brian Merchant to talk about US AI regulation, and how governments never seem to make the effort to understand technology.

https://www.bloodinthemachine.com/
https://www.bloodinthemachine.com/p/were-about-to-find-out-if-silicon

Get $10 off a year’s subscription to my premium newsletter: https://edzitronswheresyouredatghostio.outpost.pub/public/promo-subscription/w08jbm4jwg

YOU CAN NOW BUY BETTER OFFLINE MERCH! Go to https://cottonbureau.com/people/better-offline and use code FREE99 for free shipping on orders of $99 or more.

---

LINKS: https://www.tinyurl.com/betterofflinelinks

Newsletter: https://www.wheresyoured.at/

Reddit: https://www.reddit.com/r/BetterOffline/ 

Discord: chat.wheresyoured.at

Ed's Socials:

https://twitter.com/edzitron

https://www.instagram.com/edzitron

See omnystudio.com/listener for privacy information.

Listen and follow along

Transcript

This is an iHeart podcast.

There's more to San Francisco with the Chronicle.

More to experience and to explore.

Knowing San Francisco is our passion.

Discover more at sfchronicle.com.

Every now and then I rinse it out,

and I need to be rinsed tonight.

And I need it more.

I can't let the bed and the smell never leaves.

Downey Rinse fights stubborn odors in just one wash.

When impossible odors get stuck in,

weight loss is easy when you have the right prescription.

At gimme.care, you can access physician-prescribed GLP-1 treatment for just $130 a month.

That's the same price at all doses.

No insurance, no hidden fees, no in-person visit.

You can message your physician anytime through the online portal.

And if you're eligible, your medication ships right to your door in discrete packaging.

Get started today.

Gimme.care.

That's G-I-M-M-E.care.

Eligibility required results vary.

Your global campaign just launched.

But wait, the logo's cropped.

The colors are off.

And did Lego clear that image?

When teams create without guardrails, mistakes slip through, but not with Adobe Express, the quick and easy app to create on-brand content.

Brand kits and lock templates make following design guidelines a no-brainer for HR sales and marketing teams.

And commercially safe AI powered by Firefly lets them create confidently so your brand always shows up polished, protected, and consistent everywhere.

Learn more at adobe.com slash go slash express.

Call Zone Media

Oink Oink, welcome to the monologue this week.

Except it's a duologue.

There's two of us.

Better offline.

I'm your host, Ed Zitron.

Better offline.

And today I'm joined by Brian Merchant, the author and writer of the book and newsletter, Blood in the Machine.

Brian, thank you so much for joining me.

Ah, thanks for having me.

Good to be back.

Today we're talking about AI regulation and the current state of, well, California's regulatory moves and indeed the very confusing situation.

And I think we can start with this actually.

This AI safety bill that was vetoed and then just got signed.

What happened there?

Yeah, I mean, the long story short is that, you know, the people who are actually worried about catastrophic risk, right?

Like the AI doomer crowd, the people who think that like this is the number one thing to worry about with AI,

they all got together last year

and wrote this bill.

Um, last year it was 1047.

Um,

and it was, you know, again, assuming that you are worried about that kind of thing, like this was like a legitimate uh stab at trying to keep the companies honest.

It did things like mandated like third-party audits of the systems to make sure that they weren't, uh, you know, they weren't too risky or that they weren't, you know,

including information about biological weapons or whatever and teaching people how to um

but the ai companies hated it because it would uh it mandated some sharing of information some transparency it mandated no no no no no no no no no no not even a little bit even this even like the safe ai companies like anthropic anthropic yeah yeah no influence

right even those guys

they were in they were against so everybody was against it it passed uh because uh scott weiner the sponsor, is a fairly prominent and powerful legislator.

But even though it passed both houses and was sitting on Newsom's desk, the industry,

Silicon Valley, got Newsom to veto it, basically.

So Newsome vetoed it.

And then he said, let's try it again next year.

So Scott Weiner got together with a bunch of, you know, AI ethics people and some.

And then, so this year he comes back with like an e, like a watered down version of even that.

Again, the aim is to prevent catastrophic risk.

And yes, and he just signed this bill.

So as we're recording this,

you know, he signed it a few days ago.

It's it's on the it's on the books.

And it's people are like celebrating this thing for some reason.

And I'm like, I'm really scratching my head.

Yeah, why are you not celebrating?

Okay, so

what's wrong with it?

Okay, here's what the bill actually does.

It does four things, basically.

Instead of

mandating actual transparency and sharing data with auditors, things that could actually might, you know, maybe be considered a prescription if you're worried about things like catastrophic risk.

One,

the new bill has, it forces the tech companies to publish privacy protocols on a website.

So

say, yeah, yeah, like a privacy policy.

Yes.

It says this is what we're doing to ensure and our security practices, right?

This is what we're going to do to keep you safe from catastrophic AI.

Yeah.

No checkup policy.

No,

not anything.

No, no, no.

And then you need a page.

You need a page that people can see.

No enforcement.

That'll stop it.

And then if something does go wrong and the companies have done something unsafe with their LLMs, they need to report it to the state.

So it's an honor system.

So they have to tell.

And what happens if they don't report it?

Unclear.

Unclear.

Great stuff.

Brilliant.

Unclear.

So

the one thing that you could argue or two things, again, like in theory, these are fine things that it says the other

limp.

Well,

those first two things are completely a limp.

I think they're a complete joke.

And so that, but part three and four is whistleblower protections, which

you're supposed to already supposed to have, right?

Good to formalize, but

yeah.

Yeah, it's just like kind of like underlining it with a sharpie saying, okay, people should be able to blow the whistle if something catastrophic is happening.

I have very little faith this will meaningfully change anything at all.

I'm going to guess there are no like protections actually in there.

Like, is there anything that specifies

there's some language that does raise the threshold again, theoretically, but like you just have to think about how whistleblowers are already treated and how difficult it is for them to come forward with all the NDAs and just sort of the norms in the industry.

So, like theoretically, you know, but again, and it's supposed to be for

workers who are so concerned about catastrophic risk, not about, you know,

if they see a company doing fraud or whatever.

This is for catastrophic risk.

So

really, no real protections.

No teeth, really.

No teeth.

I was reading this thing, and I'm not seeing really any enforcement mechanisms other than, you know, these companies get a slap on the wrist, and maybe it's a little bit slappier if it comes to it.

But it doesn't sound like there are actual remedies or anything like that.

Not really.

No.

There's a

finally the fourth thing that it does, again, in spirit, this is okay.

Like, this is, it's a good idea.

It's to start a public consortium, like called Cal Compute, like a public alternative for AI for researchers.

It says, let's give this.

But it's just a committee.

It's a committee.

Fuck yeah, bureaucracy.

It forms a consortium to discuss.

A group will get together that will discuss what might happen in the future after another meeting or two.

Brilliant.

Once again.

So

my read on this will, and you'll, and I've seen a lot of statements coming out of, you know, otherwise decent sort of orgs saying like, this is a step for AI safety regulation.

And I'm, no, it's not.

It's not really.

It doesn't even seem like it addresses AI.

It doesn't seem like it's an attempt to mitigate any harms.

I mean, it is an attempt to.

I think that generally.

Which harms?

Yeah.

The most, I'm going to guess.

So, in looking at AI regulation, you have probably not seen anything that addresses the environmental damage,

anything like anything about the stealing.

Yeah, workplace surveillance, things like this.

So, my read on this, right?

Like, so I think, again, I think Scott Weiner and

the sort of the group that are interested that are like the AI safety groups, like they, they believe in this stuff.

They really lobbied for this.

Like, they think that it is important to have something better than nothing.

But the effect of this mean is largely just that now Gavin Newsom gets to say, like, I signed an AI bill.

Like, I'm doing my due diligence.

And meanwhile, there are a bunch of other bills that we're going to talk about that would actually, you know, do

something, that would actually sort of meaningfully rein in

the AI companies at least a little bit, or at least point a way towards doing this.

This is just,

again,

putting your safety policies on a website, like an honor system to alert the state if you've done something wrong, kind of just saying, okay, well, the system for the most well-funded companies.

Again, and here's the number one way that you can tell that this is entirely bullshit.

And that is that Anthropic is in favor of it.

They came out and said, like, this is good.

Number two, Meta was like, this is sensible.

And they didn't support the bill, but they're not opposed to it.

Even OpenAI was like, well, well, I think you could tell that.

It's any kind of restriction.

We don't like that.

But even if it's the flimsiest one, I feel like you could see them like the gears turning in the corporate machinery.

We're like, are we like, basically, this bill requires us to have an intern like write some copy or even just have like ChatGPT generate it.

That's the thing.

Put it on a website.

They don't even have a thing in there saying you can't use AI to write AMP.

I haven't seen that.

I would have probably done that myself, personally.

Yeah.

So, number one, it does next to nothing in my eye.

And I just can't, I mean, I know all these like advocacy groups and people who are like trying to get some good laws on the big, they're hungry for a win.

So, they want to say, like, yeah, we did something.

But this is, I think, to me, this is worse than nothing because it's going to let Gavin off the hook on like a handful of bills that might actually do something.

And it will give the

fake view that the AI companies are regulated, so they're able to continue doing all the actual harmful stuff.

So I'm a big fan of Quince.

I've been shopping with them long before they advertised with the show and I just picked up a bunch of their Pima cotton t-shirts after they came back in stock as well as another overshirt because I love to wear them like a jacket over a t-shirt.

Talking of jackets, I'm planning to pick up one of their new leather racer jackets very, very soon.

Their clothes fit well, they fall nicely on the body and feel high quality like you get at a big nice department store, except they're a lot cheaper because Quince is direct-to-consumer.

And that's part of what makes Quince different.

They partner directly with Ethical Factories and skip the middlemen, so you get top fabrics and craftsmanship at half the price of similar brands, and they ship quickly too.

I highly recommend them, and we'll be giving them money in the future.

Letter up this fall with pieces that feel as good as they look.

Go to quince.com/slash better for free shipping on your order and 365-day returns.

Now available in Canada, too.

That's q-u-i-n-ce-e dot com slash better.

Free shipping and 365 day returns.

Quince.com slash better.

This is Bethany Frankl from Just Be with Bethany Frankl.

Let me tell you something.

Most dog food, scam.

Kibble, trash, garbage.

So why would I want to give that to my furry babies, Biggie and Smalls?

My dogs love Just Fresh from Just Food for Dogs.

It is made with human-grade ingredients, balanced, healthy, and shelf stable you can throw a pouch in the purse in the car in a weekender bag and your furry friends eat fresh wherever you are you will notice the difference get 50 off your first order just foodfordogs.com when it's time to think halloween think oriental trading sure we started out as a carnival prize supplier way back in 1932 but now we are the destination for all things halloween need costumes check decorations yep we have them candy hundreds of delicious options and don't forget the party decorations tableware, party favors, non-candy handouts, and crafts.

All the best Halloween stuff is just a click away, and it's all backed by our 110% low-price guarantee.

Order now at OrientalTrading.com and get fast-free shipping straight from Omaha, Nebraska.

Do you know what making a reservation and getting to the airport early have in common?

They're both good calls.

Just like getting AT ⁇ T's best deal on the all-new iPhone 17 Pro designed to be the most powerful iPhone yet.

Learn how new and existing customers can get the all-new iPhone 17 Pro on us with eligible trade-in.

Most expensive plan, not required.

AT ⁇ T, connecting changes everything.

Terms and restrictions apply, subject to change.

Visit ATT.com/slash iPhone for details.

Are there any regulations you've seen that actually approach the actual harms?

Anyway, oh, sorry, suggested regulations or bills or anything like that.

Okay, yeah, there's a couple.

So, and I've been talking to,

I've talked to, I've talked lawmakers over this process.

I've talked to some of the sponsors of the bills and I've talked to labor groups.

And the one sort of through line that runs through all of this is that Silicon Valley and its lobbyists have just been out enforced trying to crush.

even the most sort of basic common sense regulation.

And so laws or proposals that started out with some teeth have had most of them knocked out or have been delayed until next session.

So we're left with a few things.

We're left with a few things

that, you know, I think more than anything, they're just like bellwethers as to whether or not it's even possible to like get to get anything done.

Because as your listeners will know, right, like the whole AI like for enterprise isn't working out so well right now.

No.

And so no one buys it.

No one's buying it.

No one wants it.

So they're going to have to make some changes in the next couple of years.

And the way they're going to, my guess is that they're going to try to find ways to sell people on its other capacities, things like

it's a surveillance tool.

It'll do, you know, it'll.

It's an auto surveillance tool.

It's an awful, like it's, that's what's funny about this because it's like the actual harms to mitigate would be training and environmental and energy.

The number must go up.

So it's like,

we'll regulate, because the surveillance thing, I get that.

Like, there is already AI surveillance, but it's like, oh, what if they put all the data in a large language model?

Yeah.

Would the large language model do anything with it?

Like, what do you

we've like we've already got that?

But wait, so what other bills have you seen?

Okay, so you know, there's there's one, I think the one that's still out there

at the time of recording, who knows, it could have been vetoed or signed by the time that this goes to air.

But there's one, there's

there's one bill that Silicon Valley is genuinely upset about and afraid of.

And it's a bill that like sets the very lowest bar.

And it says, essentially, if you are going to make a chatbot and market it to children, then you have to be able to demonstrate that this chatbot isn't going to make them harm themselves.

Ah.

So

they hate that.

They are, and they're in, you know, they have,

there's a Silicon Valley lobbying group that's kind of famous in California, especially called the Chamber of Progress.

It's like, yeah,

sorry.

And it's like immediate reflexive.

Immediately.

I know.

Gag reflex immediately.

So

they've got this guy who's out there writing op-eds in like the San Diego Tribune and doing press for

going, oh, it's over broad.

And let me tell you, like their actual line on this is: if this bill passes, then you're going going to be taking away AI that could educate children.

You're going to be taking away AI from children, and they're not going to have the same advantages that children with AI have.

And they're running this, like, this big Facebook campaign.

They've hired lobbyists specifically to take care of them.

They don't have to do this.

These fucking evil people.

They are evil people.

I mean, this, I mean, for me, this was like, you know, this was like, it's past the threshold.

Once the Adam Rain stuff broke and Open AI is

trying to hem and haw about,

oh, well,

we're going to do this or that.

We passed it.

We passed the Rubicon, right?

They've got chatbots that are telling children to hide the news so that their parents don't see it.

And like, it's again, they make it seem like, oh, AI is this frontier.

We're going to work something.

It's a product.

It's a software product.

Yeah, go ahead.

I have a theory.

I have a theory.

Okay.

So I don't think they can control these models.

I don't mean because they're intelligent.

I don't mean because they're autonomous.

I mean, I don't think you can actually prompt a large language model to categorically stop it doing something.

Yeah.

I don't think it's possible.

Then you should not be selling that

space of soccer to children.

Yeah.

100% agree.

I'm just saying that I don't think

it's capable.

I think you're right.

Yeah.

But my theory is based on costs because of clawed code, that

they can't do cost control.

If they can't do cost control, it means the model won't listen.

Yeah.

And I reckon that they can't be like, never talk about killing yourself.

Yeah.

Like they just can't do that.

Yeah.

Or it would require like going back through, you know, like there's been, I've seen a lot of sort of speculation that the reason that it's talking like this is that it's like a lot of the language is coming from like pro-suicide forums in the bowels of the internet.

So like they don't want to go through and take

sort that out.

Yeah.

Or discussions of suicide.

Probably articles as well that say how to deal with someone who's just horrible stuff.

Yeah, really.

And this, is this a California bill?

Yeah, it's a California bill.

Yeah.

And it's, and it is, yeah.

So you, if you're on Facebook, there's, so there's a, there's a front group spun up by

some of the VC firms like A16Z and, you know, Andreessen and these guys and Y Combinator.

And there's a front group called the American Innovators Network.

And they're running all these ads, arguing that this bill, again, whose sole purpose is to

ban chatbots from being marketed to children

that also try to convince them to harm themselves.

Like, not even banning them from being

marketed to children, just

stopping them being harmful.

You have to, yeah.

It's the way that it's frayed, which is what they're all hung up on, is it's just like, you have to be able to demonstrate that if you sell this to children, it will not tell them to harm themselves and and i think you one of the more reasonable requests you could ever ask of a company most i cannot think of something that is more but they're saying oh no it's too broad everything else will get caught up in this you can't have a chat bot because but and then it's always like but why why couldn't we have this chatbot in the classroom yeah oh because it might tell somebody to kill themselves like that's why yeah and they they don't this is again we should these people should be should try and interview them like because it's the question would be say that you agreed with this bill, you disagree, say you agree with it.

How would you stop this?

Yeah, just how would you stop it?

Can you stop it?

Is that why you're upset?

Is the reason you're mad?

Because you literally can't stop this?

Yeah.

Because that's the thing.

They ham and ha around any kind of protections.

Yeah.

Anything.

Anything.

And now, and I wonder if it's because they can't.

Yeah.

And they want to be like, oh, it's too powerful.

No, it's too shit.

shit yeah

it's because it sucks it's not because it's it's not because

it's because you built something shitty and hard to control exactly it's like lava lava is hot that and can burn through most things that doesn't make it intelligent right your inability to not drink lava just like fucking

fucking deal with this i mean it's a combination of both those things like like it's either oh this would be really expensive to fix or oh like

i don't know if we can't well they i i reckon what they would have to do is they would have to just neuter it they would just have to they would have to just make it so that anything that gets even close to that conversation yeah would have to be just like shut down to the point that you can't even talk about superhero stuff right they would probably just limit it to the point of nothingness yeah which like

yeah i mean if that's what they i mean and we're again we're talking about like children here so like yes uh that seems like if that's what they have to do then then that in

my view, anyways, is something that they should do.

Every now and then, I rinse it out,

and I need to be rinsed tonight,

and I need it more.

Like, cake with the bed, and the smell never means.

I don't know what to do,

The swecking dance shore smells like a dark bar.

Downy rinse fights stubborn odors in just one wash.

When impossible odors get stuck in, rinse it out.

When the right team comes together at the right time, the potential is unlimited.

In the world of biotech, that time is right now.

At Unicisive Therapeutics, we've assembled an industry-leading biotech team to tackle the biggest unmet needs in kidney care.

Our lead investigational therapy is on pace to deliver a potential best-in-class profile for treating hyperphosphatemia in chronic kidney disease patients on dialysis.

What's next in kidney care is almost here.

Join us as we work to cross the finish line at unecisive.com.

Looking for an exceptional driving experience?

Find it behind the wheel of a Mercedes-Benz SUV.

Experience the power, precision, and intelligence of an iconic Mercedes-Benz SUV at your local Mercedes-Benz dealer today.

When uptime matters, and every mile counts, smart fleets choose Vaviline.

Our heavy-duty engine oils are engineered to extend drain intervals, reduce wear, and protect against extreme temperatures, helping you reduce total cost of ownership.

From long hauls to tough conditions, Vaviline has kept equipment running stronger, longer, for over 150 years.

Because we're more than just engine oil, we're partners in performance.

But again, it's wild because they don't usually hire lobbyists to oppose state-level bills.

But

the sponsor here,

Rebecca Bauer-Kahn, it's the LEED Act is what it's called.

It's AB 1064.

She's just like, I've never seen anything like this.

They're like having lobbyists come and knock on my door and like, you know, yelling at me about this.

And they're just like, it's a full court press.

And we could talk a little bit after this about like how this is sort of like part of a broader movement where they're, you know, Silicon Valley is sending its lobbyists out all across the country, but especially South Africa.

Is that typical to do on this level?

I mean,

Silicon Valley hasn't wanted regulation on anything for a long time, but this, the, the level of

concentrated effort and sort of the campaigning is new, right?

Like, I've been a tech journalist for 15 years and I've never seen anything like we saw over the summer where they, where Silicon Valley tried to lobby for a ban on all state-level AI lawmaking.

They really got a whole sort of united front together.

They got stakeholders from the different...

Did they fail though?

They did fail, but they failed by basically one vote.

And they failed because of a Republican in Tennessee who represents Nashville.

Marsha Black.

Yeah.

And And she was like, wait a minute, we have a law in the books that protects

our country music industry from being sloppified.

And would this overturn it?

And they were like, well, yeah, but you know, yeah, yeah.

And then so she kind of came out against it.

And that, if it wasn't for that, it would have, that, well, would have passed.

And that is bonkers.

It's bonkers.

That's wild that country music saved us.

Yeah, country music and Nashville's like no AI music slop law.

And they're going to try again, again, though.

They're going to try again.

Yeah, Ted Cruz keeps talking about it.

He's interested in giving it another go.

So it'll be coming down the pike.

But to be clear, like, no, this has never happened.

If you've never seen somebody say, okay, we're going to ban lawmaking around search.

No search engines can have any laws made around them or social networks.

This is its own thing.

This is AI.

It's its own thing.

Yeah.

There just can't be laws.

No, it's totally anti-democratic.

It's totally absurd.

And all, I mean, it tells you all you need to know, really, that the Silicon Valley interests are willing to push all their chips in and team up with the Trump administration and its allies to try to get this done.

I mean, I don't know fucking Newsom.

No, he's, I mean, Newsom has vetoed a ton of solid AI bill.

I mean, last year, Newsome vetoed a good bill that I think was good because

it was on a, so I don't know if you caught this, but last year, the

the Teamsters and some pro-labor groups fought for a bill and got it passed bipartisan, passed the house that says, okay, if you're going to use autonomous or AI technology to run a truck that's like over a certain amount of weight, then there should be a human safety operator making sure it's not hitting people, not running into people.

It passed both houses.

Newsome vetoes it.

And two weeks later,

there's the autonomous car that drags the pedestrian across San Francisco.

If that timing had been different,

yeah, thank you.

Thank you, Gavin.

Very good, Gavin.

It wouldn't have necessarily applied, but I think the optics would have been different, so he wouldn't have been able to.

Anyways, Newsom is perfectly happy to veto a lot of this stuff.

The difference is,

his constituents here in California, they care about this.

And in the case of that, of the lead act that we were talking about,

one of the reasons that it stands any chance at all

is that his wife is like a vocal supporter of the LEED Act.

And she's been on some stages saying, I think we should protect our children from AI slop.

And so

it's going to be like his wife on one hand and all of Silicon Valley's lobbying force on the other.

Newsome versus Newsome.

Jesus Christ.

Yeah.

So it's weird.

So here's something that...

Why does no one ever push any bills that actually...

I realize that even doomed ones, why did, it seems that there's a lack of any bills that are like

actually aimed at the technology written by people who have used it, for example.

Yeah.

Like, why does that never happen?

I mean, we still have this mentality ingrained in this country, especially in the political class that just doesn't really have a lot of hands-on,

you know, experience with the tech.

It's changing a little bit, but i think it's just it's ideology it's just like oh well like we don't want to stifle innovation we don't you know if they need to build you know a million data centers and you know encircle the nation and in in uh in in stargate outposts uh then then i guess that's then we'll defer to you uh we've always it's always been this way it's always been deferring to industry and then uh and then reacting right and then saying like oh oh oh well that was a bridge too far and then they try to, you know, get some.

So like, there are laws on the book now, like 10 years after social media sort of, you know, was on the scene.

They don't even direct at the problem, though.

Yeah.

Well, I mean, the biggest problem.

Yeah.

I mean, there's been some antitrust efforts.

That's the closest you've probably gotten to some decent efforts at,

you know,

reining in the giants.

Close us up.

What would be the regulation that you would want to push through?

Other than the stopping the kids getting at the ai suicide llm that one seems pretty good but are there a dream a dream bill for you what might it look like what are the things that you actually think need to be restricted i mean honestly like i we we have not even touched the the we've barely gotten to the tip of the iceberg here about what needs to happen because i mean the things that that you know that that i worry about are the same ones that you cited uh

which is with the environmental impacts of just but that's almost a separate issue right?

And in fact, all that has happened on the legislative and policy front is that the AI companies have convinced, they did this to Biden, by the way.

It wasn't Trump.

They convinced him to relax environmental regulations so they could just build more data centers without

being subject to as many fines.

So you have to get serious.

I saw some regulation around they have to report their power drawer somewhere.

Yeah.

Like they have to talk about how much energy they're using.

Wow.

Wow.

Right.

Yeah.

And I, you know, I do think that, I mean, and most of these bills are written in a way that doesn't just apply to AI.

There's, oh, there was a decent,

there's a, I mean, again, it's everything's just been battered to bits by the lobbying machine, but there is a bill that he did sign that sort of semi-restores gig workers' rights to unionize.

That's it.

It's good, but there's a big asterisk to it.

It's a little wonky, so we won't get into it.

But it's a good, a good step.

We need gig workers to be able to organize.

Right now, they can't in most of the states.

So, that I do think, you know,

workplace protections and protecting against automated hiring and firing shit and wage depression.

The no-reboat boss thing.

The no-reboat boss.

Again, it's like even the sponsor of the bill, Lorena Gonzalez, who's with the California Labor Fed, I talked to her and she's like, look, this is what we could get through.

Nothing else that we had in there that like really seriously banned discrimination.

And it's really not like

it's not about the tools again.

It's about allowing bosses to use this as an accountability sink or to offload, saying, Oh, yeah,

kind of where you need to regulate sometimes.

Yeah, exactly.

Especially when you're not going to regulate the fucking technology.

Yeah, exactly.

So, I mean, there's a million things that

need to happen.

I mean, I think antitrust absolutely needs to happen.

But

I would get way more radical than anything that's being even even talked about right now.

Because right now, it's just profoundly anti-democratic where we're at right now.

Silicon Valley is just calling.

They're just,

you know, better than anybody.

They're just

hoarding the people.

This is the one thing.

And maybe this is a dumbass's opinion, but why do you have to fucking listen to lobbyists?

You, I mean, you do not.

Otherwise you don't.

You do not.

Yeah.

Like, I realize I'm not a big politics knower, but that feels like you could just just not talk to them.

I guess that sometimes they contribute to your political campaigns, but

just don't do it.

That's it.

Yeah, I just lost this.

This is how I enter politics.

I'm just like, what if you didn't listen to it?

You just didn't pick up.

Ed Zitron, a hero for our times.

No.

Why not?

I don't want to talk to you.

You sound really annoying.

You keep, I want to run this bill.

You keep texting and calling me saying I should.

And this is going to block you.

Politicians, listen.

Let's get you a a soapbox.

I'll vote for Ed Zitron.

All right, Brian, where can people find you?

I am bloodandthemachine.com and Brian Merchant on most social media platforms.

Lovely.

Thank you for joining me.

This has, of course, been your Better Offline duologue for the week.

Back next week with an interview with Stephen Burke from Games Nexus.

Thank you for listening, everyone.

Thank you for listening to Better Offline.

The editor and composer of the Better Offline theme song is Mattasowski.

You can check out more of his music and audio projects at matasowski.com.

M-A-T-T-O-S-O-W-S-K-I dot com.

You can email me at easy at betteroffline.com or visit betteroffline.com to find more podcast links and of course my newsletter.

I also really recommend you go to chat.where's your ed.at to visit the Discord and go to r/slash betteroffline to check out our Reddit.

Thank you so much for listening.

Better Offline is a production of CoolZone Media.

For more from CoolZone Media, visit our website, coolzonemedia.com, or check us out on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.

This is an iHeart podcast.