Content Warning

29m
Over the past five years TikTok has radically changed the online world. But trust us when we say, it’s not how you’d expect.

Today we continue our yearslong exploration of what you can and can’t post online. We look at how Facebook’s approach to free speech has evolved since Trump’s victory. How TikTok upended everything we see. And what all this means for the future of our political and digital lives.

Special thanks to Kate Klonick

EPISODE CREDITS:
Reported by - Simon Adler
Produced by - Simon Adler
Original music from - Simon Adler
with mixing help from - Jeremy Bloome
Fact-checking by - Anna Pujol-Mazzini

Lateral Cuts:
The Trust Engineers
Facebook’s Supreme Court

Signup for our newsletter!! It includes short essays, recommendations, and details about other ways to interact with the show. Sign up (https://radiolab.org/newsletter)!

Radiolab is supported by listeners like you. Support Radiolab by becoming a member of The Lab (https://members.radiolab.org/) today.

Follow our show on Instagram, Twitter and Facebook @radiolab, and share your thoughts with us by emailing radiolab@wnyc.org.

Leadership support for Radiolab’s science programming is provided by the Simons Foundation and the John Templeton Foundation. Foundational support for Radiolab was provided by the Alfred P. Sloan Foundation.

Press play and read along

Runtime: 29m

Transcript

Winter is the perfect time to explore California, and there's no better way to do it than in a brand new Toyota hybrid.

With 19 fuel-efficient options like the stylish all-hybrid Camry, the Adventure-Ready RAF 4 hybrid, or the Rugged Tacoma hybrid, Toyota has the perfect ride for any adventure.

Every new Toyota comes with Toyota Care, a two-year complementary scheduled maintenance plan, an exclusive hybrid battery warranty, and of course, Toyota's legendary quality and reliability.

Visit your local Toyota dealer and test drive one today so you can be prepared for wherever the road takes you this winter. Toyota, let's go places.

See your local Toyota dealer for hybrid battery warranty details. This is a real good story about Drew, a real United Airlines customer.

After almost four years of treatments, I was finally cancer-free. My mom's like, Where do you want to go to celebrate? I'm like, Let's go somewhere tropical.

And then Pilot hopped on the intercom and started talking about me. And I was like, What is going on here? My wife be cancer too, and I wanted to celebrate his special moment.

That's Bill, a real United pilot. We brought him drinks and donuts.
We all signed a card. I was smiling ear to ear.
Best flight ever for sure. That's how good leads the way.

Getting more as a My Lowe's Pro Rewards member is easier than ever with the Lowe's app. Download it today and earn 500 points the first time you log in.

Plus, your digital wallet helps you scan, save, earn, and access what you need to manage it all in one place. Download the Lowe's app and take advantage of your pro benefits today.

Lowe's, we help you save. Valet 12125-1726.
Offer valid for first login per organization only. Loyalty program subject to terms and conditions.
Additional restrictions apply.

Visit Lowe's.com/slash terms for more details. Subject to change.

Oh, wait, you're listening. Okay.

All right.

Okay.

All right.

You're listening

to Radio Lab. Lab.
Radio Lab. From

WN Weiss.

You're right in here. Awesome.
You're going to be speaking in that microphone.

Nope, the one closer. Here.

Hey, I'm Simon Adler. This is Radio Lab.
1, 2, 3, 4, 5. Can you hear me, Kate? Yep.
And that

is Kate. Yeah, Kate Klinick.
I'm a professor at St. John's Law School.
I've talked to her a bunch over the years.

We did a couple different stories that felt like news at the time about Facebook's rules for what we can and can't post on their platform.

Don't get me saying the F word again, because last time my parents yelled at me.

Did they? Yeah, they were like, Kate, you're an adult now.

Oh, come on. You're a serious person.
I prefer to swear on the radio as much as possible.

We covered the origins of these rules and just how complicated they can become.

But beyond the specifics, what we were really exploring was how the ideal of free speech plays out in different spaces in our society.

You know, from a good old public square where anyone can say anything they want to lightly regulated broadcast TV to straight-up private spaces.

And we were asking, like, where does social media fit into all that?

And you know, I kind of thought we were done talking about all this.

But then.

I'm happy we still have a show, too, I guess.

This past month. Jimmy Kimmel can't say that anymore.
The late night host taken off air indefinitely. As we all know,

free speech was in the news again. I mean, look, we can do this the easy way or the hard way.
That's censorship. That's state speech control.
And these questions of who can say what, where,

and how much pressure the government can or can't exert just felt fresh and vital all over again.

And so I called Kate

to see how this is all playing out online. Yeah.

And now it is a problem of,

okay, how do we stop billionaires and authoritarian governments from twisting these platforms into

censorship machines or political propaganda? propaganda.

Okay.

F.

I know. That's kind of how I feel too.

Well,

I guess before

we get into all of that, let's build a bit of a foundation first. Sure.
So I guess how has the actual practice of keeping stuff up and taking stuff down changed? and why?

Sure.

So the main thing, the main thing from the last time we talked that has really, truly changed from like 2020 to 2025 is the rise of TikTok.

I mean, if you will remember, like in two short years, it had basically caught up with 12 years of Facebook's growth. And I mean, TikTok has a different way that they run their content moderation.

Okay, how so?

Well, when we spoke spoke in these past episodes, one of the assumptions of content moderation when it was getting off the ground, be it Facebook or Instagram or YouTube, was that we don't want to censor people unnecessarily.

Yep. And so you would keep content up until it was reported as being harmful.
And then you would make rules that would limit and try to preserve voice as much as possible, as they put it.

That was like the industry term for free speech, voice. There were limits to that, obviously.
But generally, like it was a keep it up unless we have to take it down type of thing.

But that's not TikTok. TikTok comes from, obviously, China, and it comes from a censorship kind of authoritarian CCP culture.

And I mean, I believe the Chinese kind of approach to speech is very reflected in. the algorithm that TikTok uses.
It is not a default, everyone should see everything.

This is a free world and people have a right to say whatever they want, even if it's a private platform. It is a we get to determine what people see and say.

And

that that's it. So they're just taking tons and tons and tons of stuff down.

Oh, I mean, no. Like

TikTok, it, it pre-screens such a volume of content that it determines to not be outside of certain political parameters.

And so they're less likely to cause negative interaction effects to put kind of an economic term on it.

If I can put a stupid man's term on it, it's like they are choosing to push things up instead of pull things down. That's a perfect way of thinking about it.

And they push things up that are very milquetoast, very like happy, make you feel good, very apolitical. And so this is basically downranking or shadow banning.

The idea that you're going to manipulate the algorithm to not delete the content, but not promote it.

And in addition to that, the algorithm is constantly improving and iterating on all the behavioral signals that you give it. And so it's able to provide a very addictive and

expectation meeting

product. Yeah, product.
I mean, there's no way I'm like almost an experience, but I'm like,

it's, I don't know what it is. I have a confession, which is that I've maybe spent five minutes on TikTok in my life.
I don't have TikTok. You don't either.

Well, I have like rules for some of these things.

But, you know, I study online speech for a living. So it seems kind of crazy, but I like, I don't need to actually be on TikTok for TikTok to be all over my life.
I see TikTok videos

constantly. Sure.
They're cross-posted. I don't need to actually be on TikTok.

Well, and on that, it is interesting that TikTok figured out how to make banal stuff compelling because we were certainly told that, well, the reason Facebook wants to leave some of this stuff up up is because

it's the highly emotive, highly reactive stuff that keeps people around. So

what did we have wrong there? Was this just like an adjacent path to the same outcome, which is keeping people on a platform? Oh, I mean, I think that it's actually fascinating.

You know, what they figured out is it is a format of video that people are hooked by.

And so it does not really matter.

You will find yourself often watching things that you didn't know you were interested in, but like you're just compelled by certain types of couples that like look very different from each other doing any type of like interaction.

Fascinating. So it's like Facebook figured out the sort of information that would keep you there.
TikTok figured out how to package any information to keep you there.

Yes, that's like one way of thinking about it. Oh my God.
Yeah. I mean, you know, but this is not new.
I mean, like advertisers have been doing this forever. Sure.
Right. Like this is,

you know, it's just a very different business model. It is a very different product model.

And it seems to then be a very different informational ecosystem you're creating because if you're pushing up everything that falls within certain bounds and you're deciding what those bounds are, it becomes.

far more like it's controlled the right word. What's the word?

Yeah, it's controlled, but it's also in like a certain way is even more dangerous because like the ultimate in censorship in American First Amendment law is really prior restraint. Right, sorry.

Sorry, excuse me. What is prior restraint? Prior restraint is censorship before something goes up or is ever published.
Oh, so it's not redacted. It's that it was never printed.
Exactly.

That is the exact distinction. And it's important because the existence of this redaction, the proof that it was removed from Facebook is actually evidence that censorship has happened, right?

Right, right, right, right, right. Whereas with TikTok, you never even know what you missed.
You never even know what you were kept from seeing.

And that is really unfortunately what we're staring down at this moment because in the last five years,

American social media has moved towards TikTok's approach to content moderation.

Wow. Okay, I didn't expect us to be talking about TikTok so much, but I'm glad we have.

So, if I'm telling the story of this, it's like, once upon a time, Facebook creates content moderation for everything, all these policies, all these rules.

Meanwhile, TikTok is sort of lurking across the Pacific, eventually jumps over, and Zuckerberg and the Silicon Valley folks see they're doing it this very different way.

When does that actually start to shift not just the way Facebook is thinking about its content moderation, but also maybe the way people are experiencing Facebook as a result?

That is

not as clear, but the biggest Z change is the one that you're thinking of. Hey, everyone.
I want to talk about something important today.

Because it's time to get back to our roots around free expression on Facebook and Instagram.

Which is the the one that happened on January 7th of this year, 2025, when Mark Zuckerberg announced the end of the fact-checking program.

We've reached a point where it's just too many mistakes and too much censorship. And that he was going to try to move towards a community notes-based system of content moderation.

So we're going to get back to our roots and focus on reducing mistakes, simplifying our policies, and restoring free expression on our platforms.

And I mean, I think that like it was, and it wasn't a sea change.

Okay, well, and talk to me, like, when we say facebook got rid of its fact checking at its sort of height what was facebook's fact checking okay so not much which is why this was a really okay which is why this was such a um

such a frustrating announcement um and it was frustrating that the media focused on it so much the fact checking was like a commitment to fact checking because there had been so much clamor about mis or disinformation but they were removing post days after they were flagged and like it was very small and so to watch it go on the chopping block was really more of a signal to a very particular person and to a very particular party that felt like big tech censorship was coming for them and like you know we can get into a whole kind of conversation about whether or not that was reality-based but that was kind of the complaint right um and if i'm gonna mount the best defense for conservatives about censorship by big tech, it would be

that during the pandemic, there was sort of a party line as to what was an acceptable way to talk about the origins of the pandemic, right? Yep. And you can even go before the pandemic.

Okay, you could take it before. You can, there's a few things.
And one of them was...

There are serious questions for Joe Biden this evening following the publication of emails allegedly belonging to his son Hunter

the Hunter Biden laptop scandal. Reporting lays out purported emails between Hunter Biden and a Ukrainian businessman.

New York Post. They broke the story and links to that were taken off Facebook and Twitter.
That was absolutely censored. And what was the justification by Facebook?

Well, that was happening a couple weeks before the 2020 election. And so what had been the huge concern for Facebook and all these other companies was how social media impacted the 2016 election.

And so they made a lot of big changes.

And one of them was just kind of like, we're not going to allow things that could possibly be foreign influence influence to stay up because this is exactly what we got yelled at in 2016.

And so they kind of overcorrected. And I think in hindsight, it was a really hard call

and maybe probably the wrong one. And then you extend that to the Wuhan lab leak.
Now, those were just insane, insane issues. And look at us, we're still talking about them today.

It's not like they were that censored. Unlike going to, say, China, where it's like, you're like, oh, you know, Tank Man.
And they're like, who? Yeah. Right.
Because there are no photos of Tank Man.

Right. They are not published.
Right. And so it's not like, I just also,

points taken.

Okay. Well, so then like, what has changed then? If yes, there was some censoring going on and censoring of things in these sort of critical moments,

like,

would that not happen now? Is that the difference? I mean, I,

I, my honest belief, I can't predict the future, but my honest belief is this administration would very quickly put the platforms in line.

Yeah, I think that there would be no hesitation to do this because I don't think that this was ever about free speech. It was about their speech.
And

that is really what you're unfortunately seeing right now. There is no recognizable free speech notions coming out of this current administration.

And with the TikTokification of social media, people have seen the vector for power that is in content moderation.

Radiolab is supported by ATT. ATT believes hearing a voice can change everything.
And if you love podcasts, you get it. The power of hearing someone speak is unmatched.

It's why we save those voicemails from our loved ones. They mean something.
ATT knows the holidays are the perfect time to do just that. Share your voice.

If it's been a while since you called someone who matters, now's the time. Because it's more than just a conversation.
It's a chance to say something they'll hear forever.

So spread a little love with a call this season and happy holidays from AT ⁇ T. Connecting changes everything.

Radiolab is supported by Capital One. With no fees or minimums on checking accounts, it's no wonder the Capital One bank guy is so passionate about banking with Capital One.

If he were here, he wouldn't just tell you about no fees or minimums. He'd also talk about how most Capital One cafes are open seven days a week to assist with your banking needs.

Yep, even on weekends. It's pretty much all he talks about in a good way.
What's in your wallet? Terms apply. See capital1.com/slash bank, capital One, N-A, member FDIC.

Radiolab is supported by the National Forest Foundation, a nonprofit transforming America's love of nature into action for our forests.

Did you know that national forests provide clean drinking water to one in three Americans?

And that national forests and grasslands cover nearly 10% of the U.S., hosting 150,000 miles of trails and providing habitat for over 3,000 species of plants and animals.

The National Forest Foundation supports the places where we come alive, keeping the trails, rivers, and forests we love healthy.

Last year, they planted 5.3 million trees and advanced over 300 projects to protect nature and communities nationwide.

Their work creates lasting impact by restoring forests and watersheds, strengthening wildfire resilience, and expanding recreation access for generations to come. And when forests struggle, so do we.

The water in our taps, the air we breathe, and the trails that connect us all. Learn how you can help at nationalforests.org.

Radiolab is supported by the National Forest Foundation, a nonprofit transforming America's love of nature into action for our forests.

Did you know that national forests provide clean drinking water to one in three Americans? And when forests struggle, so do we.

The National Forest Foundation creates lasting impact by restoring forests and watersheds, strengthening wildfire resilience, and expanding recreation access for all.

Last year, they planted 5.3 million trees and led over 300 projects to protect nature and communities nationwide. Learn more at nationalforests.org/slash radiolab.

Radiolab is supported by omgs.com. OMGES is a website that the New York Times wirecutter featured as one of their most popular gifts, and for good reason.

It presents new findings from the largest ever research study into women's pleasure and intimacy.

In partnership with the researchers at Yale and Indiana University, they asked tens of thousands of women what they wished they and their partners had discovered sooner.

And they found patterns in those discoveries. And all that wisdom about pleasure and intimacy is organized as hundreds of short videos, animations, and how-tos.

Wirecutter doesn't give recommendations lightly. And when you see OMGS, you'll understand why they featured it.
It's warm, honest, and has regular women talking about real experiences.

It is truly eye-opening. I've poked around a little on the site and it is so well done.
Inviting, not seedy, just the science of sexual generosity laid bare. See for yourself at omgs.com.

That's omgs.com.

Okay, so Kate, you were saying that TikTok TikTok has this fundamentally different approach to content moderation.

That instead of reactively taking stuff down, they are proactively flooding the zone with happy-making stuff.

That Facebook and X and others have taken notice and started adopting this approach. And that all this has happened as folks have begun to see that content moderation itself is,

I think you said, a vector for power.

Yeah, I think that basically what you're seeing is the power over what appears in your feed or doesn't appear in your feed or the types of new content that you're recommended or the first commenters that you see on a video that you just watched.

That type of control is

an ability that we've never seen before.

I remember when I was first writing about this in like 2017, 2018, presenting my research, one of the things that people were so concerned with was filter bubbles.

Well, we're going to be in these filter bubbles fed to us by the algorithm. And as it turns out, that was one very true that that would happen.

But also, even maybe more disturbingly, we don't even need filter bubbles anymore. People are just choosing platforms based on the types of content that they expect to find there.

And in that way, so if we were gone from filter bubbles to

platform islands where the owners of the platform get to push up whatever it is that fits whatever their ideological ends are.

China and TikTok, it seems to be like milquetoast stuff that's not going to rile you up, but it's going to keep your eyeballs on here. It feels a little bit like

X, formerly Twitter, is the mirror image where it's like, we're just going to rile you up all the time.

Is that right? And is that what we're going to just see more of, which is come to this platform island for emotion A, come to that platform island for emotion B.

I think that that's exactly right. I mean,

yeah, I mean, that's what we go to the movies for. That's what we turn into like certain types of things for, right? It's, I'm not in the mood for, you know, a horror film.

So I don't go to a horror film. This, um,

this kind of approach is much easier to moderate. People get much less upset.
Yeah. Yeah.
And it's much cheaper. because there is not as much reactive content moderation to do.

You don't have to employ hundreds of people in call call centers to review every report of something that's been flagged. And so, this has kind of become the new standard.

I remember one of the big questions,

probably in the first piece we did, was this question of like what kind of a space to consider Facebook? Because the First Amendment treats private spaces differently than public spaces.

So, it matters whether or not Facebook is more like a mall or a public square.

And so given all these changes you've just mentioned, like what is the metaphor now? I have one based on what you've said, but I'm curious what yours is.

No, I mean, I've always liked the mall metaphor and it has a weird, squirrely little place in First Amendment law in a bunch of cases.

But I want to hear what you're, I kind of want to hear what yours is.

Well, to me, it's now, or certainly the direction things seem headed based on what you've said, is that it's now just, it's just broadcast again. Yeah.

And with broadcast, there is no free speech, right? No. Like ABC, NBC, they can cancel a show at any time.
They get to decide exactly what the evening lineup is.

But with this, with social media,

it's like a broadcast camouflaged as an organically generated thing. 100%.

You know, you can shadow ban or take down or limit the reach, but it doesn't even have to be that subtle.

Like Elon Musk always showing up in my feed, even though I don't follow Elon Musk, is like having Rupert Murdoch in like the interstitial spaces before every commercial break at Fox News, you know, like directly telling me what I should think.

That isn't subtle. Like that is the other thing about this that is maybe the scariest part of the last couple of months is that none of it even is super pretextual.

Like there isn't a lot of like excuses. We're not even hiding behind algorithms anymore.

It is just the owner of the, of the, of the, of the platform saying the thing out loud and forcing everyone to see it if they're on his platform.

You know, I think that if you're going to all of these different platform islands, the other thing is like, how do we change those?

To use regulatory regimes to try to control how they speak is obviously a problematic thing by any type of measure.

We don't want governments controlling speech for the exact reason of all of the authoritarianism we've just discussed. And so I think that there's, it's very hard.

Sorry, if I can jump in there though, but it does feel like, yeah, I'm not for and have never been for the federal government coming in and molding Facebook's content moderation policy. Of course not.

But

if something no longer resembles a public square at all and instead has become

to keep reusing my label, like a camouflaged broadcasting network where it's like, yeah, these are individuals saying something that they believe in, but then that is being

collated, amassed, and pushed out as a opinion-changing product by someone on high.

I am okay at that point with there being some sort of regulation. It's not regulating maybe what people are allowed to post, but maybe how it's being aggregated.
I don't know.

There have to be some clever, somebody smarter than me who could come up with these sorts of rules.

No, I mean, like, every Western state has some type of media regulator specifically to avoid maybe like two or three people controlling all of media. Right.
Right.

But all of a sudden we're like on the internet. And yes, there is an infinite amount of content on the internet.
But is it so infinite?

Like if there are, if we're talking about like the same three main places that people are going to for their news, people are going to for like their, for their daily interactions, people are going to to feel like they're part of a conversation, their water cooler, their public square, whatever it is.

If that is like three people and they're all friends of the president, like that's

a problem. And maybe even more importantly, journalists, they go to X, they go to Blue Sky, they go to YouTube, they go to TikTok.

And they report things that are happening in those places as if they're real places, that things are happening. But they're also controlled by these individuals.

And so they're not reflective necessarily of real world, yet they are being reported on as if they were reflective of real world. Right.
And so

I just think that what you've seen the last five years is an industry understand

the power that it holds in content moderation, that it's so not a customer service issue, that it is actually like a huge, huge force for shaping public opinion, and

that that has exponential value to political parties and governments. It's like as valuable as oil and guns because how you push things, what you keep up, what you take down.

I mean, this is how you can basically create, you know, the rise and fall of presidencies if you want to or political parties. And they know how to market them to you, no matter how niche you are.

And that's scalable.

And so like, it's a way to make a lot of money. And then it's a way to control a lot of minds.

You know, I think one of the reasons you and I have gotten along so well over the years and have worked so well together in this now trilogy of stories is that we both had sort of an unorthodox approach to this.

I mean, most people. were saying that these Facebook guys were idiots, that they're bad, that they're causing lots of trouble, that we should just like cast scorn upon them.
Yeah.

And you, and then me sort of following your lead were more like, what if we actually try to understand

this

problem?

And I guess now, with hindsight, I'm wondering, like,

did we miss something here where

we sort of played the fool?

You know, it wouldn't be the first time that someone has told me that in some way I'm a useful idiot to Facebook or in some type of capacity.

I didn't say, I would say we would be useful idiots. So I didn't tell you.
I'm asking if we are, is the question.

I feel as if a lot of people and a lot of what we've said today, people will be like, of course, this is what happened. This is what we were saying would happen.

But it wasn't fair to complete when we talked about it. It wasn't.
Every single one of these solutions has the same

flaw at the end of the day to it, which is that these are for-profit companies that do what they want to do. And things change

as things settle. So I don't know.
Okay. Well, so then, like, is content moderation sort of dead?

I just,

yeah, this is like a, this is like a very controversial thing. Um, it really depends on what you mean by that question.

There has been a lot of controversy around, like, are they going to invest in these huge cost centers of trust and safety? Are they going to care about this type of issue

if they can TikTokify everything and just send you down these rabbit holes of endlessly drooly, like eye glaze over, like Wally

kind of scene where you're on the couch with your slurpey, like bark a lounger or whatever, like watching things. Is that what they're basically going to do?

And are they going to have to keep moderating? And I mean, I think that like the answer is that we're going to increasingly see a automated content moderation system.

It's going to increasingly not embody

the edges of of society and the

range of voice that we had at the beginnings of the internet, and that we are going to kind of see a productification of speech.

I'd love to give you one more idea that I've been playing around with for a couple of years. Yeah.

If I was ever going to write a short sci-fi story, it would be about the quote-unquote perfect piece of art.

You step in front of it, it does a quick facial scan of you, pulls everything about you that it knows from the internet, and then it puts forward an image perfectly generated for you that will evoke a feeling.

On Tuesdays, it's happiness, on Wednesdays, it's sadness, and so it's this visual tableau personalized to every person that evokes the same emotion.

And once you have that, once you can control the emotions of people with the flip of a dial by putting something in front of them that's going to only pique that feeling for them, then you could just control everybody.

Wow,

I love that. Sounds like a Ted Chang story, honestly.

But that's that's you know, you should rate that. Maybe you can ask AI to do it for you if you're really busy.

This story was reported and produced by me, Simon Adler, with some original music and sound design by me, mixing done by Jeremy Bloom. Of course, huge, huge thank you to Kate Klonic, as always.
And

yeah,

we will be back next week

saying some more things. Until then, thanks for listening.

I think we're using this one.

Hello, Hello, Lo. Oh, I can hear myself.

Kid podcast crossover special.

Hi, I'm from... Wait, hi, I'm Noir Sultan and I'm from New York.
And here are the staff credits. Weirdo Lab was created by Giad Abenbrad and it's edited by Sarin Wheeler.

Dylan Keefe is our director of sound design. Lulu Miller and Latif Massert are our co-hosts.
Our staff includes Simon Adler, Jeremy Bloom, W. Harry Fortuna,

and David Gable.

Oh, so I just have to read that one name? Okay.

Fool?

Oh my god. So you have to like tap it out.

Annie McEwen, Alex Neeson, and Sarah Kari. Oh, Sarah Sandback, Anissa Vites, Arianne Arianne Wack, Pat Walters, Molly Webster, and Jessica Young.
Yeah, yeah, I see it. Do I sound like happy?

With help from Rebecca Ran, our fact-checkers are Diane Kelly, Emily Krieger, Anna Puol Mezzini, and Natalie Middleton.

Guys, I know, Fammy, shouldn't gotta clap. What's up?

Hi, this is Laura calling from Cleveland, Ohio. Leadership support for Radiolab Science programming is provided by the Simmons Foundation and the John Teppleton Foundation.

Foundational support for Radiolab was provided by the Alfred P. Sloan Foundation.

Radiolab is supported by Capital One. With no fees or minimums on checking accounts, it's no wonder the Capital One bank guy is so passionate about banking with Capital One.

If he were here, he wouldn't just tell you about no fees or minimums, He'd also talk about how most Capital One cafes are open seven days a week to assist with your banking needs.

Yep, even on weekends. It's pretty much all he talks about in a good way.
What's in your wallet? Terms apply. See capital1.com/slash bank, capital One, N-A, member FDIC.

Radiolab is supported by the National Forest Foundation, a nonprofit transforming America's love of nature into action for our forests.

Did you know that national forests provide clean drinking water to one in three Americans? And when forests struggle, so do we.

The National Forest Foundation creates lasting impact by restoring forests and watersheds, strengthening wildfire resilience, and expanding recreation access for all.

Last year, they planted 5.3 million trees and led over 300 projects to protect nature and communities nationwide. Learn more at nationalforests.org/slash radiolive.