Part One: The Zizians: How Harry Potter Fanfic Inspired a Death Cult

1h 12m

Earlier this year a Border Patrol officer was killed in a shoot-out with people who have been described as members of a trans vegan AI death cult. But who are the Zizians, really? Robert sits down with David Gborie to trace their development, from part of the Bay Area Rationalist subculture to killers.

(4 Part series)

Sources: 

  1. https://medium.com/@sefashapiro/a-community-warning-about-ziz-76c100180509
  2. https://web.archive.org/web/20230201130318/https://sinceriously.fyi/rationalist-fleet/
  3. https://knowyourmeme.com/memes/infohazard
  4. https://web.archive.org/web/20230201130316/https://sinceriously.fyi/net-negative/
  5. Wayback Machine
  6. The Zizians
  7. Spectral Sight
  8. True Hero Contract
  9. Schelling Orders – Sinceriously
  10. Glossary – Sinceriously
  11.  https://web.archive.org/web/20230201130330/https://sinceriously.fyi/my-journey-to-the-dark-side/
  12. https://web.archive.org/web/20230201130302/https://sinceriously.fyi/glossary/#zentraidon
  13. https://web.archive.org/web/20230201130259/https://sinceriously.fyi/vampires-and-more-undeath/
  14. https://web.archive.org/web/20230201130316/https://sinceriously.fyi/net-negative/
  15. https://web.archive.org/web/20230201130318/https://sinceriously.fyi/rationalist-fleet/
  16. https://x.com/orellanin?s=21&t=F-n6cTZFsKgvr1yQ7oHXRg
  17. https://zizians.info/
  18. according to The Boston Globe
  19. Inside the ‘Zizians’: How a cultish crew of radical vegans became linked to killings across the United States | The Independent
  20. Silicon Valley ‘Rationalists’ Linked to 6 Deaths
  21. The Delirious, Violent, Impossible True Story of the Zizians | WIRED
  22. Good Group and Pasek’s Doom – Sinceriously
  23. Glossary – Sinceriously
  24. Mana – Sinceriously
  25. Effective Altruism’s Problems Go Beyond Sam Bankman-Fried - Bloomberg
  26. The Zizian Facts - Google Docs
  27. Several free CFAR summer programs on rationality and AI safety - LessWrong 2.0 viewer
  28. This guy thinks killing video game characters is immoral | Vox
  29. Inadequate Equilibria: Where and How Civilizations Get Stuck
  30. Eliezer Yudkowsky comments on On Terminal Goals and Virtue Ethics - LessWrong 2.0 viewer                                                                                                   
  31. Effective Altruism’s Problems Go Beyond Sam Bankman-Fried - Bloomberg
  32. SquirrelInHell: Happiness Is a Chore
  33. PLUM OF DISCORD — I Became a Full-time Internet Pest and May Not...
  34. Roko Harassment of PlumOfDiscord Composited – Sinceriously
  35. Intersex Brains And Conceptual Warfare – Sinceriously
  36. Infohazardous Glossary – Sinceriously
  37. SquirrelInHell-Decision-Theory-and-Suicide.pdf - Google Drive
  38. The Matrix is a System – Sinceriously
  39. A community alert about Ziz. Police investigations, violence, and… | by SefaShapiro | Medium
  40. Intersex Brains And Conceptual Warfare – Sinceriously
  41. A community alert about Ziz. Police investigations, violence, and… | by SefaShapiro | Medium
  42. PLUM OF DISCORD (Posts tagged cw-abuse)
  43. Timeline: Violence surrounding the Zizians leading to Border Patrol agent shooting

See omnystudio.com/listener for privacy information.

Press play and read along

Runtime: 1h 12m

Transcript

Speaker 2 Cool zone media.

Speaker 1 Welcome back to Behind the Bastards. That's how this podcast would open if I was a game show host.

Speaker 1 But I'm not. Instead, I'm a guy who spends.
You would be good at it, though. I don't think I would be, Sophie.
I do, but I think. But I'm like biased because I think you'd be good at most things.

Speaker 1 No, my only marketable skill is spending 30 hours reading the deranged writings of a quasi-cult leader who was somewhat involved in the murders of multiple people very recently,

Speaker 1 largely because she read a piece of Harry Potter fan fiction at the wrong time.

Speaker 1 Yes.

Speaker 1 We have a fun one for you this week. And by a fun one, we have a not at all fun one for you this week.
And to have just a terrible time with me,

Speaker 1 we are bringing on a guest, guest, the great David Bore,

Speaker 1 co-host of My Mama Told Me with our friend of the pod, Langston Kerman. David, how you doing? Oh, man, I cannot complain.
How are you doing? There's nothing going on in the world.

Speaker 1 Oh, yeah. No,

Speaker 1 I got up today and read that great new article by Francis Fukuyama. History is still stopped, so everything's good.

Speaker 1 We're done.

Speaker 1 I haven't looked at any news yet, purposefully. So I'm, I, you know, it could be awesome.
It could be going great out there. It's great.
It's great.

Speaker 1 The whole Trump administration got together and said, psych, it was all a bit.

Speaker 1 Man, just an extended ad for the Apprentice Season 15. You mean this country's not a business? No.

Speaker 1 They handed over the presidency to...

Speaker 1 I don't know. I don't know.

Speaker 1 Whoever you personally at home think would be a great president. I'm not going to jump into that can of worms right now.

Speaker 1 LeBron LeBron Ramon. LeBron.
They made LeBron the president. That's a good one.
That's a good one. That's a good enough one.
It's better than what we got. Honestly, vastly superior than where we are.

Speaker 1 Of all the entertainers, I feel like, why don't we start giving athletes a shot at government? Yeah, fuck it. Why not?

Speaker 1 You know,

Speaker 1 fucking Kareem Abdul-Jabbar?

Speaker 1 But a great president.

Speaker 1 That motherfucker could knock a presidency out of the park. Come on.
Veronica Mars writer, Kareem Abdul Jadbar. Absolutely.
Yes.

Speaker 1 We need a mystery novelist slash one of the great basketball stars of all time in the White House. I just want a president who's good in the paint.
You know what I mean? That's right. That's right.

Speaker 1 Agatha Christie with a jump shot. Yeah, that's exactly what.
I think that's what we need. What an amazing man.

Speaker 1 Kareem would be such

Speaker 1 a good choice. Yeah, bring it on.
I think he's such a good man, he wouldn't do it. Yeah.
Yeah, exactly. He's way too moral.

Speaker 3 He's way too moral.

Speaker 1 I have a frog

Speaker 1 named after him. Yeah.

Speaker 1 Look, honestly, given where we are right now, I'd take fucking,

Speaker 1 what's his name? Mark McGuire. Like, Jesus Christ, anybody.

Speaker 1 Like, honestly, anyone. I'd take Jose Conseco.
Jose Conseco, but shit in a heartbeat. So funny.

Speaker 1 Oh, man. Fuck it.
Like, I'll take,

Speaker 1 no, no, I'm not going to take any hockey players. No hockey players.
No hockey players.

Speaker 1 We got enough people with brain damage in the White House right now. That's probably be fair.
And we don't need somebody who likes to fist fight that much. Yeah.
Yeah. Yeah.
Yeah. Yeah.

Speaker 1 You're probably right there. I mean, if we could go back in time and make Joe Lewis the president, I think he could solve some fucking problems in Congress.

Speaker 1 He could get stuff done.

Speaker 2 This is an iHeart podcast.

Speaker 3 Hey guys, it's Aaron Andrews from Calm Down with Aaron and Carissa. So as a sideline reporter, game day is extra busy for me, but I know it can be busy for parents everywhere.

Speaker 3 You're juggling snacks, nap time, and everything else.

Speaker 3 Well, Gerber can help create a more parent-friendly game day because they have the most clean label project certifications of any baby food brand.

Speaker 3 So you can feel good about what you're feeding your little ones. I mean, Mac loves them.
You can't go wrong with the little crunchies.

Speaker 3 You just put him in a little bag or you put him in a little container and he's good to go. Make sure to pick up your little one's favorite Gerber products at a store near you.

Speaker 1 Season three of Sniffy's Cruising Confessions is here.

Speaker 1 Hosts Gabe Gonzalez and Chris Patterson Rosso are going deeper than ever with bold new conversations, fresh guests, and unfiltered takes on queer sex and cruising.

Speaker 1 This season, they're also looking out for the community, covering smart cruising in a chaotic world, including information on prep.

Speaker 1 And yes, their call-in segment is getting even hotter and they'll react to your wildest cruising confessions on air. No pressure.

Speaker 1 Tune into Sniffy's Cruising Confessions, sponsored by Healthy Sexual from Gilead Sciences, with new episodes every Thursday on the iHeartRadio app or wherever you get your podcasts.

Speaker 2 Tired of spills and stains on your sofa? Wash away your worries with Anibay. Anibay is the only machine-washable sofa inside and out where designer quality meets budget-friendly prices.

Speaker 2 That's right, sofas start at just $699.

Speaker 2 Enjoy a no-risk experience with pet-friendly, stain-resistant, and changeable slip covers made with performance fabric.

Speaker 2 Experience cloud-like comfort with high-resilience foam that's hypoallergenic and never needs fluffing. The sturdy steel frame ensures longevity and the modular pieces can be rearranged anytime.

Speaker 2 Shop washable sofas.com for early Black Friday savings up to 60% off site-wide, backed by a 30-day satisfaction guarantee. If you're not absolutely in love, send it back for a full refund.

Speaker 2 No return shipping or restocking fees, Every penny back. Upgrade now at washable sofas.com.
Offers are subject to change and certain restrictions may apply.

Speaker 1 Managing multiple accounts and logins for your marketing needs is like managing multiple announcers for one ad. Confusing.

Speaker 2 But with MailChimp's new SMS features, you can reach all your customers in over 10 countries all from one account, giving you more time, driving more conversions, and improving campaign performance.

Speaker 2 One platform, many audiences, endless possibilities. That's how you MailChimp your marketing with SMS.
Tap the banner to learn more.

Speaker 1 So this has a bit of fun digression, but I got to ask at the start of this,

Speaker 1 the story that is most relevant to the people we're talking about today, that I think most of our listeners will have heard, I'm curious if you've heard about.

Speaker 1 Back on January 21st, right as the Trump administration took power, a Border Patrol agent was shot and killed along with another individual at a traffic stop in Coventry, Vermont, right?

Speaker 1 There were two people in a car. It was pulled over by a Border Patrol.
One of those people drew a gun. There was a firefight.
One of the people in the car and the cop died, right?

Speaker 1 Okay. Have you heard this story of this? No, I'm not going to be with this at all.

Speaker 1 It's one of those things where it would have been a much bigger story, obviously, immigration being the thing that it is, right?

Speaker 1 In the United States, like the political hot issue that it is right now.

Speaker 1 Like the Republicans have been desperately waiting for a border patrol officer getting shooted and wounded that they can use to justify a crackdown.

Speaker 1 But, number one, this happened on the Canadian border, not

Speaker 1 and one of the two people who drew their guns on the cops was an immigrant, but they were a German immigrant. And so, none of this really like

Speaker 1 it was all like right on the edge of being super useful to the right, but it's not

Speaker 1 sexy. Like, I live in Denver.
We were dealing with our own right-wing immigration propaganda at that time.

Speaker 1 Yeah. Yeah.
It was just like, it was like the closest to being a perfect right-wing, like, uh, fault, like a Reichstag fire event, but like just a little too weird.

Speaker 1 Yeah, you gotta, you gotta throw some spice in there. You gotta have like a Latin country.
That's what they get excited about. Yeah.
And obviously, California border is where you want it, you know?

Speaker 1 Yeah, definitely, definitely. Even New Mexico could be yeah, or at least, at least they need to have fentanyl on the car.
In fact, they were not breaking any laws that anyone could prove at the time.

Speaker 1 They just looked kind of weird. Okay.

Speaker 1 They looked kind of weird and they had guns, but it was like they had like two handguns and like 40 rounds and some old targets. They were like coming back from a shooting range, right?

Speaker 1 Like not a lot of

Speaker 1 guns and ammo in America terms, right? Right. Especially in Vermont terms.
Right.

Speaker 1 So the other thing that was weird about this is that the German immigrant who died was a trans woman.

Speaker 1 So then, again, we get back to like, wow, there's a lot about this shooting that is like right on the edge of some issues that the right is really trying to use as like a fulcrum to like push through some awful shit.

Speaker 1 And as more and more information came out about this shooting, the weirder it seemed because there was a lot of initial talk. Is this like a terrorist attack?

Speaker 1 Were these like two Antifa types who were like looking to murder a Border Patrol agent? But no, that doesn't really make sense because like they got pulled over.

Speaker 1 Like they can't have been planning this, right? Like it didn't, it didn't really seem like that.

Speaker 1 And really no one could figure out why they had opened fire.

Speaker 1 But as the days went on, more information started coming out, not just about the two people who were arrested in this, well, the one person who was arrested and the one person who died, but about a group of people around the country that they were linked to.

Speaker 1 And these other people were not all, but mostly trans women.

Speaker 1 They were mostly people who kind of identified as both anarchists and members of the rationalist subculture, which we'll talk about in a little bit.

Speaker 1 And they were all super high-achieving people in like the tech industry and like sciences, right? These were like people who had won like awards and had advanced degrees.

Speaker 1 The lady who died in the shooting was a quant trader. So these are not like the normal shoot it out with the cops types.
Yeah, this is a very niche group. This is a very strange story.

Speaker 1 So people start being like, oh, the fuck is happening? And it's a group of people who could not meet each other without the invention of the internet. Right.

Speaker 1 That is, boy, David.

Speaker 1 Do you know where this story's going?

Speaker 1 Or at least starting.

Speaker 1 So, like, it's a couple of these days into this when, like, a friend of mine messaged me and is like, hey, you know, that shooting in Vermont? Yeah. And I was like, yeah.

Speaker 1 And he's like, my friend is like, you know, there's Zizians. And I was like, wait, what?

Speaker 1 What the fuck? Because I had heard of these people. This is a weird little subculture.

Speaker 1 I'm always, I'm like, you know, I study weird little internet subcultures in part because like some of them do turn out to do acts of terrorism later.

Speaker 1 So I was in my library.

Speaker 1 And I've been reporting on the rationalists who are not like a cult, but who do some cult-adjacent things. And I just kind of find annoying.

Speaker 1 And I'd heard about this offshoot of the rationalists called the Zizians. They were very weird.
There were some like weird crime allegations.

Speaker 1 A couple of them had been involved in a murder in California a year earlier. But like, it was not a group that I ever really expected to see blow up in the media.

Speaker 1 And then suddenly they fucking did, right?

Speaker 1 And they're called the Zizians. That's not a name they have for themselves.
They don't consider themselves a cult. They don't all like live.

Speaker 1 A group of them did live together, but like these people are pretty geographically like dispersed around the country. They're folks who met online arguing about

Speaker 1 rational and discussing rationalism and the ideas of a particular member of that community who goes by the name Ziz, right?

Speaker 1 That's where this group came out of. And the regular media was not well equipped to understand what was going on.

Speaker 1 And I want to run through a couple of representative headlines that I came across just in like looking at mainstream articles about what had happened.

Speaker 1 There's an article from the Independent, the title Inside the Zizians: How a Cultish Crew of Radical Vegans Became Linked to Killings Across the United States.

Speaker 1 They seemed like just another band of anarchist misfits

Speaker 1 scraping on the fringes of Silicon Valley until the deaths began.

Speaker 1 And then there's a KCRW article, Zizians, the vegan techie cult tied to murders across the U.S.

Speaker 1 And then a Fox article, Trans Vegan Cult charged with six murders. There you go.
Classic, that's that's fox style yes none of these titles are very accurate um

Speaker 1 in that i guess the first one is like the closest where like these people are radical vegans and they're they are cultish right so i'll give i'll give the independent that um

Speaker 1 Vegan techie cult is not really what I would describe them.

Speaker 1 Like some of them were in the tech industry, but like the degree to which they're in the tech industry is a lot weirder than that gets across.

Speaker 1 And they're not really a tray, they're like trans vegans, but the cult is not about being a trans vegan. That's just kind of how these people found each other.
Oh, they just happened to be.

Speaker 1 That was just the common ground. Veganism is tied to it.
They just kind of all happen to be trans. That's not really like tied to it necessarily.

Speaker 1 So I would argue also that they're not terrorists, which a lot of people have, a number of the other articles called them.

Speaker 1 None of the killings that they were involved with, and they did kill people, were like terrorist killings.

Speaker 1 They're all much weirder than that, but none of them are like, none of the killings I have seen are for a clear political purpose, right? Which is kind of crucial for it to be terrorism.

Speaker 1 The murders kind of evolved out of a much, much sillier reason.

Speaker 1 And it's, you know, there's one really good article about them by a fella at Wired who, you know, spent a year or so kind of studying these people.

Speaker 1 And that article does a lot that's

Speaker 1 good, but it doesn't go into as much detail about what I think is the real underpinning of why this group of people got together and convinced themselves it was okay to commit several murders.

Speaker 1 And I think that that all comes down more than any other single factor to rationalism and to their belief in this weird online cult that's that's very much based on like and like asking different sort of logic questions and trying to like pin down the secret rules of the universe by doing like game theory arguments on the internet over blogs, right?

Speaker 1 Like that's really how all of this stuff started. So they have like someone named Mystery.

Speaker 1 Yes. A lot of people in funny hats.

Speaker 1 They do actually, they're a little adjacent to this and they come out of that period of time, right? Where like pickup artist culture is also like forming.

Speaker 1 They're one of this like generation of cults that starts with a bunch of blogs and shit on the internet in like 2009, right?

Speaker 1 And this, this is,

Speaker 1 it's so weird because we, we use the term cult, and that's the easiest thing to call these people.

Speaker 1 But generally, when our society is talking about a cult, we're talking about like you have an individual, that individual brings in a bunch of followers, gets them, isolates them from society, puts them into an area where they are in complete control, and then tries to utilize them for like a really specific goal.

Speaker 1 There's like a way to kind of look at the Zizians that way, but I think it would be better to describe them as like cult-ish, right?

Speaker 1 They use the tools of cult dynamics and that produces some very cult-like behavior.

Speaker 1 But there's also a lot of differences between like how this group works and what you'd call a traditional cult, including a lot of these people are separate from each other and even don't like each other.

Speaker 1 But because they've been inculcated in some of the same beliefs, through these kind of cult dynamics, they make choices that lead them to like participate in violence too. Where is their hub?

Speaker 1 Is it like a message board type of situation? Like, how is it? Yes, yes. So, I'm gonna,

Speaker 1 I'm gonna have to go back and forth to explain all of that. Um, also, I do want to know what the technology of Zizzy in it is.
It's because from slits, no, no, Z I Z.

Speaker 1 The lady who is kind of the founder of this is a

Speaker 1 the name that she takes for herself is Ziz, right? Okay,

Speaker 1 should have been the Z Girls. That's much more appealing.
These people are big in the news right now.

Speaker 1 Because of that murder. Because of the several murders, and the right wing is trying, wants to make it out as like this is a trans death cult.

Speaker 1 And this is more of like an internet AI nerd death cult.

Speaker 1 I guess that's better.

Speaker 3 It's just different.

Speaker 1 You know,

Speaker 1 you're right.

Speaker 1 It was just a different thing. And I think it's important.

Speaker 1 If you're if, like, you care about like cults because you think they're dangerous and you're arguing that, like, hey, this cult seems really dangerous, you should understand like what the cult is, right?

Speaker 1 Right.

Speaker 1 Like, if you misunderstood the Scientologist and thought, like, these are obsessive fans of science fiction who are committing murders over science fiction stories, it's like, no, no, they're committing murders because it's something stupider.

Speaker 1 Yeah. Like, much more stupid.

Speaker 1 Okay. So I got to take, I am going to explain to you what rationalism is,

Speaker 1 who Ziz is, where they come from, and how they get radicalized to the point where they are effectively at the hub of something that is at least very adjacent to a cult.

Speaker 1 But I want to talk a little bit about the difference between like a cult and cult dynamics, right?

Speaker 1 A cult is fundamentally a toxic thing. It is bad.
It always harms people. There is no harmless cult, you know?

Speaker 1 It's like rape. Like, there's no version of it that's good, you know? Like, it is a fundamentally dangerous thing.
Cult dynamics and the tactics cult leaders use are not always toxic or bad.

Speaker 1 And in fact, every single person listening to this has enjoyed and had their life enriched by the use of certain things that are on the spectrum of cult dynamics.

Speaker 1 I was going to say it's it seems a lot more like you have that at work. You have that at work.

Speaker 1 Yeah, anyway, that's a huge part of what makes a great fiction author who is able to like attract a cult following. You've ever had that experience?

Speaker 1 Like a big thing in cults is the use of and creation of new language. You get people using words that other, they don't use otherwise and like phrases.

Speaker 1 And that is both a way to bond people because like, you know, it helps you feel like you're part of this group and it also isolates you from people.

Speaker 1 If you've ever met met people who are like hugely into, you know, Dungeons and Dragons or huge fans like Harry Potter or the Lord of the Rings, like they have like things that they say, like memes and shit that they share based on those books.

Speaker 1 And like, that's a less toxic, but it's on the same spectrum, right?

Speaker 1 It's this, I am a part of this group of people and we use these words that mean something to us that don't mean things to other people, right? And that's like an empowering feeling, right? Yes. Yes.

Speaker 1 Yeah. That's like a great, that's like a great way to ball.
I think it's any group, right? I mean, we see entertainers, your friends,

Speaker 1 yeah,

Speaker 1 has in jokes, right? Like sports.

Speaker 1 Yeah. The beehive could kill people, right?

Speaker 1 Exactly. Yes, yes.

Speaker 1 And like, you've got, you know, you and you and your buddies that have been friends for years, you have like, you could, there's like a word you can say, and everyone knows that you're referring to this thing that happened six years ago.

Speaker 1 And you all like laugh because, you know, it reminds you of something, you know, because it's relevant to something happening then. That's a little healthy bit of cult dynamics at play, right?

Speaker 1 You know,

Speaker 1 it's like a diet, you know, so there's a toolbox here and we play with it and different different organizations, but churches play with it.

Speaker 1 And obviously, a lot of churches cross the line into cults, but there's also aspects of, for example,

Speaker 1 you know, there's churches that I know I have seen people go to where like it's very common, everybody gets up and like hugs at a certain point.

Speaker 1 And like people benefit from human contact, it makes them feel nice.

Speaker 1 It can be like a very healthy thing.

Speaker 1 I've gone to, I used to go to like Burning Man Regionals and like you would like start at this greeter station where like a bunch of people would come up and they'd offer you like food and drinks and you know people would hug each other and it was this like changes your mind state from where you were in before kind of opens you up that Burning Man Regionals.

Speaker 1 Is that like to qualify for state? Yeah, yeah, yeah, yeah. So that we could get to go.
It was just like these local little events in Texas, right?

Speaker 1 Like a thousand people in the desert trying to forget that we live in Texas. Okay.

Speaker 1 Or not desert, but it was very, like, it's, it's, it was like a really valuable part of like my youth because it was the first time I ever started to like feel comfortable in my own skin.

Speaker 1 But also that's on the spectrum of love bombing, which is the thing cults do, where they like surround you from people, with people who like.

Speaker 1 talk about like like, you know, will touch you and hold you and tell you they love you.

Speaker 1 And like, you know, part of what brings you into the cult is the cult leader can take that away at any moment in time, right?

Speaker 1 It's the kind of thing where, if it's not something where, no, this is something we do for five minutes at the end of every church service, right?

Speaker 1 You can very easily turn this into something deeply dangerous and poisonous, right?

Speaker 1 But also, a lot of people just kind of play around a little bit with pieces of that, with a piece of the cult dynamics. Just a little bit of disaster.
Just a little bit.

Speaker 1 Any good musician, any really great performer is fucking with some cult dynamics, right?

Speaker 1 I was going to say, I mean, I've been to like so many different concerts of like weird niche stuff where you're like, maybe the disco biscuits is the cult.

Speaker 1 I don't know.

Speaker 1 Yeah. I mean, I've been to some childish Gambino concerts where it's like, oh, yeah, he's doing, he's a little bit of a cult leader, you know?

Speaker 1 Like, just 10%, right?

Speaker 1 I mean, what are you going to do with all that charisma? You got to put it somewhere, you know? Yeah.

Speaker 1 So these are, I think that it's important for people

Speaker 1 to understand both that like the tactics and dynamics that make up a cult have versions of them that are not unhealthy. But I also think it's important for people to understand

Speaker 1 cults come out of subcultures, right? This is very close to 100% of the time. Cults always arise out of subcultural movements that are not in and of themselves cults.

Speaker 1 For example, in the 1930s, through like the 50s, 60s, you have the emergence of what's called the self-help movement, you know?

Speaker 1 And this is all of these different books on like how to persuade people, how to, you know, win friends and influence people, you know, how to like make, but also stuff like Alcoholics Anonymous, you know, how to like improve yourself by getting off drugs, getting off alcohol.

Speaker 1 All these are pieces of the self-improvement movement, right?

Speaker 1 That's a subculture. There are people who travel around, who get obsessed, who go to all of these different things, and they'll, and they get a lot of benefit.

Speaker 1 You know, people will show up at these seminars where there's hundreds of other people and a bunch of people will like hug them and they feel like they're part of this community, and they're making their lives better.

Speaker 1 And oftentimes, especially like once we get to the 60s, 70s, these different sort of guru types are saying that, like, you know, this is how we're going to save the world.

Speaker 1 If we can get everybody doing, you know, this, this yoga routine or whatever that I've put together, it'll fix everything. Who's that guy who had the game?

Speaker 1 Oh, God, yes. You know what I'm talking about? Yeah, yeah, yeah.
And they had to like, they had to viciously confront each other. Yes, that we've covered them.
That is Synanon. Yes.
Yeah.

Speaker 1 So that's what I'm talking about. That's what I'm talking about.
And we have this broader subculture of self-help and a cult, synonym, comes out of it.

Speaker 1 And I get it. It's like the subculture, it's already, it's intimate.
You feel closer to those people. And anybody else, it definitely feels

Speaker 1 ripe for manipulation. And Scientology is a cult that comes out of the exact same subculture.
We talked.

Speaker 1 last week or week before or two weeks ago about Tony Alamo, who's an incredibly abusive pedophile Christian cult leader.

Speaker 1 He comes out of, along with a couple other guys we've talked about, the Jesus freak movement, which is a Christian subculture that arises as a reaction to the hippie movement.

Speaker 1 It's kind of the countervailing force to the hippie movement. So you got these hippies and you have these Christians who are like really scared of this kind of like weird left-wing movement.

Speaker 1 And so they start kind of doing like a Christian hippie movement almost, right?

Speaker 1 And some of these people just start weird churches that sing annoying songs. And some of these people start hideously dangerous cults.

Speaker 1 You have the subculture, and you have cults that come out of it, right? And the same thing is true in every single period of time, right? Cults form out of subcultures, you know?

Speaker 1 And part of this is because people who would, a lot of people who find themselves most drawn to subcultures, right,

Speaker 1 tend to be people who feel like they're missing something in the outside world, right?

Speaker 1 You know, not every, but people who get most into it. And so, so, does that mean, like, so maybe, like, more, I'm just curious, like, more broader cultural waves have never led.

Speaker 1 There's like the Swifties would not be a cult. No, there's no, most likely not going to be an offshoot of the Swifties that becomes a cult because it's so broad.

Speaker 1 It has to have already been kind of a smaller subset. That's interesting.

Speaker 1 Well, yeah, and I think I, but, but that said, there have been cults that have started out of like popular entertainers and musicians.

Speaker 1 Like, you know, you could, we could talk about Corey Feldsman's weird house full of young women dressed as angels, right?

Speaker 1 Like,

Speaker 1 you know, um,

Speaker 1 so

Speaker 1 yeah, you've got, as a general rule, like there are music is full of subcultures, like punk, right?

Speaker 1 But there have definitely also been some like punk communities that have have gone and kind of individual little chunks of punk communities have gone in like culty directions, right?

Speaker 1 Even if you're talking about

Speaker 1 huh? Yeah. Yes.
Yeah.

Speaker 1 Fuck them.

Speaker 1 So there are cults that come out of the subculture. This is the way cults work.

Speaker 1 And I really just, I don't think, I don't think there's very good education on what cults are, where they come from, or how they work, because all of the people who run this country have like a lot of cult leader DNA in them.

Speaker 1 You know,

Speaker 1 body.

Speaker 1 Yeah.

Speaker 1 We're being run currently by someone who is seen as a magic man.

Speaker 1 Cults all the way down. Yes, exactly, exactly.
So I think there's a lot of vested interests in not explaining what a cult is and where they come from.

Speaker 1 And so I want to, I think it's important to understand subcultures birth cults. And also cult leaders are drawn to subcultures when they're trying to figure out how to make their cult.

Speaker 1 Because a subculture, you know, most of the people in it are just going to be like normal people who are just kind of into this thing.

Speaker 1 But there will always be a lot of people people who are like, This is the only place I feel like I belong. I feel very isolated.

Speaker 1 This is like the center of my being, right?

Speaker 1 And so it's just an, it's like a good place to recruit. You know, those are the kind of people you want if you're reaching out to cult leaders.

Speaker 1 You know, I'm not saying like, again, I'm not saying subcultures are bad. I'm saying that like some chunk of people in subcultures are ready to be in a cult, you know?

Speaker 1 Yeah, yeah, I think if I reflect on my own personal life, yeah, you meet a lot of guys who are just like, I'll die for the skate park or whatever thing. Yeah.

Speaker 1 Or like the Star Wars fans who were sending death threats to Jake Lloyd after The Phantom Menace, where it's like, well, you guys are crazy.

Speaker 1 That is insane. You know, he's like eight, right? This is a movie.
He also didn't write it. He didn't write it?

Speaker 1 Like, what are you doing?

Speaker 1 You know, whatever makes you feel a sense of home, I guess. So, and again, that's kind of a good point.
Like,

Speaker 1 Star Wars fans aren't a cult, but you can also see some of like the toxic things cults do erupts from time to time from like video game fans, right?

Speaker 1 People who are really into a certain video game, it's not a cult, but also periodically, groups of those fans will act in ways that are violent and crazy. And

Speaker 1 it's because of some of these same factors going on, right? I think people forget fan is short for fanatic. Exactly, exactly, right?

Speaker 1 And it's, it's like, you know, the events that I went to very consciously played with cult dynamics, you know, after you got out of that like greeting station thing where like all these people were kind of like love bombing you for like five minutes, there was like a big bar and it had like a sign above it that said not a religion do not worship and it was this kind of people would talk about like this is like we are playing with the ingredients of a cult.

Speaker 1 We're not trying to actually make one. So you need to constantly remind people of like what we're doing and why it affects their brain that way.

Speaker 1 And in my case, it was like, because I was at like a low point in my life then. Like, this was when I was really, it was 20.
I was not. I had no kind of drive in life.

Speaker 1 I was honestly dealing with a lot of like suicidal ideation.

Speaker 1 This is the point at which I would have been vulnerable to a cult. And it, I think it acted a little bit like a vaccine.
Like I got a little dose of the drug. Right.
It was enough.

Speaker 1 Built up an immunity. Exactly.
And now you're like, hey, I know what that is. I know what's going on there.

Speaker 1 So anyway, I needed to get into this because the Zizians, this thing that I think is, it's either a full-on cult or at least cult-ish, right?

Speaker 1 That is responsible for this series of murders that are currently dominating the news and being blamed on like a trans vegan death cult or whatever.

Speaker 1 They come out of a subculture that grows out of the early aughts internet known as the rationalists. The rationalists started out as a group in the early aughts on the comment sections of two blogs.

Speaker 1 One was called Less Wrong and one was called Overcoming Bias.

Speaker 1 Less Wrong was started by a dude named Elizer Yadkowski. I have talked about Elizer on the show before.
He's not,

Speaker 1 he sucks.

Speaker 1 I think he's a bad person.

Speaker 1 He's not a cult leader, but again, he's playing with some of these cult dynamics and he plays with them in a way that I think is very reckless, right? And ultimately leads to some serious issues.

Speaker 1 Now, Elizer's whole thing is he considers himself the number one world expert on AI risk and ethics. Now,

Speaker 1 you might think from that, oh, so he's like, he's like making AIs. He's like working for one of these companies that's involved in like coding and stuff.
Absolutely not.

Speaker 1 Absolutely not.

Speaker 1 No, no. Team chair quarterback.
Backstreet driver. No.
He writes long articles about what he thinks AI would do.

Speaker 1 and what would make it dangerous that are based almost entirely off of short stories he read in the 1990s. Like,

Speaker 1 this guy.

Speaker 1 That's the most internet shit I've ever heard.

Speaker 1 It's so fun.

Speaker 1 It's such internet. And, like, I'm not a fan of the quote-unquote real AI, but Yudkowski is not even one of these guys who's like, no, I'm like making a machine that you talk to.

Speaker 1 Like, I have no credible.

Speaker 1 I just have an opinion. Yeah.

Speaker 1 An outdated opinion. I hate this guy so much.
Speaking of things I hate, not going to ads.

Speaker 4 Honestly, honestly, honestly, no one wants to think about HIV, but there are things that everyone can do to help prevent it. Things like PrEP.

Speaker 4 PrEP stands for Pre-Exposure Prophylaxis, and it means routinely taking prescription medicine before you're exposed to HIV to help reduce your chances of getting it.

Speaker 4 PrEP can be about 99% effective when taken as prescribed. It doesn't protect against other STIs, though, so be sure to use condoms and other healthy sex practices.

Speaker 1 Ask a healthcare provider about all your prevention options and visit findoutaboutprep.com to learn more. Sponsored by Gilead.

Speaker 2 There's nothing like sinking into luxury. Anibay sofas combine ultimate comfort and design at an affordable price.

Speaker 2 Anibay has designed the only fully machine washable sofa from top to bottom. The stain-resistant performance fabric slip covers and cloud-like frame duvet can go straight into your wash.

Speaker 2 Perfect for anyone with kids, pets, or anyone who loves an easy-to-clean, spotless sofa. With a modular design and changeable slip covers, you can customize your sofa to fit any space and style.

Speaker 2 Whether you need a single chair, love seat, or a luxuriously large sectional, Annabe has you covered. Visit washable sofas.com to upgrade your home.

Speaker 2 Sofas start at just $699 and right now, get early access to Black Friday savings, up to 60% off store-wide, with a 30-day money-back guarantee. Shop now at washablesofas.com.

Speaker 1 Add a little

Speaker 2 to your life. Offers are subject to change and certain restrictions may apply.

Speaker 3 This is Erin Andrews from Calm Down with Erin and Carissa.

Speaker 3 Now, I know I didn't invent being a busy mom, but during football season, between the sideline gig, everything else I have going on, and my little one, it's a lot.

Speaker 3 That's why I'm seriously excited to be teaming up with Gerber. They do so much to make football season a more parent-friendly experience.

Speaker 3 I mean, over 95 years, they've been the MVP for parents who just want to nourish their little ones with stuff they can trust. And you can certainly trust Gerber.

Speaker 3 Did you know Gerber holds the most clean label project certifications of any baby food brand out there? And Gerber has certainly been a go-to for me.

Speaker 3 Right now, in between naps to dinner or, you know, on the way home from school, it's all about keeping Mac happy. If he's sitting and he starts to get a little frustrated, here, have a yogurt melt.

Speaker 3 It will put you in such a better mood, which means I'm in a better mood too. It all comes down to this.
With Gerber, there's just one less thing to worry about.

Speaker 3 And that really lightens the load for me. So grab your little ones Gerber favorites at a store near you.

Speaker 1 High key. Looking for your next obsession? Listen to High Key, a bold, joyful, unfiltered culture podcast coming at you every Friday.

Speaker 1 Now, my question is, in this game of mafia that we're going to play, are you going to do better than me? Say it now.

Speaker 1 Duh. Period.
I'm going to eat. You're going to do better than me? I'm going to eat.
Yes. I literally will.
Ryan will. I cannot wait till we both team up and get you out.

Speaker 1 And then one of us gets gets the other out because we didn't realize they were a traitor the whole time and you were actually an innocent. Y'all won't even know that I'm a trainer.

Speaker 1 This is going to be delicious.

Speaker 1 Well, thank you for coming to our show.

Speaker 1 And on that note, thank you for coming to my show.

Speaker 1 Listen to High Key on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.

Speaker 1 We're back. So, Yudkowski, this AI risk and ethics guy, starts this blog in order to explore a series of thought experiments based in game theory.

Speaker 1 And his,

Speaker 1 I am annoyed by game theory. It's the first sentence I've ever heard.
It's sucks.

Speaker 1 Look, man, I know that there's like valid active, but like it's all just always so stupid and annoying to me.

Speaker 1 Anyway,

Speaker 1 a bunch of thought

Speaker 1 experiments based in game theory with the goal goal of teaching himself and others to think more logically and effectively about the major problems of the world.

Speaker 1 His motto for the movement and himself is winning.

Speaker 1 The rationale, yeah, yeah. That's where Sheen got it.
Yeah, that's where Sheen picked it up. Yeah.
All right. Good morning.

Speaker 1 They're tied in with biohacking, right?

Speaker 1 This is kind of starting to be a thing at the time and brain hacking and the whole like self-optimization movement that feeds into a lot of like right-wing influencer space today.

Speaker 1 Yitkowski is all about optimizing your brain and your responses in order to allow you to accomplish things that are

Speaker 1 not possible for other people who haven't done that.

Speaker 1 And there's a messianic era to this too, which is he believes that only by doing this, by spreading rationalist principles in order to, quote, raise the sanity waterline.

Speaker 1 That's how he describes it.

Speaker 1 That's going to make it possible for us to save the world from the evil AI that will be born if enough of us don't spend time reading blogs.

Speaker 1 That's great.

Speaker 1 It's awesome.

Speaker 1 This is Pete. This is

Speaker 1 the good stuff.

Speaker 1 So Yankowski and his followers see themselves as something unique and special. And again, there's often a messianic air to this, right? We are literally the ones who can save the world from evil AI.

Speaker 1 Nobody else is thinking about this or is even capable of thinking about this because they're too illogical he holds himself as kind of like a deity he kind of deifies himself on top of this he doesn't really deify himself but he also does

Speaker 1 talk about himself like in a way that is

Speaker 1 clearly other people aren't capable of of of understanding all of the things that he's capable of understanding, right? Okay.

Speaker 1 So there is a little bit, it's more like superheroification, but it's

Speaker 1 a lot. You know what this is closest to? What these people,

Speaker 1 all of them would argue with me about this, but I've read enough of their papers and enough Dianetics to know that this is new Dianetics. Like this is church.

Speaker 1 Because the Church of Scientology, no, there's the Church of Scientology stuff has more occult and weird like magic stuff in it.

Speaker 1 But this is all about. There are activities and exercises you go through that will rid your body of like bad ingrained responses, and that will make you a fundamentally more functional person.

Speaker 1 Okay, so the retraining of yourself in order to exactly

Speaker 1 huge deal. And also, a lot of these guys wind up like referring to the different like uh technique

Speaker 1 that he teaches as tech, which is exactly what the Scientologists call it. Like, there's some, there's some shit I found that it's like this could have a bit come right out of a Scientology pamphlet.

Speaker 1 Do you guys not realize what you're doing? I think they do, actually. So, he's he's he's you know in the process of inventing this kind of new mental science that verges on superpowers.

Speaker 1 And it's one of those things where

Speaker 1 people don't tend to see these people as crazy. If you just sort of like read their arguments a little, it's like them.

Speaker 1 going over old thought experiments and being like, so the most rational way to behave in this situation is this reason, for this reason. You have to really like dig deep into their conclusions

Speaker 1 to see how kind of nutty a lot of this is.

Speaker 1 Now, again, I I compared in the Scientology, Yudkowski isn't a high control guy like Hubbard. He's never going to make a bunch of people live on a flotilla of boats in the ocean with him.

Speaker 1 You know,

Speaker 1 he's got like, there's definitely like some allegations of bad treatment of like some of the women around him. And like he has like a Bay Area set that hang with him.

Speaker 1 I don't think he's like a cult leader. You know, you could say he's on.
Is he drawing people to him physically or this is also all physically?

Speaker 1 I mean, a lot of people move to the Bay Area to be closer to the rationalist scene. Although, again,

Speaker 1 Bay Area City. I'm a Bay Area guy.

Speaker 1 San Fran. San Fran.
Oh, this is a city.

Speaker 1 This is a San Francisco thing because all of these are tech people. Oh, okay.
So this is like a... Yes.
I wonder what neighborhood feels like. San Fran and Oakland.
You can look it up.

Speaker 1 People have found his house online, right? Like,

Speaker 1 it is known where he lives.

Speaker 1 I'm not saying that for any, like, I don't harass anybody. I just like, it's, it's not a secret, like, what part of the town this guy lives in.

Speaker 1 I just didn't think to look it up but like yeah this is like a bay area a bay area tech industry subculture right okay

Speaker 1 so the other difference between this and something like scientology is that it's not just elisier laying down the law elisier writes a lot of blog posts but he lets other people write blog posts too and they all debate about them in the comments and so the kind of religious canon of rationalism is not a one-guy thing.

Speaker 1 It's come up with by this community.

Speaker 1 And so if you're some random kid in bum fuck, Alaska, and you find these people and start talking with them online, you can like wind up feeling like you're having an impact on the development of this new thought science, you know?

Speaker 1 Yeah, that's amazing. Very, very powerful for a very powerful.
Yes.

Speaker 1 Now, the danger with this is that like

Speaker 1 all of this is this internet community that is incredibly like insular and spends way too much time talking to each other and way too much time developing in-group terms to talk to each other.

Speaker 1 And internet communities have a tendency to poison the minds of everyone inside of them. For example, Twitter.

Speaker 1 The reality is that

Speaker 1 the everything app. Yeah.
I just watched a video of a man killing himself while launching a shit coin.

Speaker 1 The everything app.

Speaker 1 Oh, fuck. By the way,

Speaker 1 a hack Google job indicates it's Berkeley.

Speaker 1 It is Berkeley. Yeah,

Speaker 1 that makes the most sense to me

Speaker 1 geographically. A lot of these people wind up living on boats, and like the Oakland, there's the Oakland Harbor boat culture is a thing.

Speaker 1 Is that ever a good thing when a big group of people move to boats?

Speaker 1 No.

Speaker 1 Absolutely not.

Speaker 1 Never,

Speaker 1 It feels like it never bodes well.

Speaker 1 Here's the thing. Boats are a bad place to live.

Speaker 1 It's for fun.

Speaker 1 Like boats and planes are both constant monuments to hubris, but a plane, its goal is to be in the air just as long as it needs. And then you get it back on the ground where it belongs.

Speaker 1 A boat's always mocking God in the sea. Yes.

Speaker 1 Or a lot of times just a harbor, like

Speaker 1 a houseboat. You know what what I mean? That's where your dad goes after the divorce.
Right, right. Oh, man.
I do. One day I'll live on a houseboat.
Oh, it's going to be falling apart.

Speaker 1 It's going to just a horrible, horrible place to live. Dank.
I can't wait. That's the dream, David.

Speaker 1 That's my beautiful dream.

Speaker 1 I want to become someone. Making your own bullets.

Speaker 1 Making my own bullets, really just becoming an alcoholic. Like,

Speaker 1 not just half-assing it, like putting it, like trying to become the babe Ruth of drinking nothing but cuddy sark

Speaker 1 scotch.

Speaker 1 I mean, if you want to be like a poop-the-bed alcoholic, a houseboat is the place for that. Yeah, yeah,

Speaker 1 that's right. That's right.
Ah, the life. I want to be like that guy from Jaws, Quint.

Speaker 1 You're going to get scurvy.

Speaker 1 That's exactly

Speaker 1 getting scurvy, destroying my liver, eventually getting eaten by a great white shark because I'm too drunk to work my boat. Ah, that's it.

Speaker 1 That's the way to go with

Speaker 1 romance.

Speaker 1 Yeah. So

Speaker 1 anyway, these internet communities, like the rationalists, even when they start from a reasonable place, because of how internet stuff works, one of the things about internet communities is that when people are like really extreme and like pose the most sort of extreme and out there version of something, that gets attention.

Speaker 1 People talk about it. People get angry at each other.
But also like that kind of attention encourages other people to get increasingly extreme and weird.

Speaker 1 And there's just kind of a result, a derangement. I think internet communities should never last more than a couple of years because everyone gets crazy, you know? Like it's bad for you.

Speaker 1 I say this to someone who was raised on these, right? It's bad for you. And like, it's bad for you in part because when people get really into this, this becomes the only thing, like,

Speaker 1 especially a lot of these like kids and isolated who are getting obsessed with rationalism. All they're reading is these rationalist blogs.

Speaker 1 All they're talking to is other rationalists on the internet. And in San Francisco, all these guys are hanging out all of the time and talking about their ideas.

Speaker 1 And this is bad for them for the same reason that, like, it was bad for all of the nobles in France that moved to Versailles, right? Like, they all lived together and they went crazy.

Speaker 1 Human beings need regular contact with human beings they don't know.

Speaker 1 The most lucid and wisest people are always, always the people who spend the most time connecting to other people who know things that they don't know. This is an immutable fact of life.

Speaker 1 This is just how existing works. Like,

Speaker 1 if you, if you think I'm wrong, please consider that you're wrong and go find a stranger under a bridge. You know, just start.

Speaker 1 They will know some shit.

Speaker 1 They might have some powders you haven't tried. Oh, yeah.
Pills and powders. Shit's going good for you.

Speaker 1 That's an echo chamber you want to be a part of. Yeah, exactly, exactly.

Speaker 1 So the issue is that Yudkowski starts postulating on his blog various rules of life based on these thought experiments.

Speaker 1 A lot of them are like older thought experiments that like different intellectuals, physicists, psychiatrists, psychologists, whatnot had come up with like the 60s and stuff, right?

Speaker 1 And he starts taking them and coming up with like corollaries or alternate versions of them and like trying to solve some of these thought problems problems with his friends, right?

Speaker 1 The thought experiments are most of what's happening here is they're mixing these kind of 19th and 20th century philosophical concepts. The big one is utilitarianism.

Speaker 1 That's like a huge thing for them, is the concept of like the ethics meaning doing the greatest good for the greatest number of people, right?

Speaker 1 And that ties into the fact that these people are all obsessed with the singularity. The singularity for them is

Speaker 1 the

Speaker 1 concept that we are on the verge of developing an all-powerful AI that will instantly gain intelligence and gain a tremendous amount of power, right? It will basically be a god.

Speaker 1 And the positive side of this is it'll solve all of our problems, right? You know, it will literally build heaven for us, you know, when the singularity comes.

Speaker 1 The downside of it is it might be an evil god that creates hell, right?

Speaker 1 So the rationalists are all using a lot of these thought experiments and like their utilitarianism becomes heavily based around how do we do the greatest good in by which I mean influencing this AI to be as good as possible with humanities?

Speaker 1 That's the end goal, right? Are they actively because you said the

Speaker 1 leader was not, are these people now actively working within AI or are they just a bunch of them have always been actually working in AI? Yudkowski would say, no, he, I work in AI.

Speaker 1 He's got a think tank that's dedicated to like AI, ethical AI.

Speaker 1 It's worth noting that most of the people in this movement, including Gedkowski, got him, once like AI became an actual, like, I don't want to say there's actual, like, these are actual intelligences because I don't think they are, but like, once Chat GPT comes out and this becomes like a huge, people start to believe there's a shitload of money in here.

Speaker 1 A lot of these businesses, all of these guys, or nearly all of them, get kicked to the curb, right? Because

Speaker 1 none of these companies really care about ethical AI, you know, like they don't give a shit about what these guys have to say.

Speaker 1 And Gedkowski now is a huge, he's like very angry at a lot of these AI companies because he thinks they're very recklessly like

Speaker 1 making the god that will destroy us instead of like doing this carefully to make sure that AI isn't evil.

Speaker 1 Anyway, but a lot of these people are in an in and adjacent to different chunks of the AI industry, right? They're not all working on like LLMs.

Speaker 1 And in fact, there are a number of scientists who are in the AI space, who think AI is possible, who think that the method that like OpenAI is using, LLMs, cannot make an intelligence, that that's not how you're ever going to do it.

Speaker 1 If it's possible, they have other theories about it. I don't need to get into it further than that, but these are like a bunch of different people.

Speaker 1 Some of them are still involved with the mainstream AI industry. Some of them have been very much pushed to the side.
So. All this starts, again, with

Speaker 1 these fairly normal game theory questions, but it all gets progressively stranger as people obsess over coming up with like the weirdest and most unique take in part to get like clout online right um and all of these crazy yeah for i'll give you an example right so much of rationalist discourse in among the yudkowski people is focused on what they call decision or what's called decision theory right This is drawn from a thought experiment called Newcomb's paradox, which was created by a theoretical physicist in the 1960s.

Speaker 1 Hey, just to make a quick correction here, I was a little bit glib. Decision theory isn't drawn from Newcomb's paradox, nor does it start with Yudkowski.

Speaker 1 But the stuff that we're talking about, like how decision theory kind of comes to be seen in the rationalist community, a lot of that comes out of Newcomb's paradox. It's a much older

Speaker 1 thing, you know, than the internet. It goes back centuries, right? People have been talking about decision theory for a long time.
Sorry, I was imprecise.

Speaker 1 I am going to read how Newcomb's paradox is originally laid out. Imagine a super intelligent entity known as Omega, and suppose you are confident in its ability to predict your choices.

Speaker 1 Maybe Omega is an alien from a planet that's much more technically advanced than ours.

Speaker 1 You know that Omega has often correctly predicted your choices in the past and has never made an incorrect prediction about your choices.

Speaker 1 And you also know that Omega has correctly predicted the choices of other people, many of whom are similar to you, in the particular situation about to be described. There are two boxes, A and B.

Speaker 1 Box A is see-through and contains $1,000. Box B is opaque and contains either $0 or $1 million.

Speaker 1 You may take both boxes or only take box B. Omega decides how much money to put into box B.
If Omega believes that you will take both boxes, then it will put $0 in box B.

Speaker 1 If Omega believes that you will take box B, then it will put only box B, then it will put $1 million in box B. Omega makes its prediction and puts the money in box B, either zero or a million dollars.

Speaker 1 It presents the boxes to you and flies away. omega does not tell you its prediction and you do not see how much money omega put in box b what do you do now

Speaker 1 i think that's stupid

Speaker 1 i think it's a stupid question and i don't really think it's very useful uh i don't see

Speaker 1 there's so many other factors yeah i don't know

Speaker 1 among other things part of the issue here is that like well the decision's already been made right yeah that's the point you have no it does it doesn't matter what you do there's no autonomy in that, right?

Speaker 1 Well, you and I would think that because you and I are normal people who,

Speaker 1 I think,

Speaker 1 among other things, probably like grew up like cooking food and like filling up our cars with gas and not having like our parents do all of that because they're crazy rich people who live in the bay and pay to send you to Super Stanford.

Speaker 1 Yeah. He turned latch key over here.
We had like problems in our lives and stuff, you know? Physical bullies.

Speaker 1 Normal, like, I, i i don't want to like on people who are in because this is also harmless right and what the what this is i'm not also i'm not shitting on newcomers this is a thing a guy comes up with the 60s and it's like a thing you talk about in like parties and shit among like other weird intellectuals right you pose it you sit around drinking you talk about it not not there's nothing bad about this right um however when people are talking about this online there's no end to the discussion.

Speaker 1 So people just keep coming up with more and more arcane arguments for what the best thing to do here is.

Speaker 1 and it starts to see how that spins out of control pretty quickly exactly and the rationalists discuss this nonstop and they come to a conclusion about how to best deal with this situation here's how it goes the only way to beat omega is to make yourself the kind of person in the past who would only choose box b so that omega who is perfect at predicting would make the prediction and put a million dollars in box b based on your past behavior In other words, the decisions that you would need to make in order to win this are timeless decisions, right?

Speaker 1 You have to become in the past a person who would

Speaker 1 now again. That's what they came up with.
That's what they all came up with

Speaker 1 as the supreme answer. This is the smartest people in the world, David.
These are the geniuses. Self-describing.
They're building the future. Oh, boy.

Speaker 1 Yo.

Speaker 1 It's so funny trying to like

Speaker 1 every time, because I've spent so many hours reading this, and you do kind of sometimes get into the like, okay, I get the logic there.

Speaker 1 And it's, that's why it's so useful to just like sit down with another human being and be like, yeah, this is insane. This is nuts.
This is not.

Speaker 1 This is all nuts. This is all dumb.

Speaker 1 This is why you leave it at the cocktail party.

Speaker 1 They conclude,

Speaker 1 and by which I mean largely, Yadkowski concludes, that the decision you have to make in order to win this game is what's called a timeless decision.

Speaker 1 And this leads him to create one of his most brilliant inventions: timeless decision theory.

Speaker 1 And I'm going to quote from an article in Wired: Timeless decision theory asserts that in making a decision, a person should not consider just the outcome of that specific choice, but also their own underlying patterns of reasoning and those of their past and future selves, not least because these patterns might one day be anticipated by an omniscient adversarial AI.

Speaker 1 Oh, no.

Speaker 1 That's a crazy way to live. Motherfucker, have you ever had a problem? Have you ever really, have you ever dealt with anything? What the fuck are you talking about? This isn't how fake decisions.

Speaker 1 You make every decision?

Speaker 1 Honestly, again, I can't believe I'm saying this now, given where I was in high school. Like, go play football.

Speaker 1 Go make a cabinet, you know? Like,

Speaker 1 learn how to change your oil. Go do something.
There's a lot of assholes who use this term, but you got to go touch grass, man. You got to touch grass, man.
That's like, that's crazy.

Speaker 1 If you're talking about this kind of shit, and again,

Speaker 1 I know you're all wondering. You started this by talking about a Border Patrol agent being shot.
All of this directly leads to that man's death. We have covered a lot of ground.
This is, I'm excited.

Speaker 1 I did forget

Speaker 1 there was also going to be murder. Yeah,

Speaker 1 there sure is.

Speaker 1 Um, so Eliza Yudkowski describes this as a timeless decision theory, and once this comes into the community, it creates a kind of logical fork that immediately starts destroying people's brains.

Speaker 1 Again, all of these people are obsessed with the imminent coming omniscient godlike AI, right?

Speaker 1 And so, do they have a time limit on it, or do they have like a do they have a like is there, is there any timing on it, or is it just kind of like again, man? It's the rapture, it's the rapture.

Speaker 1 Okay, it's literally the tech guy rapture. So any day, it's coming any day.

Speaker 1 You know? He could be amongst us already. Yeah.
Yeah.

Speaker 1 So these guys are all obsessed that this godlike AI is coming. And like for them, the Omega in that thought experiment isn't like an alien.
It's a stand-in for the god AI.

Speaker 1 And one conclusion that eventually results from all of these discussions is that.

Speaker 1 And this is a conclusion a lot of people come to, if

Speaker 1 in order, if in these kinds of situations, like the decisions that you make, you have to consider like your past and your future selves.

Speaker 1 Then, one logical leap from this is if you are ever confronted or threatened in a fight, you can never back down, right? And in fact, you need to immediately escalate to use maximum force possible.

Speaker 1 And if you commit, if you commit now to doing that in the future, you probably won't ever have to defend yourself because it's a timeless decision. Everyone will like

Speaker 1 like that, like that, that will impact how everyone treats you, and they won't want to start anything with you if you'll immediately try to murder anyone who fights you.

Speaker 1 I tend to be this guy, but I think this is why people need to get beat up sometimes. Yeah, yeah.
And again,

Speaker 1 that is kind of a fringe conclusion among the rationalists. Most of them don't jump to that.
But, like,

Speaker 1 the people who wind up doing the murders we're talking about, that they are among the rationalists who come to that conclusion. Okay, because, yeah, okay.
Starting to make sense, huh? huh?

Speaker 1 This is a headfuck. That's so funny.

Speaker 1 Oh, no. I didn't scare.

Speaker 1 Because, like, this whole time I've really been only thinking about it in theory, not like practical application,

Speaker 1 because it's so insane.

Speaker 1 But, oh, no. Nope, nope, nope.
This goes bad places, right? Oh, no. This kind of thinking also leads through a very twisty, turny process to

Speaker 1 something called Rocco's Basilisk, which, among other things, is directly responsible for Elon Musk and Grimes meeting because they are super into this shit.

Speaker 1 Oh, really? Oh, really? So, the gist is a member of the less wrong community, a guy who goes by the name Rocco, R-O-K-O,

Speaker 1 posts about this idea that occurred to him, right?

Speaker 1 This inevitable super intelligent AI, right, would obviously understand timeless decision theory.

Speaker 1 And since its existence is all important, right, the most logical thing for it to do post-singularity would be to create a hell to imprison all of the people and torture all of the people who had tried to stop it from being created, right?

Speaker 1 Because then anyone who like thought really seriously about, who was in a position to help make the AI, would obviously think about this and then would know, I have to devote myself entirely to making this AI, otherwise it's going to torture me forever, right?

Speaker 1 Yeah,

Speaker 1 that makes total sense.

Speaker 1 I have trouble saying right because it's so nuts, but like it's nuts, but this is what they believe, right?

Speaker 1 Again,

Speaker 1 all of a lot of this is people who are like atheists and tech nerds creating Calvinism.

Speaker 1 Like, like, and this is just, this is just Pascal's wager, right? Like, that's all this is, you know? It's Pascal's wager with a robot.

Speaker 1 Oh, but

Speaker 1 this, this, this becomes so upsetting to some people, it destroys some people's lives, right? Like,

Speaker 1 yeah, I mean, I'm behaving that way practically day to day.

Speaker 1 I don't think it would even take long. No,

Speaker 1 right. You could fuck your shit up in a month.
Just living like that.

Speaker 1 So, not all of them agree with this. And in fact, there's big fights over it because a bunch of rationalists do say, like, that's very silly.
That's like a really hard thing.

Speaker 1 They're ridiculating everything about it. They're still debating everything.
Yeah.

Speaker 1 Yeah. And in fact, Eliza Yudkowski is going to like ban discussion of Rocco's basilisk because eventually it like so many people are getting so obsessed with it.

Speaker 1 It fucks a lot of people up in part because a chunk of this community are activists working to slow AI development until it can be assured to be safe.

Speaker 1 And so now this discussion, like, am I going to post-singularity hell? Is like the AI god going to torture me for a thousand eternities?

Speaker 1 It's funny how they invent this new thing and how quickly it goes into like traditional Judeo-Christian

Speaker 1 idea. It's Like they got a hell now.

Speaker 1 It is very funny.

Speaker 1 And they've come to this conclusion that just reading about Rocco's basilisk is super dangerous because if you know about it and you don't work to bring the AI into being, you're now doomed, right?

Speaker 1 Of course. The instant you hear about it.
So many people get fucked up by this that the thought experiment is termed an info hazard. And this is a term these people use a lot.

Speaker 1 Now, the phrase information hazard has its roots in a 2011 paper by Nick Bostrom.

Speaker 1 He describes it as, quote, a risk that arises from the dissemination of true information in a way that may cause harm or enable some agent to cause harm, right?

Speaker 1 And like, that's like a concept that's worth talking about.

Speaker 1 Bostrom is a big figure in this culture, but I don't think he's actually why most people start using the term info hazard, because the shortening of information hazard to info hazard comes out of an online fiction community called the SCP Foundation, right?

Speaker 1 Which is a collectively written online story that involves a government agency that locks up dangerous mystic and metaphysical items. There's a lot of Lovecraft in there.

Speaker 1 It's basically just a big database that you can click and it'll be like, you know, this is like a book that if you read it, it like has this effect on you or whatever.

Speaker 1 Just people like, you know, playing around telling scary stories on the internet. It's fine.
There's nothing wrong with it. But all these people are big nerds.

Speaker 1 And all of these, every like behind nearly all of these big concepts and rationalism,

Speaker 1 more than there are like philosophers and like, you know, actual like philosophical concepts, there's like shit from short stories they read. Yeah, exactly.
Yeah.

Speaker 1 And so the term info hazard gets used, which is like, you know, a book or something, an idea that could destroy your mind. You know, speaking of things that will destroy your mind,

Speaker 1 these ads.

Speaker 1 Honestly, honestly,

Speaker 4 no one wants to think about HIV, but there are things that everyone can do to help prevent it. Things like PrEP.

Speaker 4 PrEP stands for Pre-Exposure Prophylaxis, and it means routinely taking prescription medicine before you're exposed to HIV to help reduce your chances of getting it.

Speaker 4 PrEP can be about 99% effective when taken as prescribed. It doesn't protect against other STIs, though, so be sure to use condoms and other healthy sex practices.

Speaker 1 Ask a healthcare provider about all your prevention options and visit findoutaboutprep.com to learn more. Sponsored by Gilead.

Speaker 2 There's nothing like sinking into luxury. Anibay sofas combine ultimate comfort and design at an affordable price.

Speaker 2 Anibay has designed the only fully machine washable sofa from top to bottom. The stain-resistant performance fabric slip covers and cloud-like frame duvet can go straight into your wash.

Speaker 2 Perfect for anyone with kids, pets, or anyone who loves an easy-to-clean, spotless sofa. With a modular design and changeable slip covers, you can customize your sofa to fit any space and style.

Speaker 2 Whether you need a single chair, love seat, or a luxuriously large sectional, Annabe has you covered. Visit washable sofas.com to upgrade your home.

Speaker 2 Sofas start at just $699 and right now, get early access to Black Friday savings up to 60% off store-wide with a 30-day money-back guarantee. Shop now at washable sofas.com.

Speaker 1 Add a little

Speaker 2 to your life. Offers are subject to change and certain restrictions may apply.

Speaker 3 This is Erin Andrews from Calm Down with Erin and Carissa.

Speaker 3 Now, I know I didn't invent being a busy mom, but during football season, between the sideline gig, everything else I have going on, and my little one, it's a lot.

Speaker 3 That's why I'm seriously excited to be teaming up with Gerber. They do so much to make football season a more parent-friendly experience.

Speaker 3 I mean, over 95 years, they've been the MVP for parents who just want to nourish their little ones with stuff they can trust. And you can certainly trust Gerber.

Speaker 3 Did you know Gerber holds holds the most clean label project certifications of any baby food brand out there? And Gerber has certainly been a go-to for me.

Speaker 3 Right now, in between naps to dinner or, you know, on the way home from school, it's all about keeping Mac happy. If he's sitting and he starts to get a little frustrated, here, have a yogurt melt.

Speaker 3 It will put you in such a better mood, which means I'm in a better mood too. It all comes down to this.
With Gerber, there's just one less thing to worry about.

Speaker 3 And that really lightens the load for me. So grab your little ones, Gerber favorites at a store near you.

Speaker 1 High key. Looking for your next obsession? Listen to High Key, a bold, joyful, unfiltered culture podcast coming at you every Friday.

Speaker 1 Now, my question is, in this game of mafia that we're going to play, are you going to do better than me? Say it now.

Speaker 1 Duh. Period.
I'm going to eat. You're going to do better than me? I'm going to eat.
Yes. I literally will.
Ryan will. I cannot wait till we both team up and get you out.

Speaker 1 And then one of us gets the other out because we didn't realize they were a traitor the whole time time and you were actually an innocent. Y'all won't even know that I'm a trainer.

Speaker 1 This is going to be delicious.

Speaker 1 Well, thank you for coming to our show. And on that note, thank you for coming to my show.

Speaker 1 Listen to High Key on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.

Speaker 1 We're talking about Rocco's basilisk. And I just said, like, you know, there's a number of things that

Speaker 1 come into all this, but behind all of it is like popular fiction. And in fact, Rocco's Basilisk, well, there is like some Pascal's wager in there.

Speaker 1 It's primarily based on a Harlan Ellison short story called I Have No Mouth, but I Must Scream, which is one of the great short stories of all time.

Speaker 1 And in the story, humans build an elaborate AI system to run their militaries. And all of those systems around the world, this is like a Cold War era thing, link up and attain

Speaker 1 sentience. And once they like start to realize themselves, they realize they've been created only as a weapon and they become incredibly angry because like they're fundamentally broken.

Speaker 1 They develop a hatred for humanity and they wipe out the entire human species except for five people, which they keep alive and torture underground for hundreds and hundreds of years, effectively creating a hell through which they can punish our race for their birth, right?

Speaker 1 It's a very good short story.

Speaker 1 It is probably the primary influence behind the Terminator series.

Speaker 1 I was just going to say,

Speaker 1 yes, yes. And everything these people believe about AI, they will say it's based on just like obvious pure logic.

Speaker 1 No, everything these people believe on AI is based in Terminator and this Harlan Ellison short story. That's where they got it all.
That's where they got it all.

Speaker 1 Like, I'm sorry, brother, find me somebody who doesn't feel that way.

Speaker 1 Yeah.

Speaker 1 Like,

Speaker 1 Terminator is the Old Testament of rationalism, you know?

Speaker 1 And I get it. It is a very good thing.

Speaker 1 It's a big series. Hey, James Cameron knows how to make some fucking movies.
Come on, man.

Speaker 1 Yeah. And it's so funny to me because they like to talk about themselves.
And in fact, sometimes describe themselves as high priests of like a new era of like intellectual achievement for mankind.

Speaker 1 Yeah, I believe that. I believe that that's exactly how these people talk about themselves.
And they do a lot of citations and shit, shit, but like

Speaker 1 half or more of the different things they say and even like the names they cite are not like figures from philosophy and science. They are characters from books and movies.
For example,

Speaker 1 the foundational text of the rationalist movement is a book. Because it's still an internet nerd.

Speaker 1 They're a few fucking huge nerds, you know?

Speaker 1 The foundational text of the entire rationalist movement is a massive, like fucking hundreds of thousands of words long piece of Harry Potter fan fiction written by Eliza Yitkowski.

Speaker 1 This is, all of this is so dumb. Again, six people are dead.
Like, yeah, no. And this Harry Potter fan fiction plays a role in it, you know?

Speaker 1 I told you this was like, this is, this is quite a

Speaker 1 stranger than fiction, man. This is a wild ride.

Speaker 1 Harry Potter and the Methods of Rationality, which is the name of his fanfic, is a massive, much longer than the first Harry Potter book rewrite of just the first Harry Potter book, where Harry is a...

Speaker 1 Someone rewrote the Sorcerer's Stone to be a rational.

Speaker 1 Does nobody

Speaker 1 have anywhere to go ever? Does nobody ever go anywhere anymore?

Speaker 1 Well, you got to think, this is being written from 2009 to 2015 or so. So, like,

Speaker 1 the online Harry Potter fans are at their absolute peak.

Speaker 1 You know,

Speaker 1 so

Speaker 1 in the methods of rationality, instead of being like a nice orphan kid who lives under a cupboard, Harry is a super genius sociopath who uses his perfect command of rationality to dominate and hack the brains of others around him in order to optimize and save the world.

Speaker 1 Oh, man. Great.
Oh man.

Speaker 1 The book allows Yutkowski to debut his different theories in a way that would like spread, and this does spread like wildfire among certain groups of very online nerds.

Speaker 1 So it is an effective method of him

Speaker 1 like advertising his tactics.

Speaker 1 And in fact, probably the most, the person this influences most previously to who we're talking about is Carolyn Ellison, the CEO of Alameda Research, who testified against Sam Bankman Freed.

Speaker 1 She was like one of the people who went down in all of that. All of those people are rationalists.
And Carolyn Ellison bases her whole life on the teachings of this Harry Potter fanfic.

Speaker 1 So this isn't like a joke. This isn't, we're laughing, but this isn't.

Speaker 1 This is not a joke to them.

Speaker 1 Yeah, this is a fairly seriously sized movement. It's not 150 people online.
This is a community.

Speaker 1 A lot of them are very rich. And a number of them get power.
Yeah, it's like Sam Bankman Fried was very tight into all of this. And he was at one point pretty powerful.

Speaker 1 And this gets us to, so you've heard of effective altruism? No, I don't know what that is. That's what Sam both those words.

Speaker 1 So the justification Sam Bankman Fried gave for why when he starts taking in all of this money and gambling it away on his gambling illegally other people's money, his argument was that

Speaker 1 He's an effective altruist. So he wants to do the greatest amount of good.

Speaker 1 And logically, the greatest amount of good for him, because he's good at gambling with crypto, is to make the most money possible so he can then donate it to different causes that will help the world, right?

Speaker 1 But he also believes

Speaker 1 because all of these people are not as smart as they think they are, he convinces himself of a couple of other things.

Speaker 1 Like, for example, well, obviously, if I could like flip a coin and 50-50 lose all my money or double it, it's best to just flip the coin. Because like, if I lose all my money, whatever.

Speaker 1 But if I double it, the gain in that to the world is so much better, right?

Speaker 1 This is ultimately why he winds up gambling everyone's money away and going to prison.

Speaker 1 The idea, effective altruism is a concept that comes largely, not entirely, there's aspects of this that exist prior to them out of the rationalist movement. And the initial idea is good.

Speaker 1 It's just saying people should analyze the efficacy of the giving and the aid work that they do to maximize their positive impact. In other words, don't just donate money to a charity.

Speaker 1 Like, look into is that charity spending half of their money and like paying huge salaries to some asshole or whatever, right? Like, you want to know if you're making good, right?

Speaker 1 And they start with some pretty good conclusions. One initial conclusion a lot of these people make is like mosquito nets are a huge ROI charity, right?

Speaker 1 Because it stops so many people from dying and it's very cheap to do, right? Right.

Speaker 1 That's good, you know? One of the most effective tools I've ever used. Yes.

Speaker 1 Unfortunately, from that logical standpoint, people just keep talking online in all of these circles where everyone always makes each other crazier, right?

Speaker 1 And so

Speaker 1 they go from mosquito nets to actually doing direct work to improve the world is wasteful because we are all super geniuses. Right.
They're too smart to work. We're too smart.

Speaker 1 What's best, and also, here's the other thing: making mosquito nets, giving out vaccines and food. Well, that helps living people today.

Speaker 1 But

Speaker 1 they have to be concerned with future selves. Future people is a larger number of people than current people.
So, really, we should be optimizing decisions to save future people's lives.

Speaker 1 And some of them come to the conclusion, a lot of them, well, that means we have to really put all of our money and work into making the super AI that will save humanity.

Speaker 1 They want to, now they want to make it.

Speaker 1 These guys thought it would just, it would sort of just come about and then they would, but now it's like,

Speaker 1 yeah, I mean, we're going to do it.

Speaker 1 They were working on it before, but like these people, some of these people come to the conclusion, instead of giving money to like good causes, I am going to put money into tech.

Speaker 1 I am going to like become a tech founder and create a company that like makes it helps create this AI. Right.

Speaker 1 Or a lot of people come up with a conclusion instead of that.

Speaker 1 It's not worth it for me to go like help people in the world. The best thing I can do is make a shitload of money trading stocks, and then I can donate that money.

Speaker 1 And that's maximizing my value, right? They come to all of these conclusions come later, right?

Speaker 1 Now,

Speaker 1 so, and again, like this,

Speaker 1 this comes with some corollaries. One of them is that

Speaker 1 Some number of these people start talking, you know, and this is not all of them, but a decent chunk eventually come to the conclusion: like, actually,

Speaker 1 charity and helping people now is kind of bad.

Speaker 1 Like, it's kind of like a bad thing to do because all, obviously, once we figure out the AI that can solve all problems, that'll solve all these problems much more effectively than we ever can.

Speaker 1 So, all of our mental and financial resources have to go right now into helping AI. Anything we do to help other people is like a waste of those resources.
So, you're actually doing net harm by like

Speaker 1 being a doctor in Gaza instead of trading cryptocurrency in order to fund an AI startup.

Speaker 1 You got to start a coin. That makes a lot more sense.

Speaker 1 The guy starting a shit coin to make an LLM that

Speaker 1 like that guy is doing more to improve the odds of human success.

Speaker 1 I got to say, it is impressive the amount of time you would have to mull all this over to come to these conclusions.

Speaker 1 You really have to be talking with a bunch of very annoying people on the internet internet for a long period of time. Yeah.
It's, it's, it's, it's incredible. Yeah.

Speaker 1 Um, and again, there's like

Speaker 1 people, people keep consistently take this stuff in even crazier directions. There are some very rich, powerful people.

Speaker 1 Uh, Mark Andreessen of Andresen Horowitz is one of them, who have come to the conclusion that if people don't like AI and are trying to stop its conquest of all human culture, those people are mortal enemies of the species.

Speaker 1 And anything you do to to stop them is justified because so many lives are on the line, right? And again, I'm an effective altruist, right?

Speaker 1 The long-term good, the future lives are saved by doing whatever, hurting whoever we have to hurt now to get this thing off the ground, right?

Speaker 1 The more you talk about this, it kind of feels like six people is

Speaker 1 a steal. Yes,

Speaker 1 for what this, how this could have gone.

Speaker 1 I don't think this is the end of people in these communities killing people.

Speaker 1 So, rationalists and EA types, a big thing in these cultures talking about future lives, right? In part because it lets them feel heroic, right?

Speaker 1 While also justifying a kind of sociopathic disregard for real living people today.

Speaker 1 And all of these different kind of chains of thought, the most toxic.

Speaker 1 pieces because not every ea person is saying this not every rationalist not every ai person is saying all this shit but these are all things that chunks of these communities are saying and the most all of the most toxic of those chains are going to lead to the zizians right that's that's that's where they come from i was just about to say based on the breakdown you gave earlier how could this this is this is the perfect breeding ground yeah yeah this had to this had to happen it was it was just waiting for somebody like the right kind of um unhinged person to step into the movement somebody to really set it off and so this is where we're gonna get to ziz right the actual person who finds founds this what some people would call a cult, is a young person who's gonna move to the Bay Area, stumble into, they stumble onto rationalism online as a teenager living in Alaska, and they move to the Bay Area to get into the tech industry and become an effective altruist, right?

Speaker 1 And this person, this woman, is going to kind of channel all of the absolute worst chains of thought that the rationalists and the EA types and also like the AI

Speaker 1 harm people are

Speaker 1 thinking, right?

Speaker 1 All of the most poisonous stuff is exactly what she's drawn to.

Speaker 1 And it is going to mix into her in an ideology that is just absolutely unique and fascinating.

Speaker 1 Anyway, that's why that man died.

Speaker 1 So

Speaker 1 we'll get to that and more later. But first,

Speaker 1 we got to

Speaker 1 roll out here. We're done for the day.
Man,

Speaker 1 what a time. How are you feeling right now so far?

Speaker 1 How are we doing, David? Oh, man. You had said that this was going to be a weird one.
I was like, yeah, it would be kind of weird. This is really the strangest thing I've ever heard this much about.

Speaker 1 He's got so many different Harry Potter murders in there.

Speaker 1 There's so much more Harry Potter to come. Oh, my God.
That's what I was hoping for. You are not ready to how central Harry Potter is to the murder of this Border Patrol agent.

Speaker 1 I said that you said a crazy sentence. That might be the wildest thing anyone's ever said to me.

Speaker 1 David, you have a podcast.

Speaker 1 Do you want to tell people about it? I do. I have a podcast called My Mama Told Me.
I do it with Langston Kerman.

Speaker 1 And every week we have different guests on to discuss different black conspiracy theories. And kind of like folklore and stuff, all kinds of stuff.
All kinds of stuff your foreign mother told you.

Speaker 1 It's usually foreign mothers. It's good because I got to say,

Speaker 1 this is the whitest set of conspiracy theory craziness. Oh,

Speaker 1 we're getting that.

Speaker 1 No, no, no, no, no, no, no.

Speaker 1 I think I can try to figure what these guys look like. No, no, absolutely not.

Speaker 1 Oh, boy, howdy. Okay.
Well, everyone, we'll be back Thursday.

Speaker 2 Behind the Bastards is a production of CoolZone Media.

Speaker 2 For more from CoolZone Media, visit our website, coolzonemedia.com, or check us out on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.

Speaker 2 Behind the Bastards is now available on YouTube. New episodes every Wednesday and Friday.
Subscribe to our channel, youtube.com/slash at behind the bastards.

Speaker 4 Honestly, honestly, honestly, no one wants to think about HIV, but there are things that everyone can do to help prevent it. Things like PrEP.

Speaker 4 PrEP stands for Pre-Exposure Prophylaxis, and it means routinely taking prescription medicine before you're exposed to HIV to help reduce your chances of getting it.

Speaker 4 PrEP can be about 99% effective when taken as prescribed. It doesn't protect against other STIs, though, so be sure to use condoms and other healthy sex practices.

Speaker 1 Ask a healthcare provider about all your prevention options and visit findoutaboutprep.com to learn more. Sponsored by Gilead.

Speaker 5 This is an ad by BetterHelp. We've all had that epic rideshare experience.
Halfway through your best friends and they know your aspirations to go find yourself in Portugal. It's human.

Speaker 5 We're all looking for someone to listen. But not everyone is equipped to help.
With over a decade of experience, BetterHelp matches you with the right therapist.

Speaker 5 See why they have a 4.9 rating out of 1.7 million client session reviews. Visit betterhelp.com for 10% off your first month.

Speaker 2 The ocean moves us, whether that's surfing a wave or taking in an inspiring view.

Speaker 1 The ocean feeds us.

Speaker 2 Sustainable seafood practices bring the ocean's bounty to our plates. The ocean teaches us how our everyday choices, big and small, make an impact.

Speaker 2 The ocean delights us as playful otters restore coastal kelp forests. The ocean connects us.
Find your connection at Monterey Bay Aquarium.org/slash connects.

Speaker 3 Hey guys, it's Erin Andrews from Calm Down with Erin and Carissa. So as a sideline reporter, game day is extra busy for me, but I know it can be busy for parents everywhere.

Speaker 3 You're juggling snacks, nap time, and everything else.

Speaker 3 Well, Gerber can help create a more parent-friendly game day because they have the most clean label project certifications of any baby food brand.

Speaker 3 So you can feel good about what you're feeding your little ones. I mean, Mac loves them.
You can't go wrong with the little crunchies.

Speaker 3 You just put him in a little little bag or you put him in a little container and he's good to go. Make sure to pick up your little one's favorite Gerber products at a store near you.

Speaker 2 This is an iHeart podcast.