
Part Three: How The Zizians Went Full On Death Cult
Robert tells David about Ziz's glorious plan to take to the sea and sever the right and left brains of her followers in order to make them psychopaths god that sentence was weird to write trust us the episode is weirder.
See omnystudio.com/listener for privacy information.
Listen and Follow Along
Full Transcript
Call Zone Media. later ones where he's a one-man army.
I'm doing that and I'm doing it against Microsoft because I fucking hate Copilot. With me to talk about how much we hate Microsoft Copilot, my producer Sophie Lichterman and our wonderful guest, David Boree.
David, how do you feel about Microsoft Copilot? Rambo 3, let's go. Okay, okay, okay.
Let's kill a ton of brown people. No, I mean, in this one, it's just like Microsoft co-pilots we're killing.
Okay, all of them. They're not really people.
It's so bad. Outlook is terrible.
Microsoft has really gone far off of making a lot of products that people hate to use. Speaking of products people hate to use, David said before we started recording that he was excited to hear this story.
I just want you to know where we're starting on this story is page 21 of the script. And where we end the script is page 49.
Whoa. I made a mistake in doing this.
I'm going to admit that right now. Before we get further, I'm going to say I erred in this.
And it's, you know, I've made peace with the inevitability of fucking stuff up, especially when like every week you're doing a different chunk of history and we're veering from like we're talking about fucking 17th and 18th century France. And then like now we're talking about like a fucking guy who did a genocide in Darfur or whatever, right? Like you're going to, these are all important topics, but like you simply can't every single week cover the breadth of stuff that we do and not, you're going to misspeak, you're going to make errors and stuff.
And when it comes to like, I'm talking about Hitler, I'm talking about like not, obviously those are important, but you know, if I, if I fuck up some fact about like early 1900s Germany, I'm not going to be like too bent out of shape because it's like, you know, there's there's no perfection in this. But in this case, it's this tiny little community that nearly all of the reporting on has been like deeply incomplete.
and I feel like the stress over like what do I
include in here and the other problem is that none of these people have editors. And so all of them, everybody in this story has a blog and every blog post is like 40,000 words.
So it's just like, yeah, I was gonna say, what media are you able to get? This is you're getting this all straight from the source, right? A lot of it? A lot of it's – I mean I've read most of Ziz's blog entries and I've at least done like little surveys of the blogs of everybody else involved in this. There were also a couple of very helpful compilations that like people – there's like one that like a former – sometimes it's like former members of the community.
Sometimes it's folks who are like rationalists that were trying to warn other rationalists about Zizians. But like people in and around the community have put together compilations where they'll like clip mixes of news stories and like conversations online.
And obviously these folks like- A blog mixtape is nasty work. Yes, yes.
And I'm deeply grateful. We'll have source links and everything in here.
I note when I'm kind of like pulling something from something directly.
But like I'm very grateful to the maniacs who put together these like documents that have helped me piece together what's happening.
Because really, if you're coming in as an outsider, if you weren't like embedded in this community while all this crazy shit was going on, it's a little, it's kind of impossible to like get everything you need to get. You have to refer to these interior sources.
Um, it's just the only way to actually understand stuff. No.
Yeah. I, as an outsider, I don't know what's going on.
I don't know where it's going.
I for sure don't know where it's going.
It's going.
We know where it ends.
We know where it ends. A member of Congress shows up at the library in Vermont that the U.S.
and Canada shares because a border patrol agent was murdered there and like threatens to take over Canada. And that's all like there's a degree to which you can kind of tie heightened tensions between the U.S.
and Canada to the murder of this Border Patrol agent, which itself is directly tied to the fact that Elisa Yudkowsky wrote a piece of Harry Potter fan fiction. I love that it all goes back to that.
Yes, yes. It all comes back to bad Harry Potter fan fiction.
Hey, all you Women's Hoops fans and folks who just don't know yet that they're women's hoops fans. We've got a big week over at Good Game with Sarah Spain as we near the end of one of the most exciting women's college basketball seasons ever.
The most parody we've seen in years with games coming down to the wire and everyone wondering which team will be crowned national champions this weekend in Tampa. Listen to good game with Sarah Spain on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.
My husband cheated on me with two women. He wants to stay together because he has cancer.
Should I stay? Okay, Sam, that has to be the craziest story in Okay Storytime podcast history. Well, John, that's because it's dump of week and this user writes, last week we had an attempted break-in.
I asked my husband, who was supposed to be at his mom's, to come over and change the locks, but his mom told me he wasn't with her. And it took me less than an hour to find the first two women he was cheating on me with.
Did you leave him? Well, to find out how this story ends, follow the OK Storytime podcast on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. Imagine you're scrolling through TikTok.
You come across a video of a teenage girl and then a photo of the person suspected of killing her. It was shocking.
It was very shocking. Like, that could have been my daughter.
Like, you never know. I'm Jen Swan.
I'm the host of a new podcast called My Friend Daisy. It's the story of how and why a group of teenagers turned to social media to help track down their friend's killer.
Listen to my friend Daisy on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. In 2020, a group of young women found themselves in an AI-fueled nightmare.
Someone was posting photos. It was just me naked.
Well, not me, but me with someone else's body parts. This is Levittown, a new podcast from iHeart Podcasts, Bloomberg, and Kaleidoscope about the rise of deepfake pornography and the battle to stop it.
Listen to Levittown on Bloomberg's Big Take podcast. Find it on the iHeart Radio app, Apple Podcasts, or wherever you get your podcasts.
So, part three, we spent last episode talking about Ziz's moving to the Bay and their first interactions with the rationalist community. That big CIFAR conference they went to that was very reminiscent, had a lot of exercises reminiscent of like Synanon shit, right? Right, right.
Very, very, a lot of talk of murder these people love theorizing about when it's okay to kill people constant factor in all of this which is can't be a step in a good direction yeah you know you should you should be aware of there's like if your community is talking about like the ethics of escalating to murder in random arguments too much maybe be a little worried if someone sits down next to you and says how would you murder me or whatever they said right you always got to get out of that room yeah you want to leave immediately and furthermore if they're like yeah that's the right way even worse signmore, if they're like, yeah, that's the right way, even worse sign. And then if they're like, yeah, would you perform necrophilia in order to, in the past, scare people away from attacking you? Like, get out of that room.
Leave. This is not a crew you want to be a part of.
Yeah, maybe just take a pickleball or something. Pickleball.
People never talk about necrophilia playing pickleball. I don't think one time.
I don't think one time. No, they all talk about how they're getting knee replacements.
And that's the beauty of pickleball. Exactly.
So in spite of how obviously bad this community is, Ziz desperately wants to be in the center of the rationalist subculture. And that means being in the Bay.
Unfortunately, the Bay is a nearly impossible place to survive in if you don't have shitloads of money. And one of the only ways to make it in the Bay, if you're not rich, is to wind up in deeply abusive and illegal rental situations.
You know this, David. Come on.
I'm not spreading any news to you. No, shout out to my landlord, Mr.
Lou. So Ziz winds up in a horrible sublet with a person she describes as an abusive alcoholic.
I wasn't there. I don't know if she was the problem.
Obviously, I've got one side of this story, but her claim is that it ends in physical violence. Ziz claims he was to blame, but she also describes a situation where they're like after a big argument bump into each other and he calls the cops on her for assault.
I wouldn't put it past Ziz to be leaving some parts out of this. But also I know a bunch of people who wound up in horrible sublets with abusive alcoholics who assaulted them in the Bay Area and in L.A.
Yeah, no. Craigslist is a crapshoot, you know? Craigslist is a crapshoot, yeah.
Every time. I always, I feel like the need to like qualify with like, this is just Ziz's account, but also this sounds like a lot of stories I know people have had.
Yeah, it's tough to get by there. Yeah.
So she calls the or he calls the cops on her and then,
yeah,
they,
they do nothing.
And he attacks her in her bedroom that night. So she decides to like,
he's like throwing a chair at her and shit.
So she decides I got to get out of this terrible fucking sublet.
And unfortunately her next best option,
a very common thing in the rationalist community is to have whole houses rented out that you fill with rationalists who don't have a lot of money. It never ends like artists.
Yeah, kind of like artists or like content producer houses. It never explodes.
People never have horrible times in these. this particular rationalist house is called liminal because you know
Gen Z loves talking about
their liminal spaces
on the Internet. One resident of the house reacts very negatively when Ziz identifies herself as a non-transitioning trans woman and basically asks, like, when are you going to leave? So she has, you know, she she says that as soon as she arrives, one of the other residents is transphobes.
She can't stay there very long. Again, all sounds like a very familiar Bay Area housing situation story.
She bounces around some short-term solutions, Airbnbs, moving constantly while trying to find work. She gets an interview with Google, but the hiring process there is slow.
There's a lot of different stages to it, and it doesn't offer immediate relief from her financial issues. Other potential offers fall through as she conflicts with the fundamental snake oiliness of this era of Silicon Valley development.
Ziz blames it on the fact that she couldn't feign enthusiasm for companies she didn't believe in. Quote, I was inexperienced with convincing body language inclusive lies like this.
I did not have the right false face, but very quick to think up words to say. So like, I'm not good enough at lying that I'm excited about working for an app to, you know, help you do your laundry better, which is like a third of the bay.
Yeah. Yeah.
And once again, she has like flashes of like, oh, wow, you really you really have strong morals and all that. You know what I mean? Yeah.
She has a strong resume, right? It wasn't- She does. She wants like an award as a NASA intern, right? Yeah.
Yeah. She really is good at a lot of this stuff.
And all of these Zizians, as silly as their beliefs about philosophy and like cognitive science are, they're all extremely accomplished in their fields nearly. It's good evidence of fact that like, it's always a mistake to think of intelligence as like an absolute characteristic.
Like I am a genius software engineer. Therefore I am smart.
It's like, no, no, no. You, you, you, you're, you're, you're dumb at plenty of things, Mr.
Software Engineer. Yeah.
Don't sell yourself short. Yeah.
So she does start to transition during this period of time. She goes on finasteride, which helps to avoid male pattern baldness.
And she starts experimenting with estrogen and antiandrogens. She wanted to avoid this for, I'm sure she had a variety of reasons.
But as soon as she starts taking hormones, they have such a positive effect. She describes it as a hard to describe felt sense of cognitive benefits, and she decides to stay on them.
By October, she'd committed to start writing a blog about her own feelings and theories on rationalism, and her model here was Yudkowsky. She names this blog Sin Seriously, and it was her attempt to convince other rationalists to adopt her beliefs about, like, veganism and such.
Her first articles are, like, pretty bland. It's these scattered concepts and thought experiments, very basic stuff.
Like can God create a rock so big God couldn't move it? And then like throwing a rationalist spin on that. So it's, you know, a lot of this is like, oh, maybe in an area in which college didn't cost 200 grand, you could have just gotten a philosophy degree and that would have made you happy.
You just wanted to spend a couple of years talking through silly ideas based on dead Greek guys. Well, you know, the Bay is the place to do that.
Yeah, well, unfortunately so. She starts to really show an interest early on, though, and this is where things get unsettling, in enforcement mechanisms, which are methods by which individuals can blackmail themselves into accomplishing difficult tasks for personal betterment.
She writes about an app called Beeminder, which lets you set goals and punish yourself with a financial penalty if you don't make regular progress. And she's really obsessed with just the concept of using enforcement mechanisms to make people better, writing, often you have to break things to make them better.
So not a great path going down here is she following this herself like yes she's working on she's she's trying to use some of these tactics on herself to make herself to deal with like what she sees as her flaws that are stopping her from you know saving the cosmos okay great stuff a lot of good pressure to put on yourself yeah. Poor woman has been under the highest stakes this whole time.
Well, and that's again, that comes that comes. That's not ziz.
That's the entire rationalist subculture. The stakes are immediately we have to save the world from the evil AI that will create hell to punish everybody who doesn't build it.
And that actually we'll talk about this later. That breaks a ton of people in this.
She is not the only one kind of fracturing her psyche in this community. So right around this time, as she's bouncing around short-term rentals and like desperately trying to get work, she meets a person named Jasper Gwynne, who at that point identified as a trans woman, who now goes by Gwynne Danielson and uses they, them pronouns.
That's how I'm going to refer to them. But for clarity's sake, I'm going to call them Gwyn or Danielson, even though they went by a different name at this time, because that's what they're called now.
Gwyn was a fan of Ziz's blog and had some complex rationalist theories of her own. They came to believe that each person had multiple personalities stored inside their brain, a sort of like mutation of the left brain, right brain hypothesis.
And each of these sides of your brain was like a whole like intact person. Right.
Like, great. Yeah.
Cool. No, you guys are going to be fucking with your heads real hard.
Great. Oh, man, these poor people.
Yeah. So this falls in love with Gwen's ideas and she starts bringing them up in rationalist events, trying to brute force them into going mainstream among the community.
But people are like, this is a little weird, even for us. And she does not succeed in this.
And as a result, she and Danielson and a couple of other friends start like talking and theorizing together separately from the bulk of the community. So now again, you've had this, they're starting to calve off from the broader subculture
and they're starting to like,
really like dig ruts for themselves
in a specific direction
that's leading away
from the rest of the rationalists.
Literally all that cult stuff, huh?
All that cult stuff.
All that cult stuff.
Now, Gwyn and Ziz largely like bonded
over their struggle paying Bay Area rents.
And together they stumbled upon a solution beloved by generations of punks and artists in Northern California, taking to the sea. Specifically, it's great.
It's great. I mean, I've known like three separate people who lived on boats in the Oakland Harbor because it was like, this is the only way I can afford to live in the Bay.
My little brother went to school right outside of San Francisco and his principal lived on a boat. Right.
Just like a mile away from the school and everybody loved it. Yeah, yeah, everybody loved it.
I mean, I got to say everyone I know who lived on a boat lived on a shitty boat, but I'm also not convinced there are boats that, any boats that stay nice for very long. Yeah.
It feels like it would be dank, I guess, is the word. Dank is a good description of boat life, I think.
Yeah. So Gwen's boat was anchored off the Ensenal Basin and Ziz found this a pretty sweet solution.
She goes over to stay over one night and while they're like hanging out, staying up, probably
taking drugs, they don't like usually write about it.
But from like other community conversations, I think we have to assume an awful lot of
the time when these people are staying up all night and talking.
There's a lot of like ketamine and stuff being used to that isn't like written into the
narrative.
That also goes along with the Bay Area. That also goes along with the Bay Area.
That also goes along with the Bay Area.
Pills and powders are bigger.
Yeah.
Quote, they talked about how when they were a child, their friend who was a cat had died
and they had to use their own retroactive paraphrasing, sworn an oath of vengeance against
death.
Fuck!
These are just people doing great, very healthy. It's like the opposite of what you want a kid to learn when their pet dies is like yeah you know death is inevitable it happens to everything you know it'll happen to you one day and it's sad but just something we have to accept no no no war against death no they were like nope nope nope i can fix this okay i as a, have failed in this situation.
This was an unsuccessful step in my child's development. Maybe no more pets for a while.
Maybe no more pets. Gwyn also spent way too much time online, which is how they wound up reading hundreds of theoretical articles about how AGI, artificial general intelligence, would destroy the world.
And again, AGI is like a mainstream term now because a fucking chat GPT came out a couple of years ago and everyone started talking about it. At this point, 2016, 17, it's only like people who are really into the industry in a nerdy way who are using that frame.
Like regular people on the street don't know what you fucking mean when you're talking about this stuff. But this is a term that is in use among them.
And like Ziz, Gwen moved to the Bay Area to get involved in fixing the problem. They were an otherkin.
Are you familiar with this online community? Which one? Otherkin. Otherkin? No, I have no, I've never heard of that.
It's like a, it's like the Mormonism of furriedom almost. Like.
That's insane what you said. said i don't want to be like it's harmless right like these are people who there's a mix of beliefs some of them like literally believe they're like fantasy creatures some of them just like yeah they like yeah like half identify as like any uh a non-human creature right oh like their furry persona is their true...
Yeah, yeah, kinda.
That's close enough for government work.
And in Gwyn's case, it's even different
where I don't think they believe they are literally a dragon,
but they believe that when there's a singularity
and the robot god creates heaven,
they'll be given the body of a dragon
because the robot god will be able to do that
if it's a good singularity, at least.
That's why this is all so important to them,
making sure it's like a nice AI so they'll be able to get their animal friends back and get their dragon body.
tale as old as time you know tale as old as time again a lot of this could be avoided by just like processing death and uh stuff like that a little better but we don't do that very well in our
society anyway we've got a lot of people who are committed to denying that uh so i'm not surprised
like this happens at like the corners, right? Like this is, this is just a little downstream from that Brian Johnson guy tracking his erections at night and trying to get the penis of a 19 year old. Yes.
There's not like a massive sanity gap between these two things. No, no, it's, it's, I think, I think it's.
I think we're drinking from the same well. Yeah, yeah.
So this is a result. So Ziz commits herself to turning Gwyn to the dark side, which is a term she started to use.
Obviously, it's a Star Wars term. And it comes out as a result of her obsession with what's called akrasia.
Akrasia is an actual Greek term for a lack of willpower that leads someone to act in ways that take them further from their goals in life. It's an actual like, I think akrasia often, it was like an early term for like what we call ADHD, right? Like people who have difficulty like focusing on tasks that they need to complete.
One of the promises of rationalism was to arm a person with tools to escape this state of being and act more powerfully and effectively in the world. Ziz adds to this some ideas cribbed from Star Wars.
She decides that the quote unquote way of the Jedi, which is like accepting moral restrictions, you know, about like not murdering people and the like, is a prison for someone who's like truly great and has the opportunity to accomplish important goals, right? If you're that kind of person, and you can't afford to be limited by moral beliefs. So in order to achieve the kind of vegan singularity that she thinks is critical to save the cosmos, she and her fellow rationalists need to free themselves from the restrictions of the Jedi and become vegan Sith.
That's more or less where things are going here. So I'd say I should note that while Gwynn and Ziz are spinning out on their own, everything that you're seeing from them, these feelings of grandiosity and cosmic significance, but also paranoid obsession are the norm in rationalist and effective altruist circles.
There's a great article in Bloomberg News by Ellen Hewitt. It discusses how many in the EA set would suffer paralyzing panic attacks over things like spending money on a nice dinner or buying ice cream, obsessing over how many people they'd killed by not better optimizing their expenses.
And quote, in extreme pockets of the rationality community, AI researchers believed their apocalypse related stress was contributing to psychotic breaks. Miri employee, and that's one of these organizations created by the people around
Yudkowsky, Jessica Taylor, had a job that sometimes involved imagining extreme AI torture scenarios,
as she described it in a post on Less Wrong, the worst possible suffering an AI might be able to
inflict on people. At work, she says, she and a small team of researchers believed,
we might make God, but we might mess up and destroy everything. In 2017, she was hospitalized for three weeks with delusions that she was intrinsically evil and had destroyed significant parts of the world with my demonic powers, she wrote in her post.
Although she acknowledged taking psychedelics for therapeutic reasons, she also attributed the delusions to her job's blurring of nightmare scenarios in real life. In an ordinary patient, having fantasies about being the devil is considered megalomania, she wrote.
Here the idea naturally followed from my day-to-day social environment and was central to my psychotic breakdown. Oh, man.
Just taking ketamine and convincing yourself you're the devil. Normal rationalist stuff.
Yeah, and I mean, hey, we've all been there, right? We've all been there. No, in fact, I don't think we have.
No, this is the least relatable group of people I've ever heard of. No, no, exactly.
Because it's this like grandiosity. It's this absolute need to whatever else is going on.
Even if you're like the bad guy, feel like what you're doing is like of central cosmic significance. It's this fundamental fear that all is integral to all of these tech guys.
It's at the core of Elon Musk too, that like one of these days you're not going to exist. And very few of the things that you valued in your life are going to exist.
And there's still going to be a world because that's life. That's just, yeah, that it's so crazy how it boils down to just like, yeah, man.
Well, I don't know what you thought was going to happen.
Yeah, bro.
Sorry.
Yeah, that's just how it goes.
You know, we've got like 10,000 years of like philosophy and like thinking and writing
on the subject of dealing with this, but you didn't take any humanities and your STEM classes.
So you don't know any of that. You're just trying to bootstrap it.
You just watched Star Wars again and decided you got it figured out. Yeah, you watched Star Wars 137 times and figured that was going to replace reading a little bit of fucking Play-Doh or something.
I don't know, man. Maybe it didn't work.
Also, again, the ketamine's not helping. No, no, no, no.
God, to be a fly on that wall. Oh, God.
Yeah, the rationalist therapists are raking it in. Oh, man.
Honestly, well-deserved. Yeah.
Talk about info hazards. Jesus.
Oh, man. So I have to emphasize here, again, that I want to keep going back to the broader rationalist community because I felt like a risk of this is that I would just be talking about how crazy this one lady and her friends were.
And it's like, no, no, no. Everything they're doing, even the stuff that is a split off and different and like more extreme than mainstream rationalism is directly related to shit going on in the mainstream rationalist community, which is deeply tied tied into big tech which is deeply tied into like the peter teal circle a lot of these folks are close to in and around the government right now right so like that is it's ziz is not nearly as much of an outlier as a lot of rationalists want people to think right yeah anyway uh at rationalist meetups ziz began to pushing this whole vegan sith thing hard, and again, meets with little success, but she and Gwen gradually start to expand the circle of people around them.
Meanwhile, in her professional life, that Google interview process moves forward.
Ziz says that she passed every stage of the process, but that it kept getting dragged out, forcing her to ask her parents for more help. In November, around the time her blog started to get a following, she says Google said she'd passed the committee and would be hired once she got picked for a team.
Now, I don't know what happens after this. She says Google asked for proof of address, which she doesn't have.
She's just turned 26, and she's not on her parents' health insurance either. She's been pages describing what is a very familiar nightmare scenario to me of trying to get proof of address so you can get a job and like continue getting like, you know, get on CaliMed and stuff.
And I do think it's probably worth acknowledging that like as her brain is starting to break and she's having, she's getting further and further into all these delusional ideas. She's also struggling with being off of her parents' health insurance and like trying to find stable housing in the Bay.
And like that, that influences the situation. And still in the process of transitioning, right? Yes.
Yes, exactly. And still in the process of transitioning.
Yes. It's a heavy workload.
You're doing too much to your brain right now. Yes.
So, and then she makes the worst possible decision, which is to live with her friend Gwen in her tiny sail or in their tiny sailboat, which is now anchored by the Berkeley Marina. Again, this is not like a houseboat.
This is like a sailboat with one small room. Right.
It's got a core. Yeah.
What is there? There's like a bed, a table and a sink. Right.
Like a little bathroom, probably maybe a a kitchenette but it's not like livable
for two people you should anybody who's like ever lived in too small of a space space with a roommate knows that just like no matter where you're at it's horrible bad idea and now imagine imagine if that shitty tiny apartment that you remember from your past was a boat.
Just, just disastrous.
And this is not a good situation.
Ziz would later write,
I couldn't use my computer as well.
I couldn't set up my three monitors.
There was no room.
Couldn't have a programming flow state for nine hours.
I had trouble sleeping. The slightest noise in my mind kept alerting me to the possibility that
someone like my roommate from several months ago was going to attack me in my sleep.
So this is not a healthy situation.
And both Gwyn and Ziz have endured some specific traumas, and both are also prone to flights
of grandiosity and delusion.
And now they are trapped all day, every day, together in a single room where their various
neuroses are clashing with each other, and their only relief is talking for hours about how to save the world. Oh, my God.
This is a real villain story. It couldn't get any worse than that.
It couldn't. And it's like, at this point, I don't think either of them is intentionally doing anything bad.
You've just, you've kind of created a cult
where like you're trading off on being the cult leader
and cult member for each other.
Like you've isolated each other away from the world
and you're spending time brainwashing each other
together in your little boats.
Yeah, how often do you think they were leaving that boat?
Not nearly long enough.
And Gwyn is on what Ziz describes
as a cocktail of stimulants, quote, mapped out the cognitive effects of each hour they were on them. They get very angry if Ziz interrupts their thoughts at the wrong time.
And also, like, Ziz isn't really sleeping. So they're just talking for hours and getting on each other's nerves at the same time, but also, like, building these increasingly elaborate fantasies about how they're going to save the cosmos.
And it's, you know, it's not great. Through these conversations, they do develop Gwyn's multiple personalities theory, mixing in some of Ziz's own beliefs about good and evil.
And I want to quote another passage from that Wired article that summarizes what they come to believe about this. A person's core consisted of two hemispheres, each one intrinsically good or non-good.
In extremely rare cases, they could be double good, a condition that, if so happened, Lasota identified in herself. And Ziz is consistently going to identify herself as intrinsically good.
So both sides of her personality are only good. But most people are at best single good, which means part of them is non-good or basically evil.
And they're at war with this other half of their brain that's a whole person that's evil, which is why other people can't be trusted to make decisions.
You know, like increasingly, this is attitude is going to be like only intrinsically good people can be trusted to make good decisions.
Only the double goods.
Only the double goods. It's such like a, you know, you're making your own like Orwell speech, Ziz.
This is a bad sign. So Ziz's Google ambitions fall apart at this time.
They don't really give us a good explanation as to why. I kind of think they started bombarding their contact with Google with like requests about why the process wasn't going faster.
And maybe Google was like, ah, maybe we don't need this person. Ziz concludes failing at Google was good because she's gotten she'd gotten ten thousand dollars from unemployment at this point.
Quote, this means I had some time. If they hired me soon, it would deprive me of at least several months of freedom.
And which, of course, she is continuing to work out her theories with Gwen on the sailboat. Also, I don't know if that's freedom.
It's really not freedom. I mean, maybe work.
I hear the Google campus has a lot of things to do. It's kind of the what if.
I think maybe at this point she still could have pulled out of this tailspin if she'd gotten a job and worked around other other people and socialize not on the sailboat but also a real consistent thing with ziz is at this point she has no willingness to like do the kind of compromise and i'm not just talking about the moral compromise but like if you're going to work a job for a company you're going to spend a large part of your day doing a thing that like you wouldn't be doing otherwise right because that's what a job generally is. It's just work.
That's just work. And Ziz feels like she can't handle the idea of doing anything but reading fan fiction and theorizing about how to give herself superpowers, right? That's the most important thing in the world because the stakes are so high.
So she like ethically can't square herself with doing anything she needs to succeed in this industry where she has the skill to succeed. And this is another trait she's got in common with the rest of the rationalist EA subculture.
That Bloomberg article interviewed a guy named Kuo-Chu Yuan, a former rationalist and PhD candidate who dropped out of his PhD program in order to work in AI risk. He stopped saving for retirement and cut off his friends so he could donate all of his money to, you know, EA causes.
And because his friends were distracting him from saving the world. And these are all this old cult stuff, right? Colts want you to cut off from your friends.
They want you to give them all your money. He's doing but he's doing it like independently.
Like he's not there's not like a single leader. He's not like living on a compound with them.
It's just once you kind of take these beliefs seriously, the things that you, that you will do to yourself are the things people in cults have done to them. Right.
In an interview with business insider, Juan said, you can really manipulate people into doing all kinds of crazy stuff. If you can convince can convince them this is how you can prevent the end of the world.
Once you get into that frame, it really distorts your ability to care about anything else. Man.
Yeah. That's kind of the thing.
It's harder to talk about this than like – could people talk about Ziz as like, oh, it's a cult leader and she had her, you know, vegan trans AI death cult or something. And, you know, I I feel like that's not close enough to the truth to get what's like to get how this happened.
Right. Because what happens with Ziz is very cultish.
But Ziz is one of a number of different people who have calved off of the rationalism community and had disastrous impacts. But it happens constantly with these people because like- It's got such an engine for it, right? Yes, it's an engine for making cults.
This is a cult factory for sure. Yeah, we created a cult factory.
Oh no, cults. They give you the base ideas and then you can just kind of franchise it how you'd like.
Yeah.
And a lot of prominent rationalists who news is at the time have since gone out of their way to describe her as like, you know, someone on the fringes.
Anna Salomon of CFAR described her as a young person who was hanging around and who I suspect wanted to be important.
And Anna claims- Is there anyone here who doesn't want that within this group?
They're all- No, that's all of them. Right.
That's the whole community. And like Anna was emailing direct gave that gave Ziz like some of the advice that Ziz considered like key to her moving to the Bay Area and stuff.
Right. Like these these these people like the rationalists really, really want you to think that this was just like some fringe person.
But she's very much tied in to all of this stuff, right? So for her part, Ziz doesn't deny that failing to convince other rationalists was part of why she pulled away from mainstream rationalism. But she's also going to claim that a big reason for her break is sexual abuse among people leading in the rationalist community.
And there's a specific case that she'll cite later that doesn't happen until 2018. But this is a problem people were discussing in 2017 when she's living on that boat.
The representative story is the case of Sonia Joseph, who was the basis of that Bloomberg news piece I've quoted from a couple of times. And it's a bummer of a story.
Sonia was 14 when she first read Yudkowsky's Harry Potter and the Methods of Rationality, which is set her on the path that led her to moving to the Bay Area in order to get involved in the rationalist EA set. And she's focused on the field of AI risk.
And I'm going to read a quote. This week has been so long that I completely erased the Harry Potter part of the story from my brain.
It's it's it's it's never drops too far below the surface. I cannot overemphasize how important this Harry Potter fan fiction is to all these murders.
I mean, this is like primary text, right? And this woman getting abused. Yes, yes, it's a primary text of the movement.
Wow. I'm gonna read a quote from that Bloomberg article.
Sonia was encouraged when she was 22 to have dinner with a 40-ish startup founder in the rationalist sphere because he had a close connection to Peter Thiel. At dinner, the man bragged that Yudkowsky had modeled a core Harry Potter professor in that fanfic on him.
Joseph says that he also argued that it was normal for a 12-year-old girl to have sexual relationships with adult men and that such relationships were a noble way of transferring knowledge to a younger generation. Then, she says, he followed her home and insisted on staying over.
or Jesus.
So great.
You know, bragging about your Harry Potter,
how you helped inspire the Harry Potter fanfic,
and then explaining how 12-year-old girls
should have sex with adult men.
Good stuff.
I gotta say. Rational.
I gotta say that's a crazy
brag to get chicks.
Yeah, to get chicks.
You know one of those
characters. I'm the Snape.
Yeah, I'm the Snape of this. By the way,
what do you think about 12 year olds?
Also, I have
a close connection to Peter Thiel.
Yeah.
Cool. Oh, man.
As that Bloomberg article makes clear, this is not an isolated issue within rationalism. Quote, sexual harassment and abuse are distressingly common, according to interviews with eight women at all levels of the community.
Many young, ambitious women described a similar trajectory. They were initially drawn in by the ideas, then became immersed in the social scene.
Often that meant attending parties at EA or rationalist group houses or getting added to jargon-filled Facebook Messenger chat groups with hundreds of like-minded people. The eight women say casual misogyny threaded through the scene.
On the low end, Brick, the rationalist-adjacent writer, says a prominent rationalist once told her condescendingly that she was a five-year-old in a hot 20-year-old's body. Relationships with much older men were common, as was polyamory.
Neither was inherently harmful, but several women say those norms became tools to help influential older men get more partners. And this is also, this isn't just rationalism, that is the California ideology.
That is the Bay Area tech set, right? Yeah, that feels very techie. Yes.
Oh, man. And it's all super fucking gross.
The whole you're a five-year-old in a hot 20-year-old's body thing. What the fuck, man? No, to say that.
How do you say that and not hurl yourself off the San Francisco Bay Bridge? That shit's vile. That's fucked up, dude.
That's truly vile. That's bad.
Speaking of bad, to the bone, our sponsors. This show is sponsored by BetterHelp.
Let's talk numbers. Traditional in-person therapy can cost anywhere from $100 to $250 a session, which adds up fast.
But with BetterHelp Online Therapy, you can save on average up to 50%
per session. With BetterHelp, you pay a flat fee for weekly sessions, saving you big on cost and
on time. Therapy should feel accessible, not like a luxury.
With Online Therapy, you get quality
care at a price that makes sense and can help you with anything from anxiety to everyday stress. Your mental health is worth it, and the price is within reach.
With over 30,000 therapists, BetterHelp is the world's largest online therapy platform, having served over 5 million people globally. It's convenient, too.
You can join a session with the click of a button, helping you fit therapy into your busy life. Plus, switch therapists at any time.
Your well-being is worth it. Visit betterhelp.com slash behind to get 10% off your first month.
That's betterhelp.com slash behind. It's tax season, and by now, I know we're all a bit tired of numbers, but here's an important one you need to hear.
$16.5 billion. That's how much money in refunds the IRS flagged for possible identity fraud last year.
Here's another. 20%.
That's the overall increase in identity theft related to tax fraud in 2024 alone. But it's not all grim news.
Here's a good number. 100 million.
That's how many data points LifeLock monitors every second. If your identity is stolen, LifeLock's U.S.-based restoration specialists will fix it, backed by another good number, the million-dollar protection plan.
In fact, restoration is guaranteed or your money back. Don't face identity theft and financial losses alone.
There's strength in numbers with LifeLock Identity Theft Protection for tax season and beyond. Join now and save up to 40% your first year.
Call 1-800-LIFELOCK and use promo code IHEART or go to lifelock.com slash IHEART for 40% off. Terms apply.
A lot of people tolerate ordinary. Ordinary bathrooms, kitchens, entryways.
Well, not on your watch.
If you're a pro, you've got a new partner in town.
Floor & Decor.
From tile to wood to stone, Floor & Decor has more styles and jawblock quantities of Schluter, Mape, Laidacrete, and other brands pros trust.
Come see a whole new way to wow with Flo Florin Decor. Now open in Gilroy.
Have you ever wondered if your pet is lying to you? Why is my cat not here? And I go in and she's eating my lunch. Or if hypnotism is real? You will use the suggestion in order to enhance your cognitive control.
Or what's inside a black hole? Black holes could be a consequence of the way that we understand the universe. Well, we have answers for you in the new iHeart original podcast, Science Stuff.
Join me, Jorge Cham, as we tackle questions you've always wanted to know the answer to about animals, space, our brains, and our bodies. Questions like, can you survive being cryogenically frozen? This is experimental.
This may never work for you. What's a quantum computer?
It's not just a faster computer.
It performs in a fundamentally different way.
Do you really have to wait 30 minutes after eating before you can go swimming?
It's not really a safety issue.
It's more of a comfort issue.
We'll talk to experts, break it down, and give you easy-to-understand explanations to fascinating scientific questions.
So give yourself permission to be a science geek and listen to Science Stuff on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts. Ah, we're back.
So this is important to understand in a series about this very strange person and the strange beliefs that she developed that influenced several murders. Ziz had many of the traits of a cult leader, but again, she's also a victim first of the cult dynamics inherent to rationalism.
And what she's doing next is she breaks away with a small loyal group of friends and she does create a physical situation that much more resembles the kind of cults we're used to dealing with, particularly Scientology. Because next she's going to take, oh wow, me and Gwen living alone on this boat.
We kind of hate each other and neither of us is sleeping and our emotional health is terrible. But we've made so many much progress on our ideas.
Maybe we should maybe we should make this a bigger thing. Right.
Maybe we should get a bunch of rationalists all living together on boats. Oh, she needs a work life balance.
Yeah, no, no. But what she thinks she needs is, she calls it work-life balance.
Yeah, no, no.
What she thinks she needs is, she calls it the rationalist fleet, which is she wants
to get a bunch of community members to buy several boats and live anchored in the bay
to avoid high bay area rent so they can spend all their time talking and plotting out ideas
for saving the cosmos.
Oh, man.
Great.
And I get it, right? No, of course. It's expensive here.
I want to get some boats with my friends. It does sound cool.
We won't go insane together, obviously. She buys a 24-foot boat for $600 off of Craigslist.
And I don't know much about boats, but I know you're not getting a good one for $600. No, no.
Not a good living boat. Like a full boat? Like a 24-foot boat.
Yes, a full boat. Oh, man, that had to be a piece of shit.
That had to be a shitty, shitty boat. Just a colossal piece of shit.
Yeah. She names it the Black Signet, and she starts trying to convince some of her ideal, these people who have gathered around her to get in on the project.
Eventually, she, Danielson, and a third person puts together the money to buy a boat that's going to be like the center of their fleet. A 70-year-old Navy tugboat named the Caleb, which was anchored in Alaska.
This is like a 94-foot boat. It's a sizable boat.
And it is also very old and in terrible shape. And that's the center.
That's the crown jewel of the fleet. Right, right.
That's our flagship. Oh, man.
So she and Danielson, they buy this thing with this third guy, Dan Powell, who's at least a Navy veteran. So, like, you know you know, that's a good call.
He's boat adjacent. But he's I get the feeling.
Nobody says this. Dan Powell says that he put tens of thousands of dollars into buying the Caleb.
And I just know from what Danielson and Ziz wrote about their finances, neither of them had nearly that much money. So I think by far he invests the most in this project.
And I don't want to insult the guy, but he says he did it because he quote,
considered buying the boat to be a good investment, which boats aren't. Boats are never an investment.
Like comically so, like known to be a terror. Nothing depreciates like fucking raw salmon depreciates slower than a boat.
I think his attitude is I'm going to become like the slumlord of a bunch of, or at least landlord to a bunch of boat rationalists. I think slumlord was correct.
I don't know how you expect this to pay off.
Buying a 70-year-old tugboat for a bunch of, like, poor, rationalist punk kids to live in. How is that ever supposed to work? What's the P&L statement you put together here? Oh, man.
Oh, man. What was the timeline on him getting his money back, he thought? Oh, God.
I have no idea. He absolutely takes a bath on this shit, right? Yeah.
No, he has to. And I believe him that Ziz lied to him about the whole scenario to get his money.
I do think this was essentially a con from her. He says, quote, Ziz led me to believe that she had established contacts in the Bay and that it would be easy for us to at least get a slip, if not one that was approved for overnight use.
And as it turns out, when we were coming through the inside passage from Alaska, it was revealed that we did not have a place to arrive.
Wait. Oh, I didn't realize he sailed it down from Alaska.
Yeah, they all sail it together. Them and a couple other rationalists that they pick up.
They like make a post on the Internet being like, hey, any rationalists want to sail a boat down from Alaska? Talk about our ideas while we live on a boat. Oh, man.
These people need space. Yes.
Just get a warehouse. It's Oakland.
Just get a warehouse. The ghost ship fire had happened by that point.
So i don't think warehouse space was easy to get fair yeah um but this i think this would have i think you're right in an earlier era they would have just wound up living in like a warehouse uh and maybe all died in a horrible fire uh because that there were issues with that kind of life too but they would have an option besides the boat thing anyway the caleb is not in good good shape. Again, this boat is 70 plus years old.
It is only livable by punk standards. And while it was large enough, it is a 94 foot boat.
You can keep some people on there. It's also way too big to anchor in most municipal marinas, especially since the boat has 3,000 gallons of incredibly toxic diesel fuel.
And it's not really seaworthy, which means there's this constant risk of poisoning the waters it sits in that the authorities are just going to be consistently like, guys, you can't have this here. Guys, you just, you simply can't have this here.
So they just got to operate out in international waters like a cruise ship? No, they're just kind of illegally anchoring places and hoping that it's fine and periodically getting boarded over it.
Another crew member on the ride down from Alaska who is just kind of there.
They're just there, you know, for the adventure.
So they leave and don't come back after they get to the bay.
But this person expressed an opinion that Ziz consistently came off as creepy, but not scary.
At one point, he says that she confronted him and told him he was transgender. And when he's like, no, I'm really not.
She told him he was. Yes.
She does this a lot, tells people, I know that you're this, this is, and it works like that's how a number of her followers get to her. But she also, it doesn't work a lot of time.
A lot of people are like, no, I'm not, you know, whatever it is you're saying. She does this to Gwen too.
So I don't doubt his story. Like she just kind of decides things about people and then tries to brute force them into accepting that about herself.
And when there are people who are like both desperate for like approval and affection and also who are housing insecure and need the boat or wherever to live with her, those people feel a lot, a number of them feel like a significant pull to just kind of accept whatever Ziz is saying about them. Yeah.
I mean, when you're desperate in that way, you kind of definitely find yourself bending things to have a roof over your head. Right.
Yep. And it's a very normal cult thing, right? Like this is an aspect of all of that kind of behavior.
Now, by this point, a few other people have come to live in the rationalist fleet.
One of them is Emma Borhanian, a former Google engineer, and Alex Lethem, a budding mathematician.
The flotilla became a sort of marooned aquatic salon.
Wired quotes Ziz as emailing to a friend at the time,
We've been somewhat isolated from the rationalist community for a while,
and in the course developed a significant chunk of unique art of rationality and theories of psychology aimed at solving our problems. I'm excited for this psychology you built on the boat.
Yeah. Wired continues, as Lasota articulated, their goals had moved beyond real estate into a more grandiose realm.
We are trying to build a cabal, she wrote. The aim was to find abnormally intrinsically good people and turn them all into Gervais sociopaths, creating a fundamentally type of group than I have heard of existing before.
Sociopathy, Lissota wrote, would allow the group members to operate unponed by the external world. Yeah, that is, because you had said that before, right? That they had been, that's sort of what they're looking to be.
Yeah, they're obsessed with this idea of, which is initially like kind of a joke about the office.
But they're like, no, no, no.
It actually is really good to have this sociopath at the top who like moves and manipulates these like lesser like fools and whatnot and puts them into positions below them.
Like that's how we need to, what we need to be in order to gain control of the levels of power.
Oh, man.
We have to make ourselves into Ricky Gervais sociopaths. Yeah.
Great. What a good ideology.
I love that they still love pop culture, though, you know? They're obsessed with it. And again, this is, you can't talk about this kind of shit if you're regularly having conversations with people outside of your bubble.
Exactly. It's the thing.
Yeah, if you have somewhere to go. if you have anywhere to go, this can't fly.
Yes. Yes.
If you've got a friend who's like a nurse or a contractor you have drinks with once a week and you just talk about your ideas once, they're going to be like, hey, this is bad. You need to stop.
You're going down a bad road. Do you need to stay with me? Are you okay? Yeah.
This is clearly like a cousin. Yes, someone.
This would be so upsetting for someone to just casually talk about it like a paint and sip. Like Ricky Gervais? Yeah, Ricky.
So by this point, their breaks with mainstream rationalism had gone terminal. Gwynn criticized the central rationalist community for, quote, not taking heroic responsibility for the outcome of this world.
In addition to the definitely accurate claims of sexual abuse within rationalism, they alleged organizations like CFAR were actively transphobic. I don't know how true that is.
Some of the articles I've read, there's a lot of trans rationalists who will be like, no, there's a very high population of trans people within the rationalist community. So people disagree about this.
It's not my place to come to a conclusion. But this is one of the things that Ziz says about the central rationalist community.
Ziz had concluded that transgender people were the best people to build a cabal around because they, quote, from Ziz's blog, had unusually high life force.
Ziz believed that the mental powers locked within the small community of simpatico rationalists they'd gathered together were enough to alter the fate of the cosmos if everyone could be jailbroken into sociopaths.
And these are all double goods as well.
Well, no, she's the only double good, actually. She becomes increasingly convinced that they're all just single good, right? And this is like her beliefs about heroism from the last episode.
If you've got the community and the hero, the community's job is to support the hero, right? Like no matter what, it was like blind support, right? Blind support, no matter what. And a lot of the language this is using here, in addition to being, you know, rationalist language, this is all like Scientology mixed with gaming and fantasy media.
She talks about the need to install new mental tech on she and her friends, which is like tech is like a Scientology term, right? Like that's like a big thing that they say. She and her circle start dressing differently.
Ziz starts wearing like all black robes and stuff to make her look like a Sith or some sort of wizard. Her community adopts the name vegan anarcho transhumanism and starts unironically referring to themselves as vegan Sith.
Imagine being in the boat community when they move in. Yeah.
It's like what the fuck is going on? I just wanted I'm just an alcoholic. What's happening? I just wanted to be like Quint from Jaws.
Oh, no. I'm just here because my wife left me.
I think I might die a different way than a great white attack now. This is looking bad.
Yikes. Oh, man.
So around this time, Gwen claims she came up with a tactic for successfully separating and harnessing the power of different hemispheres of someone's brain. The tactic was unihemispheric sleep.
And this is a process by which only one half of your brain sleeps at a time. In a critical write-up published as a warning before the killings that are to come, a rationalist named Apollo Mojave writes,
Normally, it is not possible for human beings to sleep with only one hemisphere.
However, a weak form of UHS can be achieved by stimulating one half of the body and resting the other.
Like hypnosis or fasting, this is a vulnerable psychological state for a person.
Entering UHS requires the sleeper to be exhausted.
It also has disorienting effects, so they are not quite themselves.
And I disagree with him that they're not just actually sleeping with only one hemisphere. And in fact, I think they may have taken this idea from Warhammer 40,000.
That's so funny. Because it's something space marines do.
Because, yeah, what are you talking about? But yeah, that doesn't, that's not a thing. Like, you know, that's, like, yes, if you don't let yourself sleep for long periods of time and like kind of let yourself zone into a meditative state, you'll get a trippy effect.
Like you will become altered. You're altering your state.
And you can, this is why cults deprive people of sleep. You can fuck with people's heads a lot when they're in that space.
But this isn't what's happening. I like to think of them on the boat, just only using one half of their body.
Right, right. Like one eye open.
Watching the office. Furiously taking notes.
So this is how that write-up describes the process of uni-hemispheric sleep. One, you need to be tired.
Two, you need to be laying down or sitting up. It is important that you stay in a comfortable position that won't require you to move very much.
In either case, you want to close one eye and keep the other open. Distract the open eye with some kind of engagement.
Eventually, you should feel yourself begin to fall asleep on one side. That side will also become numb.
The degree of numbness is a good way to track how deep into sleep the side is.
Once into UHS,
it is supposed to be possible to infer which aspects
of your personality
are associated
with which side of the brain.
And the goal of unihemispheric sleep
is to jailbreak the mind
into psychopathy fully, right?
And that's how Ziz describes it.
That's the goal.
That's the goal.
That's their goal.
Got to make ourselves into psychopaths so we can save the world but it also gets used you can use it to like I have this thing I don't like that I react this way in this situation so get me into this sleep pattern and you like talk me through and we'll figure out why I'm doing it and we'll they describe it as using tech to upgrade their mental capabilities right so? So they're just kind of brainwashing each other.
They're like fucking around with some pretty potentially dangerous stuff. And again, drugs are definitely involved in a lot of aspects of this, which is not usually written up, but you just have to infer.
given that there's some disagreement
or there's some disagreement around all this,
but it seems accurate to say that Gwynn is the one
who came up with the unihemispheric sleep idea. But a lot of the language around how this tactic was used and what it was supposed to do came from Ziz.
And again, the process is just sleep deprivation, right? This is cult stuff. It's part of how cults brainwash people, but it also wouldn't have seemed inherently suspicious to rationalists because being part of that subculture and going to those events had already normalized a slightly less radical version of this behavior, as this piece in Bloomberg explains.
At house parties, rationalists spent time debugging each other, engaging in a confrontational style of interrogation that would supposedly yield more rational thoughts.
Sometimes, to probe further, they experimented with psychedelics and tried jailbreaking their minds to crack open their consciousness and make them more influential or agentic. Several people in Taylor, and this is one of the sources, Sphere, had similar psychotic episodes.
One died by suicide in 2018 and another in 2021. So in the mainstream
rationalist subculture, they are also trying to like consciously hack their brains using a mix
of like drugs and meditation and like social abuse and people kill themselves as a result of
like the outcomes of this. This is already a problem in the mainstream subculture.
Yeah. Let alone this extremist offshoot, right?
Yep. In her own writings at the Times, Ziz describes hideous fights with Gwyn in which Gwyn tries to mentally dominate and mind control Ziz they both become believers in a new theory Ziz has that's basically like she uses the term mana which she describes as like your ability to persuade people which is if you can convince someone of something, it's evidence that you have an inherent level of magical power.
And someone with naturally high mana, like Ziz, can literally mind control people with low mana. That's what she believes she's doing whenever she tries to talk someone into something about themselves.
She's mind controlling them. And she and Gwyn have mind control battles.
At one point, they start having one of these arguments where basically Gwyn threatens to mind control Ziz, and Ziz threatens Gwyn back, and this starts a verbal escalation.
And the way Ziz describes this escalation, which is, again, these are two sleep-deprived, traumatized people fucking with each other's heads on a boat.
But the way that Ziz describes the escalation cycle is going to be important because this has a lot to do with the logic of the murders that are to come. I said that if they were going to defend a right to be attacking me on some level and treat fighting back as a new aggression and cause to escalate, I would not at any point back down.
And if our conflicting definitions of the ground state where no further retaliation was necessary meant that we were consigned to a runaway positive feedback loop of revenge so be it and if that was true we might as well try to kill each other right then and there in the darkness of the caleb's bridge at night where we were both sitting lying under things in a cramped space i became intensely worried they could stand up faster consider the idea from world war one mobilization is tantamount to a declaration of war. I stood up, still, silent, waiting.
So you see...
Oh... idea from World War I.
Mobilization is tantamount to a declaration of war. I stood up, still, silent, waiting.
So you see, first off. And there's other people there as well.
Yes, yes. And just like the logic of, well, obviously, if you attack me, then I'm going to counterattack you and then you're going to counterattack me, which means eventually we'll kill each other.
So we should just kill each other now. Like when you are taking your advice on how to handle social conflict from the warring European powers that got into World War I, maybe not a good, like positive example.
It's just so like, even in understanding how they got there it still is such a stress like even having all this back it's still like yeah really taking some leaps it's it's yeah i mean just having a fight with your friend and then opening your locket which has like kaiser wilhelm and the czar in it and going what would you guys do here ancestors Ancestors guide me. And again, you know, part of what's going on here is this timeless decision theory bullshit, right? Ziz believes that she makes it clear at this point when they start having a conflict that the stakes will immediately escalate to life or death.
Gwyn won't risk fucking with her, right? But by doing this, she also immediately creates
a situation where she feels unsafe.
However, in that conflict, Gwyn yields
and Ziz concludes that the technique works,
right? So then her mana must
be what she thinks it is.
Her mana strong, and this is a good
idea for handling all conflicts,
right? So I'm going to increasingly
teach all these people who are listening to
me that this is the escalation
loop that you handle every conflict
with, right? Great
Thank you. Right.
So I'm going to increasingly teach all these people who are listening to me that this is the like escalation loop that you handle every conflict with. Right.
Great stuff.
One of the young people who got drawn to Zizit this time was Maya Pasek, who blogged under the name Squirrel in Hell.
She wrote about mainstream rationalist stuff, citing Gidkowski and Elon Musk.
But in her blog, there's like a pattern of depressive thought.
In one 2016 post, she mused about whether or not experiencing joy and awe might be bad because it biases your perception. So this is a young person who I think is dealing with a lot of depressive issues.
Yeah, that's classic stinking thinking, as they say. Right.
And maybe the community is not super helpful to her. She was working to create a rationalist community in the Canary Islands.
She's kind of trying to do the same thing Ziz did, but like in an island where it's cheaper to live. Is this a thing that can exist a lot of places? Sure.
Yeah. I mean, yeah.
If you've got cheap rent, you can get a bunch of weirdos who work online to move into a house with you. Right, fair.
Yeah, like that's always possible.
She found Ziz's blog and she starts commenting on it.
She's particularly drawn to Ziz's theories on mana
and Gwyn's theory about hemispheric personalities.
In one of her most direct cult leader moments,
Ziz reaches out directly to Maya
as she's like posting on her blog
and emails her saying, I see you liked some of my blog posts. Truly a sinister opening.
Yeah. No, that's a bad guy.
My true companion, Gwen, and I are taking a somewhat different than Miri, that's the organization, one of the rationalist organizations, approach is to say- Is that what they call each other? That's how they refer to it? Yeah, they're true companions. True companions, okay.
At this point. Or taking a somewhat different approach than the Miri approach to saving the world.
Without much specific technical disagreements, we are running on somewhat pointed to by the approach. As long as you expect the world to burn, then change course, right? So basically, we still expect the world to burn so we can't keep doing what the other rationalists are doing.
And she lays out to this this girl she meets through a blog post, her plan to find abnormally intrinsically good people and jailbreak them into Gervais sociopaths.
She invites Maya to come out and it's I don't think this happened, but they do start separately journeying into unbucketing and Maya gets really into this uni hemispheric sleep thing. And Ziz is kind of like coaching her through the process.
She tells Maya that one of her hemispheres is female because Maya is a trans woman. And Ziz tells her one of your brain hemispheres, each of which is a separate person, is female, but the other is male and quote, mostly dead.
And your suicidal impulses are caused by both the pain of being trans and also the fact that there's this dead man living in your head that's like taking up half of your brain's space. And so you really need to debuck it in order to have a chance of surviving, right? Okay.
So she needs to be jailbroken to be free. To be free.
And Maya will basically replace her sleep entirely with this uni hemispheric sleep crap, which exacerbates not sleeping, exacerbates your depressive swings and leads to deeper and deeper trials of suicidal ideation. She is believed to have died by suicide in February of 2018.
She posts a what is essentially a suicide note that is very rationalist in its verbiage, literally titled Decision Theory and Suicide. And this is the first death directly related to Ziz and Gwen's ideas, but I think it's important to note that the role mainstream rationalism plays in all of this, suicide is a common topic at CFAR events, and people will argue constantly about whether or not a low-value individual, it's better for them to kill themselves, right? Is that of higher net value to the world? And it was also used as a threat to stop women who were abused by figures in the community from speaking up.
And this is from that Bloomberg article. One woman in the community who asked not to be identified for fear of reprisals says she was sexually abused by a prominent AI researcher.
After she confronted him, she says, she had job offers rescinded and conference speaking gigs canceled and was disinvited from AI events. She said others in the community told her allegations of misconduct harmed the advancement of AI safety.
And one person suggested an agentic option would be to kill herself. So there is just within rationalism, this discussion of like, it can be agentic as in like you are taking high agency to kill yourself if you're going to be a net harm to the cause of AI safety, which you will be by reporting this AI researcher who molested you, right? Yeah, because you're taking them, man.
Yeah. Shit.
These people are like all this whole community is playing with a lot of deeply dangerous stuff. And a bunch of people are going to have their brains either kill themselves or suffer severe trauma as a result of all of this.
Yeah. Escaping this is even putting yourself back together after living this way seems like it would be such a task again and like any cult part of the difficulty is like teaching yourself how to speak normally again how to not talk about all this stuff right yeah not not identify as a vegan like it right right because like i gotta say like there's and people who are really in the community will note like a dozen different other concepts and terms in addition to like vegan Sith and Gervais sociopaths and shit that I'm not talking about that are important to Ziz's ideology.
But like you just can't like I had to basically learn like the like the a different language to do these episodes. And I'm not fluent in it.
Right. Like you have to triage like what what shit do you need to know? You know? Yeah.
It's so deep. It's so deep, deep and silly.
Let's do an ad break and then we'll be done. And we're back.
So I'm just going to conclude this little story and then we'll end the episode for the day. So this person, Maya, has likely killed themselves at the start of 2018.
And Ziz reacts to the suicide in her usual manner. She blogs about it.
She took from what had happened, not that like debucketing might be dangerous and unihemispheric sleep might be dangerous, but that explaining hemispheric consciousness to people was an info hazard. She believed that people who were single good, like Maya, were at elevated risk because learning that one of the whole persons inside them was evil or mostly dead could create irreconcilable conflict leading to depression and suicide.
And she comes up with a name for this. She calls this Pasek's Doom.
That what she like names the info hazard that kills her friend who she's like fucking with their head so that's nice yeah as nice as anything else in this story I think you might have been the doom here. Yeah.
I think you were the whole problem. But now it's an info hazard to explain a person's like.
To explain your theories. Yeah.
To a person who can't handle it, I guess. Yeah.
And she comes to the conclusion it's a particular danger to explain it to single good trans women who are the primary group of people that she is going after in terms of trying to recruit folks. So she like admits her belief is that this, this thought thing I've come up with is particularly dangerous to the community I'm recruiting from, but it's the only, it's essential.
This information is absolutely essential to saving the world. So you just have to roll the dice.
Yeah.
It isolates herself within her own group that she's created.
Well, yes.
And it also, she is then consciously taking the choice.
I know this is likely to kill or destroy a lot of the people I reach out to, but I think it's so important that it's like worth taking that risk with their lives.
Yep.
Good stuff. Yeah.
Anyway, how are you feeling? Got to give a plug? I am okay. I'm, you know what? I'm deeply sad for these people who are so lost.
And I'm also pretty interested because this is crazy, but I'm okay. I'll be back.
Great. Happy to see that.
Well, everybody, this has been Behind the Bastards, a podcast about things that you maybe didn't think, maybe didn't need to know about how the internet breaks people's brains. But also, a lot of people surprisingly close to this community are running the government now.
So maybe you do need to know about it. Sorry about that info hazard.
Behind the Bastards is a production of Cool Zone Media. For more from Cool Zone Media, visit our website, coolzonemedia.com or check us out on the iHeartRadio app, Apple Podcasts, or wherever you get your podcasts.
Behind the Bastards is now available on YouTube.
New episodes every Wednesday and Friday.
Subscribe to our channel, youtube.com slash at Behind the Bastards.