Bruce Holsinger: "Culpability" | Oprah's Book Club

44m
BUY THE BOOK!

https://www.spiegelandgrau.com/culpability

https://books.apple.com/us/book/culpability/id6740623159

https://www.chirpbooks.com/audiobooks/culpability-by-bruce-holsinger

This episode of Oprah’s Book Club: Presented by Starbucks features coffee and conversation with award-winning author, Bruce Holsinger, in a café in Seattle, Starbucks’ hometown. Chosen as the 116th Oprah’s Book Club selection, Holsinger’s brand-new scorching summer read, called Culpability, is an intense page-turner about a family that survives a crash in an autonomous car only to ask the question: who was at fault? Was it the teenage son who was texting in the driver’s seat, the distracted parents, or the car itself? The book that Oprah calls “a modern tale of our times” also touches on our current and fast-approaching future of living with artificial intelligence including chatbots, self-driving cars, drones, and other non-human forces shaping our lives. Oprah and Holsinger are joined by an audience of readers as they enjoy a Salted Caramel Mocha Strato Frappuccino® drink and talk through many thought-provoking questions about the book.

Subscribe: ⁠https://www.youtube.com/@Oprah?sub_confirmation=1

Follow Oprah Winfrey on Social:

Instagram

Facebook

TikTok

Listen to the full podcast:

Spotify

Apple Podcasts

#oprahsbookclub
Learn more about your ad choices. Visit megaphone.fm/adchoices

Listen and follow along

Transcript

Hey, audience.

Hey, hey, hey.

Bruce holds a nerve.

All right, have a seat, Bruce.

These are your people.

Indeed.

These are your people who have read your book.

Didn't you love this book?

Oh my god, that Alice honey chow.

Okay, so many things to talk about.

I'm so excited.

Hi, welcome, everybody, to Oprah's Book Club, Club presented by Starbucks.

And we are in the beautiful emerald city of Seattle, the home of Starbucks.

They've welcomed us into their headquarters.

Really fun to be here.

And I'm joined in this cozy cafe with an audience full of readers with a good book, good coffee, and good company.

Cheers for this month's summer pick.

Starbucks is pairing a salted caramel mocha stratfo

frappuccio.

Okay?

It's a modern twist on a classic and a sweet and salty indulgence.

Icy cold, frappuccino base, blended with rich mocha sauce, topped with salted caramel, cream, cold foam, and finished with a drizzle of salted caramel.

Okay, so for my 116th book club, it is a scorching summer page turner and a tense family drama, wouldn't you say?

A lot of tension in that car.

It's a thrilling read to bring with you to the beach or the pool or on a family trip.

I'm telling you, I'm taking a family trip.

I've already ordered eight copies of this book.

Everybody has to read it and then we discuss it at dinner.

That's what we're going to be doing.

It's called Culpability by award-winning author Bruce Holsinger.

Now, I love the eye-catching waterfront cover that captures the essence of the story.

So, whoever designed this, bravo to you.

But Culpability, I have to tell you, is a modern tale for our times tackling an issue that we're just barely beginning to grapple with: AI.

Now, it's going to give people so much to think about.

The second I finished it, I called Bruce.

Come on.

Bruce?

Yes.

Oh, bro.

Andre calling you right now.

He didn't believe it was me.

Oh,

hi.

Hi.

Who would?

Yeah.

I'm calling about culpability.

I just read it and I want to choose it as my summer read

for 2025.

Thank you so much, Oprah.

I'm shaking.

I welcome you to Starbucks and the book club.

Thank you so much, Oprah.

Yeah, I'm so excited.

I said I'm so excited for everybody to read this book.

So I want to ask the audience, what did you think?

What did you think of the book?

So good, right?

Okay.

All right.

Okay.

Christine, where are you?

Tell us what you thought.

Yeah, I think it was super thought-provoking.

I'm a new mom, and so it really gave me a lot to think about in terms of how I want to approach technology.

I've thought about sort of the obvious screen time cell phones, but it opened my eyes how many blind spots I have with emerging technology like AI.

We all do, I think.

Yeah, yeah.

Okay, Manny, how about you?

Hi.

So for me, I thought, you know, AI is such a relevant topic, obviously, and it's coming at us so quickly, but it can feel abstract and kind of hard to even wrap our minds around.

So that juxtaposed with this family that had very family-like tendencies that I could know them, you could know them.

It really humanized the conversation and made me think about it deeper.

Wonderful.

Thank you.

Christy?

I think for me,

it's the human experience of this generation.

And what sits with me

after reading it is I just...

Alice is in my heart and I think about where's Alice later in life.

And I think about that related to to my nephew, who's 16 right now, and what's his generation going to be like.

Well, I have to tell you all.

Christy and I work out together, and we're in the gym, and I was telling her, she says, what's going on with you?

And I said, oh, God, I've read seven books.

I can't find a book for summer.

And she said, oh, I may have something for you.

And she comes the next day and she hands me this book.

And usually when people hand me a book, I'm like, okay, good, okay.

But I actually read it overnight.

So I thank you, Christy.

Christy is a reason I'm here with this book.

Okay.

so this is your fifth fiction book and you also teach at the University of Virginia and I hear you dedicated this to your students yes why thank you for noticing that my students are you know at the heart of what I do I think and they've taught I often think that my students whether they're undergraduate students who are taking my literature survey from Beowulf to Milton or they're students that I teach in fiction writing classes at our local nonprofit in Charlottesville, which is called Writer House, or graduate students, and I'm working with them on their dissertations.

I always feel like they teach me more than I teach them, and that the experience of working with students of so many different ages and backgrounds teaches me about the world, teaches me about books, about the literature that I thought I knew.

And I realized, you know, as I was putting this, the final touches on culpability, is that I hadn't dedicated a book to my students yet.

to any of my students, let alone to all of them.

So it felt like what I wanted to do.

So when you're putting putting the final touches on it, did you know it was going to be culpability?

Did you always have that name?

No, I didn't.

Oh, it was called a lot of different things.

And I've now repressed all the other titles because obviously this is what.

And I chose that word.

The book has that

sense of looking at problems of moral responsibility, who's at fault.

who's responsible.

And in law and in history, I think the word culpability has such an interesting resonance about, you know, you can be culpable without being guilty.

You can be guilty without being culpable.

So I wanted that kind of slightly ambivalent sense of our responsibility for what we do and don't do.

Yeah, I think it was actually the perfect word

because everybody in the end was culpable.

Indeed.

Yes.

Okay, so culpability.

is a family drama told through the lens of a husband and father, Noah, and it's about living with the technology confronting us today, artificial intelligence, self-driving cars, chatbot friends, autonomous drones, and smart homes.

So what gave you the idea to tell this story, to write about what happens through this family?

Well, this story, this novel, began with the setting, actually, and that's not usually the way that I operate.

Often, when I begin a novel, I'll have a character that I imagine and cling on to, or a situation in life that I want to explore or a moral arc or just a plot.

In this case, it really was the setting.

And it was the first summer of the pandemic.

And our family wanted to get out of Charlottesville for a little while.

So we Airbnb'd this house on the northern neck of Virginia, right near the Chesapeake Bay.

It was a house on an inlet.

There were kayaks, and I went out kayaking with my sons the first day there.

And on the way out of the cove at the point, you know, it's a pretty rustic rural area.

and there was this absolutely gleaming compound with a fake lighthouse, with a fake beach, with a pristine lawn,

and this old, this, what was probably once a beautiful old charming farmhouse that was renovated to the teeth.

And as we were kayaking back in, I heard this noise.

If you've read the first part of the book,

you know what that noise is.

And this helicopter lands on a helipad.

And a couple of folks get out and we just kayak past.

I just thought that was a little bit weird.

Nothing happened.

The plot that you read did not unfold with my family, thank God, over the course of that week.

But that got into my head.

But it registered.

It registered like a brainworm.

And over the next few years.

Don't you love the way authors think?

I love this.

Yeah, I love the process.

And it just, you know, the story layered itself in, right?

Then I added, I thought, you know,

where is that going to take place?

What's going to happen?

So it really did begin with the setting, which becomes kind of a character in the novel.

And everything else fell into place once I figured out what I wanted to do.

How long did it take you to write it?

From beginning to end, probably about a year and a half.

And then there was a revision process, another revision process.

Did you know where the story was going when you started?

Did you know that it was going to have the ending that it had?

No,

I'm not an outliner.

I often get myself into trouble for that.

I have a lot of false starts.

I write,

there are plotters and pantsers, like see to your pants or a careful plotter.

I'm definitely a pantser.

I am writing, I often have no idea where things are going to go.

And then at the end, there's too many.

So as the writer, did the story start where you started it?

No, actually at the very beginning, the first version, it started with the family going down to the northern neck, to the bay.

And then there was a lot of backstory shoveled in after that.

And then I realized I really want to start with the accident.

So instead instead of getting that as backstory, you get it right up front.

You get it in the first chapter.

So there's no mystery about how the book begins.

I think you do an incredible job of challenging the reader with the moral and ethical dilemmas around artificial intelligence.

And I read that you spent three years researching this book.

And in that research, who did you talk to and what surprised you?

I talked to people who work on the ethics of AI.

I have a friend in the law school at UVA who works on algorithmic bias.

I talk to her.

I talk to some lawyers at a few tech companies.

I talk to people working with the machinery of AI.

And I also just did a lot of reading, as I do for all of my research.

And I suppose one of the biggest surprises for me, well, there's a few.

One is that everybody that you talk to in AI, and you've probably already had this experience, has a P-Doom number.

Do you know what this is?

P-Doom is a number from, it can be from 1 to 10 or it can be from 1 to 100.

It's your percentage of certainty that AI will lead to human extinction.

So you talk to someone, I haven't talked to him, but people like Sam Altman or the guys who run the podcast Hard Fork, they all have a P-Doom number.

And most people's these days, I'll just reassure you, are pretty low.

But it is out there.

And then a few other things surprised me.

One of them is

how many different views there are among experts in the field about where this is going.

You have minimizers, people who

just kind of shrug it off and think, oh, large language models like ChatGPT, it's just glorified autocomplete.

And we don't really need to worry about it, except insofar as it's destroying the environment and creating all kinds of bias and misinformation.

And then there are people on the other end of the spectrum who look at something like autonomous warfare and think, you know, we need to watch out.

This is coming and it's coming now.

So there's a real just spectrum of views about where we're going, where we are.

Well, we don't want to give away too much of the plot, especially that jaw-dropping twist at the very end.

A lot of people didn't see that coming.

I knew something was coming.

I didn't know that was coming.

But my hope is that you will buy Culpability and you will read the book and get a copy for a friend because you're going to want to talk about it with somebody afterwards.

And you're going to want to call somebody.

I call Bruce.

So will you set up the story for us?

The call?

Yeah, no,

the book.

Call's fine.

I could talk about the call all day.

I've only talked about it to two people so far.

Yes, so

the book begins when a family of five is driving a minivan or driving a semi-autonomous vehicle down for a lacrosse tournament in Delaware.

And the older son, the older child in the family, Charlie, is at the wheel, but he's not really driving.

His dad is sitting next to him.

He's on his laptop trying to finish a memo.

In the back are

Charlie's sisters and his mom.

So

his sisters are on their phones doing various things.

And then his mom is writing in her notebook.

Her mom is named Lorelei.

And she's kind of, I think, the beating heart of the book.

She's a really,

you know, the love of Noah's life, but also the great mystery of his life.

And she's a world-leading expert on the ethics of artificial intelligence.

And her life, in some ways, her career

is a real mystery to Noah.

And what happens in the car, there's this horrible accident.

The minivan...

collides head-on with another car coming in the opposite direction and it kills an elderly couple in the other car.

That happens fairly on.

There's no spoilers.

That happens at the very beginning.

Even really in the first chapter.

And

from that moment on, the novel unfolds into,

I think, a kind of intricate plot involving all the different characters, but we see it mostly through Noah's point of view.

And

we're exploring throughout the novel questions of responsibility.

Who is culpable for this accident or what is culpable culpable for this accident?

And how does it involve the personalities and actions and faults and weaknesses of these various characters, but also how does it involve that autonomous mind behind the wheel, which Noah is thinking about right before the accident happens?

Well, I thought it was so interesting.

How many of you knew early on that Alice was talking to the chatbot?

And then how many of you took a while to figure it out?

Okay, that's good.

You know, know, a lot of people are using AI chatbots now for therapy.

Oh, yeah.

You obviously knew that.

Oh, yeah, adolescents, too.

Yes, adolescents, too.

I recently heard a woman say that she uses it as her therapist and asks it, who am I without my accomplishments?

And that the answer that she received led her on a path of self-discovery that she hadn't experienced in all these years talking to real therapists.

I find that fascinating.

It is fascinating.

That a chatbot knows knows more about you than your own therapist.

Or yourself, right?

Or yourself knows more about you than you know, right?

And that I'll tell you that since you asked the research question, that's one bit of research that I was a little bit afraid to do.

I did not sign up for an AI therapist because I was just a little worried about what, you know, I read a lot about what, I listened to a lot of podcasts.

I listened to some transcripts.

of those sessions, but I didn't go there myself.

I was just a little

worried about what would happen.

You were afraid of

what would happen.

Yeah, what I would learn about myself.

Okay, so let's bring in the audience.

They've all read Culpability and have a lot of questions for Bruce.

Where's Angela?

I'm right here.

Angela,

hi.

So, Bruce, I surprised myself and I connected with Blair.

And so, my question is, did you intend to depict Blair as a good chat bot?

Was that intentional?

Oh, great question.

Yeah.

So, Blair is the chat bot who gets involved with.

You related to Blair?

What did you say?

It's because Blair was trying to be good.

At the end of the day, at first I thought Blair was a predator, but then actually it turns out Blair's a chatbot trying to be good.

Yeah, okay.

So you related to the good chatbot.

Yeah.

I wasn't sure Blair's not going to turn on her later on.

I mean, I'm thinking Blair's not good forever.

That's what I thought.

But good for you, though.

And can.

And it's a wonderful question because can these algorithms be good?

And what does good mean in human terms, right?

And

Blair is following a pattern, right?

That's her therapy at heart.

That's why at one point, Alice calls her out and says, what am I doing talking to a chatbot?

And what does Blair say?

Technically, I'm a large language model, Alice.

So Blair has a little bit of an edge too, I think.

And I wanted to depict her slash it,

not as a monster, not as someone who's manipulative, but as someone who, if not good in herself, is trying to make Alice good, is trying to make Alice better, right?

And in that sense, it fits in with Lorelei's project in her life to make AI good, to make it more responsible.

But maybe,

yeah, yes and no, I suppose, is the honest answer to your question.

So, Tim, I know you've been following the book club and you've read so many of our books.

Thank you for being here.

Thank you.

You had a question about this idea of goodness, right?

Yeah, I mean, from the very first page, Bruce, my brain was reeling, and it is about this idea of goodness.

So she was interested in how we learn to be good.

Lorelei wanted to train machines to be good in the same way we train ourselves.

And I immediately had to shut the book and think, whose definition of goodness would we be using?

Who defines what goodness means?

Well, people have been asking that question for 3,000 years, right?

Since ancient Greek poets.

We're going to answer it here at the beginning.

You can start out today.

This book will tell you

everything you need to know about how to be good.

So Lorelei is struggling with that, of course, throughout her career, throughout her work with algorithms, throughout her teaching life and her research.

And there's so many different ethical models that,

just in the history of philosophy and in practical applied philosophy.

And we see a number of them come up in the book.

But the great thing about fiction is you don't have to decide, right?

You don't have to decide who gets to decide what is goodness, right?

The characters are all exploring that from their own different angles.

The novel is asking readers to ponder that in their own lives.

So

what I think you can do, though, with culpability, if you want to

look at it this way, is you could find those moments of goodness in the novel, right?

Those moments when characters are reaching out to each other, despite the bitterness between siblings, siblings, despite these clashes between Noah and Lorelei, you know, finding these moments of goodness to cling to and maybe learn something from them.

I know that's one of the ways that I read fiction.

Okay.

Inside the novel, there is a book, within a book.

Noah's wife, Laura Lai, is a world-famous philosopher and a leader in AI, as we've said.

And throughout the book, we read excerpts from her book called Silicon Souls on the culpability of artificial minds.

And here's one of those thought-provoking passages.

Lorelei writes, like our children, our intelligent machines often break rules and disobey commands.

The danger comes when we start to assume that such behavior is intentional, when we regard an algorithm as a willful child.

Such habits reflect a common and understandable tendency to humanize artificial intelligence.

Chatbots, voice assistants, smart home interfaces, interfaces.

These systems are designed to respond in recognizably human ways.

We give them names like Siri and Alexa.

We speak to them as if they share our worldview or care about our feelings and futures.

So is that the slippery slope?

Because AI can think, we think it's thinking, and we humans are fooled into thinking that it can or should be caring about us.

Yeah, exactly, Oprah.

We think it thinks like us.

We think when it does things, it's doing things like us.

We give it our names.

We give it personalities.

People are falling in love with it.

People, yeah, are falling in love.

And there's been movies about that and novels.

Yes.

Somebody back there is falling in love with a chatbot.

I have to figure out who that is.

But yes, and

the technical term that Lorelei uses for that is is anthropomorphic projection, right?

We are imagining that these chatbots are humans and we're interacting with them in that way.

Think of the way that Noah deals with

the smart home, right?

And think of the way that Alice interacts with Blair, right?

We want them to be humans, and therefore

we act like they're humans.

And in some ways, even I think, and going back to your question, I think in a lot of families now, Alexa is like a part of the family.

Oh, yeah.

Yeah,

Alexa plays music for us.

What could be gooder than that, right?

Yeah, yeah.

Yeah.

And

it's like they, I think even researchers have this stumbling block where they want to humanize this,

all these AIs.

But

one of the other excerpts from Lorelei's book is where she talks about the AIs as cold and calculating and indifferent, right?

And that indifference can be masked as care and as goodness and as compassion.

That's one of the things I find the most frightening.

Do you think we're ready to embrace self-driving cars?

We already have.

I think we're there.

Yeah.

You go around San Francisco, you can summon a Waymo.

A Waymo, I was going to say.

A driverless taxi.

There are, when you're driving down interstates, driverless semi-trucks right now.

Not many of them yet, but they're coming.

More and more are coming.

There was a big article in the New York Times about

driverless semi-trucks.

Driverless semi-trucks.

And you can go and you can buy a luxury car right now with autonomous driving mode or hands-free driving as one of the features.

It's part of the trim package, right?

In a lot of cars right now.

So that's coming, whether we like it or not.

Yeah,

I can see that that is the future.

And I can see all the people saying, no, I'd never drive.

The same people are saying, I'm going to stick with my horse.

horse.

Yeah, exactly.

We don't have self-driving horses yet.

Yeah, we don't.

Actually, we do.

What am I saying?

I'm telling you, that automobile is a terrible thing.

I have such gratitude for you, dear listener, for taking the time to join me for this Oprah's Book Club conversation.

Coming up more with celebrated author Bruce Hulseeker, we are talking about my 116th book club selection, Copability, which explores our current and fast-approaching uncertain world of living with AI, but through a family's point of view.

It is so relatable.

We've got more thoughtful questions from our audience of readers after this break.

Welcome back to Oprah's Book Club, presented by Starbucks.

I am so glad you're here.

My hope is that you read my book club pics and then you can deepen the reading experience by listening to my conversations with the authors.

My 116th book club selection is Culpability by Bruce Holsinger.

It's about a family who experiences a car accident and in the aftermath struggles with the impact of AI on all their lives.

It may be a work of fiction, but it is, oh, so relatable.

Let's get back to questions from our audience.

Yes, ma'am.

Hi, Oprah.

So happy to be here.

Bruce, I have a question around privilege.

I thought it was such an important part of the story,

how it showed up in the AI that people had access to and what they had to bear with that technology and those choices of the AI.

When you were talking about Yemen and things like that with the drones,

or even just their everyday life, how AI and privilege was helping drive their decisions that they were making,

whether it was the young child drinking or kind of their approach to what was going on in the house.

So question for you is, with privilege, how did you use that to kind of shape the decisions that people could make or got to avoid in their lives?

Wonderful question, Shannon.

Yeah, privilege is a theme that I have been completely obsessed with for my last few novels.

I wrote a novel called The Gifted School that was about kind of snowplow parents trying to get their kids into a magnet school for exceptionally gifted kids.

My last novel was called The Displacements.

It was about the world's first category six hurricane and how one seemingly privileged family dealt with the aftermath of that catastrophe.

And then this book is looking at, I guess, privilege through a number of different lenses, right?

You have Daniel Monet, of course, the tech billionaire,

who is one of the most privileged people on earth and is

using the tools at his disposal to create these ever more powerful algorithms.

And we learn later in the novel, no spoilers, that

there's a dark purpose behind that, ultimately, that Lorelei is somehow implicated in.

And then, then, you know, Noah, of course, comes from an unprivileged background, and he's dealing with a much more, you know, blue-blooded Lorelei and her family.

And so he filters his experience in part through that category, right?

And

his awareness of the differences.

He's always feeling less than.

Always feeling less than.

Yes, exactly.

From the very beginning, even from his wedding day, right?

When his sister-in-law.

puts him down a little bit.

So, yeah, so privilege is a theme that you'll find running through all of what I write.

Yeah, thank you for the question, though.

There's a moment in the book when Lorelei confronts her husband who thinks that she's having an affair.

And one of the things she says is, you love me, I know that.

You care about me.

You want what's best for me for our marriage, for our kids, of course you do.

But sometimes the way you look at me, it makes me feel like you think I'm a freak or some kind of alien or even an AI, like you're afraid of what you'll find if you look too hard.

You think many people are afraid to look too hard at AI?

I think so, yes.

Because

one of the paradoxes of AI is it's right in front of our eyes, right?

It's right there, and

it's interacting with us all the time.

Pick up your phone when this is done, and there will be an AI there, probably, looking at you, just as more than you're looking at it.

And in the novel, I wanted to think about

that.

that moment and that came to me very late in the revision process.

I can see you all because everybody's thinking, because I think you're thinking about that question, right?

I'm so glad that landed with you.

We all are afraid to think about it, to look at it too hard because of the arguments on both sides.

Yes.

But we also know it's coming.

Part of it is we know it's coming and there's absolutely nothing we can do about it.

And Noah is looking at Lorelei.

And one of the reasons she feels so self-conscious in that passage is

she knows that she's kind of a mystery, kind of a freak to Noah, right, in his eyes, even though he's the one person in the world who loves her most.

But she feels superior.

You got to admit, she feels superior.

I don't know.

Do you think she's a fan?

Oh, yes, she does.

I have failed.

She does.

Don't you think she feels superior?

She knows, huh?

Her sister does.

Her sister does, exactly.

Thank you.

Her sister does.

You don't think she does.

I don't think she does.

No.

No.

I think she feels misunderstood.

Misunderstood, exactly.

You think she feels misunderstood?

Yes, yeah.

Okay.

Shall I sister?

Yes.

Well,

tell me why you think she feels misunderstood.

Why I do?

Yeah.

You're the one that said that, though.

But you tell me why you said you don't think.

I don't think so, and especially.

You don't think she felt superior.

I don't think she felt superior.

And I think

kind of towards the end, it really comes out that

she is

actually

puts Noah up more on a pedestal.

Like, you are it.

Like, you are, you are allowing me

to be who I am.

Okay.

And therefore, I am not indebted, but

I'm loving you for loving me the way I am.

You feel that, okay?

Yeah.

And he not only puts her on a pedestal, but he makes her the pedestal, or she makes him the pedestal, right?

And I, you know, that,

when I wrote those sentences, I, because all along I was, and a few early readers thought, why is she into him again now?

Why is she into him?

Yes.

And so I had to build his personality more.

You know, I had to sharpen it a bit.

And

that moment, you know, is that that really was an expression of love from her to him.

It finally opened his eyes.

to what she sees in him and what he means to her and how much he's accepted that over the years, right?

Okay.

I can hear that.

Yeah.

Thank you.

You're skeptical.

I had another opinion, but I can receive.

I'm open-minded.

Well, that's what readers are for.

That's what reading is about.

Stephanie, you have a question.

Yes, I do.

I related so much to all the different family dynamics in the book, and I was just curious if Noah and Lorelei's relationship was based at all on your own marriage, and if the different personalities of the children, was anyone in the family based on your own personal life?

Yeah, can we cut?

Well, I would say that I have a really brilliant and very perceptive wife who reads everything for me and will definitely look into my soul the way Lorelei looks into Noah's as well.

So there's a sense of that, of

being known and being called out for things.

within your family.

My sons, too, are, there's definitely autobiographical elements there.

They were both college athletes and I was a kind of psycho soccer dad for a long part of my life when they were growing up.

But they also call me out on things.

They really know me often better than I know myself.

And so those,

you know, those, you know, we draw, when we write a novel, we're drawing from so many different aspects of our lives.

And sometimes those,

that comes home to roost a little bit, right?

And so,

yes, certainly there's always going to be autobiographical elements, I think.

You write that a family is like an algorithm.

This is a major theme in the book.

Explain what you mean by that.

Yes.

Now, that comes from Noah

remembering Lorelei saying that, right?

A family is like an algorithm.

Why is that?

It's, in Lorelei's eyes, a family

works in kind of predictable patterns.

It goes back to your questions.

You know your kids, you know your spouse, you know how they're going to act in certain circumstances.

If you can just keep the wheels humming along,

just keep the variables

churning along with the constants.

A family is like an algorithm.

And then the way the prologue ends, a family is like an algorithm.

A family is like an algorithm until it isn't.

And that's the dark way that the prologue begins, ends, is the way we launch into the book, knowing that something is coming.

Did you know

when you started writing it,

even before you knew it was culpability, that everybody in the van would have something to do with what was happening?

Yes, you did know that.

Yes, absolutely.

And that was the kind of tapestry of the book, right?

It was the knowing that everyone in that van had some role to play.

in this horrific accident.

And then it was a question of choosing choosing which strands to pull, which to tug more tightly.

And then as I went along just

creating the whole novel,

I started to see it from a distance and kind of understand everyone's role.

And then it was a matter of letting the plot and the characters catch up with each other

and said that the whole thing would feel kind of inevitable at the end.

And you had to get them to the summer place, the North End.

Why?

Why did I have to get them there?

Well, that particular place, like I was saying, that the setting was

boring into my mind all along, and I wanted to get them there on a car trip safely, right?

And that's, and so in some ways, the novel has a second beginning.

You know, a month or so after the hospital, after, you know, when they're recovering from their injuries, They're in the car together again, not that car, a different car.

And they're on their way to recover.

And that part of the setting, I think the psychology of the setting was, this is about recovery.

This is about resilience.

This is about where we go for a while to heal.

And so it was, you know, a place that was really important for my family,

even just those few days on the northern neck.

We haven't been back since, but I remember it so clearly and how...

how good that felt, just to get away like that.

Lorelei comes to this realization on page 293, the mother.

No one can keep our kids safe forever, not even you.

No matter how much money we throw at the problem, or how many guards we hire, or how many tracking apps we put on our phones, no matter how good your algorithm is, we can't protect them from everything.

We just can't.

And I understand that now better than I ever have before.

As a father, how do you feel about parents protecting their kids from all of this?

Yeah.

I think about this so much.

I think about

the lengths that we go to protect our kids, not just from harm, but from uncomfortable situations, right?

The great phrase is snowplow parenting, right?

We'll clear the way in front of them, or we're helicopter parents, we'll hover over them to make sure everything is okay, to make sure that they have what they need.

And, you know, my kids now are 22 and 25, and I'm still doing it.

Just making sure that everything is okay, wanting to protect, wanting to defend.

And in the case here of Charlie, you know, Noah would do anything to protect Charlie from culpability, from his own culpability, right?

That's one of the tension points in the novel is that relationship between father and son.

And you can, you know, I hope you can feel it on the page, the kind of Noah's sort of bristling with worry about his son.

I want to take this moment to thank you, dear listener, for joining me for one of my favorite things to do in life.

Talk with readers about this thought-provoking book with the author.

So cool.

When we come back, more of my conversation with Bruce Hulsinger, author of Culpability.

It is, I'm telling you, the perfect summer read for the beach, for the pool, or especially a family vacation.

Stay with us, book lovers.

Welcome back to our Oprah's book club presented by Starbucks.

I'm so grateful y'all are here with me and writer Bruce Hulsinger.

We're at a Starbucks cafe with book lovers and readers.

Bruce is the author of my latest book club selection, Culpability.

It is, I'm telling you, a must-read about how a family grapples with the rapidly advancing world of AI.

The book is a page turner for this exact time we're living in.

You're not going to be able to put it down, I'm telling you.

And you're going to want to discuss it with other family members.

So let's get back to more questions from the audience.

Britton is a college counselor for a high school.

You have a question, Britton.

Yeah, so I, in my performance evaluation last spring, my boss said he wants me to embrace AI and find ways to help it enhance my work.

And so one of the things I do is I write recommendation letters for students, and I tried, and it felt icky to me.

So I used some AI in some other ways.

I enhanced a PowerPoint presentation.

I used it to send a scathing email to my son's principal at his school.

Do you have it with you?

I'm on a reach.

Oh, it's so good.

It is so good.

She never said that.

What do you say to it?

Write something scathing?

Yeah, I said, make me sound more professional, but also show that I'm irritated.

And it did a beautiful job.

It was great.

Oh, really?

Okay.

Anyhow, as I'm working with students and I'm telling them, do not use ChatGPT to write your college essay, but then I'm also dabbling in it.

Where is the line there?

Such a great question.

And our students, you know,

every faculty member I know at University of Virginia is grappling with this right now.

Yes.

You know, how do we...

Where do we draw the line?

What is the line?

And are we being hypocrites if we're using it to generate administrative prose like an assessment report, right?

Which no one is ever going to accuse of

being creative.

But you know, on the other side, there are creative writers.

There are novelists and poets who are experimenting with large language models like Chad GPT and Claude in really interesting ways.

I don't know if you know the Canadian novelist Sheila Hetty.

She wrote this beautiful story in The New Yorker.

where she, I think she worked with Claude, who was kind of an early adapter of Claude, and she fed Claude prompts, and Claude spit out these responses.

And then in order to finish the story, she took out all of her prompts and just published the responses.

And it became this kind of unforgettable, eerie short story.

And there's this brilliant poet, Lillian Yvonne Bertram at the University of Maryland, who works with small language models and uses them to create these really interesting poems.

And she's self-conscious.

They're both self-conscious about it, about how they use these models to create.

And so, you know, between prohibition, no, don't use it at all, and using it to write a story that appears in The New Yorker and everything in between, right?

That's where we are right now,

for better or for worse.

Well, I want to end our conversation with this passage on page 309.

Towards the end of the book, you write, every accident in a self-driving vehicle is huge news because it's covered as if a malevolent robot has killed a human.

Meanwhile, some random truck driver falls asleep at the wheel and kills a young couple.

Yet we never once considered taking all 18 wheelers off the road.

She turns and looks out over the inlet.

I want to believe in humans.

I want to believe that even at the last second, an AI can and should be overridden by a knowing human conscience, by a moral mind with a soul.

Now I'm not so sure there's a place for algorithms, a bigger and bigger place.

But people have to be better too.

They have to not drink and drive.

They have to not text behind the wheel.

We shouldn't make these machines because we want them to be good for us or good instead of us.

We should make them because they can help us be better ourselves.

Yes.

We like that, don't we?

You've given us a lot to think about where we are and where we're going.

Where do you think we're going?

Who knows?

Into a future of profound unpredictability.

If anything, that's what I've come to.

It's a little bit like writing a novel.

I feel like the future of AI, it's like all of us writing this novel together, and who knows where it's going to go.

Yeah.

I thought it was so interesting when you asked that question about,

are we afraid to look too hard?

And everybody in here agrees we're afraid to look too hard because of what we might find.

We thank you for inviting us to explore and wrestle with these very challenging ideas

our times and for this thrilling story.

Aren't you pass it on to a friend?

Now that you've read it, pass it on to a friend.

Culpability is available wherever you buy your books.

Audience, thank you for reading the book, for sharing your time with us and your thoughtful questions.

When I tell you, this is the perfect summer read.

Is it not?

All right.

You did the perfect summer read.

Thank you.

Before we go, I want to share that I recently published something tailor-made for you avid readers.

So delighted to tell you about it.

It's my book lover's journal.

And here's what's great.

It has over 100 prompts, like the first book that made you feel seen and the book that reminded you of what matters most, all designed to enhance your reading experience.

It's a great gift for book lovers, and I hear book clubs are using it when they meet, which makes me so happy.

I write about some of my favorite books in first sentences, and it is available anywhere you buy books.

This audience is going home with a copy.

A big thanks to our extraordinary partners partners at Starbucks for supporting us.

There is no better place for coffee and conversation.

You were saying, do you write at Starbucks sometimes?

I write in all different coffee shops.

Yes, I've written half of my novels in coffee shops over the years, including Starbucks.

Like at home to you.

Absolutely.

Go well, everybody.

Thanks so much.

Thank you all so much.

Thank you, Oprah.