Dr. Sandra Matz on The Silent War Hijacking Your Free Will | EP 578
Press play and read along
Transcript
Speaker 2 Coming up next on Passion Struck, companies now actually have to step it up, right?
Speaker 2 So they have to say, Well, if I want to get your data, I better make sure that I show you how my product is much, much better when you're giving me access to your personal information than when you're not.
Speaker 2 And you can think of it as like YouTube, right? So YouTube has this option where they're not using your history and your like what you've looked for, searched for before, what you've watched before.
Speaker 2 And you can actually see the value decrease. So there's you can see how it's like every time you have to find something new, you could doesn't remember anything.
Speaker 2 So, but if your privacy is protected by design, now companies really have to live up to that expectation.
Speaker 2 So collectively, you can actually ask for a lot more when it comes to the design of products and services.
Speaker 3 Welcome to Passion Struck. Hi, I'm your host, John R.
Speaker 3 Miles, and on the show, we decipher the secrets, tips, and guidance of the world's most inspiring people and turn their wisdom into practical advice for you and those around you.
Speaker 3 Our mission is to help you unlock the power of intentionality so that you can become the best version of yourself. If you're new to the show, I offer advice and answer listener questions on Fridays.
Speaker 3 We have long-form interviews the rest of the week with guests ranging from astronauts to authors, CEOs, creators, innovators, scientists, military leaders, visionaries, and athletes.
Speaker 3
Now, let's go out there and become Passion Struck. Hey, Passion Struck fam.
Welcome to episode 578. I want to take a moment to thank you for being part of this incredible community.
Speaker 3 Your energy, passion, and commitment to living more intentionally inspires me every day. Whether you're a longtime listener or joining us for the first time, welcome.
Speaker 3 You've joined a global movement dedicated to igniting purpose and living boldly with intention and I couldn't be happier to have you here.
Speaker 3 Let me ask you, what if the most powerful force shaping our thoughts, decisions, and even our emotions isn't our own free will, but an invisible web of algorithms that know us better than we know ourselves.
Speaker 3 What if the digital world we've built to connect us is actually driving us apart? Today's guest, Dr. Sandra Matz, is here to explain these urgent issues.
Speaker 3 A professor at Columbia University and the author of Mind Masters, The Data-Driven Science at Predicting and Changing Human Behavior, Sandra has spent years studying how our digital footprints, everything from the music we listen to, the places we go, and even the words we use, are being used to shape our reality.
Speaker 3 In today's episode, we explore how algorithms have become the architects of our choices, shaping what we see, what we buy, and even how we feel. The concept of the digital village.
Speaker 3 How today's technology nourish the tight-knit scrutiny of small communities, but on a global scale.
Speaker 3 The hidden way psychological targeting influences everything from elections to relationships to self-perception.
Speaker 3 How we can reclaim our autonomy and create a healthier, more intentional relationship with the technology we use, and the role of human connection in an era of digital disconnection, and why our sense of mattering is at stake.
Speaker 3 If you've ever questioned whether your thoughts and decisions are truly your own, or wondered how we can take back control in a world increasingly shape our artificial intelligence and predictive analytics, this episode is for you.
Speaker 3 Sandra's research challenges us to rethink the trade-offs of the digital age and discover ways to reclaim our own sense of agency.
Speaker 3 Before we dive in, let's take a moment to reflect on my powerful conversation that I had on Tuesday with Edward Fishman, a former top U.S. State Department sanctions official.
Speaker 3 He joined me to discuss how financial leverage, supply chains, and economic choke points have replaced traditional warfare.
Speaker 3 His book, Chokepoints, American Power in the Age of Economic Warfare is a must-read for understanding the future of global power.
Speaker 3 And in last week's solo episode, I explored the foundational role of mattering in our relationships, why we accept less than the love we deserve, how modern dating feels like a competition, and what it takes to build lasting, meaningful connections.
Speaker 3 Part 2 drops tomorrow, where I'll dive even deeper into how to recognize the right relationships, make love last, and navigate heartbreak with intention. Want to dive deeper?
Speaker 3 Check out our episode Starter Packs, curated playlists on topics like leadership, mental health, and personal growth at passionstruck.com slash starterpacks or Spotify.
Speaker 3 Sign up for my Live Intentionally newsletter for exclusive weekly insights, tools, challenges, and strategies to live with greater intention. Prefer a video? Join our growing community on the John R.
Speaker 3 Miles YouTube channel where you can watch this episode and share it with someone who could benefit from Sandra's insights.
Speaker 3
Now, get ready for a conversation that will change the way you see technology, autonomy, and the digital forces shaping our lives. Let's dive in with Dr.
Sandra Matz.
Speaker 3 Thank you for choosing Passionstruck and choosing me to be your host and guide on your journey to creating an intentional life. Now, let that journey begin.
Speaker 1
I am absolutely thrilled today to welcome Dr. Sandra Matz to Passionstruck.
Welcome, Sandra.
Speaker 2 Thank you so much, John, for having me.
Speaker 1 Today, we are going to be exploring your brand new book, Mind Masters, the Data-Driven Science of Predicting and Changing Human Behaviors.
Speaker 1 Congratulations on the book and it being featured as a must-read by the next big idea club. Fantastic recognition.
Speaker 2
Yeah, I was super happy. I love them.
So it was almost a dream come true.
Speaker 1 I felt the same way. I wrote a book last year, it came out in February, and when I got that recognition, it was for me.
Speaker 1 a higher level of acknowledgement than any bestseller list could have possibly been.
Speaker 2 I was going to say, it feels like the bestseller list that matters in a way. So I would feel you.
Speaker 1 I'm going to start back in your childhood because this is where you start off your book. You grew up in a small German village.
Speaker 1
And in this village, as I understand it, there were only two restaurants. There weren't any shops.
What was it like growing up in that village? And I was hoping you might be able to introduce this.
Speaker 1 by using the story of the missing rabbit.
Speaker 2 The missing rabbit, yes. I am happy to.
Speaker 2 So yes, I grew up in this, it was really a tiny village, 500 people, even though my parents keep reminding me that since I left, it's grown to a thousand people, which I can assure you does not make much of a difference.
Speaker 2 But so my experience there was really the fact that it was like a very tight-knit community, right? Everybody knew everything about everybody else's lives.
Speaker 2 Your kids running to the bus and maybe they're just not the most organized ones, or you see who's dating whom, what people are doing on the weekend.
Speaker 2 And that just means on the one hand, that it's just like the sense of community, connection, and security that I've never experienced anywhere else, right?
Speaker 2 Those neighbors who know you that well, they're the ones who can give you the best advice possible, they can connect you to opportunities, they can set you up with people that they think you might like or jobs that they think you might be interested in.
Speaker 2 But it's also
Speaker 2 very intrusive in a way, right?
Speaker 2 Because they, again, observe everything that you do and they poke around in your private life, sometimes manipulating you behind the scenes in ways that I at least didn't really appreciate.
Speaker 2 The story with the rabbit was essentially, I think, a nice example of just like how people in the village by observing everything that's going on can come together and almost like create value.
Speaker 2 So this was, I don't know how, I was still pretty young, but I remember as the weekend we're still sleeping at some point, like one of our neighbors knocks on the door telling us that my pet rabbit had escaped apparently.
Speaker 2 So they were living outside.
Speaker 2 We had built them like this amazing cage together with my dad, but apparently we left it open overnight so rabbit escaped and it was just hanging out in their garden eating the salad they had already tried to catch it so difficult so now suddenly it's my neighbors trying to chase the rabbit my entire family gets up trying to chase the rabbit and they're really fast so they zigzag around like you see it in the cartoon movies and it was just impossible to catch it and like the word gets out word spreads pretty quickly so i think just a few like maybe half an hour in the entire street is involved So we now have someone managing the traffic because the rabbit keeps running across and like back and forth across the street.
Speaker 2 We have like people planning how to catch him. There was like this entire, like the entire community came together to help us solve the problem.
Speaker 2 We eventually caught him by someone leaping onto the rabbit, which was a traumatic moment in and by itself.
Speaker 2 But again, it's just like there was someone who knew before we even did that the rabbit had escaped and everybody came together to catch it. Yeah, it was like a tight-knit community.
Speaker 1 Yeah, and I think you found that to be even more the case as you grew into a teenager and experienced this in compounding effects when you were involved in a motorcycle accident when you were 15.
Speaker 2
Yeah, you're really hitting all the pain points here. But that's true.
This is one of my low moments in my village career, I would say.
Speaker 2 So I had a boyfriend at the time, who was a bit older, and he had a motorcycle.
Speaker 2 And we loved riding the motorcycle around and it was usually him driving because he was the only one who had a license i was still 15 and i was at some point well i want to at least try i don't want to wait another three years until i can get my license so why don't we just go to an even more remote place than the village so we found an airfield and that was abandoned by the time and i was just like let me try you sit on the back
Speaker 2
And we'll take it step by step. You walk me through how stuff works because I have no idea, which was like a really nice idea.
In theory, did not plan out in practice.
Speaker 2 So I don't know exactly what happened, but I think I just pulled the bike back and just turned on the gas and let the cloud snatch.
Speaker 2 So we just rise like a horse, the front goes up, boyfriend falls off the back, and I just keep driving away, no ideas.
Speaker 2 At some point, I crashed on the side, luckily, and nothing really happened because I was still going relatively slow.
Speaker 2 But you can imagine that the moment that we come back, because my dad had to pick us up, the motorcycle just wouldn't start anymore.
Speaker 2 The moment we come back and drop it and the shop to get repaired, everybody knew about it, right?
Speaker 2 So I think for the next couple of weeks, I just constantly got caught in conversations about I'd be so stupid to do this. And then someone else telling me about their own childhood offenses.
Speaker 2 Yeah, as you, the news spread really fast in that village.
Speaker 1 So I didn't grow up in a small village like you, but I grew up in a small-knit community around the Catholic Church.
Speaker 1 And my upbringing from the time I was in kindergarten through high school, I always went to parochial schools.
Speaker 1 And so I felt like I was in this small village where everyone knew my hopes, my fears, my weaknesses, my dreams. And it gave you a sense of being seen.
Speaker 1 But in that small village, that was a blessing and a curse because it shapes how you view yourself and your sense of mattering.
Speaker 1 How for you to growing up in a village shape your understanding of mattering?
Speaker 2 Very much. And I think it shapes both like the sense of mattering, but it also shapes the choices that you have available, right? And the decisions that you make for yourself and others.
Speaker 2 So I was the daughter of a local police officer, which didn't help with a motorcycle accident, but also meant, for example, that people that I wouldn't get invited to some of the cool kids' parties, but I would then get invited to all the community stuff where you could help out and organize things.
Speaker 2 So that what people think about you, right, who they think you are, really matters in terms of how they interact with you, the opportunities that they create.
Speaker 2 So I think for me it was both the sense of like who am i and what are some of the opportunities that are available for me and then also on the flip side in terms of just getting the support so we all just there's certain things that are extremely hard right so like when i try to figure out what do i want to do after college there's so many things that you could do you've not been there right because you just get out of high school you don't have that much experience so you need people who've seen more of the world to guide you in some way and obviously the more that people know about, again, your dreams, your hopes, and so on, ambitions, they can really customize the advice that you give.
Speaker 2 So I think that the mattering expressed itself in, it was nice that there was someone who understood what I wanted.
Speaker 2 And then in those moments when they used it to my advantage and in my best interest, that felt like a really strong support system.
Speaker 1 I think the important thing here is this whole
Speaker 1 contrast that we've gone from people living in these small villages where we had these communities that people existed in for really centuries.
Speaker 1 And now we've gone to a global village where algorithms know even more about us than our neighbors ever could. And I love what you write in the book.
Speaker 1 You write, but it turns out you don't have to live in a small rural community. to have someone watch and influence every step you take and choice you make, which you were just describing.
Speaker 1 That's because we have digital neighbors. In the same way, my neighbors became expert snoopers and puppeteers over time.
Speaker 1 Computers can translate seemingly mundane, innocuous information about what we do into highly intimate insights about who we are and ultimately prescriptions of what we should do.
Speaker 1 And as I read that, it really got me into thinking how complex this becomes when these things that we don't even realize are around us are influencing our mind on what we should do, who we should be, what we should become.
Speaker 1 How does that really impact our ability to connect and feel
Speaker 1 that we have autonomy over our choices?
Speaker 2 I think it's such a great question because so for me, if you go back to the analogy of the village, right, there were essentially two components.
Speaker 2 One is that someone was snooping around in my life and they understood who i was what i wanted and so on but for me the more important part was always the second part is they use that knowledge to then potentially influence my behavior for better or worse and so the same way i think about this online right so there's now with all of the data that we generate anything from what you post on social media, your credit card spending, the fact that your phone tracks your whereabouts pretty much 24-7, knows who you connect with, knows what conversations you have.
Speaker 2 And those are all very intimate insights into your psychology.
Speaker 2 But then, the second step of, well, once companies and third parties have access to those insights, they can essentially, again, shape your behavior in certain directions that sometimes might be extremely helpful.
Speaker 2 Right? If I can figure out, again, what might be jobs that you're interested in, can I understand and whether you might be struggling from mental health issues and support you?
Speaker 2 So, there's like a lot of helpful use cases, same way as in the village.
Speaker 2 But then there's obviously also these other use cases where i get you to do something that you don't want to do maybe like the least problematic one is probably i'm just going to sell you stuff that you don't need but then once we go into i'm going to get you to vote for someone that you otherwise wouldn't vote or not vote at all it's this entire second world where suddenly your choices are no longer your own and because you mentioned connection, I think what like where we see this materialize in the context of like social fabric is that the more and more we personalize, the less and less connected we are, right?
Speaker 2 The less and less we actually have this sense of shared reality because we're all seeing the same thing.
Speaker 2 So I think a lot of people are talking about this in the context of echo chambers or filter bubbles.
Speaker 2 But the convenience that we get sometimes by seeing some of the stuff that's relevant, that's really tailored to us in our psychology, also means that what I see might not be like at all related to what you saw now.
Speaker 2 We don't even have the same conversations anymore, right? And there's also no checks and balances in terms of what news are out there. Are they actually factually correct?
Speaker 2 Are they potentially slanted? Or like even these cultural references that we used to have, like these fun conversations, right?
Speaker 2 We talk about a movie or an ad that we've seen together on TV and everybody sees the same thing. So it creates this bonding.
Speaker 2 And I think the more we move to an online world that's trying to emulate and simulate some of the ways in which we've traditionally customized content and communication, it just means that we don't see the same thing anymore.
Speaker 2 And that kind of then erodes some of the connection, I think, that you were referring to.
Speaker 3 It certainly does.
Speaker 1
And I'm going to give you a little bit of background on me that you might not know about. For years, I was a senior executive in technology.
I was a CIO at Dell,
Speaker 1 where I oversaw their data strategy. And before that, I was the chief data officer at Lowe's, where I oversaw their enterprise information systems strategy.
Speaker 1 And I remember at Lowe's, as we're looking at the strategy, we were trying to compete based on data. And one of the biggest things we were looking at was consumer data.
Speaker 1 And we developed this program called the Single View of the Customer, where this was back 2007, 2008. We could match 92%
Speaker 1
of Americans and knew exactly what their shopping habits were. where they were living, when they were on vacation, when they were on a second home.
It was overwhelming how much data we had.
Speaker 1 And what we were trying to do is to learn everything we could about them because
Speaker 1
in a shopper's lifetime, there are only so many home remodel projects that they're going to do. And I know you're moving right now.
So when someone moves,
Speaker 1 that was a key sign to us that they're likely going to be spending on this new house. And so we would use this to target all kinds of different areas about them.
Speaker 1
You write about this in the first chapter of your book. You say that there's now an estimated 149 zettabytes of data.
And to put that into perspective, it reaches as far as the moon.
Speaker 1 And you write that some suggest that there are as many digital pieces of data as there are stars in the vast universe. But what
Speaker 1 really got me is when I read that these pieces
Speaker 1
separated aren't that meaningful. It's just like a puzzle.
It's disconnected chaos. But once you put the pieces together, you begin to see the full picture and understand its meaning.
Speaker 1 And
Speaker 1 these digital traces provide a rich picture of our personal habits, preferences, needs, and motivations.
Speaker 1 In other words, you write, it's psychology.
Speaker 1 So it's been interesting to me as I've began to interview more and more data sciences, how many of them now reside in the largest of corporations?
Speaker 1 I remember at Lowe's, we had three or four, but Google has an army of them. Facebook has an army of them.
Speaker 1 How much now in these technology corporations are they using the science of human behavior to not only analyze us, but in many ways to control us?
Speaker 2 I love that. So the puzzle metaphor is, I think, one of my favorite ones because it really tells you like how this world works, right?
Speaker 2 If I ask people, like, how worried are you about your smartphone sensing data being out there, about your social media posts being out there, about your credit card spending being out there?
Speaker 2 Oftentimes people are not even that worried because it doesn't seem super intimate, right?
Speaker 2 If I think, okay, maybe someone knows where I work, maybe someone knows where I live, maybe someone knows that I got a Starbucks coffee, maybe someone knows that I went to a vacation because I post about it on Instagram.
Speaker 2 In isolation, those traces don't even seem that intimate and intrusive, right? Because it's like this one insight into what we do.
Speaker 2 But then when you put them together, that's really when this kind of holistic picture of the, you mentioned this in the context of Lowe's, emerges, right? Almost the life of the person play out.
Speaker 2 You see their routines, you see their habits, and like the stuff that I'm...
Speaker 2 I've been working on for the last 10 years is you can also zoom into their psychology and that becomes really intimate, right?
Speaker 2 So I think the reason for why, for example, Cambridge Analytica blew up so big in 2016 was that suddenly people understood it's not just data, it's not just individual data points, but they can predict whether I might be impulsive or they can predict whether I might be neurotic.
Speaker 2 And so even though like political campaigns or companies have been using data for many years before and oftentimes were celebrated by helping understand consumers, helping understand voters and talking to them about the things that matter.
Speaker 2 I think suddenly people realize that, well, wait a minute, if I can take all of this data and translate it into something that I think is really intimate and that I might not necessarily want to reveal to outsiders who I have no connection with, that might be like a bit of a red flag.
Speaker 2 And exactly for the reason that you mentioned, right?
Speaker 2 It's not just that we're poking around in your private life, but we can also use it to in a way control your choices or at least nudge them in a certain direction, right?
Speaker 2 I always talk about how like psychological targeting, that's the term that I use for that is like we predict your psychology from data and then we use it to influence your behavior.
Speaker 2 It's not a wonder weapon, right? So we're not going to completely flip your identity just by understanding that you're extroverted and showing you a few more extroverted ads, for example.
Speaker 2 But still, most choices that we make in life don't require an identity change, right? It's okay, should I go on this vacation or this vacation?
Speaker 2 Now, I can probably shift you to the one that I want you to go in.
Speaker 2 Or if you're an undecided voter and you don't know if you want to go out on election day and cast your vote, well, maybe then understanding your motivations and dreams and fears actually just gets you across that line to actually go and cast your vote.
Speaker 2 So, for me, that's the crux: is that
Speaker 2 kind of controlling behavior doesn't require like this deep identity change, it just means making some options more likely than others.
Speaker 1 Something I've heard you talk about in the past is self-determination theory, and I have had Richard Ryan on this podcast, who's one of the inventors of it.
Speaker 1 But for those who aren't familiar, it consists of three different things: autonomy, mastery, and connectedness.
Speaker 1 And oftentimes, I think
Speaker 1 our autonomy and how we relate to each other go hand in hand, meaning I feel if you don't feel like you matter as an individual, it's hard to feel like other people matter, or it's hard to feel like you matter to other people, or that you can help other people matter.
Speaker 1 So how do these
Speaker 1 algorithms impact that sense of autonomy that we have? Something that Angela Duckworth would call self-control.
Speaker 2 Yeah.
Speaker 2 It's such an interesting question because I do think that there's a difference now coming back to the old ways of the village and the new ways of the village.
Speaker 2 Because back in the day in the village, there was something that would essentially say: if you give up part of your autonomy, right?
Speaker 2 So you buy into this idea that we're a collective community, you give up some of the agency, you give up some of the ability to just do whatever you want.
Speaker 2 That created a lot of sense of connectedness, right? Because suddenly we're in it together. And suddenly it's not just my decisions and my choices, but it's the collective decisions of the community.
Speaker 2 So I think back in the day, there was like almost like this back and forth and negotiation of like, how much am I willing to give in response to how much I'm getting?
Speaker 2 And I think that kind of trade-off has been broken a little bit, or that negotiation in the digital world, because right now, the way that it typically works is right, it's not a mutual conversation that we're having.
Speaker 2 So in the village, I knew something about my neighbors. I could connect with them, I could also influence them and vice versa.
Speaker 2 But now it's essentially big companies, or it doesn't even have to be a big company, but anyone who can collect data about us, as you said, a lot easier than it should be.
Speaker 2 It's a one-way street, right? So someone is collecting data, trying to passively influence us. That means they're in a way taking our autonomy without our control or consent.
Speaker 2 But it also means that by doing so, they potentially disconnect us from the people around us. Because now, again, we're not seeing the same thing anymore.
Speaker 2 So I think back in the village, this kind of trade-off between autonomy and connectedness was actually something that we benefited from.
Speaker 2 I think in the digital world, it's like the same infrastructure, the same system is in a way taking both, right? It's taking our autonomy.
Speaker 2 And at the same time, because we don't see what other people see and because we're kept in our little echo chambers, it also takes away some of that connection and meaning.
Speaker 3 I've recently been reading Malcolm Gladwell's newest book, Revenge of the Tipping Point.
Speaker 1 And When I think of this concept of a tipping point, it's this critical juncture. He describes it as getting to kind of 33%,
Speaker 1 where a series of small actions or behaviors lead to a significant and often irreversible shift.
Speaker 1 And when I think about this shift, and especially what these algorithms and digital technologies are doing to us, we have exceeded this tipping point now a ways back.
Speaker 1 And so, this tipping point now
Speaker 1 has influenced physical targeting, our digital footprints, and again, what I call this disease of disconnect. It provides a lens to examine when and how
Speaker 1 at the society level, individual level, technology reaches a profound transformation. I want to take this tipping point in this example of going from your small village to the digital village.
Speaker 1 I think this tipping point is really shaping
Speaker 1 how we view each other, how we view our emotions.
Speaker 1 And
Speaker 1 what I wanted to understand from you is
Speaker 1 now that we have gone so far in one direction,
Speaker 1 what are some of the next steps that we can take to either reclaim control or harness this technology for good?
Speaker 2 It's a great question because I think so far and the way that I've spoken about it probably sounded relatively negative overall, but I do think that there's incredible opportunities, right?
Speaker 2 So like we've talked about like this understanding in the village of, well, they could give me the best advice ever because they knew who I was.
Speaker 2 I think when we like one of my favorite examples in this context is mental health, right?
Speaker 2 So mental health is still one of these areas where it's incredibly difficult, especially because we're now a lot more disconnected, right?
Speaker 2 In the village is if someone was struggling, the chances that someone picked up on it from the community was actually actually much, much higher than, let's say, I live in New York right now.
Speaker 2 Nobody would notice. Like I know my neighbors a little bit, but they certainly wouldn't necessarily figure out that I might be having a hard time emotionally.
Speaker 2 So here, this is something that technology can actually help us with, right?
Speaker 2 If technology can understand, for example, from your smartphone sensing data that you're not leaving the house as much anymore, there's much less physical activity, you're not making and taking as many calls anymore.
Speaker 2 Maybe it's nothing, right? Maybe you're just on vacation and you're having a great time, but it could be an early warning sign that maybe something is off, right?
Speaker 2 So your behavior starts to deviate from your typical routine.
Speaker 2 Maybe now is a good time to use the powers that we have to first of all help you realize that something might be going on and then also give you the resources that you need to get better before you even enter this valley of depression.
Speaker 2 So I think there's a lot of incredible opportunities that technology has.
Speaker 2 to amplify some of the positive sides that we experience in the village of like under using the understanding of who you are to help you accomplish the things that you want right so like savings is another example where like it's just it's a top it's like a behavior that's extremely difficult for the human brain because it's like you have to give up something in the here and now for maybe a potential benefit in the future now again back in the village there was probably someone who could try to help you by motivating you in a certain way And the same is true for technology.
Speaker 2 So if I can understand your preferences and your, again, motivations and needs, yeah, I can use it to get you to spend more, but I can also use it to get you to save more, for example.
Speaker 2 And then the last one, which I think is actually very nicely ties into this topic of connection that I've been thinking about a lot, is could we also use technology almost to accomplish the opposite of keeping you in your echo chamber and having us become increasingly disconnected, but as a way to actually experience the world from different viewpoints, right?
Speaker 2 So
Speaker 2 the example that I always give is: if I wanted to understand what the reality of, of, let's say, a 50-year-old Republican farmer in Ohio looks like, almost impossible for me to do right now because I only have one kind of view on the world, right?
Speaker 2 And one perspective, and that's my own. It's based on my experiences, everything that I've seen, everything that I've experienced.
Speaker 2 And it would be really hard for me to understand what that reality looks like.
Speaker 2 I would have to go there. I'd have to talk to a lot of people, maybe shadow someone and see what their day looks like, ideally get a sense of how they process.
Speaker 2 Now, with technology, we could have like something, I call it like a perspective exchange or like an echo chamber swap where the big companies know exactly what that reality looks like right so google knows what that person sees when they search for something on google and facebook knows what their news feed looks like what do they talk about who are they connected with so there is a way in which those tech companies could actually give us access to the reality of someone else so that we are instead of being isolated and kept in our own echo chambers we could actually use it as a tool to expand our view on the world and become potentially connected to people who look very different to us, who have totally different life experiences, and that we would otherwise never get to see.
Speaker 2 And I'm talking about it in the context of the US right now.
Speaker 2 You could imagine I could step into the shoes of someone who lives in Thailand, someone who lives in Chile, and see, okay, what are the topics that they actually care about?
Speaker 2 What does their daily routines look like? What are some of the things that they might be worried about? What are some of the things that they dream of?
Speaker 2 And I think if we talk about connection right now, because we're so focused on personalization and optimizing for profits, yeah, we're stuck in our little echo chamber, but the same technology has the potential to actually broaden our view on the world in a way that we've never seen before.
Speaker 1 Yeah, as you were just describing that, it really made me think of the late Emile Bruneau.
Speaker 1 I'm not sure if you know who Emile was, but he was a professor at Wharton, and he was really trying to look at this whole factor of dehumanization and how so much of the conflicts that we're in worldwide are because we fail to see the other side as who they are.
Speaker 1 So it got me thinking that a way you could use this is to really see,
Speaker 1 just take a look at Russians versus Ukrainians. If you could see who the other side was and see them as a whole person.
Speaker 1 Could that change the whole perspective of seeing people as more common instead of enemies. What are your thoughts on that?
Speaker 2 And it's a topic that I've been thinking about so much because it's not trivial.
Speaker 2 So the risk that you always have is if, let's say, I get to see what is the news feed, like the Google searches of someone who has like totally different political abuse than I have looks like.
Speaker 2 I might actually become so appalled that I dig in my heels and even deeper, right?
Speaker 2 So there's always the challenge that you have when you expose someone to a different worldview that you just evoke illicit reactants but i actually think that's something that we could also work on with technology because as you said what we want is to get a holistic understanding of the person so i remember like back I think it was in the context of the 2016 election or 2020, I can't remember, but the Wall Street Journal had this like red feed, blue feed website where they just showed side by side, here's what the news story, same topic, let's pick immigration or like gun laws looks like from a more Republican versus a more democratic stance.
Speaker 2 And I think, first of all, not that many people were using it. And I think what it somewhat led to was that people were like, just, oh my God, look at how crazy the other side is.
Speaker 2 And the reason for why I think people were not as receptive is exactly as you mentioned. This is like a super abstract, like dehumanized way of showing it, right?
Speaker 2 So it's, here's a group of people, and this is what they see. Tells you nothing about the people behind it, right? Like, why is it that a Republican, for example, is more worried about immigration?
Speaker 2 Maybe they just have something that's happening in their life that makes them think that way. And so
Speaker 2 what I had in mind when I was thinking about this echo chamber swap or perspective exchange is you really want to see the entire life of someone.
Speaker 2 And maybe we could even complement it with some kind of AI assistant, right? So that tries to help you figure out as you walk in someone else's shoes of like, here's how this might all come together.
Speaker 2 Here's why they might have a certain view on immigration.
Speaker 2 And what we know, for example, example, from research on, it's called moral reframing, which is essentially, if I can talk about an issue with your own moral lens of what is right or wrong in the world, I can actually get you to be a lot more open to an argument.
Speaker 2
So there's like these five. like moral dimensions of fairness, care, loyalty, authority, and purity.
And we know that you all have,
Speaker 2 everybody has their own profile, right? We can, there's like a little bit of a distinction between like Republicans, for example, or more conservatives.
Speaker 2 And they care more about authority, purity, and loyalty. So, those are the driving kind of forces of like how they make decisions about what's right or wrong.
Speaker 2 And liberals are more focused on care and fairness. Now, typically, when we make arguments as humans, we just make an argument from our own perspective, right? Because that's what we know.
Speaker 2 But you can imagine if I can try and say, look, here's someone who's more conservative, and maybe I can craft an argument around immigration that's focused on loyalty more than care and fairness, because they might not care about this as much.
Speaker 2 That's one way in which you can actually get them engaged in the conversation. And I think this like the same principle could apply to these echo chamber swaps, right?
Speaker 2 If I can, first of all, see here's what this entire life of the person looks like, not just what they see, like what the news that they read, and I can also have someone help me, in this case, probably an AI, understand, okay, here's where they're coming from, and here's maybe one way you could think about this from your own point of view, from your own moral campus.
Speaker 2 I think that's like a way in which we could hopefully avoid some of the reactants and really get people to humanize the people behind these digital footprints.
Speaker 1 I want to explore that a little bit more. So what you just were referring to is that psychological targeting and data collection raise profound moral questions.
Speaker 1
Through one lens, they can empower individuals and improve lives. Through another, they can manipulate and exploit vulnerabilities.
especially at a macro level.
Speaker 1 How do you think we decide as a society where to draw the ethical line and who should hold the moral responsibility for these decisions? Should it be corporations, governments, individuals?
Speaker 1 Where does this responsibility lie?
Speaker 2 It's a great question. Probably from everything that we've talked about so far, you can probably already tell that my point of view is like it should probably not just be left to companies, right?
Speaker 2 Because the incentives are just terrible. So I teach in a business school and it's all about what are companies incentivized by.
Speaker 2 And right now, the incentive is personalize as much as you can, show the content that's what's most engaging, right? Because it's all about attention.
Speaker 2 So, if those are the market incentives, companies are going to do that because it's really difficult to move away from that and say, We're just going to do the ethical thing, right?
Speaker 2 Everybody else is not doing that. And I even hear that from friends within those companies, right? I have a lot of friends in companies like Apple, Meta, Google, and so on.
Speaker 2 And I think they oftentimes complain and say, We would love to have stricter regulation that that allows us to do the ethical thing, but also make sure that everybody else is doing that.
Speaker 2 So I think that just leaving it to companies in the current incentive structure is really difficult.
Speaker 2 Now, the second part that I was for a long time very much in favor of and still am, but in a slightly different way, is, well, why don't we just give control to users and consumers, right?
Speaker 2 So there's, if you look to Europe with the general data protection regulations, which is one of the strictest data protection regulations that we have, similar to what California is doing.
Speaker 2 And their foundation is transparency and control. So if we want to manage that landscape and we want to have people decide for themselves, right, we're not all the same.
Speaker 2 You might be willing to give up your data for a certain service, but I might not. So Why don't we just explain to people what's happening and then let them decide?
Speaker 2 And this solution has a lot of intuitive appeal, right? Again, we're not all the same. We're empowering consumers to make their own choices.
Speaker 2 But the problem with these approaches in the absence of, I think, better regulation and some technological solutions that I can talk about in a second and changes in how we govern data is that it's just an impossible task to do, right?
Speaker 2 If think about it, if you wanted to manage your data properly across all of the products and services that you're using, that would be a full-time job.
Speaker 2 So first of all, it would mean that you have to read all of the terms and conditions constantly. You have to be constantly in the loop of how could your data be used today, tomorrow?
Speaker 2 What are some of the new technologies? What are some of the inferences? How could this kind of infringe on your autonomy and self-determination?
Speaker 2 Even if you knew everything about technology and could keep up with the speed at which it evolves, it would still be like 24-7, right? So nobody has the time to do that.
Speaker 2 I hope that people have better things to do than just reading through all of the terms and conditions. Share a meal with your family rather than doing that.
Speaker 2 So I think just pushing the responsibility to users is like, it's much more of a burden than it actually is a right.
Speaker 2 However, I do think it's the foundation of something that we can use to make it better, right?
Speaker 2 Because at the end of the day, it would be great if you could have control, if you also have the mastery to exercise that control. And there's a couple of things that I think we can do there.
Speaker 2
And they actually fall. Some of it falls under regulation.
Some of it, I think, falls under new forms of data governance. So
Speaker 2 most basic thing is make it easier for people to do the right thing, right?
Speaker 2 If I now have to go, if data tracking is the nor is the default, and I would now have to go through all of my terms and conditions, all of the permissions that I have, the settings for all of the apps that I'm using, all of the websites that I'm browsing, and I constantly have to opt out, nobody's gonna do that.
Speaker 2 Again, nobody has the time and energy to do that. If regulations said that the the default is an opt-in, no data is being tracked unless you say so, that would change the entire system, right?
Speaker 2 Because now companies really have to convince me that they can create value by having my data and my data would be protected by default and then I can change it. So I think that is one thing.
Speaker 2 But then the other thing, which when you think of regulation, it's all it's like risk mitigation, right?
Speaker 2 So like regulation tries to protect you from the most egregious abuses, but it's very much average and it doesn't help you maximize the utility that you can get from your data.
Speaker 2 There, I think what we need, again, coming back to this notion of control, is to have a support system that allows you to exercise your control wisely.
Speaker 2 And what I mean by that specifically is that there's these forms of governing your data that's in the context of like data co-ops and data trusts.
Speaker 2 So the idea is that instead of having to manage it all by yourself, you actually get together, and now we're coming back to this notion of community and connection, you get together a group of people who are willing to share their data and who come together to manage it for their own benefit so just to give you an example that i really like healthcare is one of these right so there's a lot of rare diseases for example that are poorly understood because to understand it you need data from a lot of people and all pooling it is in the same place ideally as much as you can in terms of like your genetics your medical history your lifestyle your nutritional choices if we had all of this from everybody in the world in one place
Speaker 2 we could understand disease much better could understand treatment much better because we can now see, okay, here's a treatment that works for person X, but not for person Y. Why is that?
Speaker 2 What else could we do for them?
Speaker 2 But it's like a pretty big ask to have, let's say, in a central server without giving it to pharma companies, right?
Speaker 2 But if you could have a community of people, let's say people suffering from MS, which is actually a real example in Switzerland, a data club in Switzerland that exists.
Speaker 2 They just get MS people to share, people suffering from MS to share all of their data.
Speaker 2 Now they can generate insights that are much much better than anything that you could do by yourself right if you're only one person there's only so much you can learn from your genetic data and so they generate insights generally about the disease but they also then use it to benefit you directly so if you're part of the data co-op it's called my data if you're part of the data co-op and you share your again genetic data healthcare data and so on you get like personalized recommendations that are sent to your doctor in the hospital of like here's treatments that might work particularly well for you here's something that you could do to like help with your symptoms and that's a totally different way of giving and giving control to people right because it's really empowering so it's not just saying you're in charge it's saying we're gonna give you a support system of people who have the same interests and also expertise.
Speaker 2 So data co-ops, the real benefit here is that once you have many people with the same interests, you can actually hire like experts and management who knows what they're doing and can say, okay, this might be a great company to collaborate with because they have really driving insights.
Speaker 2
They're very much research focused. And here's like some of the data structures that we can put in place to make sure that your data is being protected.
And it's not just you anymore.
Speaker 2 It's like you and an entire support system. So I think if we think about
Speaker 2 how do we change that and make data work for us. I wouldn't leave it to companies.
Speaker 2 I think we need regulation, but I think it's not the only part because it's slow and it just protects us from the basic abuses.
Speaker 2 I think what we need is like these more community style forms of data governance that are resting on this notion of control.
Speaker 1 Sounds like what you're saying, and thank you for explaining all that, is as individuals, we don't have the power to change much.
Speaker 1 But if we start showing up as a collective with more might, then we have more influence.
Speaker 1 And this leads me into something further on in your book where you're talking about solutions and you talk about the shift to privacy by design could also be a significant solution.
Speaker 1 To me, this whole privacy by design would fit with this model of the collective because it would give you that leverage that you're going to need to the companies to fulfill the promises once they've acquired your data.
Speaker 1 Is that a good way to think about it?
Speaker 2 I think, so for me, privacy by design just means make it easy for people to do the right thing, right?
Speaker 2 So if the, again, the default is set to relatively high standards of data protection you're not just giving away your data every time you sign up for a service or a product that just means that first of all it's it kind of it leverages our laziness right so it's essentially most of the time we're not going to change the default because we have other stuff to do so that means our data is most of the time protected but it also could mean that companies now actually have to step it up right so they have to say well if i want to get your data i better make sure that i show you how my product is much, much better when you're giving me access to your personal information than when you're not.
Speaker 2 And you can think of it as like YouTube, right? So, YouTube has this option where they're not using your history and your like what you've looked for, searched for before, what you've watched before.
Speaker 2 And you can actually see the value decrease. So, there's you can see how it's like every time you have to find something new, you don't remember anything.
Speaker 2 But if your privacy is protected by design, now companies really have to live up to that expectation.
Speaker 2 So, collectively, you can actually ask for a lot more when when it comes to the design of products and services.
Speaker 1 So Sandra, I've been recently re-watching that short documentary that Netflix did on Chernobyl.
Speaker 1 And as I was reading your book, I came across this reference that you made to a Tooth Guardian article where journalist Tori Doctorow offered a different analogy, and he compared personal data to nuclear waste, which got me thinking about Chernobyl.
Speaker 1 We should treat personal electronic data the same way that we treat
Speaker 1
radiation or plutonium because it's dangerous. I liked how you frame this.
It's long-lasting and once it's leaked, you never get it back.
Speaker 1 Just like radioactive data, our personal data can be deadly is what you write literally. Why is it so important for people to think about it in that concrete of terms?
Speaker 2 It's funny because I think nuclear nuclear waste is a good analogy. You could also think about it as nuclear power, right?
Speaker 2 Because nuclear power has the ability to destroy, but also to actually create, give energy to humanity. So I think there's actually those two sides.
Speaker 2 But one of the things that I like about this analogy, as you said, is that once data is out there, you're never going to get it back. And I think people oftentimes
Speaker 2 forget that. I teach this class on the ethics of data, and there's always people who say, well, I don't care about my personal data being out there because I have nothing to hide.
Speaker 2 And I in some way understand that sentiment, right? Because it feels like it's, first of all, an impossible task. There's not much you can do.
Speaker 2 And it might be true that people in that moment really don't worry too much of their data being out there, but it's a very privileged position to be in, right?
Speaker 2 If you don't have to worry about your data being out there, that means just that you're currently in a really good spot. And it's certainly not true for everybody, right?
Speaker 2 Think of the fact that we can predict your sexual orientation just from your online digital footprints. That still comes with a death penalty in many countries.
Speaker 2 Think of mental health, still stigmatized in many parts of the world.
Speaker 2 And I think the fact that even if you don't worry about your data being out there here and now, doesn't mean that couldn't change tomorrow.
Speaker 2 Like, I think the Supreme Court decision to overturn Roe versus Wade made that somewhat painfully real for a lot of women in the US.
Speaker 2 Because suddenly, the fact that we can track your Google searches, we can track your GPS location, so we know potentially where you go. We can, like, there's like peer-reviewed tracking apps.
Speaker 2 So, there's like all of these data traces that we never really had to worry about suddenly was like, well, now I don't necessarily want someone else poking around in my private data because it could have like implications that are very detrimental.
Speaker 2 So this is like what I mean by.
Speaker 2 It's like nuclear waste, it's long lasting and you're never going to get it back. The way that I like to frame it is like, essentially, data is permanent, but leadership isn't.
Speaker 2 You just don't know what's going to happen tomorrow. And you mentioned that the deadly piece.
Speaker 2 So the point that that I think, or like the example that it's driving home this point for me every time, it just gives me goosebumps every time, is actually the history of the country that I grew up in.
Speaker 2 So if you think back to Germany in 1939, democracy, 1930, like 39, it essentially wasn't anymore.
Speaker 2 And one of the things that we know is that a lot of the atrocities in the Jewish community within Europe depended on what data was available.
Speaker 2 So some of the European countries had religious affiliation as part of the census data.
Speaker 2 So it was sitting there in City Hall and it was extremely easy for the Germans to come in, find the records and go find the members of the Jewish community.
Speaker 2 And what we know, again, is like the rates of atrocities across European countries was very much associated with that information being available.
Speaker 2 Now, fast forward to now, you don't need City Hall anymore, right?
Speaker 2 You can just go into someone's online profile and you can, with relative, really high levels of accuracy, predict that religious orientation, religious affinity.
Speaker 2 Let's say, again, something we kind of suddenly discriminate against a different group of people, it's certainly going to be predictable from the data that is out there.
Speaker 2 So, this notion of I don't have anything to worry about today might not be true tomorrow. And for me, that's a really important point.
Speaker 3 So, Sandra, thank you for going into that.
Speaker 1 The last thing I wanted to talk to you about is the work of Esther Dyson. Do you happen to know who Esther is?
Speaker 2 No.
Speaker 1 No. So Esther happens to live in New York, but she is very well known in the startup world
Speaker 1 because she is one of the first investors in Square, Facebook. The list 20 for me, the list goes on and on.
Speaker 1 And I was just talking to her and she is very much
Speaker 1 very concerned about what this data, the poor use of the data and its implications could have on society.
Speaker 1 And what she is really working on right now and loves to talk about is systems change, how it's the systems, not just individual actions that shape outcomes, whether it's in health, which she's trying to do a lot of work in.
Speaker 1 You would actually love her work. She's trying to go into small communities and reimagine how we could do healthcare different, starting with a small village and then trying to expand it out.
Speaker 1 And so it's like the idea of redesigning the village or in your the way you think about it, redesigning the digital village.
Speaker 1 So if you think about, and we've been talking about this, the systematic changes that are necessary to create a healthier system
Speaker 1 really that balances personal agency and the algorithm influence. What would some of those initial systematic changes need to be around this
Speaker 1 of how a person feels they matter?
Speaker 2 I couldn't agree more. I think it's actually coming back to some of the things that we talked about earlier, right?
Speaker 2 So because most of the time when we think about data, how do people take control of their data, it's very much focused on the individual, right? Well, you have to manage it.
Speaker 2 When I think about systemic changes, I think about something like privacy by design. That's what we talked about.
Speaker 2 I think about something like, can we create these data co-ops that again, create a system that supports you where you can come together and it's not just you.
Speaker 2 And there's also technologies that I think will drive some of the systemic change because right now you oftentimes have less trade-off between, well, I do want personalization and I do want the value that comes with data, especially in the context, right?
Speaker 2 Some of it might just be, well, I need to find a movie on Netflix and I don't have.
Speaker 2 like my entire life to find the one that I want or I have to buy headphones on Amazon and I can't now spend the next 10 weeks trying to find the ones that are most relevant.
Speaker 2 But it can also be something in the context like healthcare.
Speaker 2 It's like, yeah, I do want the insights into disease and I do want some personalized recommendations for how to be healthier and happier, but I don't necessarily want my data to be out there with the big pharma companies all in one place where it's at risk of security breaches and abuse when it comes to manipulating us in the future or exploiting us for profits.
Speaker 2 So there is now technology.
Speaker 2 that essentially is trying to overcome this trade-off by saying, well, how do we protect your data, but still give you the insights, the value, the convenience, and the service?
Speaker 2 It's called federated learning.
Speaker 2 And the idea is essentially instead of you having to send your data to Netflix, for example, to train their models, to send your data, your medical history, genetic data to a pharma company.
Speaker 2 and then just trust them that they're going to use it for a good purpose, you can ask Netflix to say, send their models to your phone locally.
Speaker 2 And instead of you sending all of your viewing data to Netflix, they just send their model to you based on the movies that you've watched.
Speaker 2 They update locally, and then you just send the intelligence back. So instead of the original model that said, well, we're just gonna send our data to a central server, you never send your data.
Speaker 2 So you never, the data never leaves its safe harbor. What you exchange with the companies is essentially intelligence.
Speaker 2 So you have Netflix ask questions from the data, and then you can send back the intelligence.
Speaker 2 Or you can have like a pharma company ask questions from your medical data, and then you get personalized advice based on the models that they create. And so for me, this like technology is really
Speaker 2 one that kind of creates systemic change because it eliminates this trade-off that we had to grapple with all this time.
Speaker 2 And I think if you, again, if the trade-off is I can get great service convenience.
Speaker 2 here and now or maybe i can protect my data my privacy and so on here and in the future most people will gravitate towards the immediate one right so i think that's another good example for where systemic changes is necessary just to make us live up to the expectations that we have and become the best versions of ourselves.
Speaker 1 Yep, as you write, Netflix benefits and all other users benefit because your personal data never leaves its safe harbor.
Speaker 1 You don't need to trust a third party to securely store your data and use it for only the purposes it was intended. I think that's a great way to end today's discussion.
Speaker 1 Sandra, congratulations again on your book. And if a person wants to know about you and your research, where's the best place for them to go?
Speaker 2
And well, so the book has a website, mindmasters.ai. I also have a personal website, sandramutz.com.
And we recently, together with my husband, actually set up a center.
Speaker 2 It's called the Center for Advanced Technology and Human Performance. I think it's human minusperformance.ai.
Speaker 2 But essentially, it's trying to integrate everything, see how we can empower people with technology.
Speaker 1 Well, Sandra, thank you so much again for joining us here today. It was such an honor to have you.
Speaker 2 Thank you so much.
Speaker 3 And that's a wrap on an enlightening conversation with Dr. Sandra Matz.
Speaker 3 Her insights challenge us to rethink the balance between connection and autonomy in the digital world, how algorithms shape our thoughts and behaviors, and what it truly means to reclaim our sense of self in an era of constant data tracking.
Speaker 3 As we close out today's episode, take a moment to reflect. how much of your daily decision-making is truly your own and how much is being influenced by unseen forces?
Speaker 3 Are the digital tools really helping us foster deeper connection or are they pulling us further apart?
Speaker 3 What intentional steps can you take to reclaim your autonomy and cultivate more meaningful interactions in your life?
Speaker 3 The conversations we have on Passion Struck are meant to challenge perspectives and help you live with greater intention, whether it's in your personal growth, relationships, or the way you navigate the ever-evolving digital landscape.
Speaker 3 If today's episode resonated with you, please take a moment to leave a five-star rating in review. Your support fuels these impactful discussions and helps more people discover the show.
Speaker 3 And if someone in your life is struggling with digital overwhelm or questioning their relationship with technology, share this episode with them. It might just change the way they see their world.
Speaker 3 All resources discussed today, including Sandra's book, Mind Masters, are linked in the show notes at passionstruck.com. If you'd like to watch the full video version, head over to the John R.
Speaker 3 Miles YouTube channel. and don't forget to subscribe and share with others who are passionate about learning and personal growth.
Speaker 3 I love bringing these insights into organizations and teams through my speaking engagements.
Speaker 3 If today's episode sparked new thoughts on leadership, behavioral science, and digital resilience, visit johnrmiles.com slash speaking to learn more about how we can work together.
Speaker 3 Coming up next on Passion Struck, we continue our journey into the psychology of self-discovery and personal growth with a conversation that explores self-worth, purpose, and what it truly means to live in alignment with who you are.
Speaker 3 In our next episode, I sit down with Natalie Namaste, a thought leader in personal transformation, to explore how to break free from perfectionism and self-doubt, the key to embracing authenticity and rewriting limiting beliefs, and practical tools to cultivate inner confidence and a life filled with meaning.
Speaker 3 This is a conversation that will inspire you to let go of what's holding you back and step boldly into the life you were meant to live. You won't want to miss it.
Speaker 4 Conflict.
Speaker 4 arises deep within the soil plexus chakra, so connected with the stomach, with it, like there's a fiery energy there, it's the element of fire, and one can hold opinions and really believe in their opinion and it's their identity, it's who they are.
Speaker 4 One comes into contact with someone else who holds a different opinion, it creates such a trigger.
Speaker 4 They feel like it's poking at them and who they are and what they believe and you could physically feel it.
Speaker 3
Thank you for being part of the Passion Star community. Your dedication to intentional living and making an impact inspires me every day.
And remember, the fee for the show is simple.
Speaker 3 If you found value in today's episode, share it with someone who needs to hear it. Most importantly, apply what you've learned so that you can live what you listen.
Speaker 3 Until next time, live life, passion strategy.