
Dr. Sandra Matz on The Silent War Hijacking Your Free Will | EP 578
Listen and Follow Along
Full Transcript
Hey friends, ever been stuck waiting weeks for a healthcare appointment or hours on a pharmacy line? It's the worst, right? Well, here's the good news. Amazon is now in healthcare.
It's called Amazon One Medical. And trust me, this is a total game changer.
You know how hard it is to quickly see a medical provider when you really need one. With Amazon One Medical, you can access 24 by 7 virtual care and talk to a provider in minutes, right from your couch.
Feeling too sick to leave your bed? No problem. Stay wrapped in your favorite blanket while getting the care you need.
And with Amazon Pharmacy, you don't even have to leave the house to pick up prescriptions. Amazon delivers them right to your door.
No more waiting in pharmacy lines surrounded by sneezes and coughs. Thanks to Amazon Pharmacy and Amazon One Medical, healthcare just got less painful.
Learn more at health.amazon.com. That's health.amazon.com.
As a loyal listener of this show, you're always challenging yourself to grow, to be better, to keep learning, and Southern New Hampshire University can help. Southern New Hampshire University offers over 200 career-focused degree programs online, plus Southern New Hampshire University has some of the lowest online tuition rates in the United States.
So balancing school, work, and life actually feels achievable and affordable too. Find your degree at snhu.edu slash passion.
That's snhu.edu slash passion. Coming up next on PassionStruck.
Hi, I'm your host, John R. Miles.
And on the show, we decipher the secrets, tips, and guidance
of the world's most inspiring people
and turn their wisdom into practical advice
for you and those around you.
Our mission is to help you unlock the power of intentionality
so that you can become the best version of yourself.
If you're new to the show, I offer advice and answer listener questions on Fridays. We have long
form interviews the rest of the week with guests ranging from astronauts to authors, CEOs,
creators, innovators, scientists, military leaders, visionaries, and athletes. Now let's
go out there and become PassionStruck. Hey, PassionStruck fam.
Welcome to episode 578. I want to take a moment to thank you for being part of this incredible community.
Your energy, passion, and commitment to living more intentionally inspires me every day. Whether you're a longtime listener or joining us for the first time, welcome.
You've joined a global movement dedicated to igniting purpose and living boldly with intention, and I couldn't be happier to have you here. Let me ask you, what if the most powerful force shaping our thoughts, decisions, and even our emotions isn't our own free will, but an invisible web of algorithms that know us better than we know ourselves.
What if the digital world we've built to connect us is actually driving us apart? Today's guest, Dr. Sandra Matz, is here to explain these urgent issues.
A professor at Columbia University and the author of Mind Masters, the data-driven science of predicting and changing human behavior, Sandra has spent years studying how our digital footprints, everything from the music we listen to, the places we go, and even the words we use, are being used to shape our reality. In today's episode, we explore how algorithms have become the architects of our choices, shaping what we see, what we buy, and even how we feel.
The concept of the digital village. How today's technology nourries the tight-knit scrutiny of small communities, but on a global scale.
The hidden way psychological targeting influences everything from elections to relationships to self-perception. How we can reclaim our autonomy and create a healthier, more intentional relationship with the technology we use and the role of human connection in an era of digital disconnection and why our sense of mattering is at stake.
If you've ever questioned whether your thoughts and decisions are truly your own or wondered how we can take back control in a world increasingly shaped our artificial intelligence and predictive analytics, this episode is for you. Sandra's research challenges us to rethink the trade-offs of the digital age and discover ways to reclaim our own sense of agency.
Before we dive in, let's take a moment to reflect on my powerful conversation that I had on Tuesday with Edward Fishman, a former top U.S. State Department sanctions official.
He joined me to discuss how financial leverage, supply chains, and economic choke points have replaced traditional warfare. His book, Choke Points, American Power in the Age of Economic Warfare is a must read for understanding the future of global power.
And in last week's solo episode, I explored the foundational role of mattering in our relationships, why we accept less than the love we deserve, how modern dating feels like a competition, and what it takes to build lasting, meaningful connections. Part two drops tomorrow, where I'll dive even deeper into how to recognize the right relationships, make love last, and navigate heartbreak with intention.
Want to dive deeper? Check out our episode starter packs, curated playlists on topics like leadership, mental health, and personal growth at passionstruck.com slash starter packs or Spotify. Sign up for my Live Intentionally newsletter for exclusive weekly insights, tools, challenges, and strategies to live with greater intention.
Prefer video? Join our growing community on the John R. Miles YouTube channel, where you can watch this episode and share it with someone who could benefit from Sandra's insights.
Now, get ready for a conversation that will change the way you see technology, autonomy, and the digital forces shaping our lives. Let's dive in with Dr.
Sandra Matz. Thank you for choosing PassionStruck and choosing me to be your host and guide on your journey to creating an intentional life.
Now, let that journey begin. Hey, PassionStruck fam.
The perfect vacation includes a lot of adventure and even more R&R. And let me tell you, Texas has it all.
Whether you're wanting to experience the natural beauty of an iconic state park or relax on the beautiful beaches of the coastline, the Lone Star State welcomes you to enjoy the unique experiences you can only find in Texas. When hunger strikes, savor some world-famous barbecue or treat yourself to exceptional fine dining across the state.
No matter your craving, it's waiting for you in Texas. There's always a dance floor or live music venue just ready to be discovered.
The nightlife in Texas is always an exciting time. And what's a trip to Texas without taking in the vibrant art scene or horseback riding across sprawling ranches to bring out your inner cowboy.
Texas isn't just a destination, it's a one-of-a-kind experience, and it's calling you. So, let's pack our bags and get going.
Visit TravelTexas.com and start planning your trip today. Let's Texas.
Introducing the new Volvo XC90 with seven-seat versatility, Google built-in, and advanced safety features for all your precious cargo. The new Volvo XC90, designed for life.
Visit volvocars.com slash US to learn more. If you love a Carl's Jr.
Western Bacon Cheeseburger, if you're obsessed with onion rings and barbecue sauce, next time, tell them to triple it.
If you need that El Diablo heat, heat, heat,
and more meat, meat, meat, triple it.
If you're gaga for house-made guacamole,
bacon, and spicy Santa Fe sauce,
you already know it.
Introducing the new Triple Burgers.
Only at Carl's Jr.
Get a one-time free Triple Burger
when you download the app and join my rewards. Minimum purchase required.
New members only within 14 days. I am absolutely thrilled today to welcome Dr.
Sandra Matz to PassionStruck. Welcome, Sandra.
Thank you so much, John, for having me. Today, we are going to be exploring your brand new book, Mindmasters, The Data-Driven Science
of Predicting and Changing Human Behaviors. Congratulations on the book and it being featured as a must-read by the Next Big Idea Club.
Fantastic recognition. Yeah, I was super happy.
I love them. So it was almost a dream come true.
I felt the same way. I wrote a book last year, it came out in February.
And I got that recognition, it was for me, a higher level of acknowledgement than any bestseller list could have possibly been. I was going to say, it feels like the bestseller list that matters in a way.
So I would feel you. I'm going to start back in your childhood, because this is where you start off your book.
You grew up in a small German village. And in this village, as I understand it, there were only two restaurants.
There weren't any shops. What was it like growing up in that village? And I was hoping you might be able to introduce this by using the story of the missing rabbit.
The missing rabbit, yes. I'm happy to.
So, yes, I grew up in this, it was really a tiny village.
500 people, even though my parents keep reminding me that since I left, it's grown to a thousand people, which I can assure you does not make much of a difference. But my experience there was really the fact that it was like a very tight knit community.
But everybody knew everything about everybody else's lives kids running to the bus, and maybe they're just not the most organized ones, or you see who's dating whom, what people are doing on the weekend. And that just means on the one hand, that it's just like the sense of community, connection, and security that I've never experienced anywhere else, right? Those neighbors who know you that well, they're the ones who can give you the best advice possible.
They can connect you to opportunities. They can set you up with people that they think you might like or jobs that they think you might be interested in.
But it's also, it feels very intrusive in a way, right? Because they, again, observe everything that you do and they poke around in your private life, sometimes manipulating you behind the scenes in ways that I at least didn't really appreciate. Now that the story with the rabbit was essentially I think a nice example of just like how people in the village by observing everything that's going on can come together and almost like create value so this was I don't know how I was like still pretty young but I remember as the weekend we're still sleeping at some point like one of our neighbors knocks on the door telling us that my pet rabbit had escaped apparently so they were
living outside we had built them like this amazing cage together with my dad but apparently we left it open overnight so rabbit escaped and it was just hanging out in their garden eating the salad they had already tried to catch it so difficult so now suddenly it's my neighbors trying to chase the rabbit. My entire family gets up trying to chase the rabbit and they're really fast.
So they zigzag around like you see it in the cartoon movies and it was just impossible to catch it. And like word gets out, word spreads pretty quickly.
So I think just a few, like maybe half an hour in the entire street is involved. So we now have someone managing the traffic because the rabbit keeps running across and like back and forth across the street.
And we have like people planning how to catch him. There was like this entire, like the entire community came together to help us solve the problem.
We eventually caught in by someone leaping onto the rabbit, which was a traumatic moment in and by itself. But again, it's just like there was someone who knew before we even did that the rabbit had escaped and everybody came together to catch it.
Yeah, it was like a tight-knit community. Yeah, and I think you found that to be even more the case as you grew into a teenager and experienced this in compounding effects when you were involved in a motorcycle accident when you were 15.
Yeah, you're really hitting all the pain points here, but that's true. This is one of my low moments in my village career, I would say.
So I had a boyfriend at the time who was a bit older and he had a motorcycle and we loved riding the motorcycle around and it was usually him driving because he was the only one who had a license I was still 15 and I was at some point well I want to at least try I don't want to wait another three years until I can get my license so why don't we just go to an even more remote place than the village so we found an airfield and that was abandoned by the time and I was just like let me try you sit in the back and we'll take it step by step you walk me through how stuff works because I have no idea and which was like a really nice idea in theory did not plan out in practice so I don't know exactly what happened but I think I just pulled the bike back and just turned on the gas let the clap snatch so we just rise like a horse the the front goes up boyfriend falls the back, and I just keep driving away. No ideas.
At some point, I crash on the side, luckily, and nothing really happened because I was still going relatively slow. But you can imagine that the moment that we come back, because my dad had to pick us up, the motorcycle just wouldn't start anymore.
The moment we come back and drop it and the shot to get repaired, everybody knew about it, right? So I think for the next couple of weeks, I just constantly got caught in conversations about, I'd be so stupid to do this, and then someone else telling me about their own childhood offenses. Yeah, the news spread really fast in that village.
So I didn't grow up in a small village like you, but I grew up in a small-knit community around the Catholic Church. And my upbringing from the time I was in kindergarten through high school, I always went to parochial schools.
And so I felt like I was in this small village where everyone knew my hopes, my fears, my weaknesses, my dreams. And it gave you a sense of being seen.
But in that small village, that was a blessing and a curse, because it shapes how you view yourself and your sense of mattering. How for you to growing up in a village shape your understanding of mattering? Very much.
And I think it shapes both like the sense of mattering, but it also shapes the choices that you have available, right? And the decisions that you make for yourself and others. So I was the daughter of a local police officer, which didn't help with a motorcycle accident, but also meant, for example, that people that I wouldn't get invited to some of the cool kids parties, but I would then get invited to all the community stuff where you could help out and organize things so that what people think about you, invite who they think you are, really matters in terms of how they interact with you, the opportunities that they create.
So I think for me, it was both the sense of who am I and what are some of the opportunities that are available for me. And then also on the flip side, in terms of just getting the support.
So we all just, there's certain things that are extremely hard, right? So like when I try to figure out what do I want to do after college, there's so many things that you could do. You've not been there, right? Cause you just get out of high school.
You don't have that much experience. So you need people who've seen more of the world to guide you in some way.
And obviously, the more that people know about, again, your dreams, your hopes and so on, ambitions, they can really customize the advice that you give. So I think that the mattering expressed itself in it was nice that there was someone who understood what I wanted.
And then in those moments when they used it to my advantage and in my best interest, that felt like a really strong support system.
I think the important thing here is this whole contrast that we've gone from people living in these small villages where we had these communities that people existed in for really
centuries. And now we've gone to a global village where algorithms know even more about us than our neighbors ever could.
And I love what you write in the book. You write, but it turns out you don't have to live in a small rural community to have someone watch and influence every step you take and choice you make, which you were just describing.
That's because we have digital neighbors. In the same way my neighbors became expert snoopers and puppeteers over time, computers can translate seemingly mundane, innocuous information about what we do into highly intimate insights about who we are and ultimately prescriptions of what we should do.
And as I read that, it really got me into thinking how complex this becomes when these things that we don't even realize are around us are influencing our mind on what we should do, who we should be, what we should become. How does that really impact our ability to connect and feel that we have autonomy over our choices? I think it's such a great question.
So for me, if you go back to the analogy of the village, there were essentially two components. One is that someone was snooping around in my life and they understood who I was, what I wanted, and so on.
But for me, the more important part was always the second part, is they use that knowledge to then potentially influence my behavior for better or worse. And so the same way I think about this online, right, so there's now with all of the data that we generate, anything from what you post on social media, your credit card spending, the fact that your phone tracks your whereabouts pretty much 24-7, knows who you connect with, knows what conversations you have.
And those are all very intimate insights into your psychology. But then the second step of, well, once companies and third parties have access to those insights, they can essentially, again, shape your behavior in certain directions that sometimes might be extremely helpful.
If I can figure out, again, what might be jobs that you're interested in, can I understand whether you might be struggling from mental health issues and support you? So there's a lot of helpful use cases, same way as in the village. But then there's obviously also these other use cases where I get you to do something that you don't want to do.
Maybe like the least problematic one is probably I'm just going to sell you stuff that you don't need. But then once we go into, I'm going to get you to vote for someone that you otherwise wouldn't vote or not vote at all.
It's this entire second world where suddenly your choices are no longer your own. And because you mentioned connection, I think what, like where we see this materialize in the context of like social fabric is that the more and more we personalize, the less and less connected we are, right? The less and less we actually have this sense of shared reality because we're all seeing the same thing.
So I think a lot of people are talking about this in the context of echo chambers or filter bubbles, but the convenience that we get sometimes by seeing some of the stuff that's relevant, that's really tailored to us in our psychology, also means that what I see might not be at all related to what you see now. We don't even have the same conversations anymore, right? And there's also no checks and balances in terms of what news are out there.
Are they actually factually correct?
Are they potentially slanted?
Or like even these cultural references that we used to have, like these fun conversations,
right?
We talk about a movie or an ad that we've seen together on TV and everybody sees the
same thing.
So it creates this bonding.
And I think the more we move to an online world that's trying to emulate and simulate
some of the ways in which we've traditionally customized content and communication, it just
Thank you. more we move to an online world that's trying to emulate and simulate some of the ways in which we've traditionally customized content and communication, it just means that we don't see the same thing anymore.
And that kind of then erodes some of the connection, I think, that you were referring to. It certainly does.
And I'm going to give you a little bit of background on me that you might not know about. For years, I was a senior executive in technology.
I was a CIO at Dell, where I oversaw their data strategy. And before that, I was the chief data officer at Lowe's, where I oversaw their enterprise information system strategy.
And I remember at Lowe's, as we're looking at the strategy, we were trying to compete based on data. And one of the biggest things we were looking at was consumer data.
And we developed this program called the single view of the customer, where this was back 2007, 2008. We could match 92% of Americans and knew exactly what their shopping habits were, where they were living, when they were on vacation, when they were on a second home.
It was overwhelming how much data we had. And what we were trying to do is to learn everything we could about them because in a shopper's lifetime, there are only so many home remodel projects that they're going to do.
And I know you're moving right now. So when someone moved, that was a key sign to us that they're likely going to be spending on this new house.
And so we would use this to target all kinds of different areas about them. You write about this in the first chapter of your book.
You say that there's now an estimated 149 zettabytes of data.
And to put that into perspective, it reaches as far as the moon.
And you write that some suggest that there's many digital pieces of data as there are stars in the vast universe.
But what really got me is when I read that these pieces separated aren't that meaningful. It's just like a puzzle.
It's disconnected chaos. But once you put the pieces together, you begin to see the full picture and understand its meaning.
And these digital traces provide a rich picture of our personal habits, preferences, needs, and motivations. In other words, you write, it's psychology.
So it's been interesting to me as I've began to interview more and more data sciences, how many of them now reside in the largest of corporations? I remember at Lowe's, we had three or four, but Google has an army of them. Facebook has an army of them.
How much now in these technology corporations are they using the science of human behavior to not only analyze us, but in many ways to control us? I love that. So the puzzle metaphor is, I think, one of my favorite ones, because it really tells you how this world works.
If I ask people, how worried are you about your smartphone sensing data being out there, about your social media posts being out there, about your credit card spending being out there? Oftentimes, people are not even that worried because it doesn't seem super right? If I think, okay, maybe someone knows where I work, maybe someone knows where I live, maybe someone knows that I got a Starbucks coffee, maybe someone knows that I went to a vacation because I post about it on Instagram. In isolation, those traces don't even seem that intimate and intrusive, right? Because it's like this one insight into what we do.
But then when you put them together, that's really when this kind of holistic picture of the, you mentioned this in the context of Lowe's, emerges, right? Almost the life of the person play out. You see their routines, you see their habits.
And the stuff that I've been working on for the last 10 years is you can also zoom into their psychology and that becomes really intimate, right? So I think the reason for why, for example, Cambridge Analytica blew up so big in 2016, was that suddenly people understood it's not just data, it's not just individual data points, but they can predict whether I might be impulsive, or they can predict whether I might be neurotic. And so even though like political campaigns or companies have been using data for many years before and oftentimes were celebrated by helping understand consumers, helping understand voters and talking to them about the things that matter, I think suddenly people realize that, well, wait a minute, if I can take all of this data and translate it into something that I think is really intimate and that I might not necessarily want to reveal to outsiders who I have no connection with.
That might be like a bit of a red flag. And exactly for the reason that you mentioned.
It's not just that we're poking around in your private life, but we can also use it to, in a way, control your choices or at least nudge them in a certain direction. I always talk about how psychological targeting, that's the term that I use for that is like we predict your psychology from data and then we use it to influence your behavior.
It's not a wonder weapon, right? So we're not gonna completely flip your identity just by understanding that you're extroverted and showing you a few more extroverted ads, for example. But still most choices that we make in life don't require an identity change, right? It's okay, should I go on this vacation or this vacation? Now I can probably shift you to the one that I want you to go in.
Or if you're an undecided voter and you don't know if you want to go out on election day and cast your vote, well, maybe then understanding your motivations and dreams and fears actually just gets you across that line to actually go
and cast your vote so for me that's the crux is that kind of controlling behavior doesn't
require like this deep identity change it just means making some options more like
girls jr's new snack stash was made for munchie madness mix and match any three sides just 5.99
get onion rings waffle fries and jalapeno popper bites natural cut fries fried zucchini and wine
Thank you. Mix and match any three sides.
Just $5.99. Get onion rings, waffle fries, and jalapeno popper bites.
Natural cut fries, fried zucchini, and why not another fried zucchini? Get any three sides in your snack stash. Just $5.99.
Only at Girls Junior. My Rewards members get a snack stash free with any new triple burger purchase in the app.
Munch responsibly. Only for My Rewards members for a limited time at participating restaurants.
See up for terms. Are you still quoting 30-year-old movies? Have you said cool beans in the past 90 days? Do you think Discover isn't widely accepted? If this sounds like you, you're stuck in the past.
Discover is accepted at 99% of places that take credit cards nationwide. And every time you make a purchase with your card, you automatically earn cash back.
Welcome to the now. It pays to discover.
Learn more at discover.com slash credit card based on the February 2024 Nelson report. You didn't know this.
Something I've heard you talk about in the past is self-determination theory. And I have had Richard Ryan on this podcast, who's one of the inventors of it.
But for those who aren't familiar, it consists of three different things, autonomy, mastery, and connectedness. And oftentimes I think that our autonomy and how we relate to each other go hand in hand.
Meaning I feel if you don't feel like you matter as an individual, it's hard to feel like other people matter, or it's hard to feel like you matter to other people, or that you can help other people matter. So how do these algorithms impact that sense of autonomy that we have? Something that Angela Duckworth would call self-control.
Yeah. It's such an interesting question, because I do think that there's a difference now coming back to the old ways of the village and the new ways of the village.
Because back in the day in the village, there was something that would essentially say,
if you give up part of your autonomy, right?
So you buy into this idea that we're a collective community, you give up some of the agency,
you give up some of the ability to just do whatever you want.
That created a lot of sense of connectedness, right?
Thank you. into this idea that we're a collective community, you give up some of the agency, you give up some of the ability to just do whatever you want.
That created a lot of sense of connectedness, right? Because suddenly we're in it together. And suddenly it's not just my decisions and my choices, but it's the collective decisions of the community.
So I think back in the day, there was like almost like this back and forth and negotiation of like, how much am I willing to give in response to how much I'm getting? And I think that kind of trade-off has been broken a little bit, more that negotiation in the digital world. Because right now, the way that it typically works is, right, it's not a mutual conversation that we're having.
So in the village, I knew something about my neighbors, I could connect with them, I could also influence them, and vice versa. But now it's essentially big companies, or it doesn't even have to be a big companies, but anyone who can collect data about us, as you said, a lot easier than it should be.
It's a one-way street, right? So someone is collecting data, trying to passively influence us. That means they're in a way taking our autonomy without our control or consent.
But it also means that by doing so, they potentially disconnect us from the people around us. Because now, again, we're not seeing the same thing anymore.
So I think back in the village, this kind of trade-off between autonomy and connectedness was actually something that we benefited from. I think in the digital world, it's like the same infrastructure, the same system is in a way taking both, right? It's taking our autonomy.
And at the same time, because we don't see what other people see and because we're kept in our little echo chambers, it also takes away some of that connection and meaning.
I've recently been reading Malcolm Gladwell's newest book, Revenge of the Tipping Point.
And when I think of this concept of a tipping point, it's this critical juncture.
He describes it as getting to kind of 33%,
where a series of small actions or behaviors lead to a significant and often irreversible shift.
And when I think about this shift, and especially what these algorithms and digital technologies are doing to us, we have exceeded this tipping point now a ways back. And so this tipping point now has influenced physical targeting, our digital footprints, and again, what I call this disease of disconnect.
It provides a lens to examine when and how at the society level, individual level, technology reaches a profound transformation.
I want to take this tipping point in this example of going from your small village to the digital village.
I think this tipping point is really shaping how we view each other, how we view our emotions. And what I wanted to understand from you is now that we have gone so far in one direction, what are some of the next steps that we can take to either reclaim control or harness this technology for good? It's a great question because I think so far and the way that I've spoken about it probably sounded relatively negative overall, but I do think there's incredible opportunities, right? So like we've talked about like this understanding in the village of, well, they could give me the best advice ever because they knew who I was.
I think when we, like one of my favorite examples in this context is mental health, right? So mental health is still one of these areas where it's incredibly difficult, especially because we're now a lot more disconnected, right? In the village is if someone was struggling, the chances that someone picked up on it from the community was actually much, much higher than let let's say, I live in New York right now, nobody would notice. I know my neighbors a little bit, but they certainly wouldn't necessarily figure out that I might be having a hard time emotionally.
So here, this is something that technology can actually help us with. If technology can understand, for example, from your smartphone sensing data, that you're not leaving the house as much anymore.
There's much less physical activity. You're not making and taking as many calls anymore.
Maybe it's nothing, right? Maybe you're just on vacation and you're having a great time. But it could be an early warning sign that maybe something is off, right? So your behavior starts to deviate from your typical routine.
maybe now is a good time to use the powers that we have to first of all help you realize that
something might be going on and then also to use the powers that we have to, first of all,
help you realize that something might be going on, and then also give you the resources that you need
to get better before you even enter this value of depression. So I think there's a lot of incredible opportunities that technology has to amplify some of the positive sides that we experience in the village of using the understanding of who you are to help you accomplish the things that you want right so like savings is another example where like it's just it's a top it's like a behavior that's extremely difficult for the human brain because it's like you have to give up something in the here and now for maybe a potential benefit in the future now again back in the village there was probably someone who could try to help you by motivating you in a certain way.
And the same is true for technology. So if I can understand your preferences and your, again, motivations and needs, yeah, I can use it to get you to spend more, but I can also use it to get you to save more, for example.
And then the last one, which I think is actually very nicely ties into this topic of connection
that I've been thinking about a lot is could we also use technology almost to accomplish
the opposite of keeping you in your echo chamber and having us become increasingly disconnected, but as a way to actually experience the world from different viewpoints, right? So the example that I always give is if I wanted to understand what the reality of, let's say, a 50-year-old Republican farmer in Ohio looks like, almost impossible for me to do right now because I only have one kind of view in the world, right, and one perspective, and that's my own. It's based on my experiences, everything that I've seen, everything that I've experienced, and it would be really hard for me to understand what that reality looks like, right? I would have to go there.
I'd have to talk to a lot of people, maybe shadow someone and see what their day looks like. Ideally get a sense of how they process.
Now with technology, we could have something I call it like a perspective exchange or like an echo chamber swap where the big companies know exactly what that reality looks like. So Google knows what that person sees when they search for something on Google and Facebook knows what their newsfeed looks like.
What do they talk about? Who are they connected with? So there is a way in which those tech companies could actually give us access to the reality of someone else so that we are, instead of being isolated and kept in our own echo chambers, we could actually use it as a tool to expand our view on the world and become potentially connected to people who look very different to us and have totally different life experiences and that we would otherwise never get to see. And I'm talking about it in the context of the U.S.
right now. You could imagine I could step into the shoes of someone who lives in Thailand, someone who lives in Chile and see, OK, what are the topics that they actually care about? What does their daily routines look like? What are some of the things
that they might be worried about? What are some of the things that they dream of? And I think if
we talk about connection right now, because we're so focused on personalization and optimizing for
profits, yeah, we're stuck in our little echo chamber, but the same technology has the potential
to actually broaden our view
on the world in a way that we've never seen before. Yeah, as you were just describing that, it really made me think of the late Emile Bruneau.
I'm not sure if you know who Emile was, but he was a professor at Wharton, and he was really trying to look at this whole factor of dehumanization and how so much of the conflicts that we're in worldwide are because we fail to see the other side as who they are.
So it got me thinking that a way you could use this is to really see, just take a look at Russians versus Ukrainians. If you could see who the other side was and see them as a whole person, could that change the whole perspective of seeing people as more common instead of enemies? What are your thoughts on that? And it's a topic that I've been thinking about so much because it's not trivial.
So the risk that you always have is if, let's say, I get to see what is the newsfeed, like the Google searches of someone who has like totally different political views than I have, looks like I might actually become so appalled that I dig in my heels and even deeper, right? So there's always the challenge that you have when you expose someone to a different worldview that you just elicit reactants. But I actually
think that's something that we could also work on with technology. Because as you said, what we want
is to get a holistic understanding of the person. So I remember like back, I think it was in the context of the 2016 election or 2020, I can't remember.
But the Wall Street Journal had this like red feed, blue feed website, where they just showed side by side Here's what the news story, same topic. Let's take pick immigration or like gun laws looks like from a more Republican versus a more Democratic stance.
And I think, first of all, not that many people were using it. And I think what it somewhat led to was that people were like, just, oh my God, look at how crazy the other side is.
And the reason for why I think people were not as receptive is exactly as you mentioned. This is like a super abstract, dehumanized way of showing it.
So here's a group of people and this is what they see. Tells you nothing about the people behind it.
Why is it that a Republican, for example, is more worried about immigration? Maybe they just have something that's happening in their life that makes them think that way. And so what I had in mind when I was thinking about this echo chamber swap or perspective exchange is you really want to see the entire life of someone.
And maybe we could even complement it with some kind of AI assistant. Right.
So that tries to help you figure out as you walk in someone else's shoes of like, here's how this might all come together. Here's why they might have a certain view on immigration.
And what we know, for example, from research on, it's called moral reframing, which is essentially, if I can talk about an issue with your own moral lens of what is right or wrong in the world, I can actually get you to be a lot more open to an argument. So there's like these five moral dimensions of fairness, care, loyalty, authority, and purity.
And we know that everybody has their own profile, but there's like a little bit of a distinction between like Republicans, for example, or more conservatives, and they care more about authority, purity, and loyalty.. So those are the driving kind of forces of like how they make decisions about what's right or wrong.
And liberals are more focused on care and fairness. Now, typically, when we make arguments as humans, we just make an argument from our own perspective, right, because that's what we know.
But you can imagine if I can try and say, look, here's someone who's more conservative, and maybe I can craft an argument around immigration that's focused on loyalty more than care and fairness, because they might not care about this as much. That's one way in which you can actually get them engaged in the conversation.
And I think this, like, the same principle could apply to these echo chamber swaps, right? If I can, first of all, see, here's what this entire life of the person looks like, not just what they see, like what the news that they read. And I can also have someone help me, in this case, probably an AI, understand, okay, here's where they're coming from.
And here's maybe one way you could think about this from your own point of view, from your own moral campus. I think that's like a way in which we could hopefully avoid some of the reactants and really get people to humanize the people behind these digital footprints.
I want to explore that a little bit more. So what you just were referring to is that psychological targeting and data collection raise profound moral questions.
Through one lens, they can empower individuals and improve lives.
Through another, they can manipulate and exploit vulnerabilities, especially at a macro level.
How do you think we decide as a society where to draw the ethical line and who should hold the
moral responsibility for these decisions? Should it be corporations, governments, individuals?
Where does this responsibility lie? It's a great question. Probably from everything
I'm not sure what's happening. Should it be corporations, governments, individuals? Where does this responsibility lie?
It's a great question.
Probably from everything that we've talked about so far, you can probably already tell that my point of view is like it should probably not just be left to companies, right?
Because the incentives are just terrible.
So I teach in a business school and it's all about what are companies incentivized by.
And right now, the incentive is personalize as much as you can, show the content that's most engaging, right? Because it's all about attention. So if those are the market incentives, companies are going to do that because it's really difficult to move away from that and say, we're just going to do the ethical thing, right? Everybody else is not doing that.
And I even hear that from friends within those companies, right? I have a lot of friends in companies like Apple, Meta, Google, and so on. And I think they oftentimes complain and say, we would love to have strict regulation that allows us to do the ethical thing, but also make sure that everybody else is doing that.
So I think that just leaving it to companies in the current incentive structure is really difficult. Now, the second part that I was for a long time very much in favor of and still am, but in a slightly different way, is, well, why don't we just give control to users and consumers? Right.
So there's if you look to Europe with general data protection regulations, which is one of the strictest data protection regulations that we have similar to what California is doing. And their foundation is transparency and control.
So if we want to manage that landscape and we want to have people decide for themselves, right? We're not all the same. You might be willing to give up your data for a certain service, but I might not.
So why don't we just explain to people what's happening and then let them decide? And this solution has a lot of intuitive appeal, right? Again, we're not all the same. We're empowering consumers to make their own choices.
But the problem with these approaches in the absence of, I think, better regulation and some technological solutions that I can talk about in a second and changes in how we govern data is that it's just an impossible task to do, right? If think about it, if you wanted to manage your data properly across all of the products and services that you're using, that would be a full-time job. So first of all, it would mean that you have to read all of the terms and conditions constantly.
You have to be constantly in the loop of how could your data be used today, tomorrow? What are some of the new technologies? What are some of the inferences? How could this kind of infringe on your autonomy and self-determination? So even if you knew everything about technology and could keep up with the speed at which it evolves, it would still be like 24-7, right? So nobody has time to do that. I hope that people have better things to do than just reading through all of the terms and conditions.
Share a meal with your family rather than doing that. So I think just pushing the responsibility to users is like, it's much more of a burden than it actually is a right.
However, I do think it's the foundation of something that we can use to make it better, right? Because at the end of the day, it would be great if you could have control, if you also have the mastery to exercise that control. And there's a couple of things that I think we can do there.
And they actually fall, some of it falls under regulation. Some of it, I think, falls under new forms of data governance.
So most basic thing is make it easier for people to do the right thing, right? If I now have to go, if data tracking is the NOR, is the default, and I would now have to go through all of my terms and conditions, all of the permissions that I have, the settings for all of the apps that I'm using, all of the websites that I'm browsing, and I constantly have to opt out, nobody's going to do that. Again, nobody has the time and energy to do that.
If regulation said that the default is an opt-in, no data is being tracked unless you say so, that would change the entire system, right? Because now companies really have to convince me that they can create value by having my data and my data would be protected by default, then I can change it. So I think that is one thing.
But then the other thing, which when you think of regulation, it's like risk mitigation, right? So regulation tries to protect you from the most egregious abuses, but it's very much average and it doesn't help you maximize the utility that you can get from your data. There, I think what we need, again, coming back to this notion of control, is to have a support system that allows you to exercise your control wisely.
And what I mean by that specifically is that there's these forms of governing your data that's in the context of like data co-ops and data trusts. So the idea is that instead of having to manage it all by yourself, you actually get together, and now we're coming back to this notion of community and connection.
You get together a group of people who are willing to share their data and who come together to manage it for their own benefit. So just to give you an example that I really like, healthcare is one of these, right? So there's a lot of rare diseases, for example, that are poorly understood because to understand it, you need data from a lot of people and all pooling it in the same place, ideally as much as you can in terms of like your genetics, your medical history, your lifestyle, your nutritional choices.
If we had all of this from everybody in the world in one place, we could understand disease much better. We could understand treatment much better because we can now see, okay, here's a treatment that works for person X, but not for person Y.
Why is that? What else could we do for them? But it's like a pretty big ask to have this in a central server without giving it to pharma companies, right? But if you could have a community of people, let's say people suffering from MS, which is actually a real example in Switzerland, a data corp in Switzerland that exists, they just get MS people to share, people suffering from MS to share all of their data. Now they can generate insights that are much, much better than anything that you could do by yourself, right? If you're only one person, there's only so much you can learn from your genetic data.
And so they generate insights
generally about the disease, but they also then use it to benefit you directly. So if you're part of the data co-op, it's called MyData.
If you're part of the data co-op and you share your, again, genetic data, healthcare data, and so on, you get like personalized recommendations that are sent to your doctor in the hospital of like, here's treatments that might work particularly well for you. Here's something that you could do to help with your symptoms.
And that's a totally different way of giving control to people, right? Because it's really empowering. So it's not just saying you're in charge.
It's saying we're going to give you a support system of people who have the same interests and also expertise. So data co-ops, the real benefit here is that once you have many people with the same interests, you can actually hire experts and management who knows what they're doing and can say, okay, this might be a great company to collaborate with because they are really driving insights.
They're very much research focused. And here's some of the data structures that we can put in place to make sure that
your data is being protected.
And it's not just you anymore.
It's like you and an entire support system.
So I think if we think about how do we change that and make data work for us, I wouldn't
leave it to companies.
I think we need regulation, but I think it's not the only part because it's slow and it
just protects us from the basic abuses.
I think what we need is like these more community style forms of data governance that are
Thank you. the only part because it's slow and it just protects us from the basic abuses.
I think what we need is like these more community style forms of data governance that are resting on this notion of control. Sounds like what you're saying, and thank you for explaining all that, is as individuals we don't have the power to change much.
But if we start showing up as a collective with more might, then we have more influence. And this leads me into something further on in your book, where you're talking about solutions and you talk about the shift to privacy by design could also be a significant solution.
To me, this whole privacy by design would fit with this model of the collective because it would give you that leverage that you're going to need to the companies to fulfill the promises once they've acquired your data. Is that a good way to think about it? I think so for me, privacy by design just means make it easy for people to do the right thing, right? So if the, again, the default is set to relatively high standards of data protection, you're not just giving away your data every time you sign up for a service or a product.
That just means that, first of all, it kind of, it leverages our laziness, right? So it's essentially most of the time, we're not going to change the default because we have other stuff to do. So that means our data is most of the time protected, but it also could mean that companies now actually have to step it up, right? So they have to say, well, if I want to get your data, I better make sure that I show you how my product is much, much better when you're giving me access to your personal information than when you're not.
And you can think of it as like YouTube, right? So YouTube has this option where they're not using your history and what you've looked for, searched for before, what you've watched before. And you can actually see the value decrease.
So you can see every time you have to find something new, you don't remember anything. But if your privacy is protected by design, now companies really have to live up to that expectation.
So collectively, you can actually ask for a lot more when it comes to the design of products and services. So Sandra, I've been recently re-watching that short documentary that Netflix did on Chernobyl.
And as I was reading your book, I came across this reference that you made to a Tooth Guardian article where journalist Cory Doctorow offered a different analogy, and he compared personal data to nuclear waste, which got me thinking about Chernobyl. We should treat personal electronic data the same way that we treat radiation or plutonium because it's dangerous.
I like how you frame this. It's long lasting.
And once it's leaked, you never get it back. Just like radioactive data, our personal data can be deadly is what you write literally.
Why is it so important for people to think about it in that concrete of terms? It's funny because I think nuclear waste is a good analogy. You could also think about it as nuclear power, right? Because nuclear power has the ability to destroy, but also to actually create, give energy to humanity.
So I think there's actually those two sides. But one of the things that I like about this analogy, as you said, is that once data is out there, you're never going to get it back.
And I think people oftentimes forget that. I teach this class on the ethics of data.
And there's
always people who say, well, I don't care about my personal data being out there because I have
nothing to hide. And I, in some way, understand that sentiment, right? Because it feels like it's,
first of all, an impossible task. There's not much you can do.
And it might be true that people in
that moment really don't worry too much of their data being out there, but it's a very privileged
Thank you. because it feels like it's, first of all, an impossible task.
There's not much you can do. And it might be true that people in that moment really don't worry too much of their data being out there.
But it's a very privileged position to be in. If you don't have to worry about your data being out there, that means just that you're currently in a really good spot.
And it's certainly not true for everybody. Think of the fact that we can predict your sexual orientation just from your online digital footprints.
That still comes with a death penalty in many countries. Think of mental health, still stigmatized in many parts of the world.
And I think the fact that even if you don't worry about your data being out there here now, doesn't mean that couldn't change tomorrow. Like I think the Supreme Court decision to overturn Roe versus Wade made that somewhat painfully real for a lot of women in the US.
Because suddenly the fact that we can track your Google searches, we can track your GPS location, so we know potentially where you go. There's period tracking apps.
So there's all of these data traces that we never really had to worry about. Suddenly it was like, well, now I don't necessarily want someone else poking around in my private data because it could have like implications that are very detrimental.
So this is like what I mean by it's like nuclear waste, it's long lasting, and you're never going to get it back. The way that I like to frame it is like essentially data is permanent, but leadership isn't.
You just don't know what's going to happen tomorrow. And you mentioned the deadly piece.
So the point that I think, or like the example that it's driving home this point for me every time, it just gives me goosebumps every time, is actually the history of the country that I grew up in. So if you think back to Germany in 1939, democracy in 1939, it essentially wasn't anymore.
And one of the things that we know is that a lot of the atrocities in the Jewish community within Europe depended on what data was available. So some of the European countries had religious affiliation as part of the census data.
So it was sitting there in City Hall and it was extremely easy for the Germans to come in, find the records and go find the members of the Jewish community. And what we know, again, is like that the rates of atrocities across European countries was very much associated with that information being available.
Now, fast forward to now, you don't need City Hall anymore, right? You can just go into someone's online profile and you can, with really high levels of accuracy, predict that religious orientation, religious affinity, let's say again, something we kind of suddenly discriminate against a different group of people, it's certainly going to be predictable from the data that is out there. So this notion of, I don't have anything to worry about today might not be true tomorrow.
And for me, that's a really important point. So Sandra, thank you for going into that.
The last thing I wanted to talk to you about is the work of Esther Dyson. Do you happen to know who Esther is? No.
No. So Esther happens to live in New York, but she is very well known in the startup world because she is one of the first investors in Square, Facebook, the list, 20 for me, the list goes on and on.
And I was just talking to her and she is very much, very concerned about what this data, the poor use of the data and its implications could have on society.
And what she is really working on right now and loves to talk about is systems change, how it's the systems, not just individual actions that shape outcomes, whether it's in health, which she's trying to do a lot of work in. You would actually love her work.
She's trying to go into small communities and reimagine how we could do healthcare different, starting with a small village and then trying to expand it out. And so it's like the idea of redesigning the village or, and the way you think about it, redesigning the digital village.
So if you think about, and we've been talking about this, the systematic changes that are necessary to create a healthier system, really that balances personal agency and the algorithm influence, what would some of those initial systematic changes need to be around this lens of how a person feels they matter? I couldn't agree more. I think it's actually coming back to some of the things that we talked about earlier, right? So, because most of the time when we think about data, how do people take control of their data? It's very much focused on the individual, right? Well, you have to manage it.
When I think about systemic changes, I think about something like privacy by design. That's what we talked about.
I think about something like, can we create these data coops that, again, create a system that supports you where you can come together and it's not just you. And there's also technologies that I think will drive some of the systemic change, because right now you oftentimes have this tradeoff between, well, I do want personalization and I do want a value that comes with data, especially in the context, right? Some of it might just be, well, I need to find a movie on Netflix and I don't have like my entire life to find the one that I want, or I have to buy headphones on Amazon and I can't now spend the next 10 weeks trying to find the ones that are most relevant.
But it can also be something in the context like healthcare is like, yeah, I do want the insights into disease and I do want some personalized recommendations for how to be healthier and happier, but I don't necessarily want my data to be out there with the big pharma companies all in one place where it's at risk of security breaches and abuse when it comes to manipulating us in the future or exploiting us for profits. So there is now technology that essentially is trying to overcome this trade-off by saying, well, how do we protect your data, but still give you the insights, the value, the convenience, and the service? It's called federated learning.
And the idea is essentially, instead of you having to send your data to Netflix, for example, to train their models, to send your data, your medical history, genetic data to a pharma company, and then just trust them that they're going to use it for a good purpose, you can ask Netflix to say, send their models to your phone locally. And instead of you sending all of your viewing data to Netflix, they just send their model to you based on the movies that you've watched.
They update locally, and then you just send the intelligence back. So instead of the original model that said, well, we're just going to send our data to a central server, you never send your data.
So the data never leaves its safe harbor. What you exchange with the companies is essentially intelligence.
So you have Netflix ask questions from the data, and then you can send back the intelligence. Or you can have a pharma company ask questions from your medical data, and then you get personalized advice based on the models that they create.
And so for me, this technology is really one that kind of creates systemic change because it eliminates this trade-off that we had to grapple with all this time. And I think if you, again, if the trade-off is I can get great service, convenience here and now, or maybe I can protect my data, my privacy and so on here and in the future, most people will gravitate towards the immediate one, right? So I think that's another good example for where systemic change is necessary just to make us live up to the expectations that we have and become the best versions of ourselves.
Yep, as you write, Netflix benefits and all other users benefit because your personal data never leaves its safe harbor. You don't need to trust a third party to securely store your data and use it for only the purposes it was intended.
I think that's a great way to end today's discussion. Sandra, congratulations again on your book.
And if a person wants to know about you and your research, where's the best place for them to go? And well, so the book has a website, mindmasters.ai. I also have a personal website, sandramats.com.
And we recently, together with my husband, actually set up a center. It's called the Center for Advanced Technology and Human Performance.
I think it's human-performance.ai. But essentially, it's trying to integrate everything, see how we can empower people with technology.
Well, Sandra, thank you so much again for joining us here today. It was such an honor to have you.
Thank you so much. And that's a wrap on an enlightening conversation with Dr.
Sandra Matz. Her insights challenge us to rethink the balance between connection and autonomy in the digital world, how algorithms shape our thoughts and behaviors, and what it truly means to reclaim our sense of self in an era of constant data tracking.
As we close out today's episode, take a moment to reflect. How much of your daily decision making is truly your own and how much is being influenced by unseen forces? Are the digital tools really helping us foster deeper connection or are they pulling us further apart? What intentional steps can you take to reclaim your autonomy and cultivate more meaningful interactions in your life? The conversations we have on PassionStruck are meant to challenge perspectives and help you live with greater intention, whether it's in your personal growth, relationships, or the way you navigate the ever-evolving digital landscape.
If today's resonated with you, please take a moment to leave a five-star rating and review. Your support fuels these impactful discussions and helps more people discover the show.
And if someone in your life is struggling with digital overwhelm or questioning their relationship with technology, share this episode with them. It might just change the way they see their world.
All resources discussed today, including Sandra's book, Mind Masters, are linked in the show notes at passionstruck.com. If you'd like to watch the full video version, head over to the John R.
Miles YouTube channel, and don't forget to subscribe and share with others who are passionate about learning and personal growth. I love bringing these insights into organizations and teams through my speaking engagements.
If today's episode sparked new thoughts on leadership, behavioral science, and digital resilience, visit johnrmiles.com slash speaking to learn more about how we can work together. Coming up next on PassionStruck, we continue our journey into the psychology of self-discovery and personal growth with a conversation that explores self-worth, purpose, and what it truly means to live in alignment with who you are.
In our next episode, I sit down with Natalie Namaste, a thought leader in personal transformation to explore how to break free from perfectionism and self-doubt, the key to embracing authenticity and rewriting limiting beliefs, and practical tools to cultivate inner confidence and a life filled with meaning. This is a conversation that will inspire you to let go of what's holding you back and step boldly into the life you were meant to live.
You won't want to miss it. Conflict arises deep within the solar plexus chakra, so connected with the stomach, there's a fiery energy there, it's the element of fire.
One can hold opinions and really believe in their opinion and it's their identity, it's who they are. One comes into contact with someone else who holds a different opinion.
It creates such a trigger. They feel like it's poking at them and who they are and what they believe.
And you can physically feel it. Thank you for being part of the Passion Star community.
Your dedication to intentional living and making an impact inspires me every day. And remember, the fee for the show is simple.
If you found value in today's episode, share it with someone who needs to hear it. Most importantly,
apply what you've learned so that you can live what you listen. Until next time, live life,
passion strike.
Introducing the new Volvo XC90 We'll be right back. room set.
Google built in for when you choose the road more exciting and innovative technology and advanced safety features for all your precious cargo. The new Volvo XC90 designed for life.
Visit volvocars.com slash us to learn more. If you love a Carl's Jr.
Western bacon cheeseburger, if you're obsessed with onion rings and barbecue sauce, next time tell them to triple it.
If you need that El Diablo heat,
heat,
heat and more meat,
meat,
meat triple it.
If you're Gaga for house made guacamole,
bacon and spicy Santa Fe sauce,
you already know it.
Introducing the new triple burgers only at Carl's Jr.
Get a one-time free triple burger. When you download the app and join my rewards,
minimum purchase required.
New members only within 14 days.