Addicted to Scrolling? 3 Small Changes to STOP Feeling Drained After Scrolling Social Media
What time of day do you scroll the most?
Have you tried setting limits on your screen time?
Today, Jay dives into one of the defining questions of our digital age: is the algorithm shaping who we become, or are we the ones quietly teaching it how to shape us? He reveals how every click, pause, and late-night scroll acts as a subtle signal, tiny instructions that train the system, which then turns around and begins to train us. Before we even realize it, our insecurities become fuel, our curiosity becomes comparison, and outrage becomes entertainment.
But Jay also reminds us that we’re not powerless, our agency hasn’t disappeared; it’s just buried beneath layers of habit. With calm, practical guidance, he shares how we can take our feed back into our own hands, break the doom-scroll cycle, and actually reprogram the digital environment influencing our minds. Whether it’s choosing who you follow more intentionally, setting healthy boundaries in the morning, sharing more consciously, or reconnecting with real-world anchors, Jay shows that we’re not just participants, we’re contributors to how the system works. And when we change how we show up, everything around us begins to shift as well.
In this episode, you'll learn:
How to Retrain Your Algorithm in Minutes
How to Recognize When the Algorithm Is Steering You
How to Build a Healthier, Calmer Feed
How to Use Social Media Without Losing Yourself
How to Strengthen Your Digital Self-Control
You weren’t meant to be overwhelmed by noise or pulled into constant comparison. You were built to create a life rooted in values, peace, and purpose. So take a breath, make one mindful choice at a time, and let it guide the next.
With Love and Gratitude,
Jay Shetty
Join over 750,000 people to receive my most transformative wisdom directly in your inbox every single week with my free newsletter. Subscribe here.
What We Discuss:
00:00 Intro
00:31 Even the Algorithm Has a Glitch
03:04 4 Subtle Ways the Algorithm Shapes You
07:59 How Your Clicks Create the Pattern
09:45 What a Social Network Looks Like Without All the Noise
13:08 Doom-Scrolling Can Give You Anxiety!
14:47 Solution #1: Bring Back Chronological Feeds
15:10 Solution #2: Take a Moment Before Hitting Share
16:06 Solution #3: Demand Algorithmic Transparency
16:29 Why Emotional Mastery and Critical Thinking Matter
19:11 5 Simple Ways to Reset Your For You Page
See omnystudio.com/listener for privacy information.
Press play and read along
Transcript
Speaker 1 This is an iHeart podcast
Speaker 1 guaranteed human.
Speaker 1 This episode of On Purpose is brought to you by Chase Sapphire Reserve. I believe that travel is one of the greatest gifts that we've ever been given.
Speaker 1 And Chase Sapphire Reserve has been my gateway to the world's most captivating destinations.
Speaker 1 When I use my Chase Sapphire Reserve card, I get eight times the points on all the purchases I make through Chase Travel.
Speaker 1 and even access to one-of-a-kind experiences, experiences like music festivals and sporting events.
Speaker 1 And that's not even mentioning how the card gets me into the Sapphire Lounge by the club at select airports nationwide. Travel is more rewarding with Chase Sapphire Reserve.
Speaker 1 Trust me, discover more at chase.com forward slash Sapphire Reserve. Cards issued by JP Morgan Chase Bank, NA, member FDIC, subject to credit approval, terms apply.
Speaker 2 Dealing with hydrodonitis superativa, HS, is tough, but you're not alone. Before I started Cosentix, I looked at the website and saw it had many years of clinical research.
Speaker 2 That made me feel confident.
Speaker 3
Cosentix secukinumab is prescribed for adults with moderate to severe hydrodonitis superativa, HS. Don't use if allergic to Cosentix.
Get checked for TB before starting.
Speaker 3 Increased risk of infections and lowered ability to fight them may occur, like TB or other serious bacterial, fungal, or viral infections. Some were fatal.
Speaker 3 Tell your doctor if you have an infection or symptoms, like fevers, sweats, chills, muscle aches, or cough, had a vaccine or planned to, or if IBD symptoms develop or worsen.
Speaker 3 Serious allergic reactions and severe eczema-like skin reactions may occur. Learn more at 1-844-COSENTIX or COSENTIX.com.
Speaker 1 Don't wait.
Speaker 2 Ask your dermatologist about COSENTIX.
Speaker 4 We all take good care of the things that matter: our homes, our pets, our cars. Are you doing the same for your brain?
Speaker 4 Acting early to protect brain health may help reduce the risk of dementia from conditions like Alzheimer's disease.
Speaker 4 Studies have found that up to 45% of dementia cases may be prevented or delayed by managing risk factors you can change. Make brain health a priority.
Speaker 4 Ask your doctor about your risk factors and for a cognitive assessment. Learn more at brainhealthmatters.com.
Speaker 1 Is our destiny coded in the algorithm if you feel addicted to social media this video is for you if you feel glued to whatever's on your feed and can't stop doom scrolling this video is for you and if you're worried about how social media is rewiring your brain this video is for you don't skip it the number one health and wellness podcast jay shetty jay shetty the one the only jay sheti
Speaker 1 i wanted to start today saying one thing.
Speaker 1 The algorithm isn't as smart as we think it is. But the deeper I went into my research, the more I realized something unsettling.
Speaker 1
It's stronger than me, stronger than you, stronger than all of us because it knows our weaknesses. But here's what I also found.
Even the strongest system has a glitch.
Speaker 1 The algorithm doesn't just know us, it depends on us. And if we learn how it feeds, we can decide whether to starve it or steer it.
Speaker 1 When you Google the words, will I ever, the first thing that comes up is, will I ever find love?
Speaker 1 The second is, will I ever be enough?
Speaker 1 And the third is, will I am net worth? We go from love to worth to money really quickly. But this search for love, worth and belonging is what the algorithm exploits, but not in the way you think.
Speaker 1
Picture this. It's midnight.
Think of a girl named Amelia, lies in bed, phone in her hand, she posts a photo, nothing dramatic, just hoping someone notices.
Speaker 1
The likes trickle in, her friends comment, she taps on another girl's profile. Prettier, thinner, more followers.
She lingers, she clicks, she scrolls. The algorithm pays attention.
Speaker 1
The next night, her feed feels different. More flawless faces, more filters, more diets, more lives that look nothing like hers.
Curiosity turns into comparison.
Speaker 1 Comparison turns into obsession and soon every scroll feels like it's whispering the same three words.
Speaker 1 You're not enough.
Speaker 1
Until one night, she doesn't see herself anymore. She only sees the mirror mirror the algorithm is holding up to her.
This isn't just Amelia's story.
Speaker 1 56% of girls feel they can't live up to the beauty standards they see on social media. 90% of girls follow at least one social media account that makes them feel less beautiful.
Speaker 1 But here's the real question.
Speaker 1 Did the algorithm build that mirror? Or did she? Was it coded in Silicon Valley? Or coded in her own clicks? Let's look at the algorithm first. What do algorithms actually do?
Speaker 1 Number one, they watch every pause, every click, every like, every share, even how long you hover over a video or comment. TikTok tracks watch time down to the second.
Speaker 1 If you re-watch a clip, it's a super strong signal. Number two, they predict.
Speaker 1 Using your history and the behaviors of millions of people like you, algorithms predict, what are you most likely to engage with next?
Speaker 1 If people who watch fitness videos also tend to watch diet hacks, you'll probably get diet hacks. Number three, they amplify.
Speaker 1 The posts that get more engagement, especially emotional engagement, are pushed to more people.
Speaker 1
Number four, they adapt. Every click retrains the system.
Your feed tomorrow is shaped by what you do today. YouTube's recommendation engine is called a reinforcement system.
Speaker 1
It's literally designed to learn from your actions in real time. The most accurate model is a cycle.
First of all, we click what feels good, familiar, or emotionally hot.
Speaker 1 Two, the algorithm learns and serves us more of that to keep us there. Number three, we become more entrenched and less exposed to alternatives.
Speaker 1 And number four, outrage and division spread faster because anger is more contagious.
Speaker 1 In plain words, the algorithm isn't a mastermind. It's a machine that asks one question over and over again.
Speaker 1 What will keep you here the longest?
Speaker 1
It's like a maximum security prison. So how do we get trapped? First, the nudge.
Think Netflix Autoplay. TikTok Infinite Scroll, the design that says, don't think, don't choose, just keep watching.
Speaker 1 That's how you start a Korean baking show you didn't even know existed and three hours later you're crying over a documentary on penguins.
Speaker 1 A study found disabling autoplay led to a 17-minute shorter average session showing autoplay measurably extends watch time.
Speaker 1 It's not a choice disappearing. It's a choice so well hidden you don't realize you never made it.
Speaker 1
Second, the loop. Yale researchers found when people post moral outrage online, people reward them with likes and retweets.
That person now posts even more outrage the next time.
Speaker 1
It's not the algorithm, it's us, it's real people. As one researcher put it, we don't just consume outrage, we start producing it because outrage performs better than honesty.
And third, the push.
Speaker 1 Mozilla's YouTube Regrets project from 2020 found that volunteers who started started with neutral searches like fitness or politics reported being steered toward extremist conspiratorial or misogynistic content.
Speaker 1 71% of the videos people regretted watching were never searched for. They were recommended.
Speaker 1 The UCL Kent study from 2024, in a recent algorithmic model study, accounts on TikTok were shown four times more misogynistic content on the For You page within just five days of casual scrolling.
Speaker 1
What does this do to men and women? Women get more insecure about their appearance. Men get more exposed to misogynistic content.
Women experience more anxiety and self-doubt.
Speaker 1
Men become more lonely and disconnected. Women compare their lives to others and feel they're falling behind.
Men compared their status to others and feel like they're being left behind.
Speaker 1 Both end up in the same place on social media, isolated, exhausted, and shaped by the same machine. The algorithm will do anything to keep us glued.
Speaker 1 There is a huge incentive issue for the algorithm because in one study where they chose not to show toxic posts, users spent approximately 9% less time daily, experienced fewer ad impressions, and generated fewer ad clicks.
Speaker 1
The algorithm's goal is not to make us polarized. It's not to make us happy.
It's to make us addicted and glued to our screens.
Speaker 1 It is showing you what people like you are engaging with, assuming you will stay as well.
Speaker 1
We talked about what the algorithm does. Let's look at what role we play.
Our clicks build the cage. False news stories are 70% more likely to be retweeted than true stories are.
Speaker 1 It also takes true stories about six times as long to reach 1500 people as it does for false stories to reach the same number of people.
Speaker 1 Algorithms don't see truth or lies, they only see clicks from people like us.
Speaker 1 Want to make a real difference this giving season? This December on purpose is part of Pods Fight Poverty, podcast teaming up to lift three villages in Rwanda out of extreme poverty.
Speaker 1 We're doing it through Give Directly, which sends cash straight to families so they can choose what they need most. Donate at giveedirectly.org forward slash on purpose.
Speaker 1 First time gifts are matched, doubling your impact. Our goal is $1 million
Speaker 1 by year's end enough to lift 700 families out of poverty. Join us at giveedirectly.org forward slash on purpose
Speaker 1 number two false news spreads six times faster than true news because shocking content sparks more clicks and shares from us so the algorithm promotes it further The content must already have emotional potency.
Speaker 1 An algorithm won't manufacture depth or resonance from nothing. It can't make it go viral.
Speaker 1 Number three, for major media outlets, each additional negative effect word in a post is associated with a 5-8% increase in shares and retweets from us.
Speaker 1 And four, Facebook studies showed that even when given the opportunity, Users clicked links confirming their bias far more often than opposing one.
Speaker 1
Liberals chose cross-cutting news 21% of the time, conservatives 30% of the time. Here's the twist.
The algorithm doesn't pick sides. We do.
It just learns our choice and builds a fortress around it.
Speaker 1 The danger isn't that we have no choice. It's that we don't notice when our choices are being shaped for us.
Speaker 1 So let's do a thought experiment. Why don't we create a social media platform without these incentives? One that doesn't play these games with us?
Speaker 1
They already tried that. And what I'm about to share with you shocked me the most.
A new study out of the University of Amsterdam tested this by creating a stripped-down social network.
Speaker 1 No ads, no recommendation algorithms, no invisible hand pushing content.
Speaker 1 Researchers released 500 AI chatbots onto the platform, each powered by Open AI and gave them distinct political and social identities. Then they let them loose.
Speaker 1 Across five separate experiments, amounting to 10,000 interactions, the bots began to behave exactly like us.
Speaker 1 They followed those who thought like them, they reposted the loudest, most extreme voices, they gravitated into echo chambers, not because an algorithm pushed them there, but because that's where they chose to go.
Speaker 1 It also found that users who posted the most partisan content tended to get the most followers and reposts.
Speaker 1 Researchers tried interventions, dampening virality, hiding follower counts, even boosting opposing views. But nothing cracked the cycle.
Speaker 1 The most they managed was a 6% reduction in partisan engagement.
Speaker 1 In some cases, when they started hiding user bios, the divide actually grew sharper and the most extreme posts gained even more traction.
Speaker 1
The implication is chilling. Maybe it isn't just the algorithms that warp us.
Maybe social media itself is wired against our better nature. Think about it like a funhouse mirror.
Speaker 1
It doesn't simply reflect who we are. It stretches our fears, it magnifies our biases and turns our flaws into a spectacle.
As humans, we can live consciously or unconsciously.
Speaker 1
We can choose our stronger selves or our weaker selves. When we choose our weaker self, humans are not just curious.
We're programmed to measure ourselves against others.
Speaker 1
Comparison is our oldest survival instinct. Envy is the emotional fuel.
The algorithm didn't invent it, but it does exploit it.
Speaker 1
When we're tired, overwhelmed, and exhausted, humans are not ruled by curiosity. We're ruled by comparison.
And envy is the price of admission. The algorithm didn't create envy.
Speaker 1
It just turned it into an economy. Now, why do we do this? The first is negativity bias.
Evolution tuned us to notice threats more than opportunities. Missing a berry was fine.
Speaker 1
Missing a snake was fatal. Number two, outrage is social currency.
Expressing outrage signals loyalty to your group. It tells others, I'm one of us.
Speaker 1
And in polarized contexts, this isn't just emotion, it's identity signaling. Clicking rage is clicking belonging.
Number three, cognitive efficiency. Negative content is often simpler.
Speaker 1
This is bad, they're wrong, we're threatened. The brain prefers cognitive ease over nuance.
Complex, balanced content demands more effort. Negativity feels immediate, digestible, and actionable.
Speaker 1 So what do we do about this? Doom scrolling increases cortisol, anxiety, and learned helplessness. In that state, people feel like they have no agency, which can reinforce the sense of doom.
Speaker 1 So, we have an incentive issue for the platforms because they're just trying to keep us glued, and we have a lack of mental resilience for us.
Speaker 1 Put those both together, that's what we're experiencing right now. So, what do we do about the incentive issue? People often ask me if I think AI will ever have a soul.
Speaker 1 And my response is, I don't know if AI will ever have a soul. I just hope the people building AI have a soul.
Speaker 1 The people who created these algorithms will lose millions or billions if they adjusted the algorithm. Will they do that? Will they recognize or think they have a responsibility?
Speaker 1 It's a really interesting thing to think about
Speaker 1 because it's almost like we're making something that is becoming us. It's almost like Frankenstein, that idea that whatever system we build has a part of us in it.
Speaker 1
If you build a company, it has a part of you in it. There's an energetic exchange as well.
So what does that feel like when you're building a platform that millions and billions of people use?
Speaker 1 The truth is we can't afford to just diagnose the problem. And I get intrigued by that sometimes when people just want to diagnose the problem, but we need to find solutions.
Speaker 1 And here are three changes social media companies could try.
Speaker 1 The first is platforms should offer chronological feeds by default, not buried in settings, and give users transparent control to toggle between chronological and algorithmic.
Speaker 1 Facebook's own studies show chronological feeds reduce political polarization and misinformation exposure, though engagement does drop.
Speaker 1 The second thing they can do is actually probably my favorite: add friction before sharing. For example, read before retweet prompts, share limits, cooling off periods on viral posts.
Speaker 1 Imagine you couldn't share something until you had read it in full. Imagine you couldn't share something until you'd watched that video in full.
Speaker 1 Support for this podcast is brought to you by Walden University.
Speaker 1 If you're listening right now and feeling that pull to grow in your career or make a bigger impact, Walden is designed exactly for that. For over 50 years, they've helped working adults get the W.
Speaker 1
with the knowledge, confidence, and real skills to create meaningful change. What makes Walden stand out is how flexible it is.
With Walden's tempo learning, you're in control.
Speaker 1 No set weekly deadlines, no rigid schedules, just a pace that actually fits your life. And everything you learn is practical.
Speaker 1 You're working through real-world scenarios that prepare you to make a positive difference in your community and beyond.
Speaker 1 You're also guided by faculty who've lived the work themselves, scholars and practitioners with real experience.
Speaker 1
This is the kind of opportunity that reminds you it's never too late to go after what you want. If you've been waiting for the right moment, this is it.
Head to waldenu.edu and take that first step.
Speaker 1 Walden University, set a course for change. Certified to operate by Chev.
Speaker 6 Para los grandes, para los chicos, para los vajos y los altos, los pacifistas, los valientes, para los octimistas y los pessimistas, los que valor a lo deventro, para los que están lejos, los que no vende lejos, y los que no vende cerca, para los introvertidos, y los extrovertidos, para los que pienzan y los que ha hacen, para los que nos mustraro en el camino.
Speaker 5 Coca-cola, para todos.
Speaker 6 Combra una coca-cola y una tenda cerca lost.
Speaker 1 This episode of On Purpose is brought to you by Chase Sapphire Reserve. I believe that travel is one of the greatest gifts that we've ever been given.
Speaker 1 And Chase Sapphire Reserve has been my gateway to the world's most captivating destinations. Every time I travel, I find a part of myself I didn't know was missing.
Speaker 1
I remember being in this small town, completely unplugged, and for the first time in a while, I felt still. Travel does that.
It grounds you, expands you, and connects you to something deeper.
Speaker 1 That's why I'm always looking for experiences that go beyond the typical.
Speaker 1 Chase Sapphire Reserve makes traveling a breeze, earning eight times points on all purchases through Chase Travel and granting access to Sapphire Lounge by the club at select airports nationwide.
Speaker 1 No matter my destination, travel is more rewarding with Chase Sapphire Reserve. Discover more with Chase Sapphire Reserve at chase.com forward slash Sapphire Reserve.
Speaker 1 Cards issued by JP Morgan Chase Bank, NA, member FDIC, subject to credit approval, terms apply.
Speaker 1 Twitter's 2020 read-before-retweet experiment led to a 40% increase in people opening articles before sharing. WhatsApp's forwarding limits dramatically slowed misinformation in India.
Speaker 1 This could actually make a difference because not only are we misinforming others, we're underinformed ourselves.
Speaker 1 If you're retweeting something just based on the headline and have no idea what's inside of it, we're now propelling ideas that we don't fully grasp and understand.
Speaker 1 And number three, require algorithmic transparency and independent audits. Companies must publish how recommendation systems prioritize content and allow external researchers to study the impacts.
Speaker 1 The EU's Digital Services Act is already moving this way, requiring large platforms to open their algorithms to scrutiny. Now, what do we do about the human nature issue?
Speaker 1 I want to share with you one of my favorite stories. A student once asked the Buddha, what do you gain from meditation?
Speaker 1 He said nothing. The student asked, then why do you meditate if you gain nothing?
Speaker 1
The Buddha replied, I don't meditate because of what I gain. I meditate because of what I lose.
I lose anger, I lose envy, I lose ego. If the algorithm is made of us,
Speaker 1
then changing it doesn't start with code. It starts with character.
We have to remember that we are wired for generosity, but educated for greed.
Speaker 1 When will we finally start teaching emotional mastery in schools? How long before we start teaching critical thinking at an early age? Maybe the real test isn't to build a happier network.
Speaker 1 It's to build happier users.
Speaker 1
We built a machine to know us and it became us. When we started on purpose, there were only three things that went viral.
Cats and dogs. Sorry to put them in the same group.
Speaker 1
Babies and people taking their clothes off. I had the innocent intention, the naive vision of making wisdom go viral.
Today, we do over 500 million views across platforms every month.
Speaker 1 Not playing into rage bait, not trying to get
Speaker 1 be angry. What does it show me? It shows me that people will choose healthier options if they're available, if it's presented to them in a digestible way.
Speaker 1 People will choose a salad if they know why it's better for them and if it's available and has a great dressing.
Speaker 1
It's our role to not play into the fear and find ways to make love more attractive and accessible. It's so easy to sell fear.
It's so easy to sell negativity. It's so easy to sell gossip.
Speaker 1 But the truth is, why sell the things that sell people short?
Speaker 1 Why not provide them with alternatives that are healthy, strengthening, empowering, that give them the tools to make a difference in their life? Here's the good news.
Speaker 1 Algorithms do not fully decide your fate. They're predictive, not deterministic.
Speaker 1 They rely on your past clicks, but you can override them by searching, subscribing to diverse sources, and consciously engaging with content outside of your bubble.
Speaker 1
So I want you to take a look at a new account I started in the 4U page. The 4U page is pretty simple.
It's beautiful imagery. It's introducing me to a couple of scenery.
Speaker 1 And as I scroll down, you start to see more of what the average person would see.
Speaker 1 The 4U page, as you go deeper, shows you everything from political podcasts, shows me people working out, shows me influencer content. Now I'm going to show you how easy it is to change your 4U page.
Speaker 1 Because this page is so visual, I'm going to do it through finding quotes. And also, you know, I love quotes.
Speaker 1
So I'm going to go follow some quotes. I'm going to like some quotes.
Gonna like another quote. I'm going to hover over it for a while.
Speaker 1 This is really important to actually hover over the quote, to actually read it, to actually be present with it.
Speaker 1 And now I'm even going to share a quote with a friend who's now going to think they have an issue because I just shared some wisdom with them. When I refresh, check out my 4U page.
Speaker 1
It's pretty much all quotes. Through three to four simple steps, I transformed my 4U page.
This is almost a cleansing, filtering process that I recommend you do. It's simple.
Speaker 1 I want you to follow five people you wouldn't usually follow. Agency isn't eliminated, it's eroded by habit.
Speaker 1 People who intentionally curate their feeds, limit usage, or diversify inputs show significantly less polarization.
Speaker 1 The second thing I want you to do is hover over and comment on five pieces of content you want to see more of. Your offline life still matters.
Speaker 1 Real books, real conversations and communities can counteract the digital echo chamber.
Speaker 1 And number three, I want you to share five pieces of content you wouldn't usually and see how that changes your algorithm. Number four, don't look at your phone first thing in the morning.
Speaker 1
It's like letting 100 strangers walk into your bedroom before you've brushed your teeth or washed your face. You would never do that in real life.
Don't do it online. And five, be present with joy.
Speaker 1 Celebrate your friends' wins and accomplishments. Stop overreacting to negativity and underreacting to joy.
Speaker 1 We remember the bad times more than the good times because when we lose, we cry for a month and when we win, we celebrate for a night. Here's what I want you to remember.
Speaker 1 When you like something, you're telling the algorithm, show me more of this.
Speaker 1 When you hover over something, you're saying to the algorithm, I pay attention when you show me this. When you comment on something, you're saying, this is really important to me.
Speaker 1 And when you share it off the platform, you're saying, fill my feed with this.
Speaker 1
You're co-creating your algorithm. You're actually coding it.
One of my favorite thoughts comes from F. Scott Fitzgerald.
Speaker 1 He said, The test of a first-rate intelligence is the ability to hold two opposed ideas in the mind at the same time and still retain the ability to function.
Speaker 1 One should, for example, he said, be able to see that things are hopeless and yet be determined to make them otherwise.
Speaker 1 That second part is so needed right now. That's what our stories need.
Speaker 1
Accepting that things are tough, things are really hard, and at the same time reminding each other that you can make a change. You can transform your life.
You can take accountability.
Speaker 1
You can take action. You do have agency.
Reminding the world that extraordinary things have always been achieved by a group of ordinary people. I'll leave you with this.
Imagine you walk into a party.
Speaker 1
At first, it looks fun. People laughing, music playing, stories being told.
But then you notice something strange. Everywhere you turn, someone's doing better than you.
Speaker 1
Someone richer, someone prettier, someone with more friends, more followers, more success. You walk into another room, and this one feels worse.
The room is full of arguments.
Speaker 1
Everyone's shouting, no one's listening. And the louder and angrier someone is, the bigger the crowd around them.
That's when it hits you. You never chose to come to this room.
Speaker 1
You were invited by the algorithm. That's the cruel genius of social media.
It doesn't force us us into comparison. It discovers we're already drawn to it.
It doesn't create division.
Speaker 1
It learns that anger holds our gaze longer than joy. The algorithm didn't create outrage.
It turned outrage into entertainment. And here's the question only you can answer.
Speaker 1 When you pick up your phone tonight, are you walking back? into that same party
Speaker 1
or will you finally leave. Thank you for listening.
I hope you've subscribed. Share this episode with someone who needs to hear it.
Speaker 1 And remember, I'm forever in your corner and I'm always rooting for you.
Speaker 1 If you love this episode, you will also love my interview with Charles Doohigg on how to hack your brain, change any habit effortlessly, and the secret to making better decisions.
Speaker 7 Look, am I hesitating on this because I'm scared of making the choice, because I'm scared of doing the work, or am I sitting with this because it just doesn't feel right yet?
Speaker 1 This episode of On Purpose is brought to you by Chase Sapphire Reserve.
Speaker 1 I believe that travel is one of the greatest gifts that we've ever been given and Chase Sapphire Reserve has been my gateway to the world's most captivating destinations.
Speaker 1 When I use my Chase Sapphire Reserve card, I get eight times the points on all the purchases I make through Chase Travel and even access to one-of-a-kind experiences, experiences like music festivals and sporting events.
Speaker 1 And that's not even mentioning how the card gets me into the Sapphire Lounge by the club at select airports nationwide. Travel is more rewarding with Chase Sapphire Reserve.
Speaker 1 Trust me, discover more at chase.com forward slash Sapphire Reserve. Cards issued by JPMorgan Chase Bank, NA, member FDIC, subject to credit approval, terms apply.
Speaker 8 This is an ad by BetterHelp.
Speaker 8
We've all had that epic rideshare experience. Halfway through your best friends, and they know your aspirations to go find yourself in Portugal.
It's human. We're all looking for someone to listen.
Speaker 8
But not everyone is equipped to help. With over a decade of experience, BetterHelp matches you with the right therapist.
See why they have a 4.9 rating out of 1.7 million client session reviews.
Speaker 8 Visit betterhelp.com for 10% off your first month.
Speaker 6 Para los grandes, para los chicos, para los bajos y los altos, los pacifistas, los valientes, para los octimistas, y los pessimistas, los que valor a lo deventro, para los que están lejos, los que novende lejos, y los que no vende cerca, para los introvertidos, y los extrovertidos, para los que bienzan y los que hacen, para los que nos unostraron el camino.
Speaker 5 Coca-cola, para todos.
Speaker 6 Combra una Coca-Cola una tenda cerca lusted.
Speaker 4 This is an iHeart podcast.
Speaker 5 Guaranteed human.