Inside Apple: Sonic Accessibility

33m
If you want to know where the future of sound is headed, accessibility is a great place to look. And for decades, Apple has been leading the charge in accessible technology. In this episode, the Apple team breaks down the philosophy and craft behind their most impactful accessibility features. Along the way, we reveal how these innovations have transformed the way we interact with our devices, and could even lead to a revolution in hearing health. Featuring Sarah Herrlinger, Deidre Caldbeck, Ron Huang, and Eric Treski.

Enter the “Sound Off” Story Contest at 20k.org/soundoff. Submissions close on May 7th, 2025.
Get in touch with Apple’s accessibility team by writing accessibility@apple.com.
Vote for Twenty Thousand Hertz in the Webby Awards by April 17th, 2025.
Explore the all new Defacto Sound website, and click the Contact Form to get in touch.
If you know what this week's mystery sound is, tell us at mystery.20k.org.
Follow Dallas on Instagram, TikTok, YouTube and LinkedIn.
Join our community on Reddit and follow us on Facebook.
Follow You’ll Hear It, the #1 jazz podcast, on Apple Podcasts, Spotify or YouTube.
Sign up for a one-dollar-per-month trial at shopify.com/20k.
Cut your current cloud bill in half with OCI at oracle.com/20k.

Episode transcript, music, and credits can be found here: www.20k.org/episodes/sonic-accessibility
Learn more about your ad choices. Visit megaphone.fm/adchoices

Listen and follow along

Transcript

Here's something embarrassing, but true, about me.

I wear a plain black t-shirt every single day.

For me, it's just one less thing to think about.

Then recently, a friend was telling me about Merino Wool.

It's a high-quality fabric that's naturally antimicrobial.

Now, my wife is a longtime fan of Quince.

It's a company that sells durable, stylish clothing, as well as jewelry and home goods for very reasonable prices.

So I ordered a few things from Quince, including a merino wool black t-shirt.

And you know what?

It's fantastic.

It's incredibly soft, and even after a long, active day of wearing it, it doesn't feel grimy.

I think I may have finally found that one black t-shirt to rule them all.

Keep it classic and cool this fall with long-lasting staples from Quince.

Go to quince.com/slash 20k for free shipping on your order and 365-day returns.

That's quince.com/slash 20k.

For free shipping and 365-day returns, visit quince.com slash 20k.

You're listening to 20,000 hertz.

I'm Dallas Taylor.

I've always been fascinated with accessibility when it comes to sound.

Because so many incredible innovations in the world of audio began as accessibility efforts.

Take, for instance, voice commands and text-to-speech.

They were originally designed for people with visual or motor impairments, but today we take for granted that we can talk to our devices and they can talk back to us.

Closed captions were created for people with hearing impairments, yet regardless of your hearing, many of us use captions all the time.

Even audiobooks were originally created by blindness advocacy groups way back in the 1930s.

And now, audiobooks are a mainstream, multi-billion dollar industry.

The point is, if you want to know where the world of sound is headed, accessibility is a great place to look.

I firmly believe that many of the greatest future achievements in sound will come from accessibility efforts now.

Accessibility is something that's incredibly important to us.

It is a part of the process in everything that we build.

That's Sarah Herlinger.

I'm the Senior Director of Global Accessibility Policy and Initiatives at Apple.

I met with Sarah at Apple Park in Cupertino, California.

My team's job is to ensure that every way that Apple presents itself to the world, whether that be through our products, our services, our stores, our workplace, events, you name it, that we are living our core value of accessibility as a basic human right.

To embody this idea of accessibility as a basic human right, Apple depends on the people who use these features in their day-to-day lives.

It starts with adherence to the disability community mantra of nothing about us without us.

And you don't build build for a community, you build with them.

And the first step of that for us is the hiring of people with lived experience on our teams to help drive the development of our different types of accessibility features.

We look at accessibility as kind of falling into five main pillars, which is vision, hearing, physical motor, cognitive, and speech.

And we build features to support each one of those areas.

For decades now, Apple has been been continually refining its approach to those five pillars.

It's also not something that's new to us.

Our first Office of Disability actually started in 1985.

That was just one year after Steve Jobs first introduced the Macintosh computer.

And during that announcement, it was actually an accessibility feature that stole the show.

But today,

for the first time ever,

I'd like to let Macintosh speak for itself.

Hello, I am Macintosh.

It sure is great to get out of of that bed.

It was called Macintalk, and it was an early text-to-speech engine, otherwise known as TTS.

These enable programs to read text aloud.

They're helpful for the visually impaired and people with learning or cognitive disabilities.

Basically, anyone who might have trouble reading.

So it is with considerable pride that I introduce a man who's been like a father to me, Speed Charge.

Macintalk made the Macintosh look irresistibly cool, like something out of 2001 A Space Odyssey.

The 9000 series is the most reliable computer ever made.

Starting in 1987, many of the Mac's accessibility features were bundled under a label called easy access.

This included things like sticky keys, which made it easier to use keyboard shortcuts, and mouse keys, which let you control your cursor with your keyboard.

But maybe the most important was a third-party program made by a company called Berkeley Systems.

It was a screen reader called Outspoken.

Now, screen readers don't just read text.

Instead, they take in everything on screen and turn all of that visual information into described audio.

Here's an example of someone using a screen reader on a school's website.

Navigation region.

Heading level to link.

Services for children.

List of four items.

As they move around the screen with their keyboard arrows, the screen reader explains whatever element is highlighted.

In this case, a set of links.

Link

People who use screen readers often get used to listening at incredible speeds, some up to around a thousand words per minute.

Here's a demo on YouTube.

Like this.

So.

How's that?

Today, screen readers are really common, but up until the mid-aughts, the good ones were all super expensive.

The leading screen reader for Microsoft computers was called JAWS, which could cost over $1,000.

Internet Explorer, Space, Eder, 3G Search Engine, Cybertext.

But in 2005, Apple changed the game by rolling out VoiceOver, their first built-in screen reader.

Isn't it nice to have a computer that will talk to you?

It took a couple of years to work out the kinks, but by 2007, Apple had developed a serious competitor to JAWS.

That same year, VoiceOver got a new, more natural-sounding TTS voice named Alex.

In particular, people noticed how he breathed.

This is Alex.

I'm programmed with over 150 different breath sounds.

You can still find me in the system settings.

VoiceOver was a new addition to a broader set of accessibility features that were now called Universal Access.

To represent universal access, Apple designed a symbol of a blue Vitruvian man.

Basically, a little blue person with their arms and legs outstretched.

At Apple, they call him Vito.

And now, it's come to symbolize accessibility all around the world.

Universal Access marked the beginning of a new era.

Accessibility would no longer be led by third-party companies who sold their software as expensive upgrades.

Instead, Apple itself would take the lead and bake these features features right into their products.

Now, part of it may have been a business decision, because the more accessible a device is, the more marketable it is to schools, libraries, and other public institutions.

But regardless of the financials, people like Steve Jobs and Tim Cook just thought it was the right thing to do.

Here's Tim Cook with interviewer James Rath.

If you think back to how Apple was founded and is still the case today, we make tools for people to do incredible things and change the world world with them.

And that's everybody.

I've never, ever, in the 20 years of being at Apple, ever looked at a, what's our return on investment here?

It wouldn't be Apple without doing this.

I mean, it's a part of our values that we will not compromise on.

Universal access was a huge step forward for accessibility on computers.

And yet, in the mid-aughts, most cell phones still weren't very accessible.

Many people with disabilities were limited to the most basic functions of a phone, making calls and sometimes texting.

And like with JAWS on Windows, these often required expensive add-ons.

But all of that changed in 2009 with the iPhone 3GS.

When we designed it, we actually had to rethink all of the ways that one interacts with the device to make it a safe environment for someone in the the blind community so that a single touch wouldn't make them do something they didn't intend to do.

Up to that point, touch screens were much less friendly to the visually impaired than old-fashioned tactile buttons.

But once voiceover came to the iPhone, it transformed that experience, just like it did with the Mac.

So for example, on an iPhone, it will do everything from read your text to you to tell you how many bars of cell coverage you have or what time is it.

you can just move your finger on top of all of the visuals on the screen, the icons, the words, and have it read back to you.

82 degrees Fahrenheit

Along with voiceover, Apple also introduced an early version of what's now called voice control.

If you have a disability that makes fine motor movements difficult, the ability to control your phone with your voice is vital.

Now, keep in mind, Siri still hadn't come out yet, but in this 2009 YouTube demo of voice control, we can hear something that sounds a lot like it.

You just touch and hold down the home button for three seconds and it'll pop up.

Help

using iPhone voice control.

You can tell iPhone to call contacts.

Play playlists.

Play songs by Collective Soul.

Playing Songs by Collective Soul.

Then, when Siri was officially introduced in 2011, it opened up even more of the iPhone to voice commands.

Here's Apple's Scott Forstall showing off what Siri could do.

Do I need a raincoat today?

It sure looks like rain today.

For many users, Siri was mostly a cool, futuristic new feature.

But for people with limited mobility or vision, it was a game changer.

You can send and receive text messages.

You can create notes.

You can search the web.

You stick something in the oven and you're going to bake it and need to take it out in 30 minutes.

Just take your phone and ask Siri to set a timer for 30 minutes and you're done.

We were the first company to make a consumer touchscreen accessible to someone in the blind community.

It's not just voiceover, it's Zoom, it's inverting colors, it's dynamic text, it's all these different things that are there so that whatever your personal unique need is, you can set up your device to work for you.

And right away, these features really caught people's attention.

Around 2011, there was an outpouring of gratitude from the disabled community, especially users with visual impairments.

For many of them, they jumped from barely having access to cell phones to having almost full access to the world's leading smartphone.

At a 2011 event in Los Angeles, Stevie Wonder personally thanked Steve Jobs.

And I want you all to give a hand for someone who this company took the challenge in making his technology accessible to everyone.

Steve Jobs.

Because there's nothing on the iPhone or the iPad

that you can do that I can't do.

By this point, Apple had been releasing built-in accessibility features for over 25 years.

But this was only the beginning.

Today, Apple is using AI, augmented reality, and all sorts of technology to make their devices more accessible and useful than ever before.

These innovations are already changing how millions of people experience the sounds around them.

And they have the potential to revolutionize the world of hearing health.

That's coming up after the break.

Here at 20,000Hz, we've tested lots of platforms for recording remote interviews.

And with most of them, there's some kind of gotcha that makes us very wary of actually using it to record a guest.

But with Riverside, we get reliable, studio-quality recordings every time.

The way Riverside works is that each person records audio and video locally on their device.

Throughout the interview, the files automatically upload to Riverside.

That way, you can get started right away, and you never have to worry about losing everything if you don't click a specific button.

But speaking of buttons, if the person you're interviewing has a bad mic or is in a loud environment, just click the Magic Audio button.

It's an AI-powered audio enhancer and equalizer that removes background noise and gives you a rich, pristine sound.

Once you're in post-production, Riverside has a built-in AI-integrated editor that will save you a ton of time.

We've found it super handy for things like removing filler words, cleaning up audio, and fixing eye contact.

To sign up, visit Riverside.com.

Before you check out, click the I Have a Coupon button and use promo code 20K to get 20% off your Riverside subscription.

That's Riverside.com and promo code 20K.

Congratulations to Jillian McMaster for getting last episode's mystery sound right.

That's someone playing all of the different sizes of boomwhackers.

These are colorful plastic percussion tubes that produce a musical note when you hit them with your hand or against another surface.

The smaller the size, the higher the note.

Boomwhackers were first released in 1995 and are now made by a company called Rhythm Band Instruments.

I've had boomwhackers here at Taylor HQ ever since my first was born.

Now, here's this episode's mystery sound.

If you know that sound, tell us at the web address mystery.20k.org.

Anyone who guesses it right will be entered to win a super soft 20,000Hz t-shirt.

Finally, a reminder that 20,000Hz exists because of the work we do at my sound design company, DeFacto Sound.

So, if you know somebody who works in video and their projects could use some de facto sonic magic, then send them over to de facto sound.com to hear what we do.

One of the most important decisions you'll make in any business is who you hire next, because the right person doesn't just check boxes.

They bring clarity, energy, and momentum to everything you're building.

But finding that person, that's where things get tricky.

Fortunately, there's Indeed.

Indeed takes the entire messy hiring process and distills it into a simple, streamlined platform.

You can post a job, get matched with candidates, schedule interviews, and manage the entire pipeline all in one place.

And with their sponsored jobs option, your post jumps to the top of the page for relevant candidates so you can reach the right people faster.

There's no need to wait any longer.

Speed up your hiring hiring right now with Indeed.

20,000 Hertz listeners will get a $75 job credit to get your jobs more visibility at Indeed.com slash Hertz.

Just go to Indeed.com slash H-E-R-T-Z right now and support our show by saying you heard about Indeed on this podcast.

Indeed.com slash Hertz.

Terms and conditions apply.

Hiring, Indeed is all you need.

In business, there's an old saying, better, faster, cheaper.

You can only pick two.

But what if you didn't have to pick?

Some of the most innovative companies in AI and beyond are proving that it is possible to have all three, thanks to Oracle Cloud Infrastructure, or OCI.

OCI is the next generation of cloud.

It's built for serious performance across infrastructure, databases, app development, and especially AI.

You can run massive workloads in a high-performance environment and spend less while doing it.

When it comes to speed, OCI block storage gives you more operations per second.

As for the price, OCI costs up to 50% less for computing, 70% less for storage, and 80% less for networking.

And when it comes to performance, OCI delivers lower latency and higher bandwidth than other clouds, time after time.

This is the cloud built for AI and all of your most challenging workloads.

Try OCI free right now with zero commitment.

Head to oracle.com/slash 20k.

That's oracle.com/slash 20k.

Apple's accessibility features go back to the 1980s, but they really accelerated in the aughts with universal access and the iPhone 3GS.

Then, when the Apple Watch came out in 2015, it included a suite of features that started to blur the line between accessibility and health.

For instance, there's a noise app that constantly monitors the decibel levels around you.

If it detects unsafe levels of sound, you'll get this alert.

We build our health features with the objective of making an impact on people's lives.

That's Deidre Caldback.

I am the Senior Director of Product Marketing for Apple Watch and Health.

Deidre says that the impacts of these features can sometimes be surprising, even to them.

When we first introduced the noise app, we heard from a father who said it's really changed how his autistic son experiences his life at school because he didn't know how loud he was speaking and it sort of turned some people off that he communicated with.

And so the noise app was helping give him insight real time with how loud he was speaking and he could kind of bring the level of his voice down.

And that was not the way the feature was designed.

But those are the stories where we know, okay, we really want to invest more in hearing.

We really want to invest more more in all areas of health.

One issue they've become especially focused on is hearing loss.

A lot of people don't know they have mild to moderate hearing loss.

About a billion people around the world suffer from mild to moderate hearing loss, but 80% of those go undiagnosed.

The average person who needs hearing assistance generally doesn't get it for, at times, up to a decade after they should have.

That's Sarah Herlinger again.

Sarah says that even after someone seeks treatment for hearing loss, they can find themselves in a world that's not very friendly to the hearing impaired.

For example, for a long time, hearing aids couldn't connect to a cell phone without installing a copper wire called a telecoil.

And the experience was not optimal.

We were getting emails from customers saying,

you know, I know it's just telecoil, but I still can't find a way to make it really work well.

And so I've stopped talking talking to my grandkids, you know, and people just got very insulated because they couldn't use the device as well.

Isolation is a common symptom of hearing loss, especially when it goes untreated.

After all, it's hard to be social when you can barely hear or understand a conversation.

The same goes for watching a movie, going to a live event, or any social setting where you use your ears.

And what we looked at was how Bluetooth was really a great solution, but it wasn't something that was currently an option within the hearing aid world.

So Sarah's team worked with the major hearing aid companies to bring Bluetooth to their products.

We actually wrote the first Bluetooth protocol for hearing and did it specifically for hearing aids and built in a bunch of features like live listen that were specifically for hearing aid users and launched it in end of 2013 and it really revolutionized the hearing aid market.

LiveListen uses the iPhone's microphone as a directional mic and sends that audio directly into your hearing aids.

So let's say you go to a loud restaurant with a friend.

What do you think you're going to order?

Your friend can hold your iPhone up to their mouth and their words will be beamed into your Bluetooth hearing aids.

What do you think you're going to order?

And just like the Noise app, users have found creative new ways of using live listen.

At one point, Sarah heard from someone whose mother had been losing her hearing, which made it hard to watch movies together like they used to.

It was always something he loved to do, and he went home for the holidays that year.

And so, what this guy did was he took her iPhone, turned on live listen, and put it next to the speaker of the TV.

And then they were able to start watching old Christmas movies together.

I'm

dreaming

of a fed

I think that's been the most rewarding part of this process is to hear how much these products and features mean to people.

Live Listen isn't just for hearing aids.

It also works with AirPods and beats.

And like many accessibility features, it can be a useful life hack even if your hearing is perfect.

For instance, you can turn up the TV for yourself without bothering anyone else, or listen to it from the next room over.

In recent years, Apple has been using AI and other innovations to push these features even further.

For instance, there's live captions, which generate live subtitles of any speech, whether it's a podcast playing on your phone or someone talking to you in person.

There's also sound recognition, another potentially life-changing feature for the hearing impaired.

The iPhone and the watch are able to listen for environmental sounds around you, everything from a doorbell, a fire alarm, a dog barking, a baby crying, water running, and present you with a visual alert that says there is a sound behind you.

It may be your water running.

Now, you don't have to be deaf to find a use for sound recognition.

For instance, you can use it to get alerted about your doorbell ringing when you have headphones on.

But for people in the deaf community, it can be really impactful.

I remember when it first launched just seeing people

who talked even specifically about that element of a baby crying and that incredibly human moment of having the realization your child is crying and to be able to go and pick them up and comfort them.

And I think that one really more than maybe a dog barking or the doorbell ringing, it really just brings that human connection.

Over in the magnifier app, there were similar features for the visually impaired called people detection and door detection.

These allowed people to hold up their phone's camera and have it alert them about the presence of doors and other people.

Open door six feet away, turn handle or knob, swing.

Two doors detected.

Door five feet away, turn handle or knob, swing.

More recently, Apple released something called scenes.

Now, instead of just detecting doors and people, the iPhone can describe many more details about your surroundings.

It's like a screen reader for the real world.

A group of people sitting in chairs in front of a desk with a laptop and a lamp.

A person standing next to a glass door.

A room with a couch, a table, and other items.

Door detection and scenes are great examples of augmented reality, a catch-all term for when computer-generated information gets overlaid on top of the real world.

But there's another Apple device that's really pushing the envelope in this area and making augmented reality common for all users and that's the airpods it does so many more things in your life than just media that's ron huang apple's vice president of sensing and connectivity our users tend to put it in and leave them in for a much longer time and so that's why we build things like adaptive audio adaptive audio came out in 2022 and it combines two older airpods features into one the first is active noise cancellation which cancels out sound waves as they enter your ear

the second is transparency mode which is basically the opposite of noise cancellation it's for when you want to be aware of your surroundings

so adaptive audio dynamically blends active noise cancellation in transparency modes based on the environment you're in So for example, if you walk into a louder restaurant, we automatically ramp up the amount of noise cancellation we do to lessen the noise.

And when you walk out of that restaurant, we then lessen that active noise cancellation so you get more of the transparency effect directly.

Same thing goes if a truck drives by, right?

Truck gets closer to you, active noise cancellation level raises.

And when it drives away, we fall back down to transparency.

It's not just a mode switch.

It's literally a dynamic, gradual shift between the two modes.

And the end effect is really special because what we hear from our customers over and over again is when they finally take the AirPods out,

they have this OMG moment, which is like, I had no idea that the streets were so loud.

Features like adaptive audio elevate the AirPods from classic earbuds into the world of sonic augmented reality.

Essentially, it's changing our sensory input to make the outside world friendlier to our ears.

When you're in a noisy environment, the world can feel like a bad mix, where some of the instruments are super overpowering.

But now, we finally have some control over that mix.

But just because you have AirPods in, it doesn't mean you have to be closed off to social interaction.

Two features that address this are conversation boost and conversation awareness.

So let's say you're on the train for your morning commute and you're jamming out to some tunes.

Then you decide to ask someone a question.

And so what conversation awareness does automatically is that we real-time detect when you start speaking.

Microphones detect speech sounds, obviously, but there's potentially a lot of people around you also talking.

And as we combine that with the accelerometer, so we know that it's from your jaw.

It is you speaking and therefore it is your intent to speak.

And that's when we apply the ducking or the pausing of the audio to help you talk.

Excuse me, do you know what the next step is?

Once you start talking, the AirPods know that there's probably a reply coming that you want to hear.

So we beam from the mics to the conversation in front of you and actually use machine learning techniques to amplify the speech sound, but not the rest of the noise, so you can have a much better conversation.

Should be Franklin.

I'm actually getting off there.

Got it.

Thanks.

The system even has a way of knowing when the conversation is over.

To do so, they track your conversation partner using beam-forming microphones and motion sensors.

And when I'm done, as I walk away, we combine the fact that we detect you're walking away to also see that, oh, you're likely ending that conversation and therefore resuming the audio back to you.

Now, for these features, all of the processing is happening inside the AirPods themselves.

But of course, the processing power of an iPhone is much greater.

And in some cases, it makes sense to utilize that.

That's what Apple did with their voice isolation feature, which blocks ambient noise to make your voice more clear when you call or FaceTime someone.

We realized, especially with something like voice and all this machine learning capability, is we have so much more compute power on the phone that we can take advantage of.

That's Eric Truske, who directs product marketing for AirPods.

So to remove that background noise, we actually now send a raw signal of your voice directly from AirPods down to the phone.

The phone does all the processing.

And then of course that just goes out to your person on the other end.

When I'm on a business call or FaceTiming with a loved one, I don't want my voice to sound like a noisy, garbled mess.

And these new algorithms go a long way towards improving that.

And that's in windy conditions, that's in loud environments.

So it's an incredible capability that we now have.

Today, Apple is leaning into hearing health more than they ever have in the past.

So the three pillars with our new hearing health features are protection, awareness, and assistance.

On the awareness side, there are those loud noise warnings, which even apply to the things you're choosing to listen to.

We wanted to make sure they had some awareness around how loud they might be listening to their favorite music or their favorite podcast and giving them the ability to automatically reduce those loud sounds so that they're always listening to their media at a safe listening level.

On the protection side, they've added automatic hearing protection across all three of the AirPods noise control modes.

This can reduce the environmental sound hitting your ears by up to 30 decibels.

So maybe you are in a windy city or it's people who tend to be in situations where they don't even realize how loud their environment noise is, like a subway or if they have a profession where there's a lot of loud sounds around, construction, et cetera, hearing protection will actually ensure that background noise is suppressed so that your hearing is protected over a long period of time.

We talk about really what AirPods can do in people's lives beyond just music listening, and we think about that exhaustively.

So I really like the story of not only are we notifying you and able to protect your hearing from what you're listening to from a media perspective, but we're also now protecting your hearing from an environmental sound perspective.

So you're sort of covered both ways.

But for me, the most exciting part of this is what Apple is doing with the third pillar, hearing assistance.

In a hearing study they conducted, we learned that about 75% of the people diagnosed with hearing loss were not using any sort of assistance.

Well, that seems like an area where we can really make an impact.

So Apple developed a new clinically validated hearing test and built it right into the iPhone and AirPods Pro.

This feature brings together engineers, clinicians, audiologists, designers to build what is really the first of its kind hearing aid.

So you can use your iPhone, you can take about a five-minute test in the comfort of your own home, you will get a personalized hearing profile as a result, and this feature will seamlessly transform your AirPods Pro 2 into a hearing aid.

Kind of adjust based on your personal specified needs so that you can hear the world around you much better.

It's hard to overstate how big of a deal this is.

Millions of people buy AirPods, and statistically, many of them have hearing loss and don't even know it.

But by taking this test and potentially using their AirPods as hearing aids, they can avoid some of the downstream effects of hearing loss, like social isolation and cognitive decline.

In other words, it can give people who might never have gone to the doctor life-changing information.

The thing that makes us really excited about this feature is that in the way that they can just use these AirPods that they rely on in their everyday life, they can now use to really help impact their lives in a way they I don't think they would have imagined.

It also has the potential of reducing the stigma because you're seeing people wear AirPods and that's the same set of headphones that you use to listen to your favorite podcast and you can use to have that hearing assistance that you might need.

I have a list of people that I I plan to have try this.

And if they're listening, they know who they are.

For years now, I've been talking about how hearable technology was eventually going to combine headphones, earplugs, hearing aids, virtual assistants, and more into one earbud-like device that we can theoretically leave in all day.

This is the kind of technology that I'm most passionate about because it goes so far beyond just convenience or entertainment.

It's the stuff that literally changes people's lives and helps people connect with each other through sound.

For me, when I started to work on Apple Watch and then soon after Health, to be able to hear some of these stories we've been sharing today, I just felt very fortunate that this was my actual job.

This is my profession that I get to work with these brilliant people that come up with these features that anyone can use and anyone has the potential of having their lives changed.

Now, designing for accessibility comes with a lot of challenges, but when you approach those challenges with empathy and creativity, the result is often a better product for everyone.

We're all unique in the world, and accessibility features may be life hacks to one person, and they may be necessities to another, but we're always just trying to make sure that we have features that work for everyone.

Here's Tim Cook again.

And so, that basic thought of democratizing things so everyone can create whatever they would like to create or solve whatever problem they would like to solve.

That's what we're about.

That's why we're here.

For everything we build, it's really just about giving people opportunity.

I'd like to think that whoever finds the cure for cancer probably isn't going to look like what Hollywood has told us they should look like.

And so if we can build a feature that unlocks someone's capability to express themselves or to learn something or to do whatever it might be that gets us one step closer to that, I'm all in.

20,000 Hertz is produced out of the sound design studios of DeFacto Sound.

Hear more at de facto sound.com.

This episode was written and produced by Nicholas Harder and Casey Emmerling.

With help help from Grace East.

It was sound design and mixed by Jesus Serteg and Brandon Pratt.

Thanks to our guests, Sarah Herlinger, Deidre Kaldbeck, Ron Huang, and Eric Treske.

And thanks to everyone at Apple who invited me in and made this episode possible.

Now, accessibility is something that Apple is very open and responsive about.

We are very open to feedback from customers and had a lot of ways for people to be able to communicate directly with our team.

If you want to help drive these efforts forward, you can email the accessibility team team directly at accessibility at apple.com.

I'm Dallas Taylor.

Thanks for listening.