Americas

  • United States

Asia

Apple’s People Detection, accessibility, and the voice-first enterprise

opinion
Dec 07, 20204 mins
AppleArtificial IntelligenceMobile

Accessibility tools are critical to the development of voice-first computing and Apple is ahead of the curve.

iphone 12 pro max cameras
Credit: Jason Cross/IDG

To understand the future of voice-first computing, you should consider accessibility on every platform; this is particularly true on Apple’s platform.

Computers are getting out of the way

Apple’s investments in accessibility are almost as old as the company itself. It opened its first office of disability in 1985.  

Historically, Apple has always been ahead of the curve in terms of making accessible software, which it sees as a human right. This has won a great deal of recognition from key advocacy groups worldwide.

Fundamental to this work is an effort to build alternative user interfaces: the GUI, MultiTouch, and, of course, Apple’s next big user interface innovation: VoiceOver.

Apple’s work with VoiceOver has been iterative and revolutionary. It first appeared in the iPhone in 2009, took a big step forward in 2019 with Voice Control, and now has a new feature that’s even more deeply transformative: People Detection on the iPhone 12 Pro and Pro Max.

What is People Detection?

People Detection is a new accessibility feature that’s currently only available on high-end iPhones. It exploits Apple’s accessibility technologies, on-device artificial intelligence and the Neural Engine, People Occlusion in ARKit, VoiceOver, and the iPhone’s LiDAR scanner to help identify people using the iPhone camera and let you know how far away they are.

In use, you’ll hold your iPhone in front of you and the device will tell you when it sees a person, give you a description of them (“a child,” for example) and inform you as to how far away they are.

As you approach someone, People Detection will continue to explain the distance between you. It recognizes whether you are walking directly toward, or at an angle away from, another person.

How to enable People Detection

Available in iOS 14.2 or later, you enable People Detection in the iPhone’s Settings>Accessiblity section where you must enable Magnifier and VoiceOver. Once it’s enabled, you just need to open the Magnifier app, tap the People icon, and the app will use your camera to identify any people it sees in the view. It will provide a description of people it sees and tell you how far away they are from you.

For social distancing, People Detection also lets you set a distance threshold — so you’ll be warned by two audio tones if someone gets too close. It shares this information verbally, using haptics, or visually on the display. It can use all three alert forms at once, or in a combination that works for you — you may want haptics only, for example). If there are multiple people detected, it tells you that and lets you know the distance of the closest person.

AirPods users also get Spatial Audio alerts, so if a person is to the left, they’ll be alerted in their left ear, or to the right if on the right.

This BBC Click report has some excellent footage showing the feature in action:

What about identifying other things?

Apple’s People Detection technology can be seen as a proof-of-concept achievement. Apple doesn’t want this story to end here and has introduced APIs for developers that let them use the same technology to support detection apps for different tasks. Think about how solutions of this kind may help visually impaired users find bus stops, safe road crossings or stairs, for example.

Why this matters

Accessibility should be important to every enterprise just because its the right thing to do. But the idea that a smartphone can deliver real time information pertaining to the world around you using AI on the device this way is quite profound.

It opens up so many opportunities.

  • Enterprises might add layers of useful intelligence to their consumer-focused apps – a paint manufacturer can create a color recognition app that works with this system to identify wall colors and wallpaper, for example.
  • In business, it becomes possible to introduce augmented navigation tools to direct field service engineers in new locations, or to empower visually impaired staff in so many ways.
  • The feature is also a huge step toward the development of voice-first user interfaces.

Ultimately, enterprise app developers should think about the technologies behind People Detection as it they demonstrate the huge importance of accessibility and voice control across Apple’s platform development plans, particularly around AR spectacles.

 Please follow me on Twitter, or join me in the AppleHolic’s bar & grill and Apple Discussions groups on MeWe.

jonny_evans

Hello, and thanks for dropping in. I'm pleased to meet you. I'm Jonny Evans, and I've been writing (mainly about Apple) since 1999. These days I write my daily AppleHolic blog at Computerworld.com, where I explore Apple's growing identity in the enterprise. You can also keep up with my work at AppleMust, and follow me on Mastodon, LinkedIn and (maybe) Twitter.