Apple’s iOS 9 Supercharges Search and Siri With AI

Just like Android, Apple is using AI and enhanced search to make iOS 9 smarter outside of your apps.
siriinline
Apple

Today at WWDC in San Francisco, Apple senior VP of software engineering Craig Federighi demoed what's new in OS X and iOS 9---and it seemed like there was an echo from Google I/O, where two weeks ago we learned how Google Now was getting major upgrades. Likewise, OS 9 wants to break down the walls between the core OS and third-party apps through search, voice commands, and AI.

In addition to interacting via taps and swipes, all the apps on your phone will be a source of data for your core OS---and the end game for all this is to make search smarter across your Apple devices. And one way Apple will do this is by unchaining a lot of data from inside your apps, and delivering that info, namely, to Siri and Spotlight.

The Siri enhancements are both aesthetic (it now looks more like the version of Siri on the Apple Watch) and functional. The voice assistant is able to run searches of your photo roll based on dates and locations, and you can also set contextual reminders based on what you're doing on the phone. It will also have deep integration with Apple Music, letting you request specific songs or most-popular playlists ("play the most popular song from 1997"). Apple also claims the iOS 9 version of Siri is zippier and turns out more-accurate results.

Apple

The new additions to Spotlight search for OS X and iOS 9 are more significant. Via desktop, the app has improved natural language results, meaning a search for "folder with photos from WWDC last year" will be understood. (This applies to Mail too---if anyone uses that. So you could search for "emails I ignored last week" and get results.) Via mobile, searching for a keyword in Spotlight will now surface results from within apps. On stage, Federighi demoed the new feature by searching for a recipe, and Spotlight delivered a deep link to that recipe within the Yummly app. (The new iOS 9 search tool also supports unit conversions.)

And to further all these efforts, Apple also announced that it will open the API for Spotlight search. That will allow more third-party apps to allow this sort of deep-linking from iOS 9 search, meaning Spotlight has the potential to become much more than what it is.

Essentially, these features continue a cross-platform trend for both iOS and Android. Apps are just as important as they’ve ever been, and the path to making them smarter, better integrated, and more capable, is by making our search---in whatever form it takes---work with them. Search should get us into our apps more, or help us get information from them better.

The next generation of smartphone operating systems are also trying to reduce the time you spend in apps. Google Now unlocks timely, customized information from within apps and displays them in a dashboard on Android. With its new Proactive assistant and deeper search functionality, Apple is also trekking down the path to the post-app era. Using a combination of A.I., data mining from within apps, and your own information, our phones will just know what we're looking for. The idea is to get us what we want when we want it---and Apple seems bent on doing exactly that.