Wednesday, August 24, 2016

Apple’s Machine Learning

Steven Levy (via Zac Hall):

Machine learning, my briefers say, is now found all over Apple’s products and services. Apple uses deep learning to detect fraud on the Apple store, to extend battery life between charges on all your devices, and to help it identify the most useful feedback from thousands of reports from its beta testers. Machine learning helps Apple choose news stories for you. It determines whether Apple Watch users are exercising or simply perambulating. It recognizes faces and locations in your photos. It figures out whether you would be better off leaving a weak Wi-Fi signal and switching to the cell network. It even knows what good filmmaking is, enabling Apple to quickly compile your snapshots and videos into a mini-movie at a touch of a button.

[…]

How big is this brain, the dynamic cache that enables machine learning on the iPhone? Somewhat to my surprise when I asked Apple, it provided the information: about 200 megabytes, depending on how much personal information is stored (it’s always deleting older data). This includes information about app usage, interactions with other people, neural net processing, a speech modeler, and “natural language event modeling.” It also has data used for the neural nets that power object recognition, face recognition, and scene classification.

And, according to Apple, it’s all done so your preferences, predilections, and peregrinations are private.

[…]

Acero began his career in speech recognition at Apple in the early ’90s, and then spent many years at Microsoft Research. “I loved doing that and published many papers,” he says. “But when Siri came out I said this is a chance to make these deep neural networks all a reality, not something that a hundred people read about, but used by millions.” In other words, he was just the type of scientist Apple was seeking — prioritizing product over publishing.

[…]

“It’s a source of a lot of internal debate,” says Federighi. “We are used to delivering a very well-thought-out, curated experience where we control all the dimensions of how the system is going to interact with the user. When you start training a system based on large data sets of human behavior, [the results that emerge] aren’t necessarily what an Apple designer specified. They are what emerged from the data.”

Update (2016-08-31): Steven Levy (via Nick Heer):

The company’s machine learning talent is shared throughout the entire company, available to product teams who are encouraged to tap it to solve problems and invent features on individual products. “We don’t have a single centralized organization that’s the Temple of ML in Apple,” says Craig Federighi. “We try to keep it close to teams that need to apply it to deliver the right user experience.”

Jordan Kahn:

Following its acquisition of machine learning platform Turi earlier this month, Apple is now growing the team that will serve as the company’s new machine learning division focusing on integrating the tech into new and existing products, 9to5Mac has learned.

Update (2016-09-06): Dr. Drang:

This is what’s most frustrating about Siri and why I find myself yelling at her so often. It has nothing to do with big data or compromised privacy. The problem I posed was ideal for Apple’s everything-on-the-phone strategy. It didn’t even require changing apps. And yet Siri interpreted “get directions to the nearest gas station” without any regard to the data she had in her hands just seconds earlier. For some reason, when I asked for directions, only my position was used in developing the answer. That I was in a car and traveling south—essential information for giving good directions—was ignored.

Comments RSS · Twitter

Leave a Comment