Archive for August 24, 2016

Wednesday, August 24, 2016

Apple’s Machine Learning

Steven Levy (via Zac Hall):

Machine learning, my briefers say, is now found all over Apple’s products and services. Apple uses deep learning to detect fraud on the Apple store, to extend battery life between charges on all your devices, and to help it identify the most useful feedback from thousands of reports from its beta testers. Machine learning helps Apple choose news stories for you. It determines whether Apple Watch users are exercising or simply perambulating. It recognizes faces and locations in your photos. It figures out whether you would be better off leaving a weak Wi-Fi signal and switching to the cell network. It even knows what good filmmaking is, enabling Apple to quickly compile your snapshots and videos into a mini-movie at a touch of a button.


How big is this brain, the dynamic cache that enables machine learning on the iPhone? Somewhat to my surprise when I asked Apple, it provided the information: about 200 megabytes, depending on how much personal information is stored (it’s always deleting older data). This includes information about app usage, interactions with other people, neural net processing, a speech modeler, and “natural language event modeling.” It also has data used for the neural nets that power object recognition, face recognition, and scene classification.

And, according to Apple, it’s all done so your preferences, predilections, and peregrinations are private.


Acero began his career in speech recognition at Apple in the early ’90s, and then spent many years at Microsoft Research. “I loved doing that and published many papers,” he says. “But when Siri came out I said this is a chance to make these deep neural networks all a reality, not something that a hundred people read about, but used by millions.” In other words, he was just the type of scientist Apple was seeking — prioritizing product over publishing.


“It’s a source of a lot of internal debate,” says Federighi. “We are used to delivering a very well-thought-out, curated experience where we control all the dimensions of how the system is going to interact with the user. When you start training a system based on large data sets of human behavior, [the results that emerge] aren’t necessarily what an Apple designer specified. They are what emerged from the data.”

Update (2016-08-31): Steven Levy (via Nick Heer):

The company’s machine learning talent is shared throughout the entire company, available to product teams who are encouraged to tap it to solve problems and invent features on individual products. “We don’t have a single centralized organization that’s the Temple of ML in Apple,” says Craig Federighi. “We try to keep it close to teams that need to apply it to deliver the right user experience.”

Jordan Kahn:

Following its acquisition of machine learning platform Turi earlier this month, Apple is now growing the team that will serve as the company’s new machine learning division focusing on integrating the tech into new and existing products, 9to5Mac has learned.

Update (2016-09-06): Dr. Drang:

This is what’s most frustrating about Siri and why I find myself yelling at her so often. It has nothing to do with big data or compromised privacy. The problem I posed was ideal for Apple’s everything-on-the-phone strategy. It didn’t even require changing apps. And yet Siri interpreted “get directions to the nearest gas station” without any regard to the data she had in her hands just seconds earlier. For some reason, when I asked for directions, only my position was used in developing the answer. That I was in a car and traveling south—essential information for giving good directions—was ignored.

Pinterest Acquires Instapaper

Instapaper (Hacker News, Slashdot):

Today, we’re excited to announce that Instapaper is joining Pinterest. In the three years since betaworks acquired Instapaper from Marco Arment, we’ve completely rewritten our backend, overhauled our mobile and web clients, improved parsing and search, and introduced tons of great features like highlights, text-to-speech, and speed reading to the product.


For you, the Instapaper end user and customer, nothing changes. The Instapaper team will be moving from betaworks in New York City to Pinterest’s headquarters in San Francisco, and we’ll continue to make Instapaper a great place to save and read articles.

Benjamin Mayo:

Hidden at the bottom of this announcement is a ‘sunsetting’ of Instaparser, a paid API endpoint for developers to take advantage of Instapaper’s intelligent article parsing. The service is shutting down in November. It launched in April, now being shuttered in the same year it was debuted. This is pretty crappy especially given Instaparser was a paid service charging hundreds of dollars per month for an API key.

Brian Donohue:

We will be using the signals from Instapaper to power some news-based discovery within Pinterest, however, those signals will be used in aggregate in a manner similar to which we use them for the Instapaper Daily and Instapaper Weekly offerings.

Nick Heer:

I’m worried about this. I’m a long-time Instapaper user and customer, and its features — particularly highlights and notes — are essential to my reading and research habits.

Update (2016-08-24): Here’s Arment’s post on selling Instapaper to Betaworks (via McCloud).

Steve Jobs on Graphics Performance

John Gruber:

We were talking about scrolling performance, and how the iPhone 4 had to draw 4x the pixels to get 2x the resolution, and still do it smoothly. This, at a time when Android scrolling performance was just awful. I asked him how Apple could be so far ahead. He said “John, nobody else gives a shit.”

The other interesting tidbit from that conversation was that I said something to the effect of “You guys have been working on graphics performance ever since 2001” or something like that, alluding to Mac OS X 10.0.

He immediately jumped back at me with “No, we’ve been killing ourselves over graphics since 1989.” Alluding to NeXTstep 1.0.