Wednesday, November 1, 2017

Photos Machine Learning and Trusting Apple

Nick Heer:

This realization went viral; Christine Teigen posted about it, too. And, arguably, rightfully so — if you found out that your phone was, somehow, making it easier for you to search semi-nude photos, you might find that creepy, and you’d probably want to warn a lot of people about that.

[…]

There’s something else, too, that’s bothering me about this: I wonder if most people — and, let’s face it, “people” is too broad a term; “women” is much more accurate — want to search for photos of bras in their image library. That is, even if this capability and the privacy protections in place had been effectively communicated, is this something that users want catalogued?

I don’t know how many women are on Apple’s machine learning teams specifically, but just 23% of their technical employees are women. Judging by Twitter users’ incredulity, it seems like something women may not actually want, and I wonder if a higher percentage of women in technical roles might have caused object recognition to be filtered more carefully.

One issue is that most people probably don’t understand that Apple is not looking at their photos (though clearly it could). Apple does try to communicate things like this, and I’m sure it would like to do so better, but it’s not clear how.

The other issue is that I’m sure there are many groups of people who don’t want certain things cataloged, and for many of those cases there are other groups who would benefit from that type of searching. Is it possible to make everyone happy? I can’t imagine Apple adding detailed preferences for something like this. My guess is that it tries to pick an intersection of restrictions that’s suitable for the mass-market, and if you have more specialized needs you’ll have to find another photos ecosystem.

4 Comments RSS · Twitter

I'm a little bit confused by this discussion. Clearly, there are situations where it is useful to be able to search for these terms. And this searches your private collection on your private phone, so if you don't want to search for these terms, you can just not search for them. If you don't want your phone to find nude pictures, you can also not have nude pictures on your phone; that's an option, too.

Is there something I'm missing? Is this just an American thing, where everything vaguely related to sex is inherently suspicious?

@Lukas I don’t really understand it, either. People don’t seem to have a problem with Google letting you search for these things. It will even auto-complete “brassiere”. So I guess it must be related to this being your private data.

Lukas,
Yes, Americans are all twisted up in knots over sex and violence. Former is bad, latter is good. I don't get it either. Said as an American born abroad but one who lived most of his childhood and all of his adult life in the States.

[…] Previously: Photos Machine Learning and Trusting Apple. […]

Leave a Comment