Monday, August 22, 2022

Google Account Deleted Due to CSAM False Positive

Kashmir Hill (Hacker News):

With help from the photos, the doctor diagnosed the issue and prescribed antibiotics, which quickly cleared it up. But the episode left Mark with a much larger problem, one that would cost him more than a decade of contacts, emails and photos, and make him the target of a police investigation. Mark, who asked to be identified only by his first name for fear of potential reputational harm, had been caught in an algorithmic net designed to snare people exchanging child sexual abuse material.

[…]

Two days after taking the photos of his son, Mark’s phone made a blooping notification noise: His account had been disabled because of “harmful content” that was “a severe violation of Google’s policies and might be illegal.”

[…]

A few days after Mark filed the appeal, Google responded that it would not reinstate the account, with no further explanation.

[…]

CyberTipline staff members add any new abusive images to the hashed database that is shared with technology companies for scanning purposes. When Mark’s wife learned this, she deleted the photos Mark had taken of their son from her iPhone, for fear Apple might flag her account.

The police determined that no crime had occurred, but Google permanently deleted his account, anyway. Apparently, the police now have the only copy of his data.

I don’t really want to use iCloud Photo Library, but I have it enabled now because Image Capture doesn’t work wirelessly, and recent versions have been buggy. I guess the proper way to take photos for a doctor would be to temporarily turn off iCloud Photo Library or to use a third-party camera app that doesn’t save to the camera roll. But I bet nearly every iPhone user has some photos—be they medical, sexual, or of documents—that they would like to mark as private (not just hidden). They should still be backed up but protected with an extra password or something. I don’t know how to prevent this from being abused to store actual CSAM, though.

Meek Geek:

If you are accused by Google of doing something they don’t like and have your account blocked, there is no easy way to get human support on the other side to review the issue.

Kyle Howells:

One of the things I’ve been doing the last few years is trying to slowly remove Google as a single point of failure in my life.

Spreading out my online life over more companies so no 1 company can ruin my life at the flick of a switch.

There’s no real way to remove Apple if you use an iPhone.

Previously:

Update (2022-08-26): John Gruber:

To my knowledge, no innocent person has been falsely flagged and investigated like Mark using the NCMEC fingerprint database. It could happen. But I don’t think it has. It seems uncommon for an innocent person like Mark to be flagged and investigated by the second method, but as Hill reports, we have no way of knowing how many like Mark there are who’ve been wrongly flagged, because for obvious reasons they’re unlikely to go public with their stories.

[…]

“Avoid uploading to the cloud” is difficult advice for most people to follow. Just about everyone uses their phone as their camera, and most phones from the last decade or so — iPhones and Android alike — upload photos to the cloud automatically. When on Wi-Fi — like almost everyone is at home — the uploads to the cloud are often nearly instantaneous.

[…]

The on-device vs. on-server debate is legitimate and worth having. But I think it ought to be far less controversial than Google’s already-in-place system of trying to identify CSAM that isn’t in the NCMEC known database.

See also: Dithering, The Talk Show.

Update (2022-10-10): See also: Hacker News, Ben Thompson, Nick Heer.

2 Comments RSS · Twitter

Is an account deletion linked to potential child pornography such a frequent occurence that Google can't handle it correctly? They take pictures of all the streets in the world, but they can't even rely on the police that did all the work and cleared the person. The system is NOT working as intended.

It's not quite as dramatic, but here is something somewhat similar that happened to a family member, and is also maddening. His phone was stolen, and we reported it to the police. They then put the IMEI number into some kind of official blacklist. That means all mobile providers in France (and Europe maybe?) will disable communications when they detect the phone is used on their network. That sounds great! Later, we recovered the phone. Another great news, right? Except it's been impossible so far to get the phone unlisted, despite incessant phone calls to support, trips to the police, to an Apple store, letters to the mobile provider with the police report for the recovery, etc. (it's been a few months!!). So the phone is basically unusable as a phone... The worse part is that the mobile provider knows that a blacklisted phone is activated on their network, they even have the name of the person who tried to use it (my family member!), but apparently, they don't care about that, as long as we pay the bill every month. I almost wish the police would show up every time we try to use the phone and arrest us, because at least, that would mean that some of that IMEI black listing scheme is useful...

Well, that's terrible. I have pictures going back to 1995 on Google Photos. I just took old hard drives, and pretty much uploaded all the image files on there to Google Photos. It was kind of impossible to sort them out by hand, but Google Photos would do a pretty good job recognizing what the photos contained, so that made it possible to go back and find old pictures based on who or what was on them.

Problem is, this contains all kinds of things, including stuff like scans of x-rays from when I had broken bones, images from browser caches that were accidentally included, attachments from emails people sent me, photos of old girlfriends, screenshots from videogames, and so on. It never occurred to me to worry about it too much, but now I wonder exactly how bad of an idea it was to just dump everything on there.

Probably a very bad idea.

Leave a Comment