Monday, January 2, 2023

Google Changes Appeals Process for Suspected Child Abuse Images

Kashmir Hill (Mastodon, Hacker News):

Google refused to reconsider the decision in August, saying her YouTube account contained harmful content that might be illegal. It took her weeks to discover what had happened: Her 9-year-old eventually confessed that he had used an old smartphone of hers to upload a YouTube Short of himself dancing around naked.

[…]

Google has billions of users. Last year, it disabled more than 270,000 accounts for violating its rules against child sexual abuse material. In the first half of this year, it disabled more than it did in all of 2021.

[…]

It took four months for the mother in Colorado, who asked that her name not be used to protect her son’s privacy, to get her account back. Google reinstated it after The Times brought the case to the company’s attention. […] Google did not tell the woman that the account was active again.

[…]

Jason Scott, a digital archivist who wrote a memorably profane blog post in 2009 warning people not to trust the cloud, said companies should be legally obligated to give users their data, even when an account was closed for rule violations.

It remains to be seen how well the new process works.

Previously:

1 Comment RSS · Twitter · Mastodon

While I totally agree with the sentiment of “companies should be legally obligated to give users their data, even when an account was closed for rule violations”, the problem with that is it absolutely would be illegal to do if CSAM is involved - the laws around CSAM are so strict that its legally tenuous to even have your employees look at suspect CSAM to verify whether it is or isn't. How do companies deal with that? I have no idea.

Leave a Comment