Friday, November 12, 2021 [Tweets] [Favorites]

Revised Messages Communication Safety Feature in iOS 15.2

Juli Clover:

Communication Safety is a Family Sharing feature that can be enabled by parents, and it is opt-in rather than activated by default. When turned on, the Messages app is able to detect nudity in images that are sent or received by children. If a child receives or attempts to send a photo with nudity, the image will be blurred and the child will be warned about the content, told it’s okay not to view the photo, and offered resources to contact someone they trust for help.

When Communication Safety was first announced, Apple said that parents of children under the age of 13 had the option to receive a notification if the child viewed a nude image in Messages, but after receiving feedback, Apple has removed this feature. Apple now says that no notifications are sent to parents.

Previously:

2 Comments

“... the Messages app is able to detect nudity in images...”

How does it do that? How is this similar to or different from CSAM detection? Is there a secret list of already-known pictures containing “nudity” instead of CSAM?

There’s literally zero details about how they actually detect these images and I’m very concerned how everyone who raised red flags about the whole CSAM rollout is just accepting this update and isn’t questioning how it is actually going to work. How can nobody see this is just a stepping-stone to the full proposed CSAM rollout?! If they use the same underlying detection engine, we’ve already lost the game. This whole “opt in” thing is just a smokescreen.

@Kansam My understanding is that it’s totally different from the CSAM stuff and uses on-device machine learning without transmitting anything.

Stay up-to-date by subscribing to the Comments RSS Feed for this post.

Leave a Comment