Proposed EU Chat Control
Patrick Breyer (via Hacker News):
The highly controversial indiscriminate child sexual abuse regulation (so-called chat control) could still be endorsed by EU governments after all, as France could give up its previous veto. This is reported by Euractiv and confirmed by internal documents. France considers the new “upload moderation” proposal in principle as a viable option.
[…]
[Users] of apps and services with chat functions are to be asked whether they accept the indiscriminate and error-prone scanning and possibly reporting of their privately shared images, photos and videos. Previously unknown images and videos are also to be scrutinised using “artificial intelligence” technology. If a user refuses the scanning, they would be blocked from sending or receiving images, photos, videos and links (Article 10). End-to-end encrypted services such as Whatsapp or Signal would have to implement the automated searches “prior to transmission” of a message (so-called client-side scanning, Article 10a).
[…]
Probably as a concession to France, the chats of employees of security authorities and the military are also to be exempted from chat control.
This is kind of like what Apple was planning to do with iMessage, using AI rather than just checking for known images, but:
- Images not specifically approved cannot be sent, whereas Apple would inform the recipient and let them decide.
- It sounds like images flagged by the AI, though not sent to the recipient, are sent to the authorities.
Signal strongly opposes this proposal.
Let there be no doubt: we will leave the EU market rather than undermine our privacy guarantees.
This proposal--if passed and enforced against us--would require us to make this choice.
It’s surveillance wine in safety bottles.
Previously:
- Criticism of Signal
- Google Changes Appeals Process for Suspected Child Abuse Images
- Apple Abandons CSAM Scanning
- U.K. Proposal to Weaken Messaging Security
- Google Account Deleted Due to CSAM False Positive
- Google Drive Flags .DS_Store Files for Copyright Infringement
- Google Drive Incorrectly Flags File for Copyright Infringement
- Apple Delays Child Safety Features
- Scanning iCloud Photos for Child Sexual Abuse
- ProtonMail Opposes EU Golden Key
Update (2024-06-18): Alexander Martin:
Meredith Whittaker — president of the Signal Foundation, which operates the end-to-end encrypted (E2EE) messaging app of the same name — criticized on Monday the latest European Union proposals for requiring messaging services to check if users were sharing child abuse material.
Her complaint follows the publication of an internal document from the European Council — the EU body that sets the bloc’s political direction — revealing its position as of the end of May on a proposed regulation to “prevent and combat child sexual abuse.”
The European Council has taken a proposal to force mandatory scanning of all photos and videos sent through private messengers (including encrypted messengers like Signal) and they’ve rebranded it as “upload moderation.” The implication is that it’s voluntary when it’s not.
If you choose not to submit your deeply private personal photos to be scanned for criminal activity, you won’t be allowed to send images or videos at all. It’s coercion into a mass surveillance regime, with some branding.
And if your reaction is “oh well at least it’s just images and not the private text messages themselves,” understand that this is a temporary climbdown from the original proposal that required AI scanning of text messages as well. This proposal is an obvious stepping stone.
It’s not clear how this can be done safely for encrypted messengers, or if it can be done at all. None of the people behind this proposal have any idea. Their plan appears to be: get the law in place and then it won’t really matter.
Update (2024-06-20): Patrick Breyer (via Hacker News):
This is what the current proposal actually entails[…]
Meredith Whittaker (via Hacker News):
Official statement: the new EU chat controls proposal for mass scanning is the same old surveillance with new branding.
Whether you call it a backdoor, a front door, or “upload moderation” it undermines encryption & creates significant vulnerabilities.
“The fact that the EU interior ministers want to exempt police officers, soldiers, intelligence officers and even themselves from chat control scanning proves that they know exactly just how unreliable and dangerous the snooping algorithms are that they want to unleash on us citizens.”
People think ChatControl is about specific crimes. No, that’s not what’s at stake. What’s being made is an architecture decision for how private messaging systems work : if it passes, by law these systems will be wired for mass surveillance. This can be used for any purpose.
Once you build an architecture where every interpersonal private message goes through a scanning filter, one that reports to the police, the only question in the future is: what software updates do you push to that filter?
Alex Ivanovs (via Hacker News):
The EU Council and its participants have decided to withdraw the vote on the contentious Chat Control plan proposed by Belgium, the current EU President.
Update (2024-06-24): Nick Heer:
That is a truncated history of this piece of legislation: regulators want platform operators to detect and report CSAM; platforms and experts say that will conflict with security and privacy promises, even if media is scanned prior to encryption. This proposal may be specific to the E.U., but you can find similar plans to curtail or invalidate end-to-end encryption around the world[…]
Update (2024-09-17): Sebastiaan de With:
With Thierry Breton’s resignation the European Commission loses one of the most fierce proponents of ChatControl, their longtime effort to weaken or eliminate secure online communications, and regulatory overreach.
As a European I am genuinely relieved to see him gone, though I still fear there’s a lot of powerful forces trying to get it pushed through.
Previously:
Update (2024-09-19): Nick Heer:
This [new proposal] is a similar effort to that postponed earlier this year. The proposal (PDF) has several changes, but it still appears to poke holes in end-to-end encryption, and require providers to detect possible known CSAM before it is sent. A noble effort, absolutely, but also one which fundamentally upsets the privacy of one-on-one communications to restrict its abuse by a few.