Friday, July 7, 2023

French Bill to Allow Police to Commandeer Phones

Tosin Ajuwon (via Hacker News, 2, 3):

A bill that would allow police in France to spy on suspects by remotely activating cameras, microphone including GPS of their phones has been passed.

The bill allows the geolocation of crime suspects, covering other devices like laptops, cars and connected devices, just as it could be remotely activated to record sound and images of people suspected of terror offences, as well as delinquency and organised crime.

I hope that Tim Cook will have a statement about whether this is possible with Apple devices. Has Apple has been asked to assist or has it been done via exploits? Edward Snowden has mentioned stuff like this before, but I don’t recall seeing specifics about which devices were affected.

Google and Meta have proactively announced that they will block links to Canadian news sources over a link tax. Would Apple go to bat for privacy?

Previously:

11 Comments RSS · Twitter · Mastodon

The proposed law is terrible, and our civil liberties groups are fighting it. But it's about making it legal for the cops to hack people's phones, which is currently illegal in France. I wouldn't expect anything from Apole about this, as that sort of thing is accepted in the US

Surely it will done using exploits?

Old Unix Geek

@Kristoffer: yes, think Vault 7 (The trove of CIA exploits that Assange most likely is in prison for releasing).

Data collection like this is already possible (see Snowden's NSA revelations). But it can't be used in a court of law. Hence the whole "parallel construction" thing.

This will make such recordings directly usable in court. So your conversation with a friend can be held against you.

It's worse than 1984 or the USSR. In 1984, the "TV" would listen to you, but you could go somewhere else. Similarly, in the USSR, your apartment might be bugged, but you could go have a walk with a friend in a park. Here, you are listened to in any public space where other people's devices might hear you. Using location data, it should be possible to record multiple audio streams from nearby devices, then use independent component analysis to extract each conversation. Thus what you say will be heard, even if you and your friend have no phones on them. The only actual security will be to go talk to your friend in a wilderness far from other people.

Old Unix Geek

@Michael Tsai: Vault 7 targetted iPhones and OS-X too.

Old Unix Geek

If Apple gave a shit, they would add a switch to the outside of their devices that hard disconnected the cameras, microphones and location services, so that if the phone were hacked, it would still not be able to access those features.

@Unix Agree with the physical disconnect switch. I wish Silent Mode disconnected all sensors, too. There should be a hard switch on all MacBooks, too.

Since Apple’s privacy stance is largely marketing, I suspect we won’t get our privacy switches until they are forced to implement the upcoming EU regulation:

> EU: Smartphones Must Have User-Replaceable Batteries by 2027

Watch: “Privacy” Apple will make a battery swap require tooling.

Ghost Quartz

> I wish Silent Mode disconnected all sensors, too.

Err, but this would make Silent Mode basically useless for people who leave it on most or all the time. I’d need to remember to toggle the silent mode switch every time I receive a call, want to take a photo, unlock with FaceID, etc.

Since this is meant to be a physical disconnect, you can’t simply let users opt-out via a software preference.

I didn't think about how other people's phones can be used to eavesdrop.

No more secrets

Old Unix Geek

@Ghost Quartz: you could have a switch that software can turn off, but not on (an electromagnet to turn the switch off).

The phone software could remind you to turn it on manually. Then once the call is over, the microphone would be turned off again. People would get used to that pretty quickly.

> I didn't think about how other people's phones can be used to eavesdrop.

It follows the weakest-link principle.

I want to surface security requirements to my recipients and see theirs. How many people use encrypted messaging, then gate it behind a weak 6-digit unlock or FaceID? How many include your conversation in cloud backups? We should have a "privacy label" with every recipient that gives me an idea of how they could compromise our conversation, or force them to meet my requirements before starting a conversation.

Think about conversations in red/yellow/green according to best security practices. A red box tells weak-link users they need better security practices, or socially nudges them like green-text Android users. Or at the very least in a group message informs everyone "treat this like a publicly broadcasted conversation".

The harder stuff is behavioral.

How many power off their phones before surrendering them in security lines?
How many use their phones in view of public cameras?
How many connect to public chargers at airports, etc.?
Etc.

All this mass surveillance stuff is a serious threat, but that’s the realm of security agencies and has nothing to do with this law. It doesn’t say you can hack people near the suspect, does it? So, let’s focus on the actual problems of this law and not delve into broad (and vague) government overreach concerns.

What I think is fundamentally wrong is for any government to hoard vulnerabilities, because that puts their citizens at risk, even more so if the hacking they do adds more vulnerabilities (as any software will). So that’s pretty much the main thing wrong here.

Spying on suspects, recording private conversations, on an individual basis after a judge signs off on it is in itself pretty standard and has been done for decades. That part is not novel and not pushing the line.

(Not that the hacking part is really novel. E.g. neighbor Germany introduced similar police rights a few years ago.)

Leave a Comment