Privacy of Windows Copilot+ Recall
Kevin Beaumont (via Stephen Hackett):
Microsoft told media outlets a hacker cannot exfiltrate Copilot+ Recall activity remotely.
Reality: how do you think hackers will exfiltrate this plain text database of everything the user has ever viewed on their PC? Very easily, I have it automated.
[…]
Microsoft are going to deliberately set cybersecurity back a decade & endanger customers by empowering low level criminals.
Every few seconds, screenshots are taken. These are automatically OCR’d by Azure AI, running on your device, and written into an SQLite database in the user’s folder.
This database file has a record of everything you’ve ever viewed on your PC in plain text.
[…]
In fact, you don’t even need to be an admin to read the database — more on that in a later blog.
[…]
Recall enables threat actors to automate scraping everything you’ve ever looked at within seconds.
During testing this with an off the shelf infostealer, I used Microsoft Defender for Endpoint — which detected the off the shelve infostealer — but by the time the automated remediation kicked in (which took over ten minutes) my Recall data was already long gone.
This is the out of box experience for Windows 11’s new Recall feature on Copilot+ PCs. It’s enabled by default during setup and you can’t disable it directly here. There is an option to tick “open Settings after setup completes so I can manage my Recall preferences” instead.
This fact that this feature is basically on by default and requires numerous steps to disable is going to create a lot of problems for people, especially those who click through every privacy/permission screen and fundamentally don’t know how their computer actually operates—I’ve counted way too many instances where I’ve had to help people find something and they have no idea where anything lives in their file system (mostly work off the Desktop or Downloads folders). How are they going to even grapple with this?
Previously:
Update (2024-06-04): Zac Bowden (via Hacker News, MacRumors):
Microsoft has done the bare minimum to protect this data. It’s stored in a system directory that requires administrator and system-level rights to access and edit. However, these protections are easily bypassed, and an attacker could easily write a bit of software to ignore those permissions if they wanted.
[…]
With that said, I find the outrage about this discovery to be somewhat overblown. All your files are unencrypted when you’re using your PC, yet most people aren’t constantly concerned about malware potentially scraping their personal documents, pictures, downloads, videos, and synced cloud folders.
However, Recall would give it access to information that was deleted or that was shown on screen but never otherwise saved to disk.
Windows Recall won’t be deployed in the enterprise.
Remember how much effort is put into archiving and deleting email to reduce legal discovery risks?
Update (2024-06-05): Kevin Beaumont:
If you want to know how Microsoft have got themselves into this giant mess with Recall, here’s what the documentation says between the lines: you, the customer, are a simpleton who doesn’t want to be an AI genius yet. Have a caveman mode.
Alternative view: Microsoft put their CEO in front of world’s media to launch a product customers largely don’t want, attached to their biggest brand, Windows, attached to new brand, Copilot, and didn’t handle security, privacy and AI safety properly while under massive scrutiny.
Charlie Stross (via Hacker News):
Use a password manager like 1Password? Sorry, your 1Password passwords are probably visible via Recall, now.
Now, “unencrypted” is relative; the database is stored on a filesystem which should be encrypted using Microsoft’s BitLocker. But anyone with credentials for your Microsoft account can decrypt it and poke around. Indeed, anyone with access to your PC, unlocked, has your entire world at their fingertips.
But this is an utter privacy shit-show. Victims of domestic abuse are at risk of their abuser trawling their PC for any signs that they’re looking for help. Anyone who’s fallen for a scam that gave criminals access to their PC is also completely at risk.
[…]
Microsoft “got serious” about security earlier this decade, around the time Steve Balmer stepped down as CEO, and managed to recover somwhat from having a reputation for taking a slapdash approach to its users data. But they’ve been going backwards since 2020, with dick moves like disabling auto-save to local files in Microsoft Word (your autosave data only autosaves to OneDrive), slurping all incoming email for accounts accessed via Microsoft Outlook into Microsoft’s own cloud for AI training purposes (ask the Department of Justice how they feel about Microsoft potentially having access to the correspondence for all their investigations in progress), and now this.
I’m not saying that it’s not possible to secure Windows Recall data stores from malware and other users.
I’m just saying that the features to secure it don’t exist on Windows.
See also: Andrew Cunningham.
Update (2024-06-07): Thomas Claburn (via Hacker News):
Asked to explore the data privacy issues arising from Microsoft Recall, the Windows maker’s poorly received self-surveillance tool, Jaime Teevan, chief scientist and technical fellow at Microsoft Research, brushed aside concerns.
Mark Hurst (via Hacker News):
Whatever blowback Microsoft faces if and when users are hacked because of Recall, there’s no chance the feature gets killed.
[…]
“Linux on the Desktop.” The free, open-source operating system of Linux is not owned by any company (Big Tech or otherwise), doesn’t contain any opaque surveillance code, and enjoys a worldwide community of developers who actually want to make the software better – not, as in Microsoft’s case, worse.
As a warning about how Recall could be abused by criminal hackers, Alex Hagenah, a cybersecurity researcher, has released a demo tool that is capable of automatically extracting and displaying everything Recall records on a laptop.
If anybody is wondering if you can enable Recall on a machine remotely without Copilot+ hardware support - yep.
I’ve also found a way to disable the tray icon.
Andy Greenberg (via Hacker News):
On Friday, Microsoft announced that it would be making multiple dramatic changes to its rollout of its Recall feature, making it an opt-in feature in the Copilot+ compatible versions of Windows where it had previously been turned on by default, and introducing new security measures designed to better keep data encrypted and require authentication to access Recall's stored data.
This is their updated screen. It forces an absolute choice with happy language “Yes, save” as the choice in the default “continue/next” position - most likely to be selected by users who don’t read the screen or don’t have a fully informed context to decide.
As opposed to a more honest opt-in which would be a separate radio choice to Enable / Disable the feature with Continue/Next being it’s own action.
It’s better than Apple’s opt-outs that say “Later” and don’t even look like buttons.
Previously:
Update (2024-06-12): Zac Bowden (via Kevin Beaumont):
Microsoft has the Windows Insider Program, yet to maintain secrecy, it chose not to test this feature openly. I can’t think of a single feature that would have benefitted from public testing more than Windows Recall. This is the kind of feature that needs to be built in the open so that users can learn to trust you with it.
Had it been tested openly, these security concerns would have definitely been pointed out well ahead of general availability, and likely fixed before mass hysteria could ensue. Of course, the true reason Windows Recall wasn’t tested openly was because the company wanted to make it exclusive to new Copilot+ PCs, and you can’t really do that if you’re testing the feature on existing PCs where it works quite well.
Microsoft also wanted to keep Windows Recall a secret so it could have a big reveal on May 20. Except, it wasn’t really much of a big reveal. Many of us in the tech press already knew it was coming, even without being briefed on the feature ahead of time.
Update (2024-06-18): Reuters (via Hacker News):
Microsoft will not roll out “Recall”, an AI-powered feature that tracks computer usage, with its new computers next week and will instead preview it with a smaller group later, the tech giant said on Thursday, amid concerns of privacy risks.
This is confusing and vague to me, which I believe is exactly the intent. It focuses on security, reiterates that security is their top priority (and we know that this is untrue). What were the security problems? They don’t even allude to the existence or detection of any specific security problems.
It sounds to me like they’re figuring out a new marketing approach, or they’re softening the blow by “listening to users” and then rolling out more slowly, when outrage has died down and people will just accept it.
What I really want to see is proper journalism around “how / why did this make it so far before Microsoft ‘realized’ how insecure and terrible an idea it is”.
Joz’s answer [at The Talk Show] to Microsoft’s “Recall” failure is hilarious.
Update (2024-08-22): Andrew Cunningham (Hacker News):
Microsoft will begin sending a revised version of its controversial Recall feature to Windows Insider PCs beginning in October, according to an update published today to the company's original blog post about the Recall controversy. The company didn't elaborate further on specific changes it's making to Recall beyond what it already announced in June.
Update (2024-09-18): Cecily Mauran (via Hacker News):
It turns out Windows 11 users won’t be able to uninstall Microsoft’s controversial “Recall” feature after all.
[…]
But now, in a statement to The Verge, Microsoft clarified that that the uninstall option was just a bug.
Update (2024-09-30): Tom Warren (tweet):
A Recall uninstall option initially appeared on Copilot Plus PCs earlier this month, and Microsoft said at the time that it was a bug. It turns out that you will indeed be able to fully uninstall Recall. “If you choose to uninstall this, we remove the bits from your machine,” says Weston. That includes the AI models that Microsoft is using to power Recall.
[…]
The encryption in Recall is now bound to the Trusted Platform Module (TPM) that Microsoft requires for Windows 11, so the keys are stored in the TPM and the only way to get access is to authenticate through Windows Hello. The only time Recall data is even passed to the UI is when the user wants to use the feature and authenticates via their face, fingerprint, or PIN.
See also: Microsoft.
10 Comments RSS · Twitter · Mastodon
This is batshit crazy!
And I thought the dark patterns employed to prevent the user from creating a local account during setup, or the telemetry that cannot be turned off, or a million other paper cuts were bad, but this is next level insane.
I feel like all of this is a distraction - Microsoft just shifting discussion to the security aspect.
Microsoft will make the database more secure and so people will stop talking about the actual underlying issue (read: the fact that this "feature" exists at all).
I think the concerns here are largely overblown for most people. If somebody can access all files on your PC, they can also install a keylogger and record your screen anyway, so it won't make a huge difference if they also see all of the text you saw earlier.
But the implementation, just dumping everything into a database with zero encryption, and then making it incredibly difficult to turn it off, is truly mind-bogglingly stupid.
Really silly decision to not make this opt-in. But so many people are completely blinded by the promise of ai that they think this must be great
@Plume I disagree — keyloggers and screen recorders may require higher privileges than the user currently has. But most importantly, rewinding is by itself a massive feature for attackers. For example, recovering any deleted emails or TOTP QR codes, up to 3 months after the fact?
Side note: so "Azure AI" runs locally? What an awful mess of branding. Though I suspect this is intentional — if they had a brand that strictly meant "local", things would get muddled and people would be unhappy once they decided to move stuff to the cloud for whatever reason, be it better capabilities or data collection. So why not make names confusing and meaningless from the start?
@Daniel I thought they said that they were blending local and cloud and deciding at runtime what to run where? “We believe the richest AI experiences will only be possible when the cloud and device work together in concert.” But the Recall feature is apparently fully local.
Yeah, I know that for other uses they're going hybrid, which is a sensible decision. But for Recall in particular, if they were truly committed to long-term remaining fully local, it'd make sense IMO to never ever touch a brand like Azure.
So the gist of the technical side behind all this seems pretty clear, given I was able to guess it without even reading past the headline. It's fascinating technology I have largely no interest in, alas.
However, some thoughts, one, why isn't this opt in? Two, why not encrypt the database? Three, is there a way to disable some activities from scrutiny? Or perhaps even only explicitly invoke recall for those things you want "recorded"?