Monday, July 8, 2024

Signal for Mac’s “Encrypted” Database

Signal:

Storing messages outside of your active Signal device is not supported.

Messages are only stored locally.

An iTunes or iCloud backup does not contain any of your Signal message history.

This makes it private on iOS because other apps can’t access the message database. But the same design doesn’t work so well with the Mac version.

Mysk:

This is the folder structure of Signal’s local data on macOS. The encrypted database and encryption key are stored next to each other. The folder is accessible to any app running on the Mac.

Why didn’t they store the encryption key in the keychain?

Mysk:

The encryption key used to encrypt the local DB that contains all the secrets and chat history is stored in plain text in a location accessible by any app, process or script started by the Mac user.

It’s very tempting to use Signal’s desktop app. This is particularly useful for activists who can be more productive using a desktop than a mobile phone. Signal doesn’t make it clear that linking a desktop app can render Signal’s “gold standard” for encryption useless.

This seems like a much bigger deal than last week’s ChatGPT story.

Mysk:

I wrote a simple Python script that copies the directory of Signal’s local storage to another location (to mimic a malicious script or app)

[…]

Messages were either delivered to the Mac or to the VM. The iPhone received all messages. All of the three sessions were live and valid. Signal didn’t warn me of the existence of the third session [that I cloned]. Moreover, Signal on the iPhone still shows one linked device. This is particularly dangerous because any malicious script can do the same to seize a session.

Saagar Jha:

I think a lot of people have recently learned something that horrifies them. I do not fault them for that in the slightest. I just also want them to share my terror of this being standard best practice in the industry.

Previously:

Update (2024-07-09): Lawrence Abrams:

A mistake in the process used by the Signal Desktop application to encrypt locally stored messages leaves them wide open to an attacker.

He wrote this in 2018, and there are forum posts older than that referencing the issue. Curiously, a Signal developer offers the explanation that even though they are using an encrypted extension to SQLite and configured it to encrypt the database with a password, it was not their intention to protect the database with encryption:

The database key was never intended to be a secret. At-rest encryption is not something that Signal Desktop is currently trying to provide or has ever claimed to provide. Full-disk encryption can be enabled at the OS level on most desktop platforms.

I don’t understand what the reason was, then. And full-disk encryption is a solution to a different problem; it does not protect the data from other processes on the system.

Matt Henderson:

This is shocking for anyone considering Signal the gold standard in security.

Update (2024-07-15): Lawrence Abrams:

The response was unusual after Whittaker’s constant retweets about the security and privacy implications of Microsoft’s Windows Recall and how data could be stolen by local attackers or malware.

[…]

In April, an independent developer, Tom Plant, created a request to merge code that uses Electron’s SafeStorage API to further secure Signal’s data store from offline attacks.

[…]

While the solution would provide additional security for all Signal desktop users, the request lay dormant until last week’s X drama. Two days ago, a Signal developer finally replied that they implemented support for Electron’s safeStorage, which would be available soon in an upcoming Beta version.

Ben Lovejoy:

Using Keychain on Mac fully secures the encryption key, while the Windows solution could still potentially be compromised by some malware, but will be significantly safer than now.

25 Comments RSS · Twitter · Mastodon

It is worrying, but I guess that will not change. Back in 2020 Scott Nonnenberg, the developer at Signal working on Signal Desktop, wrote in a GitHub issue (https://github.com/signalapp/Signal-Desktop/issues/4042) that "at-rest encryption is not something that Signal Desktop is currently trying to provide or has ever claimed to provide. Full-disk encryption can be enabled at the OS level on most desktop platforms."

I worry that elevating things like this into security scandals will just result in more programs encrypting all local data with keys outside of user access. More lock-in, less end-user control.

@palm0x If they’re not trying to provide that, why did they bother with the faux solution?

@vintner I worry about that, too, but I’m not sure what the answer is.

@Michael Tsai What do you mean, "faux solution"? Full-disk encryption enabled at the OS level?
If so, I would say that they coherently did not bother at all; they simply state that it is not a task of Signal to encrypt messages on-disk and they delegate the task to the OS.
Personally I get their point, but in any case it would be better (especially for some sensitive use cases) if they would implement an additional security layer or at least if they would notify users (e.g. after the first installation) about the possible threats mentioned above.

@palm0x Faux solution in that they bothered to encrypt the database at the file level (which requires a special build of SQLite, too) but stored the key in plaintext right next to it.

@Michael Tsai Oh, I see now; well, back in 2018 Joshua Lund, a member of Signal staff, wrote in a post related to this problem on the Signal forum (https://community.signalusers.org/t/vulnerabilities/4548/7) that "SQLCipher ships with a good default set of plugins/pragmas, and its performance is excellent because it is built on top of the widely used SQLite database layer”.
So, as far as I understood, the purpose of using SQLCipher was not to encrypt the database, rather to benefit from its better features (in other words, the fact that the database is encrypted is just a side effect of using SQLCipher).

@palm0x That doesn’t make a lot of sense to me. From what I can see (and I looked into using it in my app), almost all of the extra functionality of SQLCipher is for encryption. It’s basically a wrapper over SQLite’s paging system that encrypts each page. It does have some extra performance measurement tools, which might be what he’s referring to? But you can also use SQLCipher without enabling encryption, in which case you end up with a standard SQLite database. So if they didn’t want encryption, but did want the other stuff, why enable encryption and end up with a non-standard file that’s harder to work with?

@Michael Tsai It does not make a lot of sense to me too, but unfortunately it is the only official answer that I found.
Another hypothesis formulated by a user that I read somewhere on the Signal forum is that encrypting the database was probably a small step in the direction of implementing the option to lock the application on desktop too, a feature that has been requested from the community for a very long time (https://community.signalusers.org/t/lock-the-desktop-app-with-a-password/1383)…

@Michael Tsai Update: it looks like that the developers at Signal changed their mind about the issue and that soon it will be fixed: https://github.com/signalapp/Signal-Desktop/pull/6849#issuecomment-2218845070

I am not moved. Frankly I don't understand the need to protect data on disk beyond what the OS gives you with FDE because once a bad actor has compromised your machine it's lights out anyway. And the choice of signal not to export recoverable data in a backup is a clear usability disaster regardless.

Sometimes it seems clear to me that security people live sad and paranoid lives. OTOH maybe that's precisely the kind of outlook you need for building software like this—though I'll never be convinced. I don't believe in firewalls either, so what do I know?

@Sebby Do you also think that an encrypted keychain and password manager make no sense if you have FDE?

@Michael Tsai Honestly, yeah. The encryption is mostly there to benefit the portability to or through untrusted systems. Of course if we're beginning with first principles, designing a desktop OS with gatekeeping on the keychain and other sensitive sandboxed/shoeboxed APIs does make sense. But let's keep things in perspective here. Once it's on your machine, there's a full spectrum of attacks that are possible once malicious code is running. Like, say, prompting you for a password in a convincing fashion, if you can't grab it from the keyboard directly.

Old Unix Geek

@Sebby: taking your laptop through security at the airport where they image it. Even if you are compelled to give them your laptop password, they might not ask for your signal password.

@OUG Sure, that's fair. I guess I'm just saying that I would not personally mind if the local encryption in my chat app were optional. If the client has to prompt for a password so the db is encrypted just like a password manager, that could give you the required security at a cost to convenience.

@Sebby Like you, I do not think that this issue is as big a concern to "regular users" as some people on social media has made it out to be, e.g. advising to “uninstall Signal Desktop, because it is not secure”. From my point of view these kind of statements should be avoided, because non-experts will obviously panic and switch to less secure alternatives.
That being said, Signal is not only marketed to and used by “regular users”; it is often recommended to people that have high threat model and their life could depend on the security and privacy of their messenger app. People trust Signal and indirectly trust the people that handle it.
So, the really worrying thing is that even after this security issue (I would not call it a bug, rather a design flaw) was disclosed the Signal team downplayed its importance. Again, it is not really a vulnerability, rather a lack of an advanced security measure; not necessary for the vast majority of the users, but essential for some (especially those who do not have advanced computer skills).
As I already wrote above, I think that the developers at Signal should have either added an additional security layer years ago or warned people that they were not protecting their messages at-rest on the desktop app.

I think at-rest encryption is perfectly valid. As long as the key is protected too and under my own control. I encrypt my SSH private key as well. I don't see the problem with that. You can unlock it for a session or let it prompt for a password at every use. Works pretty well and I've never been prompted for the password and not known why so I'm not sure how the malware would trick me in this case. Randomly prompt me to login to my server that I'm not actively trying to login too? Same with the Signal database, you'd only be prompted for that password when you wanted to access the Signal database directly, which is hardly ever, so it seems harder to spoof access to it. Double click malicious calculator app and a Signal database password prompt appears?

Look, if they exfiltrate my data out of my system then they can hack on it for a while and try to get access, but at least they'll have to put effort into it. As weird as it sounds, a part of security is assuming things might get compromised and you are now mitigating access to the rest of the system/network with your security choices.

@Sebby @Nathan The default should just be to store the password in the keychain. This would protect the data from other processes without affecting convenience.

Old Unix Geek

Let's consider a common use case for Signal where the encryption matters: a journalist contacted by a source who wants to stay anonymous.

The source wants protection, and might make the effort to ensure his machine is properly encrypted since his life is on the line.

The journalist is usually an arts major, doesn't understand computers, and just wants to publish a juicy story. Since journalists are honeypots for sources, his computer is probably already rooted by an intelligence agency.

If the journalist's computer is not properly encrypted, or if his password is in the keychain for convenience, the next time he goes through a border, his device can be imaged, and the source can be identified by the intelligence services.

Far fetched?

Glenn Greenwald was contacted by Snowden but fobbed him off because installing decryption software was too hard. It took Loira Poitras to kick Glenn into action.

David Miranda, Glenn Greenwald's husband, was "lawfully detained" at Heathrow, and his devices imaged under section 7 of the 2000 Terrorism act. The (encrypted) files he held were imaged.

The actual person who made all the wikileaks files public, and should have gone to prison, is David Leigh. He published the password to the encrypted copy of the unredacted cables in his book. He claimed he thought that the password would be "temporary", something mathematically impossible for static data. Assange had previously asked people to keep a copy of that encrypted file so that it could not be destroyed. Only after the cat was out of the bag, did Wikileaks publish said documents themselves. So the US case against Assange "who killed our sources" originated in the behaviour of a mathematics impaired journalist.

And I think it was on Joe Rogan that Tucker Carlson claimed his phone is always hacked, and that he just replaces it every so often because the malware runs the battery life down. He also claimed the NSA got his signal texts and complained that he was going to Moscow to interview Putin.

The unfortunate truth is that incompetence kills and "convenience" is essentially catering to the incompetent. Therefore if it matters enough to encrypt, convenience should be the last thing on one's mind. Getting shit right should be. But that's tough. Even governments get it wrong, e.g. buying encryption tools from "independent Swiss" thoroughly US infiltrated Crypto AG.

@Nathan_RETRO You're not vulnerable to social engineering. Good, so that means your data won't be exfiltrated, assuming you have done everything else right, including not running malicious software. And that is very likely to be the case unless you are being directly targeted, in which case you've really got much bigger problems.

@Michael Tsai Agreed, except that the option to put your password in the Keychain for auto-unlock should be a choice for security-sensitive users to make, otherwise the added security is pointless when the attacker has your login password. Just ask the user once when the app first launches, and make it a preference available to change the decision later if desired.

Agreed with @palm0x. To be clear, and without getting further sidetracked by arguments about the futility (or not) of "defence in depth" (except to say that my view of this principle is rather cynical), I am not suggesting that there is no value in reducing risk of compromise, especially for people who simply don't understand the risk. The default should absolutely be precautionary. Using protected APIs to conveniently safeguard encryption keys is obviously a good idea if you don't trust processes on the local machine not to access sensitive data. But I *do* believe that it's an overstated risk for anyone who is already clueful about host-based security, which includes basically any careful user of a system who keeps their software updated and is choosy about the software they run, and the response to this issue is just silly. It was silly when Google were caught storing your passwords in plaintext too, IMO. Encryption-at-rest is a feature available to *all the software on your machine* if you just turn it on in your OS; anything else is a value-add for the completely paranoid. If you are genuinely concerned about malware on your machine, the right response here is to assume the absolute worst and stop trusting it until you can be sure that it is vetted, which usually means nuking it from orbit and starting from scratch. JMO, as ever.

@Michael Tsai
I agree that storing password in MacOS Keychain is fine, but as a user option of course. However, on MacOS that does mean I'd likely disable iCloud Keychain just to be safe. I use the command line keychain tool on Linux for that same reason you state, locking the key is now protected by a dedicated process -- https://www.funtoo.org/Funtoo:Keychain

@Sebby
I'm sure that I am vulnerable to other phishing attacks, just wasn't sure what would prompt a user to put in a seldom used password for a random unrelated task. I get spoofing, say my bank, then I input the password into a malicious website. Also, if the MacOS keychain is used then I see the problem being any process that prompts for user password could be the malicious one that grants access to the database. Bigger attack vector there.

>I agree that storing password in MacOS Keychain is fine, but as a user option of course.

I see little benefit in making this an option. It would be unusual of a Mac app to behave that way, too. It should always store credentials in Keychain.

The keychain already lets the user control whether certain apps require a prompt to unlock.

@Sören
I meant from the standpoint of not wanting to sync all your passwords to the cloud. If you use MacOS keychain for this password and enable iCloud keychain, doesn't it sync the password you are trying to keep out of the cloud into Apple's servers? That's fine if you really want that to work that way, but I would prefer not to do it.

@Michael Tsai
Yes, I believe that to be true about prompting to unlock, which is how I use Funtoo keychain as well, do I want this open for the session or prompt each time. However, I do not know if you can tell iCloud keychain to sync only certain passwords and not others. I do not want the Signal DB password synced to a could I emphatically do not control.

@NATHAN_RETRO Would have to check but IIRC there is a way for apps to tell Keychain never to sync an item to the cloud.

Concerning spoofing, basically anyone who can prompt you for your login password can have the content of your OS Keychain, so whether or not they get your Signal password or your OS password is basically immaterial to the threat. Having Keychain protect the key protects against the one special case where some process (malware) reads your unprotected DB. I am definitely not saying it couldn't happen, or that the added protection isn't worthwhile. Just that you've got big problems if someone tricks you into giving them your password. Ever thus.

@Sören Why not an option? I agree it's unusual *not* to use the Keychain, but that doesn't make it mandatory either. Do you put your password manager's unlock password into your Keychain? I don't, and I suggest that it would be a pretty clear violation if it did that without asking.

Mmm, looks like some posts appeared afterwards, sorry I missed them!

@OUG Agreed and your (very real) journalist should definitely not put their Signal password in the Keychain, with or without FDE enabled. They should unlock Signal every time they log in. That's the only security level that would be good enough in that case, IMO. Signal will need to add that feature.

@Michael Tsai You're right, however the UI for controlling which apps are permanently allowed access is *very* unintuitive, so Signal should probably explain that to users.

Leave a Comment