this post was submitted on 29 Jan 2025
108 points (97.4% liked)

Asklemmy

46885 readers
1002 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy 🔍

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 6 years ago
MODERATORS
 

Greetings!

A friend of mine wants to be more secure and private in light of recent events in the USA.

They originally told me they were going to use telegram, in which I explained how Telegram is considered compromised, and Signal is far more secure to use.

But they want more detailed explanations then what I provided verbally. Please help me explain things better to them! ✨

I am going to forward this thread to them, so they can see all your responses! And if you can, please cite!

Thank you! ✨

you are viewing a single comment's thread
view the rest of the comments
[–] dessalines@lemmy.ml 17 points 1 month ago* (last edited 1 month ago) (7 children)

I can't speak about telegram, but signal is absolutely not secure to use. Its a US-based service (that must adhere to NSLs), and requires phone numbers (meaning your real identity in the US).

Matrix, XMPP, or SimpleX are all decentralized, and don't require US hosting.

[–] 9tr6gyp3@lemmy.world 20 points 1 month ago (2 children)

This entire article is guessing at hypothetical backdoors. Its like saying that AES is backdoored because the US government chose it as the standard defacto symmetrical encryption.

There is no proof that Signal has done anything nefarious at all.

[–] juli@lemmy.world 3 points 1 month ago (2 children)

This entire article is guessing at hypothetical backdoors. Its like saying that AES is backdoored because the US government chose it as the standard defacto symmetrical encryption.

There is no proof that Signal has done anything nefarious at all.

As an outsider, I mean isn't that the same for news coverage for chinese/russian backdoors, but everyone believes it without any proof.

Why is US company being a US honeypot a big surprise, and its government recommending it not a big red flag? but it is when China recommends wechat? Can't we be critical and suspicious of both authoritarian countries?

Do you have access to Signal servers to verify your claims by any chance? Afaik their servers are running modified codebase, and third party apps cannot use them. So how do you claim anything that goes behind closed doors at all? Genuinel curious.

[–] 9tr6gyp3@lemmy.world 3 points 1 month ago

Being critical is good, and we should always hold them accountable for our security. We can look to third party audits for help with that.

https://community.signalusers.org/t/overview-of-third-party-security-audits/13243

[–] patatahooligan@lemmy.world 3 points 1 month ago

Do you have access to Signal servers to verify your claims by any chance?

That's not how it works. The signal protocol is designed in a way that the server can't have access to your message contents if the client encrypts them properly. You're supposed to assume the server might be compromised at any time. The parts you actually need to verify for safe communication are:

  • the code running on your device
  • the public key of your intended recipient
[–] dessalines@lemmy.ml 2 points 1 month ago (1 children)

There was also no proof that a ton of US companies were spying on their users, until the global surveillance disclosures. Crypto AG ran a honeypot that spied on communications between world leaders for > 40 years until it got exposed.

[–] 9tr6gyp3@lemmy.world 4 points 1 month ago

Right but Signal has been audited by various security firms throughout its lifetime, and each time they generally report back that this messenger has encryption locked down properly.

[–] flux@lemmy.world 17 points 1 month ago (1 children)

So if I understand it Signal has your phone number but only logs sign up date and last activity date. So yes they can say this person has Signal and last used it on date X. Other than that no information.

Matrix doesn't require a phone number but has no standard on logging activity so it's up to the server admin what they log, and they could retain ip address, what users are talking in what, rooms, etc. and E2EE is not required.

I think both have different approaches. I'm just trying to understand. On one hand you have centralized system that has a standard to minimize logs or decentralized system that must be configured to use E2EE and to remove logs.

[–] dessalines@lemmy.ml 1 points 1 month ago* (last edited 1 month ago)

They have your phone number (meaning your full identity, and even current address), and as the primary identifier, it means they have message timestamps and social graphs.

Its impossible to verify what code their server is running. Or that they delete their logs, because they say they do? You should never rely on someone saying "just trust us". Truly secure systems have much harder verifiability tests to pass.

[–] SnotFlickerman@lemmy.blahaj.zone 11 points 1 month ago* (last edited 1 month ago) (1 children)

SimpleX is taking a lot of venture capital money which makes it just slightly suspect, imho. Those guys usually want a return of some kind on their investment. I simply don't trust the motives of technocrats like Jack Dorsey.

The Matrix Foundation, on the other hand, seems a lot more democratic in governance and stewardship of the protocol.

[–] HotCoffee@lemm.ee 5 points 1 month ago

Good projects require money. And SimpleX is still way better than Signal and Telegram, so imo it's worth supporting and using

[–] doomsdayrs@lemmy.ml 11 points 1 month ago* (last edited 1 month ago) (6 children)

Thank you for your post!

I want you to know your effort and knowledge is appreciated, this will help future readers make better decisions.✨

But the situation stands that my friend and their friends are not as technologically literate as we are, and I would rather have them on something easy and secured than unsecured at all, especially from my experience with getting communities to use such decentralized platforms you mentioned.

load more comments (6 replies)
[–] Valmond@lemmy.world 8 points 1 month ago (1 children)

As you say yourself (cryptocraphic nerd here):

Signal’s E2EE protocol means that, most likely, message content between persons is secure.

So a shame there are no free servers, are the server soft not open source, only the signal app itself?

[–] dessalines@lemmy.ml 2 points 1 month ago (2 children)

The server is supposedly open source, but they did anger the open source community a few years back, by going a whole year without posting any code updates. Either way that's not reliable, because signal isn't self-hostable, so you have no idea what code the server is running. Never rely on someone saying "just trust us."

[–] hedgehog@ttrpg.network 1 points 1 month ago

Its impossible to verify what code their server is running.

Signal has posted multiple times about their use of SGX Secure Enclaves and how you can use Remote Attestation techniques to verify a subset of the code that’s running on their server, which directly contradicts your claim. (It doesn’t contradict the claim that you cannot verify all the code their server is running, though.) Have you looked into that? What issues did you find with it?

I posted a comment here going into more detail about it, but I haven’t personally confirmed myself that it’s feasible.

[–] Valmond@lemmy.world 0 points 1 month ago (1 children)

I have read that it is self hostable (but I haven't digged into it) but as it's not a federating service so not better than other alternative out there.

Also read that the keys are stored locally but also somehow stored in the cloud (??), which makes it all completely worthless if it is true.

That said, the three letter agencies can probably get in any android/apple phones if they want to, like I'm not forgetting the oh so convenient "bug" heartbleed...

[–] hedgehog@ttrpg.network 1 points 1 month ago (1 children)

Also read that the keys are stored locally but also somehow stored in the cloud (??),

Which keys? Are they always stored or are they only stored under certain conditions? Are they encrypted as well? End to end encrypted?

which makes it all completely worthless if it is true.

It doesn’t, because what you described above could be fine or could have huge security ramifications. As it is, my guess is that you’re talking about how Signal supports secure value recovery. In that case:

  1. The key is used to encrypt your contacts, profile name, group avatars, social graph, etc., but not your messages.
  2. Your key is only uploaded to the cloud if you have a recovery PIN or passphrase
  3. Your key is encrypted using your PIN or passphrase using techniques (key-stretching, storing in server secure enclaves) that make it more difficult to brute force

The main criticism of this is that you can’t opt out of it without opting out of the Registration Lock, that it necessarily uses the same PIN or passphrase, and that, particularly because it isn’t clear that your PIN/passphrase is used for encryption, users are less likely to use more secure pass phrases here.

But even without the extra steps that we can’t 100% confirm, like the use of the Secure Enclave on servers and so on, this is e2ee, able to be opted out by the user, not able to be used to recover past messages, and not able to be used to decrypt future messages.

[–] Valmond@lemmy.world 1 points 1 month ago (4 children)

Nice try FBI.

Well, if my pin is four numbers, that'll make it so hard to crack. /s

If you can't show hard evidence that everything is offline locally, no keys stored in the cloud, then it's just not secure.

BTW, "keys" when talking about encryption is the keys used to encrypt and decrypt, it wouldn't be very interesting to encrypt them, because now you have another set of keys you have to deal with.

load more comments (4 replies)
[–] TheHobbyist@lemmy.zip 4 points 1 month ago (1 children)

and requires phone numbers (meaning your real identity in the US).

This gets shared a lot as a major concern for all services requiring a phone number. It is definitely true that by definition, a phone number is linked to a person's identity, but in the case of signal, no other information can be derived from it. When the US government requests data for that phone number from Signal, like they occasionally do, the only information Signal provides them with is whether they do have a signal account and when they registered it last and when they last signed in. How is that truly problematic? For all other services which require a phone number, you would have much more information which is where it is truly problematic, say social graph, text messages, media, locations, devices etc. But none of that is accessible by Signal. So literally the only thing signal can say is whether the person has an account, that's about it. What's the big deal about it? Clearly the US government already has your phone number because they need it to make the request for Signal, but they gain absolutely no other information.

[–] Aria@lemmygrad.ml 2 points 1 month ago (1 children)

Your data is routed through Signal servers to establish connections. Signal absolutely can does provide social graphs, message frequency, message times, message size. There's also nothing stopping them from pushing a snooping build to one user when that user is targeted by the NSA. The specific user would need to check all updates against verified hashes. And if they're on iOS then that's not even an option, since the official iOS build hash already doesn't match the repo.

[–] TheHobbyist@lemmy.zip 1 points 1 month ago (2 children)

Signal absolutely can does provide social graphs, message frequency, message times, message size.

Do you have anything to back this up?

[–] Aria@lemmygrad.ml 2 points 1 month ago (1 children)

Your link lists all the things they don't share. The only reasonable reading is that anything not explicitly mentioned is shared. It's information they have, and they're legally required to share what they have, also mentioned in your link in the documents underneath their comment.

[–] TheHobbyist@lemmy.zip 1 points 1 month ago (2 children)

If you open the latest instance, from August 2024, you will find a California government request, for a number of phone numbers.

The second paragraph of that very page says:

Once again, Signal doesn’t have access to your messages; your calls; your chat list; your files and attachments; your stories; your groups; your contacts; your stickers; your profile name or avatar; your reactions; or even the animated GIFs you search for – and it’s impossible to turn over any data that we never had access to in the first place.

They respond to the request with the following information:

  1. The responsive information that Signal possessed was:

a. REDACTED: Most Recent Registration: 2023-01-31 T19:42:10 UTC; Most Recent Login: 2023-01-31 T00:00:00 UTC.

b. REDACTED: Most Recent Registration: 2022-06-01 T16:30:01UTC; Most Recent Login: 2022-12-12 T00:00:00 UTC.

c. REDACTED: Most Recent Registration 2021-12-02T03:42:09 UTC; Most Recent Login: 2022-12-28 T00:00:00 UTC.

The redacted values are the phone numbers.

That is the full extent of their reply. No other information is provided, to the government request.

[–] Aria@lemmygrad.ml 2 points 1 month ago

We can't verify that. They have a vested interest in lying, and occasionally are barred from disclosing government requests. However, using this as evidence, as I suggested in my previous comment, we can use it to make informed guesses as to what data they can share. They can't share the content of the message or calls -- This is believable and assumed. But they don't mention anything surrounding the message, such as whom they sent it to (and it is them who receives and sends the messages), when, how big it was, etc. They say they don't have access to your contact book -- This is also very likely true. But that isn't the same as not being able to provide a social graph, since they know everyone you've spoken to, even if they don't know what you've saved about those people on your device. They also don't mention anything about the connection they might collect that isn't directly relevant to providing the service, like device info.

Think about the feasibility of interacting with feds in the manner they imply. No extra communication to explain that they can't provide info they don't have? Even though they feel the need to communicate that to their customers. Of course this isn't the extent of the communication, or they'd be in jail. But they're comfortable spinning narratives. Consider their whole business is dependant on how they react to these requests. Do you think it's likely their communication of how they handled it is half-truths?

[–] dessalines@lemmy.ml 1 points 1 month ago (1 children)

California does not issue NSLs, the US federal government does. And those come with gag orders that means you will go to federal prison if you tell anyone that you've been asked to spy on your users.

[–] TheHobbyist@lemmy.zip 1 points 1 month ago (1 children)

Are you implying that Signal is withholding information from the Californian Government? And only providing the full extent of their data to the government?

This comes back to the earlier point that there is no proof Signal even has more data than they have shared.

[–] dessalines@lemmy.ml 1 points 1 month ago (1 children)

If you don't know what an NSL is, then you definitely shouldn't be speaking about privacy.

[–] TheHobbyist@lemmy.zip 1 points 1 month ago (2 children)

It's unfortunate that you react like this. I don't claim to be an expert, never have. I've only been asking for evidence, but all we get to are assumptions and they all seem to stem from the fact that allegedly the CIA has indirectly funded Signal (I'm not disputing nor validating it).

The concern is valid, and it has caused a lot of distrust in many companies due to the Snowden leaks, but that distrust is founded in the leaks. But so far there is no evidence that Signal is part of any of it. And given the continued endorsement by security experts, I'm inclined in trusting them.

[–] Aria@lemmygrad.ml 2 points 1 month ago

I think Dessalines most recent comment is fair even if it's harsh. You should understand the nature of a "national security letter" to have the context. The vast majority of (USA) government requests are NSLs because they require the least red tape. When you receive one, it's illegal to disclose that you have, and not to comply. It requires you to share all metadata you have, but they routinely ask for more.

Here's an article that details the CIA connection https://www.kitklarenberg.com/p/signal-facing-collapse-after-cia

The concern doesn't stem from the CIA funding. It's inherit to all services operating in or hosted in the USA. They should be assumed compromised by default, since the laws of that country require them to be. Therefore, any app you trust has to be completely unable to spy on you. Signal understands this, and uses it in their marketing. But it isn't true, they've made decisions that allow them to spy on you, and ask that you trust them not to. Matrix, XMPP and SimpleX cannot spy on you by design. (It's possible those apps were made wrong, and therefore allow spying, but that's a different argument).

[–] hedgehog@ttrpg.network 2 points 1 month ago

The concern is valid, and it has caused a lot of distrust in many companies due to the Snowden leaks, but that distrust is founded in the leaks.

Snowden explicitly endorsed Signal, too - and as far as I know he’s never walked that endorsement back.

[–] dessalines@lemmy.ml 2 points 1 month ago (1 children)

They have to. They can't route your messages otherwise.

[–] TheHobbyist@lemmy.zip 1 points 1 month ago (2 children)

They have to know who the message needs to go to, granted. But they don't have to know who the message comes from, hence why the sealed sender technique works. The recipient verifies the message via the keys that are exchanged if they have been communicating with that correspondent before or else it is a new message request.

So I don't see how they can build social graphs if they don't know who the sender if all messages are, they can only plot recipients which is not enough.

[–] dessalines@lemmy.ml 2 points 1 month ago (1 children)

But they don't have to know who the message comes from, hence why the sealed sender technique works.

Anyone who's worked with centralized databases can tell you that even if they did add something like that, with message timestamps, it'd be trivial to find the real sender of a message. You have no proof that they even use that, because the server is centralized, and closed source. Again, if their response is "just trust us", then its not secure.

[–] TheHobbyist@lemmy.zip 1 points 1 month ago (1 children)

From what I understand, sealed sender is implemented on the client side. And that's what's in the github repo.

[–] Aria@lemmygrad.ml 1 points 1 month ago (1 children)

How does that work? I wasn't able to find this. Can you find documentation or code that explains how the client can obscure where it came from?

[–] hedgehog@ttrpg.network 2 points 1 month ago (1 children)

https://signal.org/blog/sealed-sender/ explains the feature.

https://github.com/signalapp/Signal-Android/issues/13842 has some links into the code base showing where sealed sender is implemented.

[–] Aria@lemmygrad.ml 0 points 1 month ago (1 children)

Okay. But this method doesn't address that the service doesn't need the message to include the sender to know who the sender is. The sender ('s unique device) can with 100% accuracy be appended to the message by the server after it's received. Even if we trust them on the parts that require trust, the setup as described by the blog doesn't do anything to prevent social graphs from being derived, since the sender is identified at the start of every conversation.

If we trust them not to store any logs (unverifiable), then this method means they can't precisely know how long a conversation was or how many messages were exchanged. But you can still know precisely when and how many messages both participants received, there's just a chance that they're talking to multiple people. Though if we're trusting them not to store logs (unverifiable), then there shouldn't be any data to cross reference to begin with. So if we can't trust them, then why are we trusting them not to take note of the sender?

The upside is that if the message is leaked to a third-party, there's less info in it now. I'm ignoring the Github link, not because I don't appreciate you finding it, but because I take the blog-post to be the mission statement for the code, and the blog doesn't promise a system that comprehensively hides the sender's identity. I trust their code to do what is described.

[–] hedgehog@ttrpg.network 1 points 1 month ago (5 children)

The sender ('s unique device) can with 100% accuracy be appended to the message by the server after it's received.

How?

If I share an IP with 100 million other Signal users and I send a sealed sender message, how does Signal distinguish between me and the other 100 million users? My sender certificate is encrypted and only able to be decrypted by the recipient.

If I’m the only user with my IP address, then sure, Signal could identify me. I can use a VPN or similar technology if I’m concerned about this, of course. Signal doesn’t consider obscuring IPs to be in scope for their mission - there was a recent Cloudflare vulnerability that impacted Signal where they mentioned this. From https://www.404media.co/cloudflare-issue-can-leak-chat-app-users-broad-location/

404 Media asked daniel to demonstrate the issue by learning the location of multiple Signal users with their consent. In one case, daniel sent a user an image. Soon after, daniel sent a link to a Google Maps page showing the city the user was likely in.

404 Media first asked Signal for comment in early December. The organization did not provide a statement in time for publication, but daniel shared their response to his bug report.

“What you're describing (observing cache hits and misses) is a generic property of how Content Distribution Networks function. Signal's use of CDNs is neither unique nor alarming, and also doesn't impact Signal's end-to-end encryption. CDNs are utilized by every popular application and website on the internet, and they are essential for high-performance and reliability while serving a global audience,” Signal’s security team wrote.

“There is already a large body of existing work that explores this topic in detail, but if someone needs to completely obscure their network location (especially at a level as coarse and imprecise as the example that appears in your video) a VPN is absolutely necessary. That functionality falls outside of Signal's scope. Signal protects the privacy of your messages and calls, but it has never attempted to fully replicate the set of network-layer anonymity features that projects like Wireguard, Tor, and other open-source VPN software can provide,” it added.

I saw a post about this recently on Lemmy (and Reddit), so there’s probably more discussion there.

since the sender is identified at the start of every conversation.

What do you mean when you say “conversation” here? Do you mean when you first access a user’s profile key, which is required to send a sealed sender message to them if they haven’t enabled “Allow From Anyone” in their settings? If so, then yes, the sender’s identity when requesting the contact would necessarily be exposed. If the recipient has that option enabled, that’s not necessarily true, but I don’t know for sure.

Even if we trust Signal, with Sealed Sender, without any sort of random delay in message delivery, a nation-state level adversary could observe inbound and outbound network activity and derive high confidence information about who’s contacting whom.

All of that said, my understanding is that contact discovery is a bigger vulnerability than Sealed Sender if we don’t trust Signal’s servers. Here’s the blog post from 2017 where Moxie describe their approach. (See also this blog post where they talk about improvements to “Oblivious RAM,” though it doesn’t have more information on SGX.) He basically said “This solution isn’t great if you don’t trust that the servers are running verified code.”

This method of contact discovery isn’t ideal because of these shortcomings, but at the very least the Signal service’s design does not depend on knowledge of a user’s social graph in order to function. This has meant that if you trust the Signal service to be running the published server source code, then the Signal service has no durable knowledge of a user’s social graph if it is hacked or subpoenaed.

He then continued on to describe their use of SGX and remote attestation over a network, which was touched on in the Sealed Sender post. Specifically:

Modern Intel chips support a feature called Software Guard Extensions (SGX). SGX allows applications to provision a “secure enclave” that is isolated from the host operating system and kernel, similar to technologies like ARM’s TrustZone. SGX enclaves also support a feature called remote attestation. Remote attestation provides a cryptographic guarantee of the code that is running in a remote enclave over a network.

Later in that blog post, Moxie says “The enclave code builds reproducibly, so anyone can verify that the published source code corresponds to the MRENCLAVE value of the remote enclave.” But how do we actually perform this remote attestation? And is it as secure and reliable as Signal attests?

In the docs for the “auditee” application, the Examples page provides some additional information and describes how to use their tool to verify the MRENCLAVE value. Note that they also say that the tool is a work in progress and shouldn’t be trusted. The Intel SGX documentation likely has information as well, but most of the links that I found were dead, so I didn’t investigate further.

A blog post titled Enhancing trust for SGX enclaves raised some concerns with SGX’s current implementation, specifically mentioning Signal’s usage, and suggested (and implemented) some improvements.

I haven’t personally verified the MRENCLAVE values for any of Signal’s services and I’m not aware of anyone who has (successfully, at least), but I also haven’t seen any security experts stating that the technology is unsound or doesn’t actually do what’s claimed.

Finally, I recommend you check out https://community.signalusers.org/t/overview-of-third-party-security-audits/13243 - some of the issues noted there involve the social graph and at least one involves Sealed Sender specifically (though the link is dead; I didn’t check to see if the Internet Archive has a backup).

load more comments (4 replies)
[–] cypherpunks@lemmy.ml 1 points 1 month ago

They have to know who the message needs to go to, granted. But they don’t have to know who the message comes from, hence why the sealed sender technique works. The recipient verifies the message via the keys that are exchanged if they have been communicating with that correspondent before or else it is a new message request.

So I don’t see how they can build social graphs if they don’t know who the sender if all messages are, they can only plot recipients which is not enough.

  1. You need to identify yourself to receive your messages, and you send and receive messages from the same IP address, and there are typically not many if any other Signal users sharing the same IP address. So, the cryptography of "sealed sender" is just for show - the metadata privacy remains dependent on them keeping their promise not to correlate your receiving identity with the identities of the people you're sending to. If you assume that they'll keep that promise, then the sealed sender cryptography provides no benefit; if they don't keep the promise, sealed sender doesn't really help. They outsource the keeping of their promises to Amazon, btw (a major intelligence contractor).

  2. Just in case sealed sender was actually making it inconvenient for the server to know who is talking to who... Signal silently falls back to "unsealed sender" messages if server returns 401 when trying to send "sealed sender" messages, which the server actually does sometimes. As the current lead dev of Signal-for-Android explains: "Sealed sender is not a guarantee, but rather a best-effort sort of thing" so "I don't think notifying the user of a unsealed send fallback is necessary".

Given the above, don't you think the fact that they've actually gone to the trouble of building sealed sender at all, which causes many people to espouse the belief you just did (that their cryptographic design renders them incapable of learning the social graph, not to mention learning which edges in the graph are most active, and when) puts them rather squarely in doth protest too much territory? 🤔

[–] logging_strict@lemmy.ml 1 points 1 month ago

You are right but

we like doing the wrong thing over and over again. And being surprised, each and every time, when it turns out to be wrong. Never picking up onto the repeating simple pattern.

1111111111111 what's the next number ... errrr Signal! That's it you got it. Good job.

Embrace the idiocracy!

This is why Telegram is awesome.

Eventually you will come around and realize how hopeless humanity is and embrace that it is well beyond hope.

And then you will have a larger network and enjoy each and every one of them.