549
submitted 11 months ago by DannyMac@lemmy.world to c/technology@lemmy.world
top 50 comments
sorted by: hot top controversial new old
[-] KLISHDFSDF@lemmy.ml 231 points 11 months ago* (last edited 11 months ago)

Tangentially related, if you use iMessage, I'd recommend you switch to Signal.

text below from a hackernews comment:


Gonna repeat myself since iMessage hasn't improved one bit after four years. I also added some edits since attacks and Signal have improved.

iMessage has several problems:

  1. iMessage uses RSA instead of Diffie-Hellman. This means there is no forward secrecy. If the endpoint is compromised at any point, it allows the adversary who has

a) been collecting messages in transit from the backbone, or

b) in cases where clients talk to server over forward secret connection, who has been collecting messages from the IM server

to retroactively decrypt all messages encrypted with the corresponding RSA private key. With iMessage the RSA key lasts practically forever, so one key can decrypt years worth of communication.

I've often heard people say "you're wrong, iMessage uses unique per-message key and AES which is unbreakable!" Both of these are true, but the unique AES-key is delivered right next to the message, encrypted with the public RSA-key. It's like transport of safe where the key to that safe sits in a glass box that's strapped against the safe.

  1. The RSA key strength is only 1280 bits. This is dangerously close to what has been publicly broken. On Feb 28 2023, Boudet et. al broke a 829-bit key.

To compare these key sizes, we use https://www.keylength.com/en/2/

1280-bit RSA key has 79 bits of symmetric security. 829-bit RSA key has ~68 bits of symmetric security. So compared to what has publicly been broken, iMessage RSA key is only 11 bits, or, 2048 times stronger.

The same site estimates that in an optimistic scenario, intelligence agencies can only factor about 1507-bit RSA keys in 2024. The conservative (security-consious) estimate assumes they can break 1708-bit RSA keys at the moment.

(Sidenote: Even the optimistic scenario is very close to 1536-bit DH-keys OTR-plugin uses, you might want to switch to OMEMO/Signal protocol ASAP).

Under e.g. keylength.com, no recommendation suggest using anything less than 2048 bits for RSA or classical Diffie-Hellman. iMessage is badly, badly outdated in this respect.

  1. iMessage uses digital signatures instead of MACs. This means that each sender of message generates irrefutable proof that they, and only could have authored the message. The standard practice since 2004 when OTR was released, has been to use Message Authentication Codes (MACs) that provide deniability by using a symmetric secret, shared over Diffie-Hellman.

This means that Alice who talks to Bob can be sure received messages came from Bob, because she knows it wasn't her. But it also means she can't show the message from Bob to a third party and prove Bob wrote it, because she also has the symmetric key that in addition to verifying the message, could have been used to sign it. So Bob can deny he wrote the message.

Now, this most likely does not mean anything in court, but that is no reason not to use best practices, always.

  1. The digital signature algorithm is ECDSA, based on NIST P-256 curve, which according to https://safecurves.cr.yp.to/ is not cryptographically safe. Most notably, it is not fully rigid, but manipulable: "the coefficients of the curve have been generated by hashing the unexplained seed c49d3608 86e70493 6a6678e1 139d26b7 819f7e90".

  2. iMessage is proprietary: You can't be sure it doesn't contain a backdoor that allows retrieval of messages or private keys with some secret control packet from Apple server

  3. iMessage allows undetectable man-in-the-middle attack. Even if we assume there is no backdoor that allows private key / plaintext retrieval from endpoint, it's impossible to ensure the communication is secure. Yes, the private key never leaves the device, but if you encrypt the message with a wrong public key (that you by definition need to receive over the Internet), you might be encrypting messages to wrong party.

You can NOT verify this by e.g. sitting on a park bench with your buddy, and seeing that they receive the message seemingly immediately. It's not like the attack requires that some NSA agent hears their eavesdropping phone 1 beep, and once they have read the message, they type it to eavesdropping phone 2 that then forwards the message to the recipient. The attack can be trivially automated, and is instantaneous.

So with iMessage the problem is, Apple chooses the public key for you. It sends it to your device and says: "Hey Alice, this is Bob's public key. If you send a message encrypted with this public key, only Bob can read it. Pinky promise!"

Proper messaging applications use what are called public key fingerprints that allow you to verify off-band, that the messages your phone outputs, are end-to-end encrypted with the correct public key, i.e. the one that matches the private key of your buddy's device.

  1. iMessage allows undetectable key insertion attacks.

EDIT: This has actually has some improvements made a month ago! Please see the discussion in replies.

When your buddy buys a new iDevice like laptop, they can use iMessage on that device. You won't get a notification about this, but what happens on the background is, that new device of your buddy generates an RSA key pair, and sends the public part to Apple's key management server. Apple will then forward the public key to your device, and when you send a message to that buddy, your device will first encrypt the message with the AES key, and it will then encrypt the AES key with public RSA key of each device of your buddy. The encrypted message and the encrypted AES-keys are then passed to Apple's message server where they sit until the buddy fetches new messages for some device.

Like I said, you will never get a notification like "Hey Alice, looks like Bob has a brand new cool laptop, I'm adding the iMessage public keys for it so they can read iMessages you send them from that device too".

This means that the government who issues a FISA court national security request (stronger form of NSL), or any attacker who hacks iMessage key management server, or any attacker that breaks the TLS-connection between you and the key management server, can send your device a packet that contains RSA-public key of the attacker, and claim that it belongs to some iDevice Bob has.

You could possibly detect this by asking Bob how many iDevices they have, and by stripping down TLS from iMessage and seeing how many encrypted AES-keys are being output. But it's also possible Apple can remove keys from your device too to keep iMessage snappy: they can very possibly replace keys in your device. Even if they can't do that, they can wait until your buddy buys a new iDevice, and only then perform the man-in-the-middle attack against that key.

To sum it up, like Matthew Green said[1]: "Fundamentally the mantra of iMessage is “keep it simple, stupid”. It’s not really designed to be an encryption system as much as it is a text message system that happens to include encryption."

Apple has great security design in many parts of its ecosystem. However, iMessage is EXTREMELY bad design, and should not be used under any circumstances that require verifiable privacy.

In comparison, Signal

  • Uses Diffie Hellman + Kyber, not RSA

  • Uses Curve25519 that is a safe curve with 128-bits of symmetric security, not 79 bits like iMessage.

  • Uses Kyber key exchange for post quantum security

  • Uses MACs instead of digital signatures

  • Is not just free and open source software, but has reproducible builds so you can be sure your binary matches the source code

  • Features public key fingerprints (called safety numbers) that allows verification that there is no MITM attack taking place

  • Does not allow key insertion attacks under any circumstances: You always get a notification that the encryption key changed. If you've verified the safety numbers and marked the safety numbers "verified", you won't even be able to accidentally use the inserted key without manually approving the new keys.

So do yourself a favor and switch to Signal ASAP.

[1] https://blog.cryptographyengineering.com/2015/09/09/lets-tal...

[-] BearOfaTime@lemm.ee 63 points 11 months ago

Wow.

I think it would help to summarize the major issue with iMessage and have it at the top.

The RSA encrypting the AES with the message content is so face-palmingly bad that you really don't need to read any further, and thd rest is just more evidence of issues.

Well done. I had no idea. Saving your summary, because it's so staggering. Wish I could upvote you a hundred times. This is a huge issue.

[-] Socsa@sh.itjust.works 11 points 11 months ago

We literally know that the FBI at one point was unable to break into an iPhone, and then a few days later was able to break into it. Apple clearly let them in the back door after negotiating the condition that they could deny and act all upset about it.

And then they launched a whole privacy - focused marketing campaign immediately afterwards. It's all laughable transparent, yet you still have moronic pop-security YouTubers repeating that bullshit that Apple is a secure platform.

[-] GekkoState 10 points 11 months ago

Um no, the FBI used software developed by an Israel based company to hack into it. This is well documented. Isreal has been creating and selling iPhone hacking software to nation states for years. They also sold out to the Saudi's who used to it to track and kill the American resident Jamal Khashoggi.

[-] HelloHotel@lemmy.world 2 points 11 months ago* (last edited 11 months ago)

Your right, I don't think those Israel companies got a backdoor from apple. A "magic packet" backdoor is too hard to hide into the code and would tank their trust FAST. However, They do encrypt the system files to prevent reverse engineering. iPhones then have enough bad practices (see: the IMessage post) (some of them oddly specific) to make a software developer cry in the corner. Incompetence, UX tunnel vision or intentional flaws. (honestly I don't know the answer)

load more comments (2 replies)
[-] farcaller@fstab.sh 8 points 11 months ago

In iOS 13 or later and iPadOS 13.1 or later, devices may use an Elliptic Curve Integrated Encryption Scheme (ECIES) encryption instead of RSA encryption

(from apple docs).

If you’re curious about it all, I'd suggest studying some notes from the protocol researchers instead of taking to the pitchforks immediately. Here's one good post on the topic.

[-] rostby@lemmy.fmhy.net 7 points 11 months ago

No way governments spying on their own people I could never believe such an act would be tolerated.

[-] foofiepie@lemmy.world 4 points 11 months ago

What about whatsapp? Is that secure?

[-] zergtoshi@lemmy.world 20 points 11 months ago

Security rarely comes in absolutes. Whatsapp doesn't appear to be open source. That alone makes it security wise a worse choice than Signal.

[-] TK420@lemmy.world 7 points 11 months ago

Anything made by Facebook is the worse alternative.

[-] zergtoshi@lemmy.world 4 points 11 months ago

That's another viewing angle coming to the same result :)
But even before Fcbook purchased Whatsapp, it'd have been the worse choice than Signal.
Nowadays Whatsapp is a very bad choice in a lot of ways except for network effect, which is the only real strength of it.
Speaking of network effect: back when Signal was TextSecure, I could message a total of 2 (two!) contacts with it and the UX was far from being awesome.
Signal has come a long way. The UX is great and I can message a lot of my contacts on Signal now.

[-] TK420@lemmy.world 4 points 11 months ago

The homies are all on Signal now, everyone else is on iMessage. It’s been a struggle, but we are getting there.

[-] crispy_kilt@feddit.de 8 points 11 months ago

Nobody knows, because it is closed source. It migjt be, but it might just as well not be. Best to use Signal.

[-] TK420@lemmy.world 6 points 11 months ago

lol is Facebook secure?

[-] RaoulDook@lemmy.world 3 points 11 months ago* (last edited 11 months ago)

Thanks for bringing that info here. I was already using Signal but I was concerned about their approach to notification security when I read this news this week.

Here's some info I found on the reddit Signal sub, not verified but just comments:

*All that goes through the Google or Apple push notification systems is “you’ve got a push notification.”

It’s up to your Signal app to then wake up, contact Signal’s servers, and see what the notification was. Message content and sender identity never pass through Google/Apple push infrastructure.

*Signal does not use google notification system is my understanding.

For apps that do, google only gets metadata, that is not content of the message.

2nd comment is not quite right, it does use the google notification system if you install it from the Play store. You can avoid that by installing the APK downloaded from the Signal site.

Metadata that is unencrypted could include things that identify who the message is to or from, and the timestamps of the messages. Seems like we can only be sure the content of messages is secure, but not the metadata. >

[-] homesweethomeMrL@lemmy.world 65 points 11 months ago* (last edited 11 months ago)

The data is said to have been used to attempt to tie anonymous users of messaging apps to specific Apple or Google accounts.

So it's not about the notifications or even necessarily the data the app handles; just that there's an apple ID or google ID they're pinging to see who it is.

Today's lesson is: Never use your apple ID or (ugh) google ID for anything important. If you can not use either for anything, great, but we all know we're not international super spies and sometimes you just want to play a card game or something. Still. If someone's unaware that smartphones are tracking devices they should probably know that now.

I'm amazed that Apple was prohibited from saying anything until now.

[-] BearOfaTime@lemm.ee 27 points 11 months ago

Just because we're not James Bond today, doesn't mean we won't be a person of interest tomorrow.

That's what's so dangerous, especially for stuff that's just collected for no particular reason. Look at the man who was arrested for a crime simply because he biked through the area during the right time, and his Google location history showed up in a search.

[-] AVincentInSpace@pawb.social 15 points 11 months ago

Look at the man who texted photos of his son's genitalia to said son's doctor and got his entire Google account banned when his phone automatically synced them to Gdrive and the algorithm decided he was a pedophile

load more comments (2 replies)
[-] mojo@lemm.ee 34 points 11 months ago

Anyone who thinks Apple is private is getting fucked balls deep by marketing at face value.

[-] noodlejetski@lemm.ee 34 points 11 months ago

not sure how it works on iOS, but at least on Android Signal has been taking some extra measures to avoid that. the message contents aren't delivered over GCM, just the ping that there's a new incoming message, which is then downloaded by Signal separately.

[-] BearOfaTime@lemm.ee 7 points 11 months ago* (last edited 11 months ago)

That's kind of how iMessage works, the Apple equivalent to GCM (Google Cloud Messaging) is called APN (or is it ANP? I always forget), and it sends a notification to the phone which then retrieves the message.

Be interesting to hear the perspective of the developers of Bubble Mini, since they just reverse-engineering iMessage.

https://jjtech.dev/reverse-engineering/imessage-explained/

[-] kpw@kbin.social 8 points 11 months ago

How do those governments have access to this data? Is it not TLS encrypted?

[-] prettybunnys@sh.itjust.works 17 points 11 months ago

The article states that Apple recommends not putting any sensitive data in the payloads as well as encrypting the payloads

This sounds a lot like a scenario where Apple informs that a mechanism used for standard mobile communication is being survived by governments not necessarily a scenario where something Apple or google are doing is inherently surveillance.

Here it seems like the surveillance is occurring at the 3rd parties who send the push notifications.

[-] LWD@lemm.ee 13 points 11 months ago* (last edited 11 months ago)
[-] BearOfaTime@lemm.ee 3 points 11 months ago

Right?

First they get location data because cell towers and people not caring.

Then they notice all these message notifications between these dozen people at this time, at this location, that happens to coincide with a protest.

Ding, fries are done!

[-] GenderNeutralBro@lemmy.sdf.org 12 points 11 months ago* (last edited 11 months ago)

Apple would be able (and perhaps required?) to provide the decrypted data. TLS is not end-to-end encryption; it's just server-to-client. It's useful to prevent MITM wiretapping but it is NOT useful to prevent server-side spying.

The article quotes Apple as saying they can update their transparency report now that this is public. Doesn't look like they have data for 2023 yet at https://www.apple.com/legal/transparency/

I'd think Apple could make push notification content end-to-end encrypted if they so desired, but I don't know how they could avoid having access to the vendor and user at minimum for the sake of validation and delivery.

[-] ImTryingLemmy@lemmy.world 3 points 11 months ago

To turn that question around, what incentive do the corporations have to encrypt that data? Whole bunch easier to just not care.

[-] LainOfTheWired@lemy.lol 6 points 11 months ago

Good time to switch to an open source degoogled android ROM and set up your own push notification server.

Until people stop giving up their freedom to these companies by agreeing to legal documents they don't even read, it's only going to get worse.

[-] solarvector@lemmy.zip 55 points 11 months ago

I agree those are good things to do.

But... Blaming people who are being fucked over by forces generally outside their control is not really going to help their or our situation. Expecting or demanding "people" to just change is also not realistic. Even if they wanted to, time, effort, energy, knowledge, skills, and attention are all finite. This is just one important issue or source of exploitation among a sea of others.

load more comments (3 replies)
[-] iAmTheTot@kbin.social 30 points 11 months ago

Lol you're dreaming if you think even 0.1% of people will be interested in setting up their own server.

[-] Socsa@sh.itjust.works 3 points 11 months ago* (last edited 11 months ago)

They're also dreaming if they think doing these things doesn't just make them stand out, and provides them any real protection from state actors.

The number one rule of tradecraft is to blend in. I promise that you haven't thought of some way of using an always connected smartphone that the NSA hasn't considered. They are probably the ones making your degoogled ROMs.

This is hubris, plain and simple. If your goal is to hide from state actors then the best way of doing that is to be uninteresting statistical noise.

[-] LWD@lemm.ee 2 points 11 months ago* (last edited 11 months ago)
[-] deadcade@lemmy.deadca.de 8 points 11 months ago

Most "standard" messaging apps (that includes signal, telegram) use the "OS provided" push service. On Android, they use firebase cloud messaging, a component of google play services.

Degoogled Android means not having any notifications, unless the app supports UnifiedPush, runs in the background 24/7 (which drains battery), or runs in the background occasionally (which delays notifications).

If the app runs in the background occasionaly, you can "burden" the people on the other side by being slow to respond.

[-] wreckedcarzz@lemmy.world 5 points 11 months ago* (last edited 11 months ago)

Eh, I use a few apps that have true foss forks and thus don't use gcm but the keep-alive method, and I didn't notice a difference in battery when I made the switch.

Also lol #3 isn't exactly a "burden", take the hint and go away people. Let me live in blissful solitude.

[-] registrert@lemmy.sambands.net 1 points 11 months ago

Pretty much my experience with pull-based notifications. I've even tested the same client on the same setup against both NTFY and client-pull without seeing a noticable difference in battery usage.

[-] Socsa@sh.itjust.works 1 points 11 months ago

It also means you will be on a very short list of people who use Unified Push.

load more comments (1 replies)
load more comments (7 replies)
load more comments (3 replies)
[-] Brkdncr@sh.itjust.works 6 points 11 months ago

Sounds just like the idea that governments can retrieve metadata from phone calls without much hassle.

I’m not sure there is much you could do to get around this on iOS besides disabling push notifications in your app.

load more comments
view more: next ›
this post was submitted on 06 Dec 2023
549 points (98.8% liked)

Technology

59654 readers
2690 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS