63

Titled “Enhanced Visual Search,” this toggle permits iPhones to transmit photo data to Apple by default, raising concerns about user privacy and data-sharing practices.

top 6 comments
sorted by: hot top controversial new old
[-] Mikina@programming.dev 15 points 4 days ago

I mean, Apple is one of the companies that volunteered to the current optional version of ChatControl. They are already sending your messages and photos to EU to scan for "illegal" content.

Maybe I'm thinking of another thing, but wasnt that scanning locally for hashes vs uploading anything?

[-] Mikina@programming.dev 1 points 3 days ago

Tbh I'm not sure, I vaguely remember that hashes did play a role in how chatcontrol works, but I think it wasn't looking just for 1:1 match of known illegal content, but also for some signs? I remember reading that it had awfully high false-positive rate, which someone has to check. https://www.patrick-breyer.de/en/posts/chat-control/

According to the Swiss Federal Police, 80% of the reports they receive (usually based on the method of hashing) are criminally irrelevant. Similarly in Ireland only 20% of NCMEC reports received in 2020 were confirmed as actual “child abuse material”.

[-] TaviRider@reddthat.com 2 points 2 days ago

The 1:1 matching and the porn detection were separate capabilities.

Porn detection is called Communication Safety, and it only warms the user. If it’s set up in Screen Time as a child’s device, someone has to enter the parent’s Screen Time passcode to bypass the warning. That’s it. It’s entirely local to the device. The parent isn’t notified or shown the image, and Apple doesn’t get the image. It’s using an ML model, so it can have false positives.

CSAM detection was exact 1:1 matching using a privacy-preserving hashing system. It prevented users uploading known CSAM to iCloud, and that’s it. Apple couldn’t tell if there was a match or find out the hashes of images being evaluated.

Many people misunderstood and conflated the two capabilities, and often claimed without evidence that they did things that they were designed never to do. Apple abandoned the CSAM detection capability.

Screw all of that.

[-] FundMECFS@slrpnk.net 5 points 4 days ago

Me whose phone is too old to update to the new ios: “hahahha, finally an advantage to my low income”

this post was submitted on 06 Jan 2025
63 points (94.4% liked)

Cybersecurity

5923 readers
237 users here now

c/cybersecurity is a community centered on the cybersecurity and information security profession. You can come here to discuss news, post something interesting, or just chat with others.

THE RULES

Instance Rules

Community Rules

If you ask someone to hack your "friends" socials you're just going to get banned so don't do that.

Learn about hacking

Hack the Box

Try Hack Me

Pico Capture the flag

Other security-related communities !databreaches@lemmy.zip !netsec@lemmy.world !securitynews@infosec.pub !cybersecurity@infosec.pub !pulse_of_truth@infosec.pub

Notable mention to !cybersecuritymemes@lemmy.world

founded 2 years ago
MODERATORS