this post was submitted on 29 Aug 2025
309 points (98.1% liked)
Privacy
41322 readers
836 users here now
A place to discuss privacy and freedom in the digital world.
Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.
In this community everyone is welcome to post links and discuss topics related to privacy.
Some Rules
- Posting a link to a website containing tracking isn't great, if contents of the website are behind a paywall maybe copy them into the post
- Don't promote proprietary software
- Try to keep things on topic
- If you have a question, please try searching for previous discussions, maybe it has already been answered
- Reposts are fine, but should have at least a couple of weeks in between so that the post can reach a new audience
- Be nice :)
Related communities
much thanks to @gary_host_laptop for the logo design :)
founded 5 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
There not even checking for CSAM
That would be near impossIble considering the tech. Even on a normal portrait is hard to judge the age on. Let alone fotos with more complex perspectives and only some body parts visible.
What they are doing is using hashes of specific real pictures that the police know are commonly shared.
Theoretically it could catch some careless content consuming offenders. The worst offenders, that produce new material, are beyond the scope.
But also, obvious what google gets is just the hashcodes and not the actual pics. If the police gave google a hash to target for pics of vances bald head or (trans-positive) memes who would know?
There was a news some time ago, that a man was arrested for clicking nude pictures of children, later it was found out that he was sending pictures of his child to a doctor for diagnosis. How did that happen?
I'll link the source if I find it.
Update:
NYTimes - https://www.nytimes.com/2022/08/21/technology/google-surveillance-toddler-photo.html
Paywall removed - https://removepaywalls.com/https://www.nytimes.com/2022/08/21/technology/google-surveillance-toddler-photo.html
They are literally too big to care.
I've heard about this too.
That must be some other system indeed.
They don't really provide much information from how the images were actually shared though.
Maybe there is a machine learning algorithm that is trained to detect specific features in a random photo but i cant imagine it being accurate without frequent false possibles.
Could be that if you have a certain amount of “plausible” hits then a google employee has to review them manually and they quickly Judged it wrongly?
Though that technically implies your Private medical picture is now seen and possibly covertly copied by a (rogue) employee.
It’s been well documented.
False positives don’t matter, and there’s no human to talk to when it occurs.
They don't manually review. They just shutdown your account and you can't contact them. See my other comment
False positives are a thing. They do scan all your photos for csam. Poorly.
We know this because of the article during the pandemic when a dude sent a photo of his son's dick to a doctor (it had an infection and the doctor asked to see it). Dude lost access to his entire google account. Lost everything. Emails, files, everything. It wasn't a hash.