1730
submitted 1 year ago* (last edited 1 year ago) by lwadmin@lemmy.world to c/lemmyworld@lemmy.world

Hello everyone,

We unfortunately have to close the !lemmyshitpost community for the time being. We have been fighting the CSAM (Child Sexual Assault Material) posts all day but there is nothing we can do because they will just post from another instance since we changed our registration policy.

We keep working on a solution, we have a few things in the works but that won't help us now.

Thank you for your understanding and apologies to our users, moderators and admins of other instances who had to deal with this.

Edit: @Striker@lemmy.world the moderator of the affected community made a post apologizing for what happened. But this could not be stopped even with 10 moderators. And if it wasn't his community it would have been another one. And it is clear this could happen on any instance.

But we will not give up. We are lucky to have a very dedicated team and we can hopefully make an announcement about what's next very soon.

Edit 2: removed that bit about the moderator tools. That came out a bit harsher than how we meant it. It's been a long day and having to deal with this kind of stuff got some of us a bit salty to say the least. Remember we also had to deal with people posting scat not too long ago so this isn't the first time we felt helpless. Anyway, I hope we can announce something more positive soon.

you are viewing a single comment's thread
view the rest of the comments
[-] AeonFelis@lemmy.world 11 points 1 year ago

Interesting. But aren’t hashes unique to a specific photo? Just a single change to the photo would inevitably change its hash.

Most people are lazy and stupid, so maybe hash checking is enough to catch a huge portion (probably more than 50%, maybe even 80% or 90%?) of the CSAM that doesn't bother (or know how) to do that?

[-] dipshit@lemmy.world 1 points 1 year ago

A hash would change if even one bit changed in that file. This could be from corruption, automated resizing by any photo processing tools (i.e., most sites will resize photos if you give them one too big), saving a lossy file time again (adding more jpg), etc.. This is why there aren’t many automated tools for this detection. Sites that have tried by using skin tones in a photo have failed spectacularly.

I’ve never heard of this FBI middleware. Does anyone have the link to this? I’d like to understand what tools are available to combat this as I’ve been considering starting my own instance for some time now.

[-] TechnoBabble@lemm.ee 1 points 1 year ago

I'm almost positive they've been developing an image recognition AI that will make slightly altering csam photos obsolete.

Here's hoping.

this post was submitted on 28 Aug 2023
1730 points (97.9% liked)

Lemmy.World Announcements

28383 readers
271 users here now

This Community is intended for posts about the Lemmy.world server by the admins.

Follow us for server news 🐘

Outages 🔥

https://status.lemmy.world

For support with issues at Lemmy.world, go to the Lemmy.world Support community.

Support e-mail

Any support requests are best sent to info@lemmy.world e-mail.

Report contact

Donations 💗

If you would like to make a donation to support the cost of running this platform, please do so at the following donation URLs.

If you can, please use / switch to Ko-Fi, it has the lowest fees for us

Ko-Fi (Donate)

Bunq (Donate)

Open Collective backers and sponsors

Patreon

Join the team

founded 2 years ago
MODERATORS