14

Some of you may have noticed a lot of people freaking out about CSAM and a bunch of communities closing, instances restricting registrations, turning off image uploads or shutting down completely. It's a bit of a chaos.

Fortunately your admin has been fighting this fight for the past year so I have developed some tools to help me out. I repurposed one of them to cover lemmy images

Using this approach, I've now turned on automatic scanning of new uploads.

What this means for you is that occasionally you will upload an image for a post and it will stop working after a bit. C'est la vie. Just upload something else. Changing format or slightly altering the image won't help you.

Also, sometimes you might see missing thumbnails on post from other communities. Those were the cached thumbnails hosted by us. The original images should still work in those cases.

Unfortunately this sort of AI scanning is not perfect and due to the nature of the beast, it will catch more false positives but to an acceptable degree. But I find that this is OK for a small social network site run as a hobby project.

Cool? Cool.

you are viewing a single comment's thread
view the rest of the comments
[-] lambalicious@lemmy.sdf.org 1 points 1 year ago

and due to the nature of the beast, it will catch more false positives but to an acceptable degree

  1. What is this "acceptable degree"? Where is it documented?
  2. What is the recourse for the uploader in case of a false positive? And no I don't mean "upload something else", I mean what do you answer to "my legit content is being classified by a shared internet tool as CSAM, of all things".
this post was submitted on 29 Aug 2023
14 points (100.0% liked)

/0

1559 readers
6 users here now

Meta community. Discuss about this lemmy instance or lemmy in general.

Service Uptime view

founded 1 year ago
MODERATORS