915
I dont like drama (lemmy.ohaa.xyz)
submitted 1 year ago by Oha@lemmy.ohaa.xyz to c/memes@lemmy.ml
you are viewing a single comment's thread
view the rest of the comments
[-] 30p87@feddit.de 2 points 1 year ago

How would one realize CSAM protection? You'd need actual ML to check for it, and I do not think there are trained models available. And now find someone that wants to train such a model, somehow. Also, running an ML model would be quite expensive in energy and hardware.

[-] NightAuthor@beehaw.org 2 points 1 year ago

There are models for detecting adult material, idk how well they’d work on CSAM though. Additionally, there exists a hash identification system for known images, idk if it’s available to the public, but I know apple has it.

Idk, but we gotta figure out something

this post was submitted on 01 Sep 2023
915 points (96.1% liked)

Memes

45619 readers
1348 users here now

Rules:

  1. Be civil and nice.
  2. Try not to excessively repost, as a rule of thumb, wait at least 2 months to do it if you have to.

founded 5 years ago
MODERATORS