1331
AI rule (media.infosec.exchange)
you are viewing a single comment's thread
view the rest of the comments
[-] someguy7734206@sh.itjust.works 25 points 1 year ago

One thing I've started to think about for some reason is the problem of using AI to detect child porn. In order to create such a model, you need actual child porn to train it on, which raises a lot of ethical questions.

[-] breadcodes@lemm.ee 26 points 1 year ago

Cloudflare says they trained a model on non-cp first and worked with the government to train on data that no human eyes see.

It's concerning there's just a cache of cp existing on a government server, but it is for identifying and tracking down victims and assailants, so the area could not be more grey. It is the greyest grey that exists. It is more grey than #808080.

[-] tryptaminev@feddit.de 3 points 1 year ago

well, many governments had no issue taking over a cp website and hosting it for montha to come, using it as a honeypot. Still they hosted and distributed cp, possibly to thousands of unknown customers who can redistribute it.

load more comments (12 replies)
this post was submitted on 20 Oct 2023
1331 points (100.0% liked)

196

16452 readers
1732 users here now

Be sure to follow the rule before you head out.

Rule: You must post before you leave.

^other^ ^rules^

founded 1 year ago
MODERATORS