510
FBI Arrests Man For Generating AI Child Sexual Abuse Imagery
(www.404media.co)
This is a most excellent place for technology news and articles.
for some reason the US seems to hold a weird position on this one. I don't really understand it.
It's written to be illegal, but if you look at prosecution cases, i think there have been only a handful of charged cases. The prominent ones which also include relevant previous offenses, or worse.
It's also interesting when you consider that there are almost definitely large image boards hosted in the US that host what could be constituted as "cartoon CSAM" notably e621, i'd have to verify their hosting location, but i believe they're in the US. And so far i don't believe they've ever had any issues with it. And i'm sure there are other good examples as well.
I suppose you could argue they're exempt on the publisher rules. But these sites don't moderate against these images, generally. And i feel like this would be the rare exception where it wouldnt be applicable.
The law is fucking weird dude. There is a massive disconnect between what we should be seeing, and what we are seeing. I assume because the authorities who moderate this shit almost exclusively go after real CSAM, on account of it actually being a literal offense, as opposed to drawn CSAM, being a proxy offense.
It seems to me to be a lesser charge. A net that catches a larger population and they can then go fishing for bigger fish to make the prosecutor look good. Or as I've heard from others, it is used to simplify prosecution. PedoAnon can't argue "it's a deepfake, not a real kid" to the SWAT team.
ah that could be a possibility as well. Just ensuring reasonable flexibility in prosecution so you can be sure of what you get.