543
you are viewing a single comment's thread
view the rest of the comments
[-] phoenixz@lemmy.ca 55 points 1 year ago

I'm very conflicted on this one.

Child porn one of those things that won't go away if you prohibit it, like alcohol. It'll just go underground and cause harm to real children.

AI child pornography images, as disturbing as they might be, would serve a "need", if you will, while not actually harming children. Since child pornography doesn't appear to be one of those "try it and you'll get addicted" things, I'm genuinely wondering if this would actually reduce the harm caused to real children. If so, I think it should be legal.

[-] clausetrophobic@sh.itjust.works 34 points 1 year ago

Normalisation in culture has effects on how people behave in the real world. Look at Japan's sexualization of women and minors, and how they have huge problems with sexual assault. It's not about whether or not real children are getting hurt, it's about whether it's morally right or wrong. And as a society, we've decided that CP is very wrong as a moral concept.

[-] PhlubbaDubba@lemm.ee 24 points 1 year ago

Here's the thing though, being too paranoid about normalization also makes the problem worse, because the truth is that these are people with severe mental problems, who in all likelihood want to seek professional help in most cases.

The problem is the subject is SO taboo that even a lot of mental health professionals will chase them off like rabid animals when the solution is developing an understanding that can lead to a clinical treatment plan for these cases.

Doing that will also help the CSAM problem too since getting people out of the alleyways and into professional help will shrink the market significantly, both immediately and overtime, reducing the amount of content that gets made, and as a result, the amount of children victimized to make that content.

The key factor remains, we have to stop treating these people like inhuman monsters that deserve death and far worse whenever they're found. They're sick in the head souls who need robust mental health care and thought management strategies.

[-] JoBo@feddit.uk 3 points 1 year ago

None of that is an argument for normalisation via legalisation. Offenders and potential offenders should feel safe to seek help. Legalising AI-generated CSAM just makes it much less likely that they'll see the need to seek help. In much the same way that rapists assume all men are rapists, because most men don't make it clear that they're not.

[-] phoenixz@lemmy.ca 1 points 1 year ago

I'm sorry, should I make clear to every bank that I'm not a bank robber? Do I seriously have to tell every woman that I am not a rapist? That is a really bad argument. The vast VAST majority of men are not rapists, saying that it's men's fault because they don't apologize or clarify that they're not rapists is just... crazy

[-] JoBo@feddit.uk 1 points 1 year ago* (last edited 1 year ago)

Where did you get any of that from? Why does any of what I said somehow imply telling women anything at all?

Get a fucking grip.

[-] fubo@lemmy.world 18 points 1 year ago

On the other hand, producing porn is illegal in India and they have huge problems with sexual assault too.

[-] Shush@reddthat.com -4 points 1 year ago

Producing - sure. But consuming?

[-] MrSqueezles@lemm.ee 15 points 1 year ago

I heard an anonymous interview with someone who was sickened by their own attraction to children. Hearing that person speak changed my perspective. This person had already decided never to marry or have kids and chose a career to that same end, low likelihood that kids would be around. Clearly, since the alternative was giving up on love and family forever, the attraction wasn't a choice. Child porn that wasn't made with children, comics I guess, was used to fantasize to prevent carrying through on those desires in real life.

I don't get it, why anyone would be attracted to kids. It's gross and hurtful and stupid. If people suffering from this problem have an outlet, though, maybe fewer kids will be hurt.

[-] pineapplelover@lemm.ee 12 points 1 year ago* (last edited 1 year ago)

I'm thinking it should still be illegal but if they get charged for it, make it less severe than being charged with actual cp. This might naturally incentivize that industry to go for ai generated images instead of trafficking. Also I think if they took an image of an actual child and used AI to do this stuff it should be more severe than using a picture of a legal aged person to make cp.

[-] kromem@lemmy.world 11 points 1 year ago

I'd go more in the direction of state sponsored generation and controlled access.

If you want legal unlimited access to AI generated CSM, you need to register with the state for it and in so doing also close off access to positions that would put you in situations where you'd be more able to act on it (i.e. employment in schools, child hospitals, church youth leadership, etc).

If doing that, and no children are harmed in the production of the AI generated CSM, then you have a license to view and possess (but not redistribute) the images registered with the system.

But if you don't have that license (i.e. didn't register as sexually interested in children) and possess them, or are found to be distributing them, then you face the full force of the law.

[-] ParsnipWitch@feddit.de 4 points 1 year ago

There are many things still unclear about whether or not this will increase harm.

We don't know how these images effect people and their behaviour. Many techbros online treat it like it's a fact that media does not influence behaviour and thought processes, but if you look at the research this isn't clear cut at all. And some research was able to show that specific media indeed does influence people.

Additionally, something rarely talked about, these images, stories and videos can be used to groom children and teenagers. Either to become victims and/or to become consumers themselves. This was a thing in the past and I bet it is still happening with Manga depicting Loli Hentai. Making these images legal will give groomers even better tool.

[-] phoenixz@lemmy.ca 1 points 1 year ago

If Loli porn can turn people into pedophiles then I think humanity is having bigger issues

[-] UsernameIsTooLon@lemmy.world 3 points 1 year ago

It's an ethical dilemma. It's just an extremely controversial one. You really have to weigh in whether or not we should keep chaos if it means betterment for society as we advance forward.

I don't think things should be as black and white as legal or not. I think the answer lies somewhere between something like decriminalizing drugs. Mostly illegal, but could benefit those who are genuinely seeking help. It would just have to take me a lot of convincing on an individual to need to seek out this material or else they are a danger to those around them.

[-] UntouchedWagons@lemmy.ca 2 points 1 year ago

Isn't AI art based on pre-existing content that's been fed into the model?

[-] Skwerls@discuss.tchncs.de 20 points 1 year ago

Yes, but not in the way I think you're implying, it is not trained on csam images. It can put the pieces together to varying degrees of success. If you ask for a Martian hedgehog in a tuxedo riding a motorcycle, it can create something looking like that without being trained on exactly that thing.

[-] LogicalDrivel@sopuli.xyz 9 points 1 year ago

Martian hedgehog in a tuxedo riding a motorcycle

Just to prove your point I fed that into an AI (dreamshaper 8). no other prompts or anything, and this was the first image it generated.

[-] Skwerls@discuss.tchncs.de 4 points 1 year ago

Lol thanks. Not sure what's marrtian but it got the rest pretty well!

[-] davidgro@lemmy.world 5 points 1 year ago

The black and green colors match Marvin's head, but it's mostly missing red for his body

[-] jcg@halubilo.social 3 points 1 year ago

You can see the red on the hands on his motorcycle but most of it would be covered by the tuxedo

[-] JoBo@feddit.uk -3 points 1 year ago

You can certainly argue that AI-generated CSAM does less harm but you can't argue from that to legalising it because it still does a bucketload of harm. Consumers of CSAM are very likely to harm real children and normalising CSAM makes that much more likely.

This argument is a non-starter and people really need to stop pushing it.

[-] NightAuthor@lemmy.world 5 points 1 year ago

Consumers of CSAM are very likely to harm real children and normalising CSAM makes that much more likely.

If any of that was objectively true, then yeah, I agree. Problem is, it looks like you just pulled that out of your ass.

[-] phoenixz@lemmy.ca 1 points 1 year ago

You're literally claiming a bunch of things as facts. Any spur ea to back that up?

this post was submitted on 30 Sep 2023
543 points (97.7% liked)

World News

38969 readers
2018 users here now

A community for discussing events around the World

Rules:

Similarly, if you see posts along these lines, do not engage. Report them, block them, and live a happier life than they do. We see too many slapfights that boil down to "Mom! He's bugging me!" and "I'm not touching you!" Going forward, slapfights will result in removed comments and temp bans to cool off.

We ask that the users report any comment or post that violate the rules, to use critical thinking when reading, posting or commenting. Users that post off-topic spam, advocate violence, have multiple comments or posts removed, weaponize reports or violate the code of conduct will be banned.

All posts and comments will be reviewed on a case-by-case basis. This means that some content that violates the rules may be allowed, while other content that does not violate the rules may be removed. The moderators retain the right to remove any content and ban users.


Lemmy World Partners

News !news@lemmy.world

Politics !politics@lemmy.world

World Politics !globalpolitics@lemmy.world


Recommendations

For Firefox users, there is media bias / propaganda / fact check plugin.

https://addons.mozilla.org/en-US/firefox/addon/media-bias-fact-check/

founded 1 year ago
MODERATORS