171
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 10 Jul 2024
171 points (100.0% liked)
Technology
37800 readers
166 users here now
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 2 years ago
MODERATORS
…
What?
Have you not interacted with teenage boys?
I can think of not much more of a better way to teach them there are no consequences and they can keep doing this as long as they smirk and say they’re sorry whenever they get caught
Punishment or not, those charges are still scary. I think the probation and courses are a good addition.
I don’t think those are additions, I think those are the punishments for those charges, in full. I could be wrong but that’s how I read it.
Some of them are too young to receive real sentencing. It's important to remember that they're children, too.
Yeah, it's probably more important to make sure we don't have child porn generation machines available to anyone online.
Since anyone can download and train their own AI, that ship has probably sailed.
I think training your own image generator on existing child porn is probably beyond most high schoolers. I'd be happy if at least commercial options were held responsible for distributing generated CP, which is already illegal BTW.
I don't think the models are trained on CP. They're likely trained on widely-available porn.
This. If you ask an image generator for a bed in the shape of a pineapple, it probably has no pineapple-shaped beds in its training data but it has pineapples and beds and can mash the concepts together.
Technically, any model trained on LAION-5B before December 2023 was trained on CSAM.
But yeah, I expect any porn model trained on a sufficient diversity of adult actors could be used to make convincing CP even without having it in the training data. AI image generation is basically the digital equivalent of a chainsaw - a tool for a particular messy job that can really hurt people if used incorrectly. You wouldn't let a typical kid run around unattended with one, that's for sure.
I know I'm wading into the danger zone here, but let's also remember we're talking about teenagers. A (for example) 15 year old's body type will be closer to an 18 year old's than a 5 year old's, so the perfectly legal porn model would work just fine for that, uh, purpose.
Good luck with that
I mean you can do a significant amount by making it illegal to offer it on the open web, which might be the way to go, but creating awesome things that can be had once you go outside the law actually carries its own little long-term consequences
I disagree, these children are minors and the their behavior, while abhorrent, belies a fundamental lack of perspective and empathy.
I've been a teenage boy before and I did some bone-headed things. Maybe not this bad, but still, I agree with the judge in this instance that it would be inappropriate to impose permanent consequences on these kids before their life even gets started because they were stupid, horny, teenage boys.
Even if we assume that these kids don't all have well-meaning parents who who will impose their own punishments, having a probation officer in high school is not going to help with popularity. Then, mandatory classes that will force these boys to evaluate the situation from another perspective seems like a great add-on.
I know it doesn't feel like justice, but our goal as a society shouldn't be to dole out maximum punishment in every instance. The goal is to allow all of us to peacefully coexist and contribute to society - throwing children in a dark hole somewhere to be forgotten isn't going to help with that.
Having said all of the above, it feels like a good time to emphasize that we still don't have any good ideas for solving the core problem here, which is the malicious use of this technology that was dumped on society without any regard for the types of problems that it would create, and entirely without a plan to add guard rails. While I'm far from the only one considering this problem, it should be clear enough by now that dragging our feet on creating regulation isn't getting us any closer to a solution.
At a minimum it feels like we need to implement a mandatory class on the responsible use of technology, but the obvious question there is how to keep the material relevant. Maybe it's something that tech companies could be mandated to provide to all users under 18 - a brief, recurring training (could be a video, idc) and assessment that minors would have to complete quarterly to demonstrate that they understand their responsibilities.
Completely agree with 100% of this
I’m just saying that I think the answer lies somewhere between “take some classes and promise not to do it again” and “adult prison”. They imposed significant harm to another human being, in a way that’s so significant that we all agreed it should be illegal. Yes, I know that probably wasn’t the intent on their part. But this kind of “oh but I just got horny and just kind of didn’t care / wasn’t focused on what the impact was” is not a thing you wanna teach them there’s some wiggle room with as long as they make sure to apologize about it after.
Community service? Home arrest? Juvenile detention for 21 days? Fuckin something? I’m not saying put them in the hole.
Same.
I would be surprised if anyone with the same history didnt do at least a few completely boneheaded things at one point in their youth.