526
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 10 Feb 2024
526 points (97.3% liked)
The Onion
4584 readers
685 users here now
The Onion
A place to share and discuss stories from The Onion, Clickhole, and other satire.
Great Satire Writing:
founded 3 years ago
MODERATORS
Only people who believe they'd benefit from regulating deepfakes are some high profile and/or internet narcissists.
"Boohoo someone made a video of Trump's hemorrhoids and Biden licking them" Everyone already knows you can easily fake some video without using "AI" for it, we have a whole fucking industry for it pumping hundred movies out every Saturday. We already know you shouldn't believe everything you see.
Goes a bit beyond that nowadays. Deep fakes can be used to create false evidence for example
Deepfakes are already being used on an industrial scale for scams and conning people.
It's not a case of them needing regulating because they offend peoples sensibilities, it's because they're actively being used to harm people.
how would more regulation help? what you are talking about is already illegal
The same way cracking down on CP helps make it harder to access by pedos.
Y'all are seriously looking creepy
Good one. You want to lock people up but people who believe in the first amendment are creepy. Nice spoof of moral panic populism.
Not everyone is an American idiot
True. Freedom of speech and of the press is a peculiarly American thing. In virtually all other countries... No, wait. That's the 2nd amendment. What were we talking about?
Good one. You want the freedom to create any porn you want regardless of who it hurts without any personal accountability.
This is a weird hill to die on but I've seen worse. Not really.
You don't have to be a Hugh Hefner to reject fascism.
Putting in safeguards to protect people from porn being made of them is fascism?
Like I said. Weird hill.
Yeah, fraud used to be such a fun pastime for the whole family. Now we need to regulate it. Technology ruins everything.
The past month or so I've started encountering quite a few deepfakes on dating sites. I honestly can't tell they're deepfakes just by looking; the only reason I've realised tell is because they were very obviously Instagram model photos. I reverse image searched them to find where they were taken from and confirm my suspicions that the profile's using stolen photos, only to find that the original photos aren't quite the same. It'll be the exact same shot with the same body but a different face, and with identifying tattoos removed, moles adds, etc.
If they weren't obvious modelling shots that made me want to reverse image search them, I wouldn't have known at all. It makes me wonder how many deepfaked images I've encountered on dating sites already and just not known about because they've been fairly innocuous-looking photos...
thats already illegal
In a courtroom sure. What about putting it on YouTube?
So you have no issues with me distributing deepfakes of you burning crosses across your neighborhood?
I'm not saying deepfakes should not be regulated.
I'm saying the examples are poor because scamming people is already illegal.
So you aren't actually syaing anything at all. You're just being contrarian for the sake of it.
Not exactly. Arguments like "they should be regulated because they can be used for illegal stuff" are moot, since those usages are already regulated. I'm on the fence on the whole regulation thing and I've yet to see any actual realistic examples on how regulation would look.
Is it even logical to regulate ai images specifically, or should we lump it in together with any form of image manipulation?
Okay but can you tell the difference between legal real evidence and illegal false evidence?
The technology is there to create this type of false evidence, it's not going back to the Pandora's box anymore. The truth is that you can't trust a single videotape as 100% evidence alone.
Oh hi Ethan
Didn’t expect him to show up and defend his own article lol