128
submitted 1 year ago* (last edited 1 year ago) by Blaze@discuss.tchncs.de to c/fediverse@lemmy.ml

https://poptalk.scrubbles.tech/c/poptalkmeta

(The site is already down, and my instance wasn't subscribed, so I can't properly link, hopefully the bot will help).

Link from LW: https://lemmy.world/post/3979585

all 44 comments
sorted by: hot top controversial new old
[-] meldrik@lemmy.wtf 73 points 1 year ago

We need much better moderation tools. List of uploaded content, list of cached content, option to purge cached content.

[-] gabe@literature.cafe 21 points 1 year ago

Hopefully a push towards forcing the lemmy devs hands on that end will come from this.

[-] TheButtonJustSpins@infosec.pub 1 points 1 year ago

Seems like this kind of insight tool could be created by the community separately.

[-] Trekman10@sh.itjust.works 17 points 1 year ago

I keep hearing they are swamped and not many people, so if that's really the case i hope this spurs knowledgeable people to join to help add these things.

[-] CMahaff@lemmy.world 2 points 1 year ago

Absolutely this. There are issues with deletes not federating properly too, right?

That's a big part of the issue here too since even when .world cleans up the content it's already been pushed out to every other instance and will now remain there until all THOSE admins also purge it.

[-] argv_minus_one@beehaw.org 62 points 1 year ago

Fascist shutdown of public discourse, step by step:

  1. Find out where public discourse is.
  2. Post child porn on an obscure corner of it.
  3. Take screenshots of the posted porn.
  4. Send law enforcement to seize all the things.
[-] gabe@literature.cafe 35 points 1 year ago* (last edited 1 year ago)

They are a dumbass if they do this cause they will get prosecuted themselves for posting that. There's a safe harbor provision for server admins so long as they make a good faith effort to report when it is brought to their attention (at least in the US). So long as you are doing your due diligence as a web host, you should be fine.

[-] argv_minus_one@beehaw.org 6 points 1 year ago

I'm talking about when the government wants an excuse for shutting down public discourse. Obviously it isn't going to prosecute itself.

[-] Zorque@kbin.social 22 points 1 year ago

They have much less roundabout ways of shutting things down if they really want to.

[-] squiblet@kbin.social 11 points 1 year ago

imo it’s more likely that some agitator dickweed would do that than a government. Both are conceivable, though.

[-] Sorchist@kbin.social 2 points 1 year ago

Ah yes, it was all a false flag, instigated by the government to take away our freedoms, I get it

[-] KSPAtlas@sopuli.xyz 1 points 1 year ago

Also screenshots of csam would count as csam right

[-] poVoq@slrpnk.net 33 points 1 year ago* (last edited 1 year ago)

IANAL, but you are still responsible for it if you host it on a cloud server, the difference is mainly that the authorities will complain first with your cloud host, which will likely take down the entire site, instead of with you directly who can swiftly remove the offending material if asked to do so.

It is FUD to say the police will SWAT you if someone reports CSAM on your server. But you should have an easy way for the responsible agencies to contact you for take-down requests.

For people hosting in the EU, this pdf document explains the legal situation a bit and also where and how to report CSAM, should you come across it.

[-] dingleberry@discuss.tchncs.de 25 points 1 year ago* (last edited 1 year ago)

Federation lovers find out all the work that goes behind curtains in any modern website that hosts user generated content.

To all the people who like to talk smack to Jitsi for requiring Google login, lemmy.world for banning piracy communities, or to even to YT for honouring takedown requests: go ahead and try to host just 100 people on your site.

[-] lemann@lemmy.one 20 points 1 year ago

I was contemplating setting up an instance dedicated to micromobility, active transport and livable cities... but this is a BIG turn off - I don't think it's worth spinning up a new instance until the new mod tools land, and the ability to disable caching federated thumbnails

There are some instances using ML to scan their pictrs folder IIRC, could be good if it could quarantine/delete/purge posts from Lemmy

[-] not_amm@lemmy.ml 6 points 1 year ago

Hey, that sounds very interesting. Hope you can do it in the future, I'll definitely join. :)

[-] gabe@literature.cafe 19 points 1 year ago
[-] Blaze@discuss.tchncs.de 11 points 1 year ago* (last edited 1 year ago)

I know, and I just saw this after the meme about people hosting instances taking risks for others...

[-] Sickos@hexbear.net 15 points 1 year ago* (last edited 1 year ago)

Aww, dang, scrubbles was gonna be one of my internet friends. I get it though, protecting yourself is important. Good luck, comrade.

[-] eric5949@lemmy.cloudaf.site 15 points 1 year ago

Ngl this might be it for my instance, not dealing with all this.

[-] Naomikho@monyet.cc 13 points 1 year ago

I had to clean a child porn image once because some bastard registered at an instance I'm moderating and posted an anime child porn image to another instance using the account he created here. And I found a total of 5 copies of that image in the server...

[-] Pregnenolone@lemmy.world 9 points 1 year ago

Egregious CSAM aside, none of these websites, including sites like Reddit, can be 100% confident that the people posting images are doing so of people who are over 18. For example, how could Lemmy or Reddit be confident that (insert gonewild poster here) is over the age of 18? As far as I'm aware the only checks that they do are to confirm that the poster is the person they say they are, not that they are of an appropriate age to post.

[-] spiritedpause@sh.itjust.works 8 points 1 year ago

I get that the Lemmy devs are swamped with a lot of github issues, but how is this not one of, if not THE top priority for them right now? It's mind blowing that instance admins don't have the ability to disable the automatic caching of images from other remote instances.

If any shit show instance that ends up having CSAM can then cause an admin's instance to inadvertently cache/host that same content, why the fuck would anyone be motivated to host an instance and deal with the liability?

[-] Blaze@discuss.tchncs.de 8 points 1 year ago

This is probably going to switch priorities

[-] Sethayy@sh.itjust.works 5 points 1 year ago

Following the Unix philosophy, why aren't any of us doing anything?

[-] koper@feddit.nl 1 points 1 year ago

Because people are blowing this way out of proportion. Users uploading illegal content is always part of hosting a platform and lawmakers realized this decades ago. Platform hosters legally cannot be held liable for the content of their users unless they have actual knowledge of specific instances of illegal content. This is both in the US (section 230 of the Communications Decency Act) and the EU (chapter II of the Digital Services Act, previously the eCommerce directive)

[-] Rentlar@beehaw.org 6 points 1 year ago

Site is back up after being rehosted off-site!

[-] Blaze@discuss.tchncs.de 3 points 1 year ago

Indeed, that's great!

[-] sab@lemmy.world 2 points 1 year ago

You forgot to put "temporarily" in your headline.

[-] Blaze@discuss.tchncs.de 1 points 1 year ago

Happy to see they are back

this post was submitted on 27 Aug 2023
128 points (97.8% liked)

Fediverse

17698 readers
2 users here now

A community dedicated to fediverse news and discussion.

Fediverse is a portmanteau of "federation" and "universe".

Getting started on Fediverse;

founded 5 years ago
MODERATORS