cross-posted from: https://jamie.moe/post/113630
There have been users spamming CSAM content in !lemmyshitpost@lemmy.world causing it to federate to other instances. If your instance is subscribed to this community, you should take action to rectify it immediately. I recommend performing a hard delete via command line on the server.
I deleted every image from the past 24 hours personally, using the following command:
sudo find /srv/lemmy/example.com/volumes/pictrs/files -type f -ctime -1 -exec shred {} \;
Note: Your local jurisdiction may impose a duty to report or other obligations. Check with these, but always prioritize ensuring that the content does not continue to be served.
Update
Apparently the Lemmy Shitpost community is shut down as of now.
Locking the thread. Information relevant to self-hosters has already been shared. Too many reports of off-topic comments to leave this open.
i’d love for a good tech journalist to look into how and why this is happening and do a full write-up on it. come on ars, verge, vice
How desperate to destroy Lemmy must you be to spam CSAM on communities and potentially get innocent people into trouble?
Self hoster here, im nuking all of pictrs. People are sick. Luckily I did not see anything, however I was subscribed to the community.
- Did a shred on my entire pictrs volume (all images ever):
sudo find /srv/lemmy/example.com/volumes/pictrs -type f -exec shred {} \;
-
Removed the pictrs config in lemmy.hjson
-
removed pictrs container from docker compose
Anything else I should to protect my instance, besides shutting down completely?
Couldn’t this be stopped with automatic filtering of bad content? There are open source tools and libraries that do this already
That’s what we’re pushing the lemmy devs to do. Honestly even if they want to use proprietary tools for this instance I’m okay, I’ll happily go register an Azure account and plop an API key into the UI so it can start scanning. Lemmy should have the guardrails to prevent this from ever hitting our servers.
In the meantime, services like cloudflare will handle the recognizing and blocking access to images like that, but the problem still comes down to the federation of images. Most small hosters do not want the risk of hosting images from the whole of the internet, and it sounds like there is code in the works to disable that. Larger hosters who allow open registrations can do what they please and host what they please, but for us individual hosters we really need tools to block this.
Proprietary software isnt necessary there are plenty of project that detect scam