cross-posted from: https://jamie.moe/post/113631
cross-posted from: https://jamie.moe/post/113630
There have been users spamming CSAM content in !lemmyshitpost@lemmy.world causing it to federate to other instances. If your instance is subscribed to this community, you should take action to rectify it immediately. I recommend performing a hard delete via command line on the server.
I deleted every image from the past 24 hours personally, using the following command:
sudo find /srv/lemmy/example.com/volumes/pictrs/files -type f -ctime -1 -exec rm {} \;
Can i get an ELI5 run down on CSAM? I’ve seen it mentioned several times on different communities
To add to this, the reason more people are calling it CSAM now and not “child porn” is because advocates for and survivors of sexual exploitation have been vocal about how “porn” implies a level of consent which is incapable of being present.
Sure, but I feel also takes away the emotional impact of the term. “Child porn” gives me visceral repulsion, CSAM sounds like I need a topical ointment to fix it. Clinicizing it tends to have the effect of normalizing it.(shell shock -> PTSD, retard -> special needs) which can be good but not in this case.