Hello everyone,

We unfortunately have to close the !lemmyshitpost community for the time being. We have been fighting the CSAM (Child Sexual Assault Material) posts all day but there is nothing we can do because they will just post from another instance since we changed our registration policy.

We keep working on a solution, we have a few things in the works but that won’t help us now.

Thank you for your understanding and apologies to our users, moderators and admins of other instances who had to deal with this.

Edit: @Striker@lemmy.world the moderator of the affected community made a post apologizing for what happened. But this could not be stopped even with 10 moderators. And if it wasn’t his community it would have been another one. And it is clear this could happen on any instance.

But we will not give up. We are lucky to have a very dedicated team and we can hopefully make an announcement about what’s next very soon.

Edit 2: removed that bit about the moderator tools. That came out a bit harsher than how we meant it. It’s been a long day and having to deal with this kind of stuff got some of us a bit salty to say the least. Remember we also had to deal with people posting scat not too long ago so this isn’t the first time we felt helpless. Anyway, I hope we can announce something more positive soon.

469 points

Fuck these trolls

permalink
report
reply
299 points

troll is too mild of an adjective for these people

permalink
report
parent
reply
290 points

How about “pedophile”? I mean, they had to have the images to post them.

permalink
report
parent
reply
64 points

“Terrorist”. Having the images doesn’t mean they liked them, they used them to terrorize a whole community though.

permalink
report
parent
reply
53 points

Yeah, this isn’t just joking or shitposting. This is the kind of shit that gets people locked up in federal pound-you-in-the-ass prison for decades. The feds don’t care if you sought out the CSAM, because it still exists on your device regardless of intent.

The laws about possessing CSAM are written in a way that any plausible deniability is removed, specifically to prevent pedophiles from being able to go “oh lol a buddy sent that to me as a joke” and getting acquitted. The courts don’t care why you have CSAM on your server. All they care about is the fact that you do. And since you own the server, you own the CSAM and they’ll prosecute you for it.

permalink
report
parent
reply
15 points

Sounds like a digital form of SWATing.

permalink
report
parent
reply
9 points

And not just the instance admins would be at risk as well. Any time you view an image your device is making a local copy of it. Meaning every person who viewed the image even accidentally is at risk as well.

permalink
report
parent
reply
12 points

Yeah honestly report all of those accounts to law enforcement. It’s unlikely they’d be able to do much, I assume, but these people are literally distributing CSAM.

permalink
report
parent
reply
133 points

That’s not a troll, CSAM goes well beyond trolling, pedophile would be a more accurate term for them.

permalink
report
parent
reply
26 points
*

Yeah. A troll might post something like a ton of oversized images of pig buttholes. Who the fuck even has access to CSAM to post? That’s something you only have on hand if you’re a predator already. Nor is it something you can shrug off like “lol I was only trolling”. It’s a crime that will send you to jail for years. It’s a major crime that gets entire police units dedicated to it. It’s a huuuuge deal and I cannot even fathom what kind of person would risk years in prison to sabotage an internet forum.

permalink
report
parent
reply
3 points

My thoughts exactly, like if they were just spamming goatsee or something, that would be one thing…

But this raises several questions, and they can only have grimdark answers.

permalink
report
parent
reply
4 points
*

Dont forget they are doing this to harm others, they deserve the name “e-terrorist” or simmlar. hey are still absolutely pedophiles. Their bombing out a space, not trying to set up shop.

permalink
report
parent
reply
4 points

I would definately agree that this would very likely count as cyber terrorism, and if it doesn’t it should.

permalink
report
parent
reply
-17 points
*
Deleted by creator
permalink
report
parent
reply
24 points

Simply having it to post makes you culpable. It’s way beyond trolling.

permalink
report
parent
reply
12 points

I’d say the proper word is ‘criminal.’

permalink
report
parent
reply
10 points

A person who is attracted to children is an evil and disgusting person, someone being a pedophile isn’t just “liking something”, they are a monster.

permalink
report
parent
reply
51 points

Criminals.

permalink
report
parent
reply
36 points

Trolls? In most regions of the planet, I am fairly certain their actions would be considered criminal.

permalink
report
parent
reply
14 points
*
Removed by mod
permalink
report
parent
reply
14 points

The Internet is essentially a small microbiome of beautiful flora and fauna that grew on top of a lake of sewage.

permalink
report
parent
reply
7 points

Yeah, back in the Limewire/Napster/etc days, it wasn’t unheard of for people to troll by relabeling CSAM as a popular movie or TV show. Oh, you wanted to download the latest Friends episode? Congrats, now you have CSAM because a troll uploaded it with the title “Friends S10E7.mov”

permalink
report
parent
reply
332 points

I would like to extend my sincerest apologies to all of the users here who liked lemmy shit posting. I feel like I let the situation grow too out of control before getting help. Don’t worry I am not quitting. I fully intend on staying around. The other two deserted the community but I won’t. Dm me If you wish to apply for mod.

Sincerest thanks to the admin team for dealing with this situation. I wish I linked in with you all earlier.

permalink
report
reply
209 points

@Striker@lemmy.world this is not your fault. You stepped up when we asked you to and actively reached out for help getting the community moderated. But even with extra moderators this can not be stopped. Lemmy needs better moderation tools.

permalink
report
parent
reply
52 points

Hopefully the devs will take the lesson from this incident and put some better tools together.

permalink
report
parent
reply
50 points

There’s a Matrix Room for building mod tools here maybe we might want to bring up this issue there, just in case they aren’t already aware.

permalink
report
parent
reply
-28 points

Or we’ll finally accept that the core Lemmy devs aren’t capable of producing a functioning piece of software and fork it.

permalink
report
parent
reply
73 points

Please, please, please do not blame yourself for this. This is not your fault. You did what you were supposed to do as a mod and stepped up and asked for help when you needed to, lemmy just needs better tools. Please take care of yourself.

permalink
report
parent
reply
31 points

This isn’t your fault. Thank you for all you have done in regards to this situation thus far.

permalink
report
parent
reply
25 points

It’s not your fault, thank you for your job!

permalink
report
parent
reply
24 points
*

It’s not your fault, these people attacked and we don’t have the proper moderation tools to defend ourselves yet. Hopefully in the future this will change though. As it stands you did the best that you could.

permalink
report
parent
reply
20 points

Definitely not your fault mate, you did what anyone would do, it’s a new community and shit happens

permalink
report
parent
reply
15 points

I love your community and I know it is hard for you to handle this but it isn’t your fault! I hope no one here blames you because it’s 100% the fault of these sick freaks posting CSAM.

permalink
report
parent
reply
14 points

Thanks for your work. The community was appreciated.

permalink
report
parent
reply
12 points

You don’t have to apologize for having done your job. You did everything right and we appreciate it a lot. I’ve spent the whole day trying to remove this shit from my own instance and understanding how purges, removals and pictrs work. I feel you, my man. The only ones at fault here are the sickos who shared that stuff, you keep holding on.

permalink
report
parent
reply
10 points

Thank you for your help. It is appreciated.

permalink
report
parent
reply
10 points

You didn’t do anything wrong, this isn’t your fault and we’re grateful for the effort. These monsters will be slain, and we will get our community back.

permalink
report
parent
reply
7 points

Really feel for you having to deal with this.

permalink
report
parent
reply
6 points

You do a great job. I’ve reported quite a few shit heads there and it gets handled well and quickly. You have no way of knowing if some roach is gonna die after getting squashed or if they are going to keep coming back

permalink
report
parent
reply
4 points

You’ve already had to take all that on, don’t add self-blame on top of it. This wasn’t your fault and no reasonable person would blame you. I really feel for what you and the admins have had to endure.

Don’t hesitate to reach out to supports or speak to a mental health professional if you’ve picked up trauma from the shit you’ve had to see. There’s no shame in getting help.

permalink
report
parent
reply
2 points

As so many others have said, there’s no need for an apology. Thank you for all of the work that you have been doing!

The fact that you are staying on as mod speaks to your character and commitment to the community.

permalink
report
parent
reply
251 points

Contact the FBI

permalink
report
reply
193 points

This isn’t as crazy as it may sound either. I saw a similar situation, contacted them with the information I had, and the field agent was super nice/helpful and followed up multiple times with calls/updates.

permalink
report
parent
reply
95 points

This doesn’t sound crazy in the least. It sounds like exactly what should be done.

permalink
report
parent
reply
45 points

yha, what do people think the FBI is for… this isn’t crazy. They can get access to ISP logs, VPN provider logs, etc.

permalink
report
parent
reply
115 points

This is good advice; I suspect they’re outside of the FBI’s jurisdiction, but they could also be random idiots, in which case they’re random idiots who are about to become registered sex offenders.

permalink
report
parent
reply
82 points
*
Deleted by creator
permalink
report
parent
reply
18 points
*

I’m not saying anybody takes csam less serious. But I wish the American government Went after minor csam events as much as they go after copyright/IP violations. Its not like mike pompeo flew out to other countries to strong arm them into new laws to prevent csam like they have done with pirates who threatened Hollywood profits

permalink
report
parent
reply
1 point

Yeah there was even that case where a citizen and resident of Mexico was arrested and detained in the US for breaking US law, even tho it technically didn’t apply to them since they were under Mexican sovereignty… Borders mean little to the US

permalink
report
parent
reply
45 points

They might be, but I’d imagine most countries have laws on the books about this sort of stuff too.

permalink
report
parent
reply
19 points

And it’s something that the nations usually have no issues cooperating with.

The FBI has assisted in a lot of global raids related to CSAM.

permalink
report
parent
reply
28 points
*

The FBI has offices in alot of other countries and work with local law enforcement.

https://www.fbi.gov/contact-us/international-offices

Can’t really hide from them unless you live in North Korea or Russia

permalink
report
parent
reply
4 points
*

Wait, is this like China having police offices in other countries?

I knew the US collects taxes on their citizens no matter where they live, but isn’t this kind of excessive? Wasn’t INTERPOL supposed to take care of international crime?

permalink
report
parent
reply
23 points

I have to wonder if Interpol could help with issues like this I know there are agencies that work together globally to help protect missing and exploited children.

permalink
report
parent
reply
30 points

‘Criminal activity should be reported to your local or national police. INTERPOL does not carry out investigations or arrest people; this is the responsibility of national police.’

From their website.

permalink
report
parent
reply
20 points

The FBI reports it to interpol I believe, interpol is more of like an international warrant system built from treaties.

permalink
report
parent
reply
7 points

FBI would be great in this case tbh. They have the resources.

permalink
report
parent
reply
6 points

Perhaps most importantly, it establishes that the mods/admins/etc of the community are not complicit in dissemination of the material. If anyone (isp, cloud provider, law enforcement, etc) tries to shut them down for it, they can point to their active and prudent engagement of proper authorities.

permalink
report
parent
reply
5 points

More importantly, and germaine to our conversation, the FBI has the contacts and motivation to work with their international partners wherever the data leads.

permalink
report
parent
reply
169 points

This is seriously sad and awful that people would go this far to derail a community. It makes me concerned for other communities as well. Since they have succeeded in having shitpost closed does this mean they will just move on to the next community? That being said here is some very useful information on the subject and what can be done to help curb CSAM.

The National Center for Missing & Exploited Children (NCMEC) CyberTipline: You can report CSAM to the CyberTipline online or by calling 1-800-843-5678. Your report will be forwarded to a law enforcement agency for investigation. The National Sexual Assault Hotline: If you or someone you know has been sexually assaulted, you can call the National Sexual Assault Hotline at 800-656-HOPE (4673) or chat online. The hotline is available 24/7 and provides free, confidential support.

The National Child Abuse Hotline: If you suspect child abuse, you can call the National Child Abuse Hotline at 800-4-A-CHILD (422-4453). The hotline is available 24/7 and provides free, confidential support. Thorn: Thorn is a non-profit organization that works to fight child sexual abuse. They provide resources on how to prevent CSAM and how to report it.

Stop It Now!: Stop It Now! is an organization that works to prevent child sexual abuse. They provide resources on how to talk to children about sexual abuse and how to report it.

Childhelp USA: Childhelp USA is a non-profit organization that provides crisis intervention and prevention services to children and families. They have a 24/7 hotline at 1-800-422-4453. Here are some tips to prevent CSAM:

Talk to your children about online safety and the dangers of CSAM.

Teach your children about the importance of keeping their personal information private. Monitor your children’s online activity.

Be aware of the signs of CSAM, such as children being secretive or withdrawn, or having changes in their behavior. Report any suspected CSAM to the authorities immediately.

permalink
report
reply
28 points

So far I have not seen such disgusting material, but I’m saving this comment in case I ever need the information.

Are there any other numbers or sites people can contact in countries other than the USA?

permalink
report
parent
reply
7 points

It’s probably going to be country dependent

permalink
report
parent
reply
9 points

Of course yes. But I’ve discovered that cell phones are programmed to translate emergency numbers even.

In the USA, our main emergency number is 911, but I found out (quite by accident), that dialing 08 brings you to emergency services.

https://en.wikipedia.org/wiki/List_of_emergency_telephone_numbers

permalink
report
parent
reply
149 points

Not that I’m familiar with Rust at all, but… perhaps we need to talk about this.

The only thing that could have prevented this is better moderation tools. And while a lot of the instance admins have been asking for this, it doesn’t seem to be on the developers roadmap for the time being. There are just two full-time developers on this project and they seem to have other priorities. No offense to them but it doesn’t inspire much faith for the future of Lemmy.

Lets be productive. What exactly are the moderation features needed, and what would be easiest to implement into the Lemmy source code? Are you talking about a mass-ban of users from specific instances? A ban of new accounts from instances? Like, what moderation tool exactly is needed here?

permalink
report
reply
116 points

Speculating:

Restricting posting from accounts that don’t meet some adjustable criteria. Like account age, comment count, prior moderation action, average comment length (upvote quota maybe not, because not all instances use it)

Automatic hash comparison of uploaded images with database of registered illegal content.

permalink
report
parent
reply
65 points

On various old-school forums, there’s a simple (and automated) system of trust that progresses from new users (who might be spam)… where every new user might need a manual “approve post” before it shows up. (And this existed in Reddit in some communities too).

And then full powers granted to the user eventually (or in the case of StackOverlow, automated access to the moderator queue).

permalink
report
parent
reply
11 points

What are the chances of a hash collision in this instance? I know accidental hash collisions are usually super rare, but with enough people it’d probably still happen every now and then, especially if the system is designed to detect images similar to the original illegal image (to catch any minor edits).

Is there a way to use multiple hashes from different sources to help reduce collisions? For an example, checking both the MD5 and SHA256 hashes instead of just one or the other, and then it only gets flagged if both match within a certain degree.

permalink
report
parent
reply
26 points

Traditional hash like MD5 and SHA256 are not locality-sensitive. Can’t be used to detect match with certain degree. Otherwise, yes you are correct. Perceptual hashes can create false positive. Very unlikely, but yes it is possible. This is not a problem with perfect solution. Extraordinary edge cases must be resolved on a case by case basis.

And yes, simplest solution must be implemented first always. Tracking post reputation, captcha before post, wait for account to mature before can post, etc. The problem is that right now the only defense we have access to are mods. Mods are people, usually with eyeballs. Eyeballs which will be poisoned by CSAM so we can post memes and funnies without issues. This is not fair to them. We must do all we can, and if all we can includes perceptual hashing, we have moral obligation to do so.

permalink
report
parent
reply
5 points

I’m surprised this isn’t linked, there are services that does this for you.

And they are free.

https://blog.cloudflare.com/the-csam-scanning-tool/

permalink
report
parent
reply
3 points

I beleive there are several readily available databases of hashes of csam material for exactly this kind of scanning. Looks like there are some open source ones.

Some top results: https://github.com/topics/csam

This looks to be the top project: https://prostasia.org/project/csam-scanning-plugins/

permalink
report
parent
reply
2 points

Could they not just change one pixel to get another hash?

permalink
report
parent
reply
40 points

I guess it’d be a matter of incorporating something that hashes whatever it is that’s being uploaded. One takes that hash and checks it against a database of known CSAM. If match, stop upload, ban user and complain to closest officer of the law. Reddit uses PhotoDNA and CSAI-Match. This is not a simple task.

permalink
report
parent
reply
27 points

None of that really works anymore in the age of AI inpainting. Hashes / Perceptual worked well before but the people doing this are specifically interested in causing destruction and chaos with this content. they don’t need it to be authentic to do that.

It’s a problem that requires AI on the defensive side but even that is just going to be eternal arms race. This problem cannot be solved with technology, only mitigated.

The ability to exchange hashes on moderation actions against content may offer a way out, but it will change the decentralized nature of everything - basically bringing us back to the early days of the usenet, Usenet Death Penaty, etc.

permalink
report
parent
reply
48 points
*

Not true.

A simple CAPTCHA got rid of a huge set of idiotic script-kiddies. CSAM being what it is, could (and should) result in an immediate IP ban. So if you’re “dumb” enough to try to upload a well-known CSAM hash, then you absolutely deserve the harshest immediate ban automatically.


You’re pretty much like the story of the economist who refuses to believe that $20 exists on a sidewalk. “Oh, but if that $20 really existed on the sidewalk there, then it would have been arbitraged away already”. Well guess what? Human nature ain’t economic theory. Human nature ain’t cybersecurity.

Idiots will do dumb, easy attacks because they’re dumb and easy. We need to defend against the dumb-and-easy attacks, before spending more time working on the harder, rarer attacks.

permalink
report
parent
reply
11 points

Couldn’t one small change in the picture change the whole hash?

permalink
report
parent
reply
20 points

Good question. Yes. Also artefacts from compression can fuck it up. However hash comparison returns percentage of match. If match is good enough, it is CSAM. Davai ban. There is bigger issue however for developers of Lemmy, I assume. It is a philosophical pizdec. It is that if we elect to use PhotoDNA and CSAI Match, Lemmy is now at the whims of Microsoft and Google respectively.

permalink
report
parent
reply
17 points

If they hash the file binary data, like CRC32 or SHA, yes. But there are other hash types out there, which are more like “fingerprints” of an image. Think of how Shazam or Sound Hound can recognize a song playing, despite the extra wind, static, etc that’s present. There are similar algorithms for images/videos.

No idea how difficult those are to implement, though.

permalink
report
parent
reply
4 points

One bit, in fact. Luckily there are other ways of comparing images without actually showing them to human eyes that allow you to calculate a percentage of similarity.

permalink
report
parent
reply
14 points

Reddit had automod which was highly configurable.

permalink
report
parent
reply
-7 points

Reddit automod is also a source for all the porn communities. Have you ever checked automod comment history?

Yeah, I have. Like 2/3 of automod comments are in porn communities.

https://www.reddit.com/r/shitprotips/comments/pkflpd/a_dump_of_random_subreddits_from_automoderators/

permalink
report
parent
reply
8 points

What? Reddit automod is not a source for porn. What would be happening is the large quantity of content it reacts to there.

It literally reads your config in your wiki and performs actions based on that. The porn communities using it are using it to moderate their subs. You can look at the post history. https://www.reddit.com/user/AutoModerator It is commenting on posts IN those communities as a reaction to triggers but isn’t posting porn (unless they put in their config)

Not worth it if you don’t moderate on reddit but read the how to docs for reddit automod, it is an excellent tool for spam management and the source is open prior to reddit acquiring it and making it shit. https://www.reddit.com/wiki/automoderator/full-documentation

permalink
report
parent
reply
13 points
*

The best feature the current Lemmy devs could work on is making the process to onboard new devs smoother. We shouldn’t expect anything more than that for the near future.

I haven’t actually tried cloning and compiling, so if anyone has comments here they’re more than welcome.

permalink
report
parent
reply
11 points

I think having a means of viewing uploaded images as an admin would be helpful, as well disabling external image caching. Like an “uploaded” gallery for admins to view that can potentially hook into Photodna/CSAI-Match or whatever.

permalink
report
parent
reply
10 points

I think it would be an AI autoscan that flags some posts for mod approval before they show up to the public and perhaps more fine-grained controls for how media is posted like for instance only allowing certain image hosting sites and no directly uploaded images.

permalink
report
parent
reply
6 points
*

I was just discussing this under another post and turns out that the Germans have already developed a rule-based auto moderator that they use on their instance:

https://github.com/Dakkaron/SquareModBot

This could be adopted by lemmy.world by simply modifying the config file

permalink
report
parent
reply
5 points

That statement is just outright wrong though. They could easily use CloudFlares CSAM monitoring and it never would have been a problem. A lot of people in these threads, including admins, have absolutely no idea what they’re talking about.

permalink
report
parent
reply
2 points

Cloudflare CSAM protection is not available outside of the US, unfortunately.

permalink
report
parent
reply
1 point

There are several other solutions including ones from Microsoft and Facebook.

permalink
report
parent
reply
3 points

Probably hashing and scanning any uploaded media against some of the known DBs of CSAM hashes.

Iirc that’s how Reddit/FB/Insta/Etc. handle it

permalink
report
parent
reply
16 points
*
Deleted by creator
permalink
report
parent
reply
1 point

The sad thing is that all we can usually do is make it harder for attackers. Which is absolutely still worth doing, to be clear. But if an attacker wants to cause trouble badly enough, there’s always ways around everything. Eg, image detection can be foiled with enough transformation, account age limits can be gotten past by a patient attacker. Minimum karma can be botted (even easier than ever with AI) and Lemmy is especially easy to bot karma because you can just spin up an instance with all the bots your heart desires. If posts have to be approved, attackers can even just hotlink to innocent images and then change the image after it’s approved.

Law enforcement can do a lot more than we can, by subpoenaing ISPs or VPNs. But law enforcement is slow and unreliable, so that’s also imperfect.

permalink
report
parent
reply

Lemmy.World Announcements

!lemmyworld@lemmy.world

Create post

This Community is intended for posts about the Lemmy.world server by the admins.

Follow us for server news 🐘

Outages 🔥

https://status.lemmy.world

For support with issues at Lemmy.world, go to the Lemmy.world Support community.

Support e-mail

Any support requests are best sent to info@lemmy.world e-mail.

Report contact

Donations 💗

If you would like to make a donation to support the cost of running this platform, please do so at the following donation URLs.

If you can, please use / switch to Ko-Fi, it has the lowest fees for us

Join the team

Community stats

  • 684

    Monthly active users

  • 99

    Posts

  • 12K

    Comments