Hello everyone,

We unfortunately have to close the !lemmyshitpost community for the time being. We have been fighting the CSAM (Child Sexual Assault Material) posts all day but there is nothing we can do because they will just post from another instance since we changed our registration policy.

We keep working on a solution, we have a few things in the works but that won’t help us now.

Thank you for your understanding and apologies to our users, moderators and admins of other instances who had to deal with this.

Edit: @Striker@lemmy.world the moderator of the affected community made a post apologizing for what happened. But this could not be stopped even with 10 moderators. And if it wasn’t his community it would have been another one. And it is clear this could happen on any instance.

But we will not give up. We are lucky to have a very dedicated team and we can hopefully make an announcement about what’s next very soon.

Edit 2: removed that bit about the moderator tools. That came out a bit harsher than how we meant it. It’s been a long day and having to deal with this kind of stuff got some of us a bit salty to say the least. Remember we also had to deal with people posting scat not too long ago so this isn’t the first time we felt helpless. Anyway, I hope we can announce something more positive soon.

You are viewing a single thread.
View all comments View context
92 points

Imagine if you were the owner of a really large computer with CSAM in it. And there is in fact no good way to prevent creeps from putting more into it. And when police come to have a look at your CSAM, you are liable for legal bullshit. Now imagine you had dependents. You would also be well past the point of being respectful.

On that note, the captain db0 has raised an issue on the github repository of LemmyNet, requesting essentially the ability to add middleware that checks the nature of uploaded images (issue #3920 if anyone wants to check). Point being, the ball is squarely in their court now.

permalink
report
parent
reply
16 points

I think the FBI or eqivilant keeps a record of hashes for a known CASM and middleware should be able to compare to that. Hopefully, if a match is found, kill the post and forward all info on to LE.

permalink
report
parent
reply
14 points

Interesting. But aren’t hashes unique to a specific photo? Just a single change to the photo would inevitably change its hash.

I think Apple was going to implement a similar system and deploy to all iPhones/Macs in some iOS/macOS update. However was eventually 86’d due to privacy concerns from many people and the possible for abuse and/or false positives.

A system like this might work on a small scale though as part of moderating tools. Not sure where you would get a constantly updated database of CSAM hashes though.

permalink
report
parent
reply
11 points

Interesting. But aren’t hashes unique to a specific photo? Just a single change to the photo would inevitably change its hash.

Most people are lazy and stupid, so maybe hash checking is enough to catch a huge portion (probably more than 50%, maybe even 80% or 90%?) of the CSAM that doesn’t bother (or know how) to do that?

permalink
report
parent
reply
5 points

In my utopia world, the FBI has a team updating the DB.

The utopia algorithim would do multiple subsets of the picture so cropping or watermarking wouldn’t break the test (assume the ‘crux’ of the CSAM would be most likely unaltered?) , maybe handle simple image transformations (color, tint, gamma, etc.) with a formula.

permalink
report
parent
reply
1 point

IMO scanning images before posting them to a forum is a distinct and utterly completely different world than having your photo collection scanned. Especially in context and scale

permalink
report
parent
reply
8 points

You can already protect your instance using CloudFlare’s CSAM protection, and sorry to say it, but I would not use db0’s solution. It is more likely to get you in trouble than help you out. I posted about it in their initial thread, but they are not warning people about actual legal requirements that are required in many places and their script can get you put in jail (yes, put in jail for deleting CSAM).

permalink
report
parent
reply
1 point

The developers of LemmyNet are being asked for the ability to define a subroutine by which uploaded images are to be preprocessed and denied or passed thereafter. There is no such feature right now. Even if they wanted to use CloudFlare CSAM protection, they couldn’t. That’s the entire problem. This preprocessing routine could use Microsoft PhotoDNA and Google CSAI, it could use a self-hosted alternative as db0 desires or it could even be your own custom solution that doesn’t destroy, but stores CSAM on a computer you own and stops it from being posted.

permalink
report
parent
reply
1 point

Even if they wanted to use CloudFlare CSAM protection, they couldn’t.

? CF’s solution happens at the DNS level. It has absolutely nothing to do with lemmy and there’s nothing the devs could do to change that.

permalink
report
parent
reply

Lemmy.World Announcements

!lemmyworld@lemmy.world

Create post

This Community is intended for posts about the Lemmy.world server by the admins.

Follow us for server news 🐘

Outages 🔥

https://status.lemmy.world

For support with issues at Lemmy.world, go to the Lemmy.world Support community.

Support e-mail

Any support requests are best sent to info@lemmy.world e-mail.

Report contact

Donations 💗

If you would like to make a donation to support the cost of running this platform, please do so at the following donation URLs.

If you can, please use / switch to Ko-Fi, it has the lowest fees for us

Join the team

Community stats

  • 858

    Monthly active users

  • 100

    Posts

  • 12K

    Comments