Hey folks!

I made a short post last night explaining why image uploads had been disabled. This was in the middle of the night for me, so I did not have time to go into a lot of detail, but I’m writing a more detailed post now to clear up where we are now and where we plan to go.

What’s the problem?

As shared by the lemmy.world team, over the past few days, some people have been spamming one of their communities with CSAM images. Lemmy has been attacked in various ways before, but this is clearly on a whole new level of depravity, as it’s first and foremost an attack on actual victims of child abuse, in addition to being an attack on the users and admins on Lemmy.

What’s the solution?

I am putting together a plan, both for the short term and for the longer term, to combat and prevent such content from ever reaching lemm.ee servers.

For the immediate future, I am taking the following steps:

1) Image uploads are completely disabled for all users

This is a drastic measure, and I am aware that it’s the opposite of what many of our users have been hoping, but at the moment, we simply don’t have the necessary tools to safely handle uploaded images.

2) All images which have federated in from other instances will be deleted from our servers, without any exception

At this point, we have millions of such images, and I am planning to just indiscriminately purge all of them. Posts from other instances will not be broken after the deletion, the deleted images will simply be loaded directly from other instances.

3) I will apply a small patch to the Lemmy backend running on lemm.ee to prevent images from other instances from being downloaded to our servers

Lemmy has always loaded some images directly from other servers, while saving other images locally to serve directly. I am eliminating the second option for the time being, forcing all images uploaded on external instances to always be loaded from those servers. This will somewhat increase the amount of servers which users will fetch images from when opening lemm.ee, which certainly has downsides, but I believe this is preferable to opening up our servers to potentially illegal content.

For the longer term, I have some further ideas:

4) Invite-based registrations

I believe that one of the best ways to effectively combat spam and malicious users is to implement an invite system on Lemmy. I have wanted to work on such a system ever since I first set up this instance, but real life and other things have been getting in the way, so I haven’t had a chance. However, with the current situation, I believe this feature is more important then ever, and I’m very hopeful I will be able to make time to work on it very soon.

My idea would be to grant our users a few invites, which would replenish every month if used. An invite will be required to sign up on lemm.ee after that point. The system will keep track of the invite hierarchy, and in extreme cases (such as spambot sign-ups), inviters may be held responsible for rule breaking users they have invited.

While this will certainly create a barrier of entry to signing up on lemm.ee, we are already one of the biggest instances, and I think at this point, such a barrier will do more good than harm.

5) Account requirements for specific activities

This is something that many admins and mods have been discussing for a while now, and I believe it would be an important feature for lemm.ee as well. Essentially, I would like to limit certain activities to users which meet specific requirements (maybe account age, amount of comments, etc). These activities might include things like image uploads, community creation, perhaps even private messages.

This could in theory limit creation of new accounts just to break rules (or laws).

6) Automated ML based NSFW scanning for all uploaded images

I think it makes sense to apply automatic scanning on all images before we save them on our servers, and if it’s flagged as NSFW, then we don’t accept the upload. While machine learning is not 100% accurate and will produce false positives, I believe this is a trade-off that we simply need to accept at this point. Not only will this help against any potential CSAM, it will also help us better enforce our “no pornography” rule.

This would potentially also allow us to resume caching images from other instances, which will improve both performance and privacy on lemm.ee.


With all of the above in place, I believe we will be able to re-enable image uploads with a much higher degree of safety. Of course, most of these ideas come with some significant downsides, but please keep in mind that users posting CSAM present an existential threat to Lemmy (in addition to just being absolutely morally disgusting and actively harmful to the victims of the abuse). If the choice is between having a Lemmy instance with some restrictions, or not having a Lemmy instance at all, then I think the restrictions are the better option.

I also would appreciate your patience in this matter, as all of the long term plans require additional development, and while this is currently a high priority issue for all Lemmy admins, we are all still volunteers and do not have the freedom to dedicate huge amounts of hours to working on new features.


As always, your feedback and thoughts are appreciated, so please feel free to leave a comment if you disagree with any of the plans or if you have any suggestions on how to improve them.

125 points

Personally I say just leave hosting of images to dedicated sites for that purpose. Your efforts are better left to dealing with how to render them. That being said, I use to be in charge of managing abuse on a site that has an average of 20 million posts a month (seriously).

The way I essentially defeated these kinds of attacks was with an image scanning service. It scans for anything NSFW and blocks it. Sometimes things would make it through but once an admin flagged it we could use that to block the users IP and account. It’s not cheap but the volume is also not huge yet for lemm.ee so it might not be too bad.

permalink
report
reply
60 points

This is my opinion also. Reddit turned to shit around the time they started self-hosting. Imgur only exists because people needed a place to host reddit images.

permalink
report
parent
reply
15 points

Is there a fediverse instance of Imgur?

permalink
report
parent
reply
36 points
*

No, but there’s nothing stopping you from using direct links from imgur, in traditional fashion.

It’s a little bit convoluted, though. You have to post the image, then hover over and select “Get share links”, and then pick the option for BB code (forums). This has the [img] tags at the start and finish, but importantly it has the direct link to the image file. If you use this on lemmy then it will load in the instance, rather than directing to imgur itself.

permalink
report
parent
reply
15 points

I’ve seen people link to uploads on Pixelfed, though this is probably not the intended use case.

permalink
report
parent
reply
1 point

Not yet but I wish there was. I use imgur quite a lot and I like the idea of a fediverse version. Especially with the direction they’ve gone lately.

permalink
report
parent
reply
18 points

Yeah genuinely we could all be hosting images for free or cheap on several image sites. Even NSFW images and videos! And it would save our instance admins a lot of headaches and probably some cost too.

permalink
report
parent
reply
8 points

Personally I say just leave hosting of images to dedicated sites for that purpose.

They aren’t profitable, so they’ll eventually go down. If no one is looking at their site, why keep it going just to serve other sights?

permalink
report
parent
reply
1 point

The same can be said about Lemmy.

permalink
report
parent
reply
2 points

But if the instance goes down, no one will care that the images in the posts are also gone.

permalink
report
parent
reply
82 points

You forgot getting the authorities involved when somebody does upload csam

permalink
report
reply
31 points

It’s a known tactic by trolls to upload cheese pizza and then notify the media/the authorities themselves because context doesn’t matter when it comes to CSAM

permalink
report
parent
reply
14 points

The Lemmy.world team is getting some authorities involved already for this particular case. I am definitely in favor of notifying law enforcement or revelant organizations, and if anybody tries to use lemm.ee to spread such things, I will definitely be involving my local authorities as well.

permalink
report
parent
reply
11 points

getting the authorities involved

How do you imagine that playing out? This isn’t some paedophile ring trading openly, this is people using CSAM as an attack vector. Getting over-enthusiastic police involved is exactly their goal, and will likely do very little to help the victims in the CSAM itself.

Yes, authorities should be notified and the material provided to the relevant agencies for examination. However that isn’t truly the focus of what’s happening here. There is no immediate threat to children with this attack.

permalink
report
parent
reply
41 points

How do you imagine that playing out?

FBI: Whoa that illegal

Admin: Ya

FBI: We’re going to look for this guy

Admin: alright

END ACT 1

permalink
report
parent
reply
12 points

This isn’t something the FBI have much involvement with. The FBI deal with matters across states.

This isn’t America, where you have a bunch of separate states unified under one American government. People haven’t been posting porn to lemm.ee. People have been posting porn to other instances, which has seeped through to lemm.ee.

Getting the Estonian law enforcement involved is like trying to get the Californian government involved in dealing with a problem from Texas. Estonian law enforcement have no jurisdiction over lemmy.world or any other instance, and giving them an opportunity is only going to lead to locking down lawful association and communication in favour of some vague “think of the children” rhetoric. And, like I say, it won’t do anything to curtail the production of CSAM as the purpose of this attack has little to do with the promotion of CSAM.

Frankly, it could easily be more like:

lemm.ee: We’ve got a problem with illegal content

Estonian law enforcement: Woah that’s illegal.

Estonian law enforcement: You’ve admitted to hosting illegal content. We’re going to confiscate all your stuff.

lemm.ee is shut down pending investigation.

Meanwhile, if lemm.ee continues its current course of action, yet someone notifies law enforcement:

Estonian law enforcement: Woah, we’ve got a report of something dodgy, that’s illegal.

lemm.ee: People tried to post illegal content elsewhere that could have come to our site, we blocked and deleted it to the best of our ability.

Estonian law enforcement: Fair enough, we’ll see what we can figure out.

It really matters how and when the problem is presented to law enforcement. If you report yourself, they’re much more likely to take action against yourself than if someone else reports you. It doesn’t do yourself any favours to present your transgressions to them, not unless you’re absolutely certain you’re squeeky clean.

At this stage and in these circumstances, corrective action is more important than reporting.

permalink
report
parent
reply
61 points

For step 6 - are you aware of the tooling the admin at dbzero has built to automate the scanning of images in Lemmy instances? It looks pretty promising.

permalink
report
reply
20 points

Yep, I’ve already tested it and it’s one of the options I am considering implementing for lemm.ee as well.

permalink
report
parent
reply
5 points
*

It’s worth considering some commercially developed options as well: https://prostasia.org/blog/csam-filtering-options-compared/

The Cloudflare tool in particular is freely and widely available: https://blog.cloudflare.com/the-csam-scanning-tool/

I am no expert, but I’m quite skeptical of db0’s tool:

  • It repurposes a library designed for preventing the creation of synthetic CSAM using stable diffusion. This library is typically used in conjunction with prompt scanning and other inputs into the generation process. When run outside it’s normal context on non-ai images, it will lack all this input context which I speculate reduces its effectiveness relative to the conditions under which it’s tested and developed.
  • AI techniques live and die by the quality of the dataset used to train them. There is not and cannot be an open-source test dataset of CSAM upon which to train such a tool. One can attempt workarounds like extracting features classified and extracted separately like trying to detect coexisting features related to youth (trained from dataset A using non sexualized images including children) and sexuality (trained separately from dataset B using images containing only adult performers)… but the efficacy of open source solutions is going to be hamstrung by the inability to train, test, and assess effectiveness of the open tools. Developers of major commercial CSAM scanners are better able to partner with NCMEC and other groups fighting CSAM to assess the effectiveness of their tools.

I’m no expert, but my belief is that open tools are likely to be hamstrung permanently compared to the tools developed by big companies and the most effective solutions for Lemmy must integrate big company tools (or gov/nonprofit tools if they exist).

PS: Really impressed by your response plan. I hope the Lemmy world admins are watching this post, I know you all communicate and collaborate. Disabling image uploads is I think I very effective temporary response until detection and response tooling can be improved.

permalink
report
parent
reply
1 point

you make some good points. this gave rise to a thought: seems like law enforcement would have such a data set and seems they should of course allow tools to be trained on it. seems but who knows? might be worth finding out.)

permalink
report
parent
reply
0 points

The neat thing is that it’s all much easier as lemm.ee doesn’t allow porn: The filter can just nuke nudity with extreme prejudice, adult or not.

permalink
report
parent
reply
15 points

It seems promising but also incomplete for US hosts, as our laws do not allow deletion of CSAM rather it must be saved and preserved and sent to a central authority and not deleted until they give the okay. Rofl.

I also wonder if this solution will use PHash or other hashing to filter out known and unaltered CSAM images (without actually comparing the images, rather their metadata).

permalink
report
parent
reply
1 point

i didn’t quite know that and yet, it doesn’t surprise either.

permalink
report
parent
reply
1 point

I blocked botart from their instance as some pretty disturbing stuff was added in the last few days.

permalink
report
parent
reply
53 points

IMO Lemmy shouldn’t have media uploading of any kind. Aside from the CSAM risk, it’s unsustainable and I think one of the reasons Reddit went to shit is by getting into the whole image/video/gif hosting.

Dozens of media hosts exist out there, and the mobile/web clients should focus instead on showing remote content better.

permalink
report
reply
36 points

The flip side of the argument is that if you also host the media you are not at risk of having broken links. I’ve seen a number of long running forums that had post bodies that contained external images that are now broken.

Of course an argument can be made that the only reason that those forums have lived for so long was due to not having costs associated with hosting media.

permalink
report
parent
reply
10 points

That’s no worse than a reddit link getting borked because it’s been cross-posted and someone managed to kill the original link with a DMCA notice.

permalink
report
parent
reply
7 points
*

I would say that is a different issue. DMCA could go to whatever external host as well so that doesn’t change.

My argument was about putting faith in external providers to stay alive to continue hosting media. You can also get in a situation where an external provider decides to do a mass delete like what Imgur did this past summer.

permalink
report
parent
reply
6 points

A post getting removed because someone threatened legal action is not the same as using an image host that goes under because no one visits their site to see their ads to pay for hosting it or because they arbitrarily purged their content or changed their link format like imgur has. Unless Lemmy hosts it’s own images it will be at risk of being purged like has happened many times over.

permalink
report
parent
reply
2 points

I get we don’t trust these third party image hosting sites, but if it’s that or having local images that can potentially bring down instances, I’d say that’s a no brainier of a compromise.

These upload sites like imgur automatically handle image detection and take the load off smaller servers. It seems like a perfect solution got now

permalink
report
parent
reply
2 points

There is a privacy and tracking concern with loading images from 3rd-party hosts vs lemm.ee hosting or re-hosting them.

permalink
report
parent
reply
50 points
*

Please please do not implement an invite system.

The success of a forum like this depends on people being able to join and express their thoughts freely. Reddit and digg would never have gotten where they are if they had a closed system.

I almost didn’t join lemmy because the first two instances I heard about (lemmy.ml and beehaw) had closed registration. I think I applied and then forgot about it for 2 weeks. Thankfully I saw a post about lemmy on reddit yet again and finally found an open instance.

Don’t let the actions of a few scumbags ruin a good thing for everyone. You’ll be giving them exactly what they want.

permalink
report
reply
42 points

I agree that users should be able to join Lemmy freely, but I think it makes a lot of sense to try and spread users out more between instances - this spreads out the responsibilities between more admins, spreads out the load between more servers and also reduces the chance of a single point of failure for the whole system.

It’s clear that there are seriously vile people out there who want to cause huge amounts of damage to Lemmy, and if we have unlimited growth in a few selected instances, then these people only have to target those specific instances for maximum damage.

In a perfect world, none of this would be necessary, but then again, in a perfect world, we wouldn’t need a decentralized platform in the first place.

permalink
report
parent
reply
12 points
*

Thanks for responding!

I agree that it’s best for the lemmyverse.net if there are many big instances too.

Unfortunately, the concept of the fediverse isn’t as easy to understand. The average newcomer (who mostly just wants to consume content and occasionally ask a question or two) starts off by interacting within their instance, and it takes some time to figure out cross-instance communication (there are still posts about this on the nostupidquestions-type communities). For such users, landing on a small instance means they’ll poke around the Local active posts, think that “this forum is dead”, and never return.

Like reddit, having a large userbase on lemmyverse is important to keep the conversation interesting (see https://i.imgur.com/4tXHAO0.png). Reddit has provided lemmy with a huge shot at success by injecting a large number of users. But if I’m being honest, the conversation on the lemmyverse isn’t as diverse and engaging as it is on reddit yet. This isn’t self-sustaining yet. I can point to 2 pieces of evidence to support this:

  1. Using Voat as a (imperfect) proxy - I don’t know if there are official stats of Voat, but the best dataset I’ve seen for Voat (https://ojs.aaai.org/index.php/ICWSM/article/download/19382/19154/23395) has 16.2M comments in 2.3M submissions from 113k users. Voat was shut down for lack of funding, but even in its heyday it wasn’t exactly thriving - many people on Voat were united in their toxicity and it never really got going. Compare these numbers to the lemmyverse which has about 100k active users over the last 6 months. If the fediverse is to grow beyond “that niche forum for nerds”, this userbase isn’t enough.

  2. It’s already clear that the number of active users is decreasing - since mid-July, the number of monthly active users has dropped from 70k to 50k. This is expected (bunch of redditors who joined in June, poked around and said hi and left), but it means if the lemmyverse wants to have any chance of succeeding long term, you can’t alienate new users now.

The approach I’ve been advocating since the beginning of lemmy is:

  • if you see a user who’s interested in lemmy but isn’t really tech savvy, just point them to one of the biggest instances. Don’t explain what federation is, leave it as a feature to be discovered once they’re engaged.
  • if you see a user who’s interested in the concept of a fediverse and wants to know how it works, explain federation and send them to a smaller instance.

The way federation works now, it’s still disadvantageous to be on a smaller instance (discoverability of new communities is harder, syncing posts/comments isn’t always fast, it’s hard to know which community is more active. Many of these can be fixed with changes to activitypub and lemmy protocol, but in the meantime, sending casual users to small instances means they’ll likely never return.

So to sum up, I think there should be an avenue for casual users to join the biggest instances, even as we encourage people to move to smaller ones (either targeting those who are more tech savvy, or those who have already been on Lemmy long enough to know how it works - I myself was on Lemmy.world and switched to this “smaller” instance).

Anyway, you’re the admins here and I have no say over what you eventually do. I’m just hoping you’ll consider the practical realities of user behavior - everyone wants what’s best for the fediverse in the long term.

permalink
report
parent
reply
1 point
Deleted by creator
permalink
report
parent
reply
1 point

discoverability of new communities is harder

https://github.com/Fmstrat/lcs

syncing posts/comments isn’t always fast

My experience is the opposite, but that may be instance dependant

it’s hard to know which community is more active

Active users stats are the same on every instance for communities

permalink
report
parent
reply
25 points

If I may, lemm.ee is now the second biggest instance. Redirecting people to register on local instances (feddit.country) or generalist ones (reddthat.com, Lemmy.today, discuss.online etc.) couldebe reasonable to make those ones grow as well.

I agree that there should be a clear lists of instances open for registrations, but that probably needs to wait for the dust to settle a bit beforehand

permalink
report
parent
reply
2 points

I posted a long reply above (direct link https://lemm.ee/comment/2929349).

permalink
report
parent
reply
11 points

While I understand your concerns, this instance has gotten a fair bit larger and will start to suffer the same issues that lemmy.world does if registrations aren’t curbed. It can’t grow infinitely. That just isn’t feasible for one server. Having closed registrations on lemm.ee doesn’t stop anyone from signing up on different instances. A solution might be to temporarily limit registration here in some way, and for the devs and instance admins to find a better way of helping new users choose an instance. The initial sign up process was confusing, and could be streamlined to make it easier for people to choose an instance. In the long term, enhancing the way federation works so users who do sign up on smaller/newer instances don’t need to be lemmy savvy to find content would also help alleviate that type of issue.

permalink
report
parent
reply
2 points

I posted a long reply above (direct link https://lemm.ee/comment/2929349).

permalink
report
parent
reply
1 point

i’m on the side of “no, rampant froth without proper tools and whatnot is a recipe for something we don’t want to eat.

permalink
report
parent
reply
1 point

i get your point but some folks aren’t that put off by it, assuming they can ask for an invite and it does t take ten years. i had to work at it a bit over on reddit but i took my time and just wrote about the difficulties and in a couple weeks hey, i got an invite. i’d prefer a nicer community once i’m in to a quick and easy entry but it sucks thereafter (or is just chaotic and unhappy periodically). it’s like your house. do you just let everyone in from fear of being lonely? probably not. probably, if you’re not a outlier, you’ve taken steps to make it a bit hard for anyone not invited to enter. and it makes your home a better place to be.

permalink
report
parent
reply

Meta (lemm.ee)

!meta@lemm.ee

Create post

lemm.ee Meta

This is a community for discussion about this particular Lemmy instance.

News and updates about lemm.ee will be posted here, so if that’s something that interests you, make sure to subscribe!


Rules:

  • Support requests belong in !support
  • Only posts about topics directly related to lemm.ee are allowed
  • If you don’t have anything constructive to add, then do not post/comment here. Low effort memes, trolling, etc is not allowed.
  • If you are from another instance, you may participate in discussions, but remain respectful. Realize that your comments will inevitably be associated with your instance by many lemm.ee users.

If you’re a Discord user, you can also join our Discord server: https://discord.gg/XM9nZwUn9K

Discord is only a back-up channel, !meta@lemm.ee will always be the main place for lemm.ee communications.


If you need help with anything, please post in !support instead.

Community stats

  • 240

    Monthly active users

  • 221

    Posts

  • 5.7K

    Comments

Community moderators