Several months ago Beehaw received a report about CSAM (i.e. Child Sexual Abuse Material). As an admin, I had to investigate this in order to verify and take the next steps. This was the first time in my life that I had ever seen images such as these. Not to go into great detail, but the images were of a very young child performing sexual acts with an adult.

The explicit nature of these images, the gut-wrenching shock and horror, the disgust and helplessness were very overwhelming to me. Those images are burnt into my mind and I would love to get rid of them but I don’t know how or if it is possible. Maybe time will take them out of my mind.

In my strong opinion, Beehaw must seek a platform where NO ONE will ever have to see these types of images. A software platform that makes it nearly impossible for Beehaw to host, in any way, CSAM.

If the other admins want to give their opinions about this, then I am all ears.

I, simply, cannot move forward with the Beehaw project unless this is one of our top priorities when choosing where we are going to go.

52 points

I’m not sure that’s possible with images being allowed. If Google, Facebook, Instagram, and YouTube all struggle with it I think it will be an issue anywhere images are allowed. Maybe there’s an opening for an AI to handle the task these days but any dataset for something like that could obviously be incredibly problematic

permalink
report
reply
39 points

Yeah, the key problem here is that any open forum, of any considerable popularity, since the dawn of the Internet has had to deal with shit like CSAM. You don’t see it elsewhere because of moderators. Doing the very job Op does. It’s just now, Op, you’re in the position. Some people can, and have decided to, deal with moderating the horrors. It may very well not be something you, Op, can do.

permalink
report
parent
reply
22 points
*

The thing is though, with traditional forums you get a LOT of controls for filtering out the kind of users who post such content. For instance, most forums won’t even let you post until you complete an interactive tutorial first (reading the rules and replying to a bot indicating you’ve understood them etc).

And then, you can have various levels of restrictions, eg, someone with less than 100 posts, or an account less than a month old may not be able to post any links or images etc. Also, you can have a trust system on some forums, where a mod can mark your account as trusted or verified, granting you further rights. You can even make it so that a manual moderator approval is required, before image posting rights are granted. In this instance, a mod would review your posting history and ensure that your posts genuinely contributed to the community and you’re unlikely to be a troll/karma farmer account etc.

So, short of accounts getting compromised/hacked, it’s very difficult to have this sort of stuff happen on a traditional forum.

I used to be a mod on a couple of popular forums back in the day, and I even ran my own community for a few years (using Invision Power Board), and never once have I had to deal with such content.

The fact is Lemmy is woefully inadequate in it’s current state to deal with such content, and there are definitely better options out there. My heart goes out to @Chris and the staff for having to deal with this stuff, and I really hope that this drives the Beehaw team to move away from Lemmy ASAP.

In the meantime, I reckon some drastic actions would need to be taken, such as disabling new user registrations and stopping all federation completely, until the new community is ready.

permalink
report
parent
reply
2 points

So this just got posted on lemmy.dbzer0. They’ve got an AI-based CSAM screen up and running with promising initial results. The model was trained using CLIP, which as far as I understand it means they used written descriptions of what CSAM is or is not.

Could something like this work for Beehaw?

permalink
report
parent
reply
1 point

I’m sure the mods saw that, and it’s really more of a question for them tbh, but if it works for other Lemmy instances I’m not sure why it wouldn’t work here.

permalink
report
parent
reply
1 point

Wonder whether in theory one could use a dataset of… everything else, have the AI exclude what it does not recognise, then run the exclusions against a dataset to see whether or not they contain children. There could be an additional layer of running the exclusions against a dataset of regular sexual content.

One issue is that admin of any site would still want to report any CSAM to authorities. That could be automated by an AI checker, but one would have to have a lot of faith that the AI was decently accurate and not generating many false reports. The workaround I described to avoid using datasets of abuse is unlikely to be particularly accurate - ok for the purposes of protecting admin, but leaves them in an odd spot when it comes to banning a user, especially where a user’s livelihood could be impacted, or things like paid online courses. I guess specialist police departments probably would have to use highly relevant datasets, along with review by humans, but still - nobody wants to inadvertently clog up that system with false reports.

permalink
report
parent
reply
43 points

I’d be fine with not hosting images entirely. I don’t think people come to beehaw primarily to look at pictures

permalink
report
reply
1 point

I’ve been thinking lately that I kind of miss things like IRC where you couldn’t really post pictures in chat. With things like Discord and Slack the off topic channels often devolve into people just sharing random memes they found funny at the time, and not really talking to each other. I’m sure there’s value in that too, but I think it can take up a lot of oxygen in the social space, so I’m not sure it’s always a win. Different formats encourage different ways of interacting with each other, I guess, and it’s interesting!

permalink
report
parent
reply
39 points
*

People keep talking about going to another platform. Personally I think a better idea would be to develop lemmy to deal with these issues. This must be a fediverse wide problem. So some discussion with other admins and the developers is probably the way to go on many of these things. Moreover you work with https://opencollective.com/, can they help. Beyond this, especially CSAM, there must be large funding agencies where one could get a grant to get some real professional programming put into this problem. Perhaps we could raise funds ourselves to help with this too.

So frankly I would like to see Beehaw solve the issues with lemmy, rather then just move to some other platform that will have its own issues. The exception may be if the Beehaw people think that being a safe space creates too big a target that you have to leave the Threadiverse to be safe. That to me seems like letting the haters win. It is exactly what they want. My vote will always be to solve the threadiverse issues rather then run away.

Just my feeling. There may be more short term practical issues that take precedence and frankly it is all up to you guys where you want to take this project.

permalink
report
reply
11 points

The solution is to use an already existing software product that solves this, like CloudFlare’s CSAM Detection. I know people on the fediverse hate big companies, but they’ve solved this problem already numerous times before. They’re the only ones allowed access to CSAM hashes, lemmy devs and platforms will never get access to the hashes (for good reason).

permalink
report
parent
reply
3 points
*

They will still need to have a developer set this up and presumably it should be added as an option to the main code base. I thought I heard the beehaw admins were not developers.

There are a number of other issues that are driving the admins to dump lemmy. Same applies there.

permalink
report
parent
reply
1 point

Not sure what you mean. You do not need to be a developer to set up CloudFlare’s CSAM detection. You simply have email the NCMEC, get an account, then check a box in CF, input some information about your NCMEC account, and then you’re good to go.

permalink
report
parent
reply
1 point

Wait… why is no access to csam hashes a good thing? Wouldn’t it make it easier to detect if hashes were public?! I feel like I’m missing something here…

permalink
report
parent
reply
4 points

Giving access to CSAM hashes means anyone wanting to avoid detection simply has to check what they’re about to upload against the db. If it matches then they simply modify the image until it doesn’t. It’s literally guaranteed to make the problem worse, not better.

permalink
report
parent
reply
36 points
*

I just want to say, I am so so so sorry you had to see that.

I accidentally saw some CSAM in the 1990s and you are right, it is burnt into your mind. It’s the real limit case of “what has been seen cannot be unseen” - all I could do was learn to avoid accessing those memories.

If you can access counselling for this, that might be a good option. Vicarious trauma is a real phenomenon.

permalink
report
reply
18 points

If you can access counselling for this, that might be a good option. Vicarious trauma is a real phenomenon.

Thank you for the advice. I’m not sure that I’ll need counseling but I’m open to it if need be. Time will tell.

permalink
report
parent
reply
7 points

Be sure to keep tabs on yourself, sometimes these things can really sneak up on you.

permalink
report
parent
reply
35 points

I’m sure those repugnant assholes do it “for the lulz” and if they want to mess with you they’ll do it anywhere.

There’s this study that says playing Tetris helps ease recently acquired trauma https://www.ox.ac.uk/news/2017-03-28-tetris-used-prevent-post-traumatic-stress-symptoms

And the admin from his eponymous instance dbzero created an interesting script to get rid of CSAM without having to review it manually, take a look -> https://github.com/db0/lemmy-safety

permalink
report
reply
11 points
*

Just tagging @admin in case they don’t see this ❤️

Edit: aaand I did it wrong 🙄 @admin@beehaw.org 👈 Better?

permalink
report
parent
reply

Chat

!chat@beehaw.org

Create post

Relaxed section for discussion and debate that doesn’t fit anywhere else. Whether it’s advice, how your week is going, a link that’s at the back of your mind, or something like that, it can likely go here.


Subcommunities on Beehaw:


This community’s icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

Community stats

  • 357

    Monthly active users

  • 486

    Posts

  • 9.5K

    Comments