Not a good look for Mastodon - what can be done to automate the removal of CSAM?
That’s an arbitrary decision to make and doesn’t really need to be debated
The study is pretty transparent about what “CSAM” is under their definition and they even provide pictures, from a science communication point of view they’re in the clear
And their definition kind of sucks. They’re basically saying it’s anything that SafeSearch or PhotoDNA flags, or something that has hashtag hits.
That said, there’s absolutely some terrible things on Mastodon, including grooming and trading. I’m interested to know what the numbers look like for lolicon and similar vs actual CP, which would give me a much better understanding of how bad the problem is. As in, are the things included in the report outliers, or typical of their sample set?
I guess I’m looking for a bit more granularity in the report.