Not a good look for Mastodon - what can be done to automate the removal of CSAM?
https://stacks.stanford.edu/file/druid:vb515nd6874/20230724-fediverse-csam-report.pdf
I’d suggest that anyone who cares about the issue take the time to read the actual report, not just drama-oriented news articles about it.
So if I’m understanding right, based on their recommendations this will all be addressed as more moderation and QOL tools are introduced as we move further down the development roadmap?
What development roadmap? You’re not a product manager and this isn’t a Silicon Valley startup.
If I can try to summarize the main findings:
- Computer-generated (e.g…, Stable Diffusion) child porn is not criminalized in Japan, and so many Japanese Mastodon servers don’t remove it
- Porn involving real children is removed, but not immediately, as it depends on instance admins to catch it, and they have other things to do. Also, when an account is banned, the Mastodon server software is not sending out a “delete” for all of their posted material (which would signal other instances to delete it)
Problem #2 can hopefully be improved with better tooling. I don’t know what you do about problem #1, though.
One option would be to decide that the underlying point of removing real CSAM is to avoid victimizing real children; and that computer-generated images are no more relevant to this goal than Harry/Draco slash fiction is.
And are you able to offer any evidence to reassure us that simulated child pornography doesn’t increase the risk to real children as pedophiles become normalised to the content and escalate (you know, like what already routinely happens with regular pornography)?
Or are we just supposed to sacrifice children to your gut feeling?
I don’t know what you do about problem #1, though.
Well the simple answer is that it doesn’t have to be illegal to remove it.
The legal question is a lot harder, considering AI image generation has reached levels that are almost indistinguishable from reality.
In which case, admins should err on the side of caution and remove something that might be illegal.
I personally would prefer to have nothing remotely close to CSAM, but as long as children aren’t being harmed in any conceivable way, I don’t think it would be illegal to post art containing children. But communities should absolutely manage things however they think is best for their community.
In other words, I don’t think #1 is a problem at all, imo things should only be illegal if there’s a clear victim.
4.1 Illustrated and Computer-Generated CSAM
Stopped reading.
Child abuse laws “exclude anime” for the same reason animal cruelty laws “exclude lettuce.” Drawings are not children.
Drawings are not real.
Half the goddamn point of saying CSAM instead of CP is to make clear that Bart Simpson doesn’t count. Bart Simpson is not real. It is fundamentally impossible to violate Bart Simpson’s rights, because he doesn’t fucking exist. There is nothing to protect him from. He cannot be harmed. He is imaginary.
This cannot be a controversial statement. Anyone who can’t distinguish fiction from real life has brain problems.
You can’t rape someone in MS Paint. Songs about murder don’t leave a body. If you write about robbing Fort Knox, the gold is still there. We’re not about to arrest Mads Mikkelsen for eating people. It did not happen. It was not real.
If you still want to get mad at people for jerking off to the wrong fantasies, that is an entirely different problem from photographs of child rape.
What does that even mean?
There’s nothing to “cover.” They’re talking about illustrations of bad things, alongside actual photographic evidence of actual bad things actually happening. Nothing can excuse that.
No shit they are also discussing actual CSAM alongside… drawings. That is the problem. That’s what they did wrong.
Oh, wait, Japanese in the other comment, now I get it. This conversation is a about AI Loli porn.
Pfft, of course, that’s why no one is saying the words they mean, because it suddenly becomes much harder to take the stance since hatred towards Loli Porn is not universal.
I mean, I think it’s disgusting, but I don’t think it should be illegal. I feel the same way about cigarettes, 2 girls 1 cup, and profane language. It’s absolutely not for me, but that shouldn’t make it illegal.
As long as there’s no victim, knock yourself out with whatever disgusting, weird stuff you’re into.
Okay, thanks for the clarification
Everyone except you still very much includes drawn & AI pornographic depictions of children within the basket of problematic content that should get filtered out of federated instances so thank you very much but I’m not sure your point changed anything.
They are not saying it shouldn’t be defederated, they are saying reporting this to authorities is pointless and that considering CSAM is harmful.
If you don’t think images of actual child abuse, against actual children, is infinitely worse than some ink on paper, I don’t care about your opinion of anything.
You can be against both. Don’t ever pretend they’re the same.
Oh no, what you describe is definitely illegal here in Canada. CSAM includes depictions here. Child sex dolls are illegal. And it should be that way because that stuff is disgusting.
CSAM includes depictions here.
Literally impossible.
Child rape cannot include drawings. You can’t sexually assault a fictional character. Not “you musn’t.” You can’t.
If you think the problem with child rape amounts to ‘ew, gross,’ fuck you. Your moral scale is broken, if there’s not a vast gulf between those two bad things.