Not a good look for Mastodon - what can be done to automate the removal of CSAM?
Okay, thanks for the clarification
Everyone except you still very much includes drawn & AI pornographic depictions of children within the basket of problematic content that should get filtered out of federated instances so thank you very much but I’m not sure your point changed anything.
If you don’t think images of actual child abuse, against actual children, is infinitely worse than some ink on paper, I don’t care about your opinion of anything.
You can be against both. Don’t ever pretend they’re the same.
Hey, just because someone has a stupid take on one subject doesn’t mean they have a stupid take on all subjects. Attack the argument, not the person.
He invented the stupid take he’s fighting against. Nobody equated “ink on paper” with “actual rape against children”.
The bar to cross to be filtered out of the federation isn’t rape. Lolicon is already above the threshold, it’s embarrassing that he doesn’t realize that.
Some confused arguments reveal confused people. Some terrible arguments reveal terrible people. For example: I don’t give two fucks what Nazis think. Life’s too short to wonder which subjects they’re not facile bastards about.
If someone’s motivation for making certain JPEGs hyper-illegal is “they’re icky” - they’ve lost benefit of the doubt. Because of their decisions, I no longer grant them that courtesy.
Demanding pointless censorship earns my dislike.
Equating art with violence earns my distrust.
They are not saying it shouldn’t be defederated, they are saying reporting this to authorities is pointless and that considering CSAM is harmful.
Everybody understands there’s no real kid involved. I still don’t see an issue reporting it to authorities and all the definitions of CSAM make a point of including simulated and illustrated forms of child porn.
Definitions of CSAM definitely do not include illustrated and simulated forms. They do not have a victim and therefore cannot be abuse. I agree that it should not be allowed on public platforms, hence why all instances hosting it should be defederated. Despite this, it is not illegal, so reporting it to authorities is a waste of time for you and the authorities who are trying to remove and prevent actual CSAM.
What’s the point of reporting it to authorities? It’s not illegal, nor should it be because there’s no victim, so all reporting it does is take up valuable time that could be spent tracking down actual abuse.