Not a good look for Mastodon - what can be done to automate the removal of CSAM?

You are viewing a single thread.
View all comments View context
1 point
*

Okay, thanks for the clarification

Everyone except you still very much includes drawn & AI pornographic depictions of children within the basket of problematic content that should get filtered out of federated instances so thank you very much but I’m not sure your point changed anything.

permalink
report
parent
reply
-1 points

If you don’t think images of actual child abuse, against actual children, is infinitely worse than some ink on paper, I don’t care about your opinion of anything.

You can be against both. Don’t ever pretend they’re the same.

permalink
report
parent
reply
-1 points

Hey, just because someone has a stupid take on one subject doesn’t mean they have a stupid take on all subjects. Attack the argument, not the person.

permalink
report
parent
reply
0 points
*

He invented the stupid take he’s fighting against. Nobody equated “ink on paper” with “actual rape against children”.

The bar to cross to be filtered out of the federation isn’t rape. Lolicon is already above the threshold, it’s embarrassing that he doesn’t realize that.

permalink
report
parent
reply
1 point

Some confused arguments reveal confused people. Some terrible arguments reveal terrible people. For example: I don’t give two fucks what Nazis think. Life’s too short to wonder which subjects they’re not facile bastards about.

If someone’s motivation for making certain JPEGs hyper-illegal is “they’re icky” - they’ve lost benefit of the doubt. Because of their decisions, I no longer grant them that courtesy.

Demanding pointless censorship earns my dislike.

Equating art with violence earns my distrust.

permalink
report
parent
reply
1 point

Step up the reading comprehension please

permalink
report
parent
reply
1 point

I understand what you’re saying and I’m calling you a liar.

permalink
report
parent
reply
6 points

They are not saying it shouldn’t be defederated, they are saying reporting this to authorities is pointless and that considering CSAM is harmful.

permalink
report
parent
reply
-1 points
*

Everybody understands there’s no real kid involved. I still don’t see an issue reporting it to authorities and all the definitions of CSAM make a point of including simulated and illustrated forms of child porn.

https://en.m.wikipedia.org/wiki/Child_pornography

permalink
report
parent
reply
2 points

Definitions of CSAM definitely do not include illustrated and simulated forms. They do not have a victim and therefore cannot be abuse. I agree that it should not be allowed on public platforms, hence why all instances hosting it should be defederated. Despite this, it is not illegal, so reporting it to authorities is a waste of time for you and the authorities who are trying to remove and prevent actual CSAM.

permalink
report
parent
reply
2 points

What’s the point of reporting it to authorities? It’s not illegal, nor should it be because there’s no victim, so all reporting it does is take up valuable time that could be spent tracking down actual abuse.

permalink
report
parent
reply

Technology

!technology@lemmy.ml

Create post

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

Community stats

  • 2.6K

    Monthly active users

  • 2.7K

    Posts

  • 42K

    Comments

Community moderators