It not only have problems with CSEM (the real-life stuff), but there are now bots spamming it and Twitter made reporting it a chore.

“Please provide more context”, WTF it’s literally just CSEM plus a link I won’t click even if my life depends on it!

I’m quite sad since a lot of the creators I’m following are still only there, or on the boneless fediverse app BlueSky (which is worse in some ways), and I still need to keep it around just to protect my user handle there and to look things up from time to time.

Once I’m at home from work, I’m locking my account, and put up a farewell message to whoever might miss me.

I’m not saying that the fediverse is perfect, far from it (especially certain segments of Lemmy), but it’s a way better experience than whatever Xitter (or Reddit for that matter) tries to be. I even have more reach, especially since the whole paid blue checkmark thing.

You are viewing a single thread.
View all comments View context
2 points
*

CSAM is supposed to be more explicit that the images are essentially crime scene photographs, and to emphasize that it is Abuse first and foremost and not merely pornography.

CP is a morally neutral term, or at least the components words themselves are. CSAM is not, and is explicitly negative.

permalink
report
parent
reply
2 points

Hmm… I mean I’m not challenging this explanation, but I’m just a little curious about this I suppose? So starting from when I was like 13-14, I regularly sent and received nudes of other people my age I met on gay forums n shit. Uk… Sexting n stuff. Now I know that this could’ve gone incredibly ugly had I been deanonymized n stuff. But I mean… I had fun at the time and am in contact (not that regular tho) with some of these guys (and I’m an adult now).

I had fun at the time and was not coerced into anything by anyone. I was just a horny teen with an out and so were they. How’s this abuse? Like who’s the abuser? I’m sure it wasn’t us, as no one coerced anyone into doing anything.

permalink
report
parent
reply
3 points

That’s I guess why CSEM is used, because if the images are being shared around exploitation has clearly occurred. I can see where you’re coming from though.

What I will say is that there are some weird laws around it, and there have even been cases where kids have been convicted of producing child pornography… of themselves. It’s a bizarre situation. If anything, seems like abuse of the court system at that point.

Luckily a lot of places have been patching the holes in their laws.

permalink
report
parent
reply

Fediverse

!fediverse@lemmy.world

Create post

A community to talk about the Fediverse and all it’s related services using ActivityPub (Mastodon, Lemmy, KBin, etc).

If you wanted to get help with moderating your own community then head over to !moderators@lemmy.world!

Rules

  • Posts must be on topic.
  • Be respectful of others.
  • Cite the sources used for graphs and other statistics.
  • Follow the general Lemmy.world rules.

Learn more at these websites: Join The Fediverse Wiki, Fediverse.info, Wikipedia Page, The Federation Info (Stats), FediDB (Stats), Sub Rehab (Reddit Migration), Search Lemmy

Community stats

  • 5.1K

    Monthly active users

  • 1.8K

    Posts

  • 62K

    Comments