You are viewing a single thread.
View all comments View context
-7 points

I’m genuinely wondering if this would actually reduce the harm caused to real children. If so, I think it should be legal.

So tired of seeing this point made. Allowing animated or AI generated CSAM to exists openly and legally will not reduce violence against childern. It will increase it. It will normalized it.

You seem to think people who are willing and capable of commiting sexual violence against childern are going to do it less when theres a robust market of leaglly accessable CSAM.

It wont. it will instead create predator pipelines. It will take people with mild sexual disorders and histories of their own sexual assualts as childern and feed them CSAM. It will create more predators.

It will allow for communities of pedophiles to exist openly, findable on google searchs and advertised on regular porn sites.

Also the people who make AI generated CSAM are not going to be water marking it a AI genrated.

They are going to make it as real as possible. it will be indistinguishable to the naked eye and thus allow for Actual CSAM to masquarade and AI generated.

I could go on. But im not an expert on any of this.

permalink
report
parent
reply
8 points
*

You completely ignored the “state controlled generation and access” part of the argument. Experience with addictive drugs has shown us that tightly controlled access, oversight and possibly treatment can be a much better solution than just making it illegal. The truth is that we just don’t know if it would work the same with CSAM, but we do know that making it a taboo topic doesn’t work.

permalink
report
parent
reply
-2 points

There’s no parallel here. Providing safe access to drugs reduces harm to the user and the harm done by the black-market drug trade. Normalising AI-generated CSAM might reduce the harm done to children during production of the material but it creates many more abusers.

The parallel only works if the “state controlled generation and access” to drugs was an open shop handing out drugs to new users and creating new addicts. Which is pretty much how the opiate epidemic was created by drug companies, pharmacists and doctors using their legitimate status for entirely illegitimate purposes.

permalink
report
parent
reply
0 points

there is no parallel here

Says who?

permalink
report
parent
reply
1 point

Normalising AI-generated CSAM might reduce the harm done to children during production of the material but it creates many more abusers.

The problem with your argument is that you assume a bunch of stuff that we just don’t know, because we haven’t tried it yet. The closest thing we do know are drugs, and for them controlled access has proven to work really well. So I think it’s at least worth thinking about and doing limited real-world trials.

And I don’t think any sane person is suggesting to just legalize and normalize it. It would have to be a way for people to self-report and seek help, with conditions such as mandatory check-in/counseling and not being allowed to work with children.

permalink
report
parent
reply
1 point

You make a huge amount of claims, all as fact. How do you know that any of it is true? I’m not trying to defend rapists and pedophiles, I’m trying to think rationally and pragmatically about how to solve or at least improve this problem. Your reaction to it seems to be more emotional than rational and factual.

permalink
report
parent
reply
1 point

I’m trying to think rationally and pragmatically

Ahh yes the rational thought process which leads you to think a government is capable of Safely facilitating the production of csam. ???

They are unable to stop child poverty but totally capable to producing CSAM in a safe way…

Spare me Your fact finding mission.

Im not an expert or a social worker but i can tell But i can tell you that drug addiction and pedophilia are not the same.

To consider these two the same, as the original commentor did, is disgisting, offensive and ignorant.

There is no inherent victim with drug use. The same cannot be said pedophilia and Child sexual assualt.

While there is always a spectrum of people particpating in child victimization. The people who are the creators of the CSAM and those who participate in its distribution are not addicts. The are predators.

I’m not trying to defend rapists and pedophiles

Well you are…

permalink
report
parent
reply

World News

!world@lemmy.world

Create post

A community for discussing events around the World

Rules:

  • Rule 1: posts have the following requirements:

    • Post news articles only
    • Video links are NOT articles and will be removed.
    • Title must match the article headline
    • Not United States Internal News
    • Recent (Past 30 Days)
    • Screenshots/links to other social media sites (Twitter/X/Facebook/Youtube/reddit, etc.) are explicitly forbidden, as are link shorteners.
  • Rule 2: Do not copy the entire article into your post. The key points in 1-2 paragraphs is allowed (even encouraged!), but large segments of articles posted in the body will result in the post being removed. If you have to stop and think “Is this fair use?”, it probably isn’t. Archive links, especially the ones created on link submission, are absolutely allowed but those that avoid paywalls are not.

  • Rule 3: Opinions articles, or Articles based on misinformation/propaganda may be removed. Sources that have a Low or Very Low factual reporting rating or MBFC Credibility Rating may be removed.

  • Rule 4: Posts or comments that are homophobic, transphobic, racist, sexist, anti-religious, or ableist will be removed. “Ironic” prejudice is just prejudiced.

  • Posts and comments must abide by the lemmy.world terms of service UPDATED AS OF 10/19

  • Rule 5: Keep it civil. It’s OK to say the subject of an article is behaving like a (pejorative, pejorative). It’s NOT OK to say another USER is (pejorative). Strong language is fine, just not directed at other members. Engage in good-faith and with respect! This includes accusing another user of being a bot or paid actor. Trolling is uncivil and is grounds for removal and/or a community ban.

Similarly, if you see posts along these lines, do not engage. Report them, block them, and live a happier life than they do. We see too many slapfights that boil down to “Mom! He’s bugging me!” and “I’m not touching you!” Going forward, slapfights will result in removed comments and temp bans to cool off.

  • Rule 6: Memes, spam, other low effort posting, reposts, misinformation, advocating violence, off-topic, trolling, offensive, regarding the moderators or meta in content may be removed at any time.

  • Rule 7: We didn’t USED to need a rule about how many posts one could make in a day, then someone posted NINETEEN articles in a single day. Not comments, FULL ARTICLES. If you’re posting more than say, 10 or so, consider going outside and touching grass. We reserve the right to limit over-posting so a single user does not dominate the front page.

We ask that the users report any comment or post that violate the rules, to use critical thinking when reading, posting or commenting. Users that post off-topic spam, advocate violence, have multiple comments or posts removed, weaponize reports or violate the code of conduct will be banned.

All posts and comments will be reviewed on a case-by-case basis. This means that some content that violates the rules may be allowed, while other content that does not violate the rules may be removed. The moderators retain the right to remove any content and ban users.


Lemmy World Partners

News !news@lemmy.world

Politics !politics@lemmy.world

World Politics !globalpolitics@lemmy.world


Recommendations

For Firefox users, there is media bias / propaganda / fact check plugin.

https://addons.mozilla.org/en-US/firefox/addon/media-bias-fact-check/

Community stats

  • 11K

    Monthly active users

  • 17K

    Posts

  • 279K

    Comments