You are viewing a single thread.
View all comments
86 points

This is tough. If it was just a sicko who generated the images for himself locally… that is the definition of a victimless crime, no? And it might actually dissuade him from seeking out real CSAM…

BUT, iirc he was actually distributing the material, and even contacted minors, so… yeah he definitely needed to be arrested.

But, I’m still torn on the first scenario…

permalink
report
reply
12 points

What is the AI trained on?

permalink
report
parent
reply
53 points

Image-generating AI is capable of generating images that are not like anything that was in its training set.

permalink
report
parent
reply
0 points

In that case probably the strongest argument is that if it were legal, many people would get off charges of real CSAM because the prosecuter can’t prove that it wasn’t AI generated.

permalink
report
parent
reply
-26 points

AI can compose novel looking things from components it has been trained on - it can’t imagine new concepts. If CSAM is being generated it’s because it was included in it’s training set which is highly suspected as we know the common corpus had CSAM in it: https://cyber.fsi.stanford.edu/news/investigation-finds-ai-image-generation-models-trained-child-abuse

permalink
report
parent
reply
1 point

Very, very good point. Depending on the answer, I retract the “victimless” narrative.

permalink
report
parent
reply
1 point

It’s interesting your bring this up. Not long ago I was having basically this exact same discussion with my brother. Baring you second point, I honestly don’t know how I feel.

On the one hand - if it’s strictly images for himself and it DOES dissuade seeking out real CSAM (I’m not convinced of this) then I don’t really see the issue.

On the other hand - I feel like it could be a gateway to something more (your second point). Kinda like a drug, right? You need a heavier and heavier hit to keep the same high. Seems like it wouldn’t be a stretch to go from AI generated imagery to actual CSAM.

But yeah, I don’t know. We live in an odd time for sure.

permalink
report
parent
reply
15 points

On the other hand - I feel like it could be a gateway to something m

You mean like marijuana and violent video games?

permalink
report
parent
reply
6 points

Except in the case of pornography, it’s an open question if viewing it has a net increase or decrease in sexual desire.
With legal pornography, it’s typically correlated with higher sexual desire. This tracks intuitively, since the existence of pornography does not typically seem to line up with a drop in people looking for romantic partners.

There’s little reason to believe it works the other way around for people attracted to children.
What’s unknown is if that desire is enough to outweigh the legal consequences they’re aware of, or any social or ethical boundaries present.
Studies have been done, but finding people outside of the legal system who abuse children is exceptionally difficult, even before the ethical obligation to report them to the police would trash the study.
So the studies end up focusing either on people actively seeking treatment for unwanted impulses (less likely to show a correlation), or people engaged with the legal system in some capacity (more likely to show correlation).

permalink
report
parent
reply
-5 points

Holy strawman, Batman! Just because someone uses the term “gateway” doesn’t mean they think that games and weed are going to turn all people and frogs gay and violent.

permalink
report
parent
reply
13 points

First off, this is obviously a sticky topic. Every conversation is controversial and speculative.

Second, I don’t really see a lot of legitimacy to the “gateway” concept. The vast majority of people use some variety of drug (caffeine, alcohol, nicotine), and that doesn’t really reliably predict “harder” drug use. Lots of people use marijuana and that doesn’t reliably predict hard drug use. Obviously, the people who use heroin and meth have probably used cocaine and ketamine, and weed before that, and alcohol/caffeine/nicotine before that, but that’s not really a “gateway” pipeline so much as paying through finer and finer filters. As far as I know, the concept has fallen pretty heavily out of favor with serious researchers.

In light of that perspective, I think you have to consider the goal. Is your goal to punish people, or to reduce the number and severity of victims? Mine is the latter. Personally, I think this sort of thing peels off many more low-level offenders to low-effort outlets than it emboldens to higher-severity outlets. I think this is ultimately a mental-health problem, and zero-tolerance mandatory reporting (while well-meaning) does more harm than good.

I’d rather that those with these kinds of mental issues have 1. the tools to take the edge off in victimless ways 2. safe spaces to discuss these inclinations without fear of incarceration. I think blockading those avenues yields a net increase the number and severity of victims.

This seems like a net benefit, reducing the overall number and severity of actual victims.

permalink
report
parent
reply
2 points
*

Thanks for being honest and well-meaning. Sorry you’re getting downvoted, we both said pretty much exactly the same thing! A difficult subject, but important to get right…

permalink
report
parent
reply
66 points

But, I’m still torn on the first scenario…

To me it comes down to a single question:

“Does exposure and availability to CSAM for pedophiles correlate with increased or decreased likelihood of harming a child?”

If there’s a reduction effect by providing an outlet for arousal that isn’t actually harming anyone - that sounds like a pretty big win.

If there’s a force multiplier effect where exposure and availability means it’s even more of an obsession and focus such that there’s increased likelihood to harm children, then society should make the AI generated version illegal too.

permalink
report
parent
reply
52 points

Hoooooly hell, good luck getting that study going. No ethical concerns there!

permalink
report
parent
reply
13 points

How they’ve done it in the past is by tracking the criminal history of people caught with csam, arrested for abuse, or some combination thereof, or by tracking the outcomes of people seeking therapy for pedophilia.

It’s not perfect due to the sample biases, but the results are also quite inconsistent, even amongst similar populations.

permalink
report
parent
reply
15 points

I think the general consensus is that availability of CSAM is bad, because it desensitizes and makes harming of actual children more likely. But I must admit that I only remember reading about that and don’t have a scientific source.

permalink
report
parent
reply
19 points

I’m willing to bet it’ll differ from person to person, to complicate matters further

permalink
report
parent
reply
7 points

I’m fine with it just being illegal, but realistically you could just ban the transmission and distribution of it and then you cover enforceable scenarios. You can police someone sending or posting that stuff, it’s probably next to impossible to police someone generating it at home.

permalink
report
parent
reply
3 points

Agreed. And props for making a point that isn’t palatable. The first one is complicated. Not many folk I talk to can set aside their revulsion and consider the situation logically. I wish we didn’t have to in the first place.

permalink
report
parent
reply

News

!news@lemmy.world

Create post

Welcome to the News community!

Rules:

1. Be civil

Attack the argument, not the person. No racism/sexism/bigotry. Good faith argumentation only. This includes accusing another user of being a bot or paid actor. Trolling is uncivil and is grounds for removal and/or a community ban.


2. All posts should contain a source (url) that is as reliable and unbiased as possible and must only contain one link.

Obvious right or left wing sources will be removed at the mods discretion. We have an actively updated blocklist, which you can see here: https://lemmy.world/post/2246130 if you feel like any website is missing, contact the mods. Supporting links can be added in comments or posted seperately but not to the post body.


3. No bots, spam or self-promotion.

Only approved bots, which follow the guidelines for bots set by the instance, are allowed.


4. Post titles should be the same as the article used as source.

Posts which titles don’t match the source won’t be removed, but the autoMod will notify you, and if your title misrepresents the original article, the post will be deleted. If the site changed their headline, the bot might still contact you, just ignore it, we won’t delete your post.


5. Only recent news is allowed.

Posts must be news from the most recent 30 days.


6. All posts must be news articles.

No opinion pieces, Listicles, editorials or celebrity gossip is allowed. All posts will be judged on a case-by-case basis.


7. No duplicate posts.

If a source you used was already posted by someone else, the autoMod will leave a message. Please remove your post if the autoMod is correct. If the post that matches your post is very old, we refer you to rule 5.


8. Misinformation is prohibited.

Misinformation / propaganda is strictly prohibited. Any comment or post containing or linking to misinformation will be removed. If you feel that your post has been removed in error, credible sources must be provided.


9. No link shorteners.

The auto mod will contact you if a link shortener is detected, please delete your post if they are right.


10. Don't copy entire article in your post body

For copyright reasons, you are not allowed to copy an entire article into your post body. This is an instance wide rule, that is strictly enforced in this community.

Community stats

  • 15K

    Monthly active users

  • 18K

    Posts

  • 466K

    Comments