2 points
*

My god there are way too many comments in here trying to normalize pedophilia. Disgusting. Pathetic.

These are people that need serious psychiatric care, not acceptance or to be included in the LGBTQ+ community. There is absolutely nothing to compare between them and any group within the LGBTQ+ community. Nothing.

Combatting CP is a hard enough task for the poor bastards that have to do it. There does not need to be AI produced images in the mix.

Lemmy, do better.

permalink
report
reply
28 points

Not that I think they should be included in LGBTQ+ but as someone who is bisexual I feel they’re not as far from us as you seemingly believe. Why wouldn’t we compare them? Both are sexual attractions that deviate from the norm. A pedophile didn’t choose to be a pedophile anymore than I chose to be bi.

Growing up in a conservative household and town was a miserable experience for me. I hated myself, didn’t want to accept it, and felt utterly alone. Now think about how much worse it must be to realize you’re attracted to children. You have zero allies, you have zero people you can talk to, and a lot of people hate you merely for existing and/or want you dead. From where I stand their experience echoes my experience being LBGTQ+ quite heavily. Except over my lifetime LBGTQ+ acceptance grew quite rapidly and my husband was the light at the end of the tunnel. But pedophiles will never get that, probably ever. I feel nothing but sympathy for their situation.

And what “serious psychiatric care” do you even think there is for it? Unless you also believe in gay conversion camps, we have nothing. We don’t even really know how sexuality actually works in the brain, we definitely aren’t anywhere close to being able to treat it.

permalink
report
parent
reply
-11 points

Why wouldn’t we compare them?

Really? What part of your sexuality, or mine, involves raping children? Nothing, right? One step back, what part of you being bisexual or my being trans involves harming anyone? That’s right, nothing.

I don’t have the answer of how to deal with those that are attracted to children. But to suggest psychiatric care for those who have serious pathology is akin to gay conversion camps is gross.

This is not some philosophical debate. Stop playing into the hands of bigots who are actively trying to paint LGBTQ+ folks, especially trans people at the moment, as “groomers” and “pedos”.

We are not associated or comparable with pedophiles in any way, shape or form—full stop.

permalink
report
parent
reply
-1 points

I’m with you. Lot of goofs in this thread. Fucking hell.

permalink
report
parent
reply
10 points
*

Most people in jail for abusing children are not pedophiles, but normal rapists and kids unfortunelately just happen to be easy targets. Even most pedophiles have morals. They know what they like is wrong and they wouldn’t want to hurt anyone. Just like most men aren’t rapists despite being turned on by women.

Just imagine being born as someone with these urges. What a shitty fucking hand you’ve been dealt and as if that’s not bad enough, people want to murder you just for coming out and asking for help.

permalink
report
parent
reply
-18 points

Im sorry, they make it make sense by using disease. They can’t just say paedophiles are bad because they dont want to beleive in ‘bad’. It is a philosophy debate though, its evil versus sick. They’ll agree you’re not evil but you’ll get lumped into sick.

permalink
report
parent
reply
18 points

Tf are you talking about, unless being gay involves raping men, being pedo also doesn’t involve raping children. Even as a cishet non-pedo you will often encounter situations where acting on some attraction you feel would be anywhere from morally questionable to straight up illegal, and most of us manage to deal with that just fine. Of course that’s going to be tougher for someone whose entire experience consists of that, rather than just part of it, but nothing about being pedo forces you to become a child-raping piece of shit.

Of course psychiatric care is important, but the point the other commenter was making is that it’s currently impossible to change anyone’s attraction, so it’s not a pathology that can be “cured” in this way. Any psychiatric care currently has to be aimed at helping people deal with being pedo without acting on it and also not developing any other psychological afflictions because of suppressing their attraction. Trying to “cure” the attraction itself would indeed be akin to gay conversion therapy: there’s no scientific evidence it works, and it’s going to do more harm than good.

permalink
report
parent
reply
8 points

Consensual non-consent folks be like

permalink
report
parent
reply
-6 points

Excuse me what? I’m pansexual and fucking what? I’m nothing like a kiddy diddler. I don’t revel in the agony inflicted onto a child. These people get off on violence and destroying people. These victims are never the same again. That’s why parents catching someone doing this to a child will kill the perpetrator and nobody would fault them. Pedos are criminally insane if anything.

permalink
report
parent
reply
24 points

And what “serious psychiatric care” do you even think there is for it? Unless you also believe in gay conversion camps, we have nothing. We don’t even really know how sexuality actually works in the brain, we definitely aren’t anywhere close to being able to treat it.

There’s programmes that focus on how to deal with it in a societally acceptable way, mainly on how not to become a predator. That’s a pretty good start.

permalink
report
parent
reply
-11 points

Off the bat, I wholly disagree with the idea that this should have been legal. That filth, even if AI generated, should be illegal for a multitude of reasons, one of them being that it allows those… urges to be practiced. I’m not one for the slippery slope fallacy but in situations where it could escalate to real child abuse, there should be zero tolerance and indulgence. If it’s a mental illness, they don’t need to fulfill that urge.

That said, I think the people suggesting otherwise here are just looking at it from a perspective of numbers and nothing else, with little consideration of the significant downsides. The stance also ignores that offenders are likely in it for the taboo more than actual interest in kids— it sure seems like Epstein’s friends were mostly doing it because they could, and it was a new level of depravity to try. If you ignore all of these, AI generated filth could indeed reduce actual child abuse. That’s a good thing and theoretically comes with no additional suffering, right?

I see this as naivety. Rude to imply about others here but better than CSAM apologism. It’s about the best I can think of, and I try to assume the best in people these days.

Also to make clear why I think the slippery slope is valid here, making some form of that awful “interest” legal dramatically lowers the bar of entry. And unlike violent films that are accused of increasing violence, that filth will never have wider societal acceptance, so a legal but taboo on-ramp is more likely to lead to illegal and taboo viewing, then perhaps onto the real thing. Society should never be willing to risk that by indulging in their mental illness.

permalink
report
parent
reply
37 points
*

I think pedophiles should be treated with compassion, as being a pedophile doesn’t make someone a sexual predator.

IMO the stigma against pedophiles worsens their mental state and could push them to become sexual predators. This is just a guess though.

However, I do think “treatment” of pedophilia with generated CP should only be tried after conducting proper research into the actual effectiveness of it (maybe with general sex offenders and regular porn). In the end I think the top priority should be to minimize the amount of pedophiles who are also predators, in order tI think pedophiles should be treated with compassion, as being a pedophile doesn’t make someone a sexual predator.

permalink
report
parent
reply
-25 points

Sex offenders aren’t allowed to watch porn because the evidence suggests it doesn’t treat the behavior, but encoureges it.

permalink
report
parent
reply
14 points

Having a hard time finding the evidence you mention, got a citation? First few articles I saw were actually advising against blanket pornography bans.

permalink
report
parent
reply
-23 points

Thanks chatbot

permalink
report
parent
reply
-7 points

The stigma against racism and sexism I guess a are also making people want to hurt these groups?

permalink
report
parent
reply
8 points

I believe racism and sexism are choices, while I think most pedophiles would prefer not to be pedophiles. If a pedophile doesn’t hurt anyone, why should people want to hurt him?

permalink
report
parent
reply
-23 points

The psychologists have tried to normalize everything and sympathy for the devil is the greatest signal of one’s virtuous compassion. There is no evil anymore, all characters are grey, just ask the game of thrones fans about all the sadistic psychopaths in that story, none are truly bad.
One day soon someone will build a robot version of their own child to rape and abuse and people will hail it as the perfect solution. And when that child finds out they will be told to take a chill pill because there is no harm done.
Paepophiles are not sick, they are part of the natural variation of the species. Sometimes those variations are harmful and that needs to be addressed. If someone died and made me god i would murder suicide everyone who isnt a card carrying vegan pacifist. Yes, im a monster too. Failing that, i vote we name, shame and imprison the bad people. Yes, i beleive in good and bad. No im not religious.
The left will never get anywhere with this moral nihilism.

permalink
report
parent
reply
18 points

You sound like a scientologist.

permalink
report
parent
reply
12 points
*

I’m trying to be better by not treating all pedophiles as child-abusers-in-waiting. Humans are capable of not acting on base immoral instincts.

permalink
report
parent
reply
2 points

Right, not everyone who wants to kill someone is a murderer.

permalink
report
parent
reply
104 points

Hear me out on this one:

If we take it as given that pedophilia is a disorder and ultimatly a sickness, wouldn’t it be better that these people get their fix from AI created media than from the real thing?

IMO there was no harm done to any kid in the creation of this and it would be better to give these people the fix they need or at least desperately desire in this way before they advance to more desperate and harmful measures.

permalink
report
reply
-11 points

Exxept for the source csam required to get the model started, of course.

permalink
report
parent
reply
64 points

That is not required. Especially in the larger models like a DALLE-3 it can combine concepts even without being directly trained on it. The one they had in the showcase for DALLE-2 was a chair shaped like an avocado. It knows what a chair is and it knows what an avocado is, so it can combine them. So it can know “this is what a naked human looks like” and “this is what a human child looks like” and could combine them without having ever seen CSAM.

permalink
report
parent
reply
-61 points

Did somebody audit the dataset?

Ya, I thought so…

permalink
report
parent
reply
1 point
Deleted by creator
permalink
report
parent
reply
-1 points

Jfc what’s with these pedo apologists. If someone were a cannibal, would it be totally fine to just give them human flesh removed from surgeries or dead people? Maybe let him pay people to eat them and drink their blood? AI images are trained on actual CP and CP anyway should not be normalized. If someone has ideation of violence then the last thing you do is feed those ideations. Would you think a suicidal person should watch simulated suicide? Why would watching simulated acts of depraved violence because you enjoy them somehow prevent you from committing that act yourself? If you enjoy something that much then you are thinking about doing it yourself.

permalink
report
parent
reply
4 points

Actually the analogy here would be to give “wannabe cannibals” synthetic meat/stuff that tastes like human meat/stuff

permalink
report
parent
reply
-34 points
*

Because eventually looking at images might not be enough

Edit: Do we want to be normalising this? It’s disturbing how there’s people defending it.

permalink
report
parent
reply
45 points

By that logic almost everyone in Hollywood should be in prison for depicting violence, murder, rape etc in movies/shows etc. This argument was put to rest back in the '90s.

permalink
report
parent
reply
-2 points
*
Deleted by creator
permalink
report
parent
reply
51 points

Bro I’ve watched a lot of regular porn and never once have I gone out and thought “why yes I’d sure like to rape that person”

permalink
report
parent
reply
-17 points

But you will at least have an outlet if you get yourself a partner or hire an escort. There’s the prospect of sex in real life. You’re not forever limited to porn.

permalink
report
parent
reply
27 points

Slippery slope argument goes brrrrrr

permalink
report
parent
reply
-35 points

That’d be like giving an alcoholic a pint by the end of the week to reward their alcoholic behavior that they’d want out of.

That’d be like giving money to a gambling addict as they promise to ‘pay you back’ for the loan you’ve given them.

My point is, enabling people’s worst habits is always a bad idea.

And how can you guarantee for certain that after awhile of these AI-generated CP crap, that they eventually wouldn’t want the real thing down the road and therefore, attempt crimes?

Your solution is just dumb altogether.

permalink
report
parent
reply
35 points

…Aren’t drug patches already a thing for more extreme drugs? I feel like you just gave bad examples when there’s actual examples that exist…

permalink
report
parent
reply
29 points

There is literally no data to back up your slippery slope argument.

permalink
report
parent
reply
2 points

You really like spamming that “slippery slope” term, don’t you? It’s like your ultimate go-to for feeling like you’re superior. Just wait until you use it in a context where you’ll look like a dumbass, one of these days in where it doesn’t fit.

permalink
report
parent
reply
28 points

I’m not an expert on the psychology of pedophilia, but I don’t think it has anything to do with addiction. It seems to be a paraphilia/disorder.

https://en.wikipedia.org/wiki/Pedophilia

permalink
report
parent
reply
1 point

I don’t claim to be an expert either, but it’s kind of a no-brainer to see what addiction is and what it does to people. Really simple stuff.

permalink
report
parent
reply
1 point

No, it’s like flooding the rhino horn market with fake rhino horn. Literally.

permalink
report
parent
reply
-7 points
Deleted by creator
permalink
report
parent
reply
-18 points

Sex offenders aren’t allowed to watch porn at all in my state.

Because science suggests watching porn, and getting your fix as you put it, through porn, encourages the behavior.

Watching child porn teaches the mind to go to children to fulfill sexual urges. Mindfulness practice has been shown to be effective in curbing urges in all forms of addiction.

So, no. Just no to your whole post.

There’s effective treatment for addictions, rather sexual or otherwise. Rather the addiction feeds on children or heroin. And we don’t need to see if fake child porn helps. Evidence already suggests it doesn’t and we already have effective treatments that don’t put children at risk and that don’t encourage the behavior.

permalink
report
parent
reply
4 points

This isn’t about addiction, it’s about sexuality. And you can’t just curb your whole sexuality away. These people have a disorder that makes them sexually attracted to children. At this point there is no harm done yet. They just are doomed to live a very unfulfilling life, because the people with whom they want to engage in sexual practices can’t give their consent, which is morally and legally required, no question about that. And most of them don’t give in to these urges and seek the help they need.

But still, you can’t just meditate your whole sexuality away. I don’t want to assume, but I bet you also masturbate or pleasure yourself in one way or another, I know I do. And when I was young, fantasy was all I needed, but then I saw my first nude and watched my first porno and it progressed from there, and I’m sure fantasy won’t be enough for these people as well. So when they get to the stage where they want to consume media, I prefer it to be AI created images or some drawn hentai of a naked young girl or whatever, and not real abused children.

permalink
report
parent
reply
11 points

As mentioned on another one of your comments, I am having a hard time finding the science you reference.

permalink
report
parent
reply
17 points

Not judging/voting your comment, do you have the data at hand? Just out of interest.

Some input though, you are not making a difference between offenders and non-offenders and i doubt there is even good data on non offenders to begin with.

permalink
report
parent
reply
-19 points

What do you think those AIs models are trained on?

permalink
report
parent
reply
21 points
*
Deleted by creator
permalink
report
parent
reply
-10 points
*

So it wasn’t trained on pictures of astronauts and pictures of horses?

permalink
report
parent
reply
4 points

I think that astronaut has hooves for hands

permalink
report
parent
reply
16 points

Not child porn. AI produces images all the time of things that aren’t in its training set. That’s kind of the point of it.

permalink
report
parent
reply
-15 points
*

AI produces images all the time of things that aren’t in its training set.

AI models learn statistical connections from the data it’s provided. It’s going to see connections we can’t, but it’s not going to create things that are not connected to its training data. The closer the connection, the better the result.

It’s a pretty easy conclusion from that that CSAM material will be used to train such models, and since training requires lots of data, and new data to create different and better models…

permalink
report
parent
reply
-60 points

Just stop being attracted to kids you sicko

permalink
report
parent
reply
21 points

That’s like telling gay dudes to stop liking dick. It’s brain chemistry and neural circuits, you can’t exactly just snap your fingers and be rid of the problem. Humans are complex creatures.

permalink
report
parent
reply
-14 points
*

Pedophilia is not akin to being gay (and kindly fuck off with that tyvm). It’s akin to rape, or sexual sadism (and I mean real, violent sadism, not roleplay). It is a predatory inclination and @nxsfi is right - trying to frame it as an “orientation” does sound like MAP acceptance rhetoric.

permalink
report
parent
reply
-3 points

Obligatory “are you comparing pedophiles to LGBTQ?”

permalink
report
parent
reply
6 points

Stop veing gay, stop being trans, stop being attracted to fatties, stop being attracted to small people.

permalink
report
parent
reply
-17 points

Stop trying to make LGBTP a thing

permalink
report
parent
reply
37 points
*

This is like telling someone to “stop liking rock music” or “stop enjoying ice cream.” People don’t decide what their preferences are, they just have them. If we can give pedophiles a way to release those urges without harming children that should be a good thing. Well not good, but positive in the relative sense at least.

permalink
report
parent
reply
-42 points

Sounds exactly like MAP acceptance rhetoric to me.

permalink
report
parent
reply
8 points

“Anyone who disagrees with me is a child rapist.” That’s the level of argumentation I expect from a child or a fascist.

permalink
report
parent
reply
-2 points

What’s more disgusting, nonces or those who say that it’s fascist to ban them?

permalink
report
parent
reply
4 points

Or politicians who want chat control

permalink
report
parent
reply
-14 points

I dont beleive its a sickness. Humans vary in innumerable ways and defining natural variations as sickness is a social decision, not medical. If you look at the DSM you will find that that social problems are sometimes given as a reason for defining something as illness. This is just the medicalisation of everything.
Even if you grant that its a sicknesd, how does it follow that sickness should therefore be treated by AI? I see no argument or logic here. Do you think harm would be done if the paedophile knows the child? If the child finds out they are the object of rape fantasies? If you find you are married to a person who gets off on raping children? Your children?
Do you allow for disgust and horror at sadistic desires or are we ‘not allowed to kink shame’.

permalink
report
parent
reply
32 points

That’s basically how I feel. I’d much rather these kinds of people jack it to drawings and AI Generated images if the alternative is that they’re going to go after real chidlren.

permalink
report
parent
reply
-48 points

At some point the fake images won’t do it for them and then they’d fix their attention to real kids. We don’t want to wait for that to happen.

It’s like using a drug with your threshold increasing each time you use, they’re will be a time that your old limit will have no effect on your satisfaction level.

permalink
report
parent
reply
34 points
Deleted by creator
permalink
report
parent
reply
28 points

Source: dude just trust me

permalink
report
parent
reply
36 points

Is that proven or just bullshit speculation?

permalink
report
parent
reply
28 points
*

By your logic, does everyone who’s into bdsm have a sex dungeon in their bedroom?

Your comment reduces everyone to their base fetishes, as if that were the only thing enacting pressure on an individual to act, and I don’t believe that’s the case.

permalink
report
parent
reply
12 points
*

Do you know how much porn there is of the My Little Pony characters? Tons

Do you know how much of an epidemic there is of cartoon watchers going out and fucking ponies? Somewhere between null and zilch… Maybe one or two extreme cases, but that’s around the same amount of people who watch Super Hero movies and try to jump off the roof in order to fly.

This is a slippery slope fallacy if I ever saw it.

Heck, if anything we’ve seen that restrictions on porn actually leads to increased instances of sexual assault, in the same way a crackdown on drugs just leads to more deaths from overdoses.

If letting some sicko have fake images of pretend children saves even one real child from being viciously exploited, I think it’s worth it.

It’s not ideal and yeah, it makes the skin of any sane person crawl… Ideally we should be out curing pedophiles of their sexual urges entirely, but since we don’t have a way to do that why let perfect be the enemy of good? I mean what other ideas do we have? Cause “To Catch A Predator” may have been good television, but even that had ethical concerns ending in lawsuits lost and suicides performed, and castrated everyone convicted isn’t exactly 8th Amendment friendly… and even then that prevents repeat offenses, not initial offenses. (Prevention > Cure)

Now all this aside, we do need to look at this on a case by case basis. If real children are being used to model for the AI or fake images are used as a form of blackmail (Think “Revenge Porn”, but way, way worse), then cuffs need to be slapped on people.

permalink
report
parent
reply
52 points

Some of the comments here are so stupid: “either they unpedophile themselves or we just kill them for their thoughts”

Ok so let me think this through. Sexual preferences in any way or pretty normal and they don’t go away. Actually if you tend to ignore them they become stronger. Also being a pedophile is not a crime currently. It’s the acting on it. So what happens right now is that people bottle it up, then it gets too much and they act on it in gruesome ways, because “if I go to prison I might as well make sure it was worth it”. Kids get hurt.

“But we could make thinking about it illegal!” No we can’t. Say that’s a law, what now? If you don’t like someone, they’re a “pedophile”. Yay more false imprisonment. Also what happens to real pedophiles? Well they start commit more acts because theres punishment even for restraint. And the truth is a lot of ppl have pedophilic tendencies. You will not catch all of them. Things will just get worse.

So why AI? Well as the commenter above me already said, if there’s no victim, there’s no problems. While that doesn’t make extortion legal (I mean obv. it’s a different law), this could make ppl with those urges have more restraint. We could even still limit it to specific sites and make it non-shareable. We’d have more control over it.

I know ppl still want the easy solution which evidently doesn’t work, but imo this is a perfect solution.

permalink
report
parent
reply
-1 points

Pedo isn’t a sexual preference anymore than cannibal a dietary one…

permalink
report
parent
reply
2 points

You know what? Sure. Imagine I find ppl really taste, especially hands. But I never chew on one. I just think about it. Literally the same thing. You should be rewarded for restraint on these urges. If I’d get punished for thinking about munching on a thumb, I’d at least take a hand with me to jail. I’m going there anyway.

permalink
report
parent
reply
19 points

I largely agree with what you’re saying and there definitely is no easy solution. I’ve never understood why drawings or sex dolls depicting underage people are illegal in some places, as they are victimless crimes.

The issue with aigen that differentiates it a bit from the above is the fidelity. You can tell a doll or an anime isn’t real, but in a few years from now it’ll b difficult to spot aigen images. This isn’t unique to this scenario though, it’s going to wreck havok on politics, scams, etc, but there is the potential that real CP comes out from hiding and is somewhat shielded by the aigen.

Of course this is just speculation, I hope it would go the other way around and everyone just jacks off at their computers and CP disappears completely. We need to stop focusing our attention on people with pedophila, get them mental support, and focus on sex offenders who are actually hurting people.

permalink
report
parent
reply
3 points

I’m all for letting people have their dolls, drawings, and AI generated stuff, but yeah… it would become easy for offenders to say “Naw, I snatched that shit off of DALL-E.” and walk in court, so some kind of forensic tool that can tell AI Generated Images from Real Ones would have to be made…

Actually there’s a lot of reasons we’d want a tool like that that have nothing to do with hypothetical solutions to kiddie diddling.

Can you imagine how easy extortion would become if you could show an AI pictures of your neighbor next door killing some rando missing person in the area? But every new technology enables crime, until we find out what the proper safeguards are so I’m not too worried about it in the long-term.

permalink
report
parent
reply
1 point
*

There’s also a difference (not sure if clinically) between people who sexualize really young kids and someone who likes kids that are under the age that whatever society has decided splits children and adults. In the USA porn depicting the latter is fine as long as everyone is over the age of adulthood, even if they dress up to look younger.

I think in general people who refer to pedophilia are usually referring to the former and not the 30 year old dating a 17 year old or whatever. But the latter makes it a little weird. Images of fictional people don’t have ages. Can you charge anyone who has aigen porn with csam if the people depicted sorta look underage?

Ai generated content is gonna bring a lot of questions like these that we’re gonna have to grapple with as a society.

permalink
report
parent
reply
1 point

The first part of your comment is rather confusing to me, but the latter part I fully agree with. Decoding age on appearance is a thing that will haunt us even more with AI until we face new solutions. But that is gonna be one of a list of big questions to be asked in conjunction with new AI laws.

permalink
report
parent
reply
10 points

I pretty much agree, while we should never treat Pedophilia as “Just another perfectly valid sexuality, let’s throw a parade, it’s nothing to be ashamed of” (Having the urge to prey on children is ABSOLUTELY something to be ashamed of even if you can’t control it.), we need to face facts… It isn’t someone waking up one day and saying “Wouldn’t it be funny if I took little Billy out back and filled him full of cock?”

It’s something going on in their head, something chemical, some misfiring of the neurons, just the way their endocrine system is built.

As much as I’d love to wave a magic wand over these people I reluctantly call people and cure them of their desires, we don’t have the power to do that. No amount of therapy in the world can change someone’s sexual tastes.

So in lieu of an ideal solution, finding ways to prevent pedophiles from seeking victims in the first place is the next best thing.

It’s not dissimilar to how when we set up centers for drug addicted people to get small doses of what they’re addicted to so that they can fight withdrawal symptoms, crimes and death rates go down. When you enact things like universal basic income and SNAP, people have less of a reason to rob banks and gas stations so we see less of them.

It’s not enough to punish people who do something wrong, we need to find out why they’re doing it and eliminate the underlying cause.

permalink
report
parent
reply
10 points
Deleted by creator
permalink
report
parent
reply
-43 points

While I don’t disagree with the initial premise, image AI requires training images.

I suppose technically you could use hyper-realistic CGI CSAM, and then it could potentially be a “victimless” crime. But the chances of an AI being trained solely on CGI are basically non-existent. Photorealistic CGI is tough and takes a lot of time and skill to create from scratch. There are people whose entire careers are built upon photorealism, and their services aren’t cheap. And you’d probably need a team of artists (not just one artist, because the AI will inevitably end up learning whatever their “style” is and nothing more,) who are both capable and willing to create said images. The chances of all of those pieces falling into place are damned near 0.

Maybe you could supplement the CGI with young-looking pornstar material? There are plenty of pornstars whose entire schtick is looking young. But they definitely don’t look like children because the proportions are obviously all wrong; Children have larger heads compared to their bodies, for example. That’s not something that an adult actress can emulate simply by being flat chested. So these supplemental images could just as easily end up polluting (for lack of a better word) your AI’s training, because it would just learn to spit out images of flat chested adult women.

permalink
report
parent
reply
34 points

Generative Ai is perfectly capable of combining concepts. Teach it how do today do photorealistic underage and photorealistic porn and or can combine them together to make csam without ever being trained on actual csam

permalink
report
parent
reply
-7 points

No, its not.

permalink
report
parent
reply
14 points

In the US we ignore mental illness, make excuses for it, and then patiently wait until sometime terrible happens.

permalink
report
parent
reply
4 points

And when it does, can’t do anything about it “While making sure this never happens again is a noble goal, let’s not politicize this tragedy.”

Or as they say over in Europe “Apparently the Americans say there’s no way to prevent that problem that literally doesn’t happen anywhere else in the world.”

permalink
report
parent
reply
-46 points
Removed by mod
permalink
report
parent
reply
30 points

My autism can also be cured by d*ing… my ADHD can be fixed forever by the same thing. They come with intrusive thoughts, do you also want the final penalty for people like me?

I’m not apologizing for people’s crimes or intentions of a crime at all, but your argument is complete bonkers if you want societies to just behave like that.

permalink
report
parent
reply
-34 points

pedophilia is a crime…

Read the original comment

permalink
report
parent
reply
27 points

Cool solution to kill people that did not offend. You sound like a real humanist. Do you by any chance run some for profit prison?

permalink
report
parent
reply
3 points

Wrongthink. You are no longer allowed to feel this way under penalty of satisfying our bloodlust. /s

permalink
report
parent
reply
78 points

You have a point, but in at least one of these cases the images used were of the girls around them and even tried extorting one is then. Issues like this should be handled on a case by case basis.

permalink
report
parent
reply
0 points
Deleted by creator
permalink
report
parent
reply
57 points

I’m very conflicted on this one.

Child porn one of those things that won’t go away if you prohibit it, like alcohol. It’ll just go underground and cause harm to real children.

AI child pornography images, as disturbing as they might be, would serve a “need”, if you will, while not actually harming children. Since child pornography doesn’t appear to be one of those “try it and you’ll get addicted” things, I’m genuinely wondering if this would actually reduce the harm caused to real children. If so, I think it should be legal.

permalink
report
reply
-7 points

I’m genuinely wondering if this would actually reduce the harm caused to real children. If so, I think it should be legal.

So tired of seeing this point made. Allowing animated or AI generated CSAM to exists openly and legally will not reduce violence against childern. It will increase it. It will normalized it.

You seem to think people who are willing and capable of commiting sexual violence against childern are going to do it less when theres a robust market of leaglly accessable CSAM.

It wont. it will instead create predator pipelines. It will take people with mild sexual disorders and histories of their own sexual assualts as childern and feed them CSAM. It will create more predators.

It will allow for communities of pedophiles to exist openly, findable on google searchs and advertised on regular porn sites.

Also the people who make AI generated CSAM are not going to be water marking it a AI genrated.

They are going to make it as real as possible. it will be indistinguishable to the naked eye and thus allow for Actual CSAM to masquarade and AI generated.

I could go on. But im not an expert on any of this.

permalink
report
parent
reply
8 points
*

You completely ignored the “state controlled generation and access” part of the argument. Experience with addictive drugs has shown us that tightly controlled access, oversight and possibly treatment can be a much better solution than just making it illegal. The truth is that we just don’t know if it would work the same with CSAM, but we do know that making it a taboo topic doesn’t work.

permalink
report
parent
reply
-2 points

There’s no parallel here. Providing safe access to drugs reduces harm to the user and the harm done by the black-market drug trade. Normalising AI-generated CSAM might reduce the harm done to children during production of the material but it creates many more abusers.

The parallel only works if the “state controlled generation and access” to drugs was an open shop handing out drugs to new users and creating new addicts. Which is pretty much how the opiate epidemic was created by drug companies, pharmacists and doctors using their legitimate status for entirely illegitimate purposes.

permalink
report
parent
reply
1 point

You make a huge amount of claims, all as fact. How do you know that any of it is true? I’m not trying to defend rapists and pedophiles, I’m trying to think rationally and pragmatically about how to solve or at least improve this problem. Your reaction to it seems to be more emotional than rational and factual.

permalink
report
parent
reply
1 point

I’m trying to think rationally and pragmatically

Ahh yes the rational thought process which leads you to think a government is capable of Safely facilitating the production of csam. ???

They are unable to stop child poverty but totally capable to producing CSAM in a safe way…

Spare me Your fact finding mission.

Im not an expert or a social worker but i can tell But i can tell you that drug addiction and pedophilia are not the same.

To consider these two the same, as the original commentor did, is disgisting, offensive and ignorant.

There is no inherent victim with drug use. The same cannot be said pedophilia and Child sexual assualt.

While there is always a spectrum of people particpating in child victimization. The people who are the creators of the CSAM and those who participate in its distribution are not addicts. The are predators.

I’m not trying to defend rapists and pedophiles

Well you are…

permalink
report
parent
reply
35 points

Normalisation in culture has effects on how people behave in the real world. Look at Japan’s sexualization of women and minors, and how they have huge problems with sexual assault. It’s not about whether or not real children are getting hurt, it’s about whether it’s morally right or wrong. And as a society, we’ve decided that CP is very wrong as a moral concept.

permalink
report
parent
reply
24 points

Here’s the thing though, being too paranoid about normalization also makes the problem worse, because the truth is that these are people with severe mental problems, who in all likelihood want to seek professional help in most cases.

The problem is the subject is SO taboo that even a lot of mental health professionals will chase them off like rabid animals when the solution is developing an understanding that can lead to a clinical treatment plan for these cases.

Doing that will also help the CSAM problem too since getting people out of the alleyways and into professional help will shrink the market significantly, both immediately and overtime, reducing the amount of content that gets made, and as a result, the amount of children victimized to make that content.

The key factor remains, we have to stop treating these people like inhuman monsters that deserve death and far worse whenever they’re found. They’re sick in the head souls who need robust mental health care and thought management strategies.

permalink
report
parent
reply
3 points

None of that is an argument for normalisation via legalisation. Offenders and potential offenders should feel safe to seek help. Legalising AI-generated CSAM just makes it much less likely that they’ll see the need to seek help. In much the same way that rapists assume all men are rapists, because most men don’t make it clear that they’re not.

permalink
report
parent
reply
18 points

On the other hand, producing porn is illegal in India and they have huge problems with sexual assault too.

permalink
report
parent
reply
-4 points

Producing - sure. But consuming?

permalink
report
parent
reply
-3 points

You can certainly argue that AI-generated CSAM does less harm but you can’t argue from that to legalising it because it still does a bucketload of harm. Consumers of CSAM are very likely to harm real children and normalising CSAM makes that much more likely.

This argument is a non-starter and people really need to stop pushing it.

permalink
report
parent
reply
5 points

Consumers of CSAM are very likely to harm real children and normalising CSAM makes that much more likely.

If any of that was objectively true, then yeah, I agree. Problem is, it looks like you just pulled that out of your ass.

permalink
report
parent
reply
1 point

You’re literally claiming a bunch of things as facts. Any spur ea to back that up?

permalink
report
parent
reply
1 point

Isn’t AI art based on pre-existing content that’s been fed into the model?

permalink
report
parent
reply
21 points

Yes, but not in the way I think you’re implying, it is not trained on csam images. It can put the pieces together to varying degrees of success. If you ask for a Martian hedgehog in a tuxedo riding a motorcycle, it can create something looking like that without being trained on exactly that thing.

permalink
report
parent
reply
10 points

Martian hedgehog in a tuxedo riding a motorcycle

Just to prove your point I fed that into an AI (dreamshaper 8). no other prompts or anything, and this was the first image it generated.

permalink
report
parent
reply
13 points

I’d go more in the direction of state sponsored generation and controlled access.

If you want legal unlimited access to AI generated CSM, you need to register with the state for it and in so doing also close off access to positions that would put you in situations where you’d be more able to act on it (i.e. employment in schools, child hospitals, church youth leadership, etc).

If doing that, and no children are harmed in the production of the AI generated CSM, then you have a license to view and possess (but not redistribute) the images registered with the system.

But if you don’t have that license (i.e. didn’t register as sexually interested in children) and possess them, or are found to be distributing them, then you face the full force of the law.

permalink
report
parent
reply
-7 points

I think this idea rests on the false premise that people both need and have a right to pornography.

Many adults go about their lives without accessing it/getting off on it. It’s not a human need like food or shelter. So government isn’t going to become a supplier. Parallels could be made, I suppose, with safe injecting rooms and methadone clinics etc - but that’s a medical/health service that protects both the individual and the community. I don’t think the same argument could be made for a government sponsored porn bank.

permalink
report
parent
reply
12 points

You don’t think there’s an argument to be made that motivating people sexually attracted to children to self-report that attraction to the state in order to be monitored and kept away from children would have a social good?

I guess I just don’t really see eye to eye with you on that then.

permalink
report
parent
reply
-7 points

There’s no conflict and no discussion, fuck these piece of shit!

permalink
report
parent
reply
13 points
*

I’m thinking it should still be illegal but if they get charged for it, make it less severe than being charged with actual cp. This might naturally incentivize that industry to go for ai generated images instead of trafficking. Also I think if they took an image of an actual child and used AI to do this stuff it should be more severe than using a picture of a legal aged person to make cp.

permalink
report
parent
reply
4 points

There are many things still unclear about whether or not this will increase harm.

We don’t know how these images effect people and their behaviour. Many techbros online treat it like it’s a fact that media does not influence behaviour and thought processes, but if you look at the research this isn’t clear cut at all. And some research was able to show that specific media indeed does influence people.

Additionally, something rarely talked about, these images, stories and videos can be used to groom children and teenagers. Either to become victims and/or to become consumers themselves. This was a thing in the past and I bet it is still happening with Manga depicting Loli Hentai. Making these images legal will give groomers even better tool.

permalink
report
parent
reply
1 point

If Loli porn can turn people into pedophiles then I think humanity is having bigger issues

permalink
report
parent
reply
-6 points

Fuck them kids

permalink
report
parent
reply
15 points

I heard an anonymous interview with someone who was sickened by their own attraction to children. Hearing that person speak changed my perspective. This person had already decided never to marry or have kids and chose a career to that same end, low likelihood that kids would be around. Clearly, since the alternative was giving up on love and family forever, the attraction wasn’t a choice. Child porn that wasn’t made with children, comics I guess, was used to fantasize to prevent carrying through on those desires in real life.

I don’t get it, why anyone would be attracted to kids. It’s gross and hurtful and stupid. If people suffering from this problem have an outlet, though, maybe fewer kids will be hurt.

permalink
report
parent
reply
4 points

It’s an ethical dilemma. It’s just an extremely controversial one. You really have to weigh in whether or not we should keep chaos if it means betterment for society as we advance forward.

I don’t think things should be as black and white as legal or not. I think the answer lies somewhere between something like decriminalizing drugs. Mostly illegal, but could benefit those who are genuinely seeking help. It would just have to take me a lot of convincing on an individual to need to seek out this material or else they are a danger to those around them.

permalink
report
parent
reply
4 points

(Apologies if I use the wrong terminology here, I’m not an AI expert, just have a fact to share)

The really fucked part is that at least google has scraped a whole lot of CSAM as well as things like ISIS execution bids etc and they have all this stuff stored and use it to do things like train the algorithms for AIs. They refuse to delete this material as they claim that they just find the stuff and aren’t responsible for what it is.

Getting an AI image generator to produce CSAM means it knows what to show. So why is the individual in jail and not the tech bros?

permalink
report
reply
32 points
*

That’s a fundamental misunderstanding of how diffusion models work. These models extract concepts and can effortlessly combine them to new images.

If it learns woman + crown = queen

and queen - woman + man = king

it is able to combine any such concept together

As Stability has noted. any model that has the concept of naked and the concept of child in it can be used like this. They tried to remove naked for Stable Diffusion 2 and nobody used it.

Nobody trained these models on CSAM and the problem is a dilemma in the same way a knife is a dilemma. We all know a malicious person can use a knife for murder, including of children Yet society has decided that knives sufficient other uses that we still allow their sale pretty much everywhere.

permalink
report
parent
reply
-2 points
*

Editing this reply to say that I was in fact right and I did not have any fundamental misunderstanding of anything. And the database in question here is called LAIOn and contains 6 billions images scraped from the web, including CSAM images.

Thanks for that. As I said, I’m not big into how AI works, so not surprised I got that wrong. The databases of everything that has come across the clear web are still there though and are available for use by people with access.

permalink
report
parent
reply
5 points

What are you referring to by “the database of everything that has come across the clear web”?

permalink
report
parent
reply
2 points

Here you go bud, no misunderstanding at all. The image generators are trained on CSAM, as I said.

https://www.independent.co.uk/news/ap-study-developers-thorn-canada-b2467386.html

permalink
report
parent
reply
5 points

This can be used by pedophiles is used as an argument to ban cryptography… I wonder if someone will apply that to the generative AI.

permalink
report
parent
reply
1 point

Depends how profitable it is.

If it can replace workers no, if it threatens the big players like Disney yes.

permalink
report
parent
reply
2 points

Getting an AI image generator to produce CSAM means it knows what to show

Not necessarily. Part of AI is blending different concepts. AI trained on images of regular children and nude adults in principle should be able to produce underage nudity. This is a side effect of the intelligence in the AI

permalink
report
parent
reply
-33 points

ITT - Lemmy supports the pedos

permalink
report
reply
-5 points
*

Yeah, I thought he whole MAP bullshit died out, but apparently it’s alive and well on lemmy. It’s pretry sad.

permalink
report
parent
reply
-1 points

Yeah a lot of the comments, and votes in this thread are really gross.

permalink
report
parent
reply
-6 points
*

Comments in this thread makes me laugh especially the ‘its not pedophila’ parts LOOOOOOOOOOOL

permalink
report
parent
reply
21 points

Can’t have any nuanced discussion here! Glad to see people such as yourself engaging in reductionism and shutting down thinking, because all interactions online have to be boiled down to five words TL;DR pithy sound bites.

Leave the shit on Twitter, we can do better here.

permalink
report
parent
reply
3 points

I actually typed out a more lengthy response to someone here already, read more responses/viewed the vote counts, and then wrote this top level comment pointing out how backwards this community’s views are. No one is directly supporting assaulting children, but as I wrote elsewhere: “why do we value the sexual gratification of pedos higher than the potential safety of children?”

permalink
report
parent
reply
0 points
*

Who the heck is proposing we value that? Everyone is saying we value the safety of real children which may entail keeping artificial CP legal.

Also it’s a victimless crime so punishments dealt out are criticized heavily, and for good reason.

permalink
report
parent
reply
15 points
*

You clearly have chosen not to understand the assignments people are making in this thread. Either that or you’re choosing to misrepresent them. Literally nobody is supporting sexual assault of children or anyone else. But hey, don’t let that stop you from gloating about how morally superior you are.

permalink
report
parent
reply
18 points

Isn’t the point that Noone was harmed and that you shouldn’t get in prison for stuff that harmed nobody. I mean ew its disgusting but not worth jailing someone for it…

permalink
report
parent
reply

World News

!world@lemmy.world

Create post

A community for discussing events around the World

Rules:

  • Rule 1: posts have the following requirements:

    • Post news articles only
    • Video links are NOT articles and will be removed.
    • Title must match the article headline
    • Not United States Internal News
    • Recent (Past 30 Days)
    • Screenshots/links to other social media sites (Twitter/X/Facebook/Youtube/reddit, etc.) are explicitly forbidden, as are link shorteners.
  • Rule 2: Do not copy the entire article into your post. The key points in 1-2 paragraphs is allowed (even encouraged!), but large segments of articles posted in the body will result in the post being removed. If you have to stop and think “Is this fair use?”, it probably isn’t. Archive links, especially the ones created on link submission, are absolutely allowed but those that avoid paywalls are not.

  • Rule 3: Opinions articles, or Articles based on misinformation/propaganda may be removed. Sources that have a Low or Very Low factual reporting rating or MBFC Credibility Rating may be removed.

  • Rule 4: Posts or comments that are homophobic, transphobic, racist, sexist, anti-religious, or ableist will be removed. “Ironic” prejudice is just prejudiced.

  • Posts and comments must abide by the lemmy.world terms of service UPDATED AS OF 10/19

  • Rule 5: Keep it civil. It’s OK to say the subject of an article is behaving like a (pejorative, pejorative). It’s NOT OK to say another USER is (pejorative). Strong language is fine, just not directed at other members. Engage in good-faith and with respect! This includes accusing another user of being a bot or paid actor. Trolling is uncivil and is grounds for removal and/or a community ban.

Similarly, if you see posts along these lines, do not engage. Report them, block them, and live a happier life than they do. We see too many slapfights that boil down to “Mom! He’s bugging me!” and “I’m not touching you!” Going forward, slapfights will result in removed comments and temp bans to cool off.

  • Rule 6: Memes, spam, other low effort posting, reposts, misinformation, advocating violence, off-topic, trolling, offensive, regarding the moderators or meta in content may be removed at any time.

  • Rule 7: We didn’t USED to need a rule about how many posts one could make in a day, then someone posted NINETEEN articles in a single day. Not comments, FULL ARTICLES. If you’re posting more than say, 10 or so, consider going outside and touching grass. We reserve the right to limit over-posting so a single user does not dominate the front page.

We ask that the users report any comment or post that violate the rules, to use critical thinking when reading, posting or commenting. Users that post off-topic spam, advocate violence, have multiple comments or posts removed, weaponize reports or violate the code of conduct will be banned.

All posts and comments will be reviewed on a case-by-case basis. This means that some content that violates the rules may be allowed, while other content that does not violate the rules may be removed. The moderators retain the right to remove any content and ban users.


Lemmy World Partners

News !news@lemmy.world

Politics !politics@lemmy.world

World Politics !globalpolitics@lemmy.world


Recommendations

For Firefox users, there is media bias / propaganda / fact check plugin.

https://addons.mozilla.org/en-US/firefox/addon/media-bias-fact-check/

Community stats

  • 11K

    Monthly active users

  • 17K

    Posts

  • 279K

    Comments