“Absolutely no way to prevent this”, says internet full of banners offering to “Undress your classmates now!”
“Tools are just tools, and there’s no sense in restricting access to undress_that_ap_chemistry_hottie.exe
because it wouldn’t prevent even a single case of abuse and would also destroy every legitimate use of any computer anywhere”, said user deepfake-yiff69@lemmy.dbzer0.com
It’s possible I just haven’t come across those types of comments you’re making fun of, but I usually just see people making the case that we don’t need new, possibly overreaching, legislation to handle these situations. They want to avoid a disingenuous “think of the children” kind of situation.
a youth court in the city of Badajoz said it had convicted the minors of 20 counts of creating child abuse images and 20 counts of offences against their victims’ moral integrity
I’m not familiar with their legal system but I would be willing to bet the crimes they’ve committed were already illegal under existing laws.
They are releasing stories like this to promote the new that requires adults to login to pornsites and to limit their use of it.
I read the headline and said oh come on. One paragraph in and that turned to what in the absolute fuck.
Are you surprised by teenage boys making fake nudes of girls in their school? I’m surprised by how few of these cases have made the news.
I don’t think there’s any way to put this cat back in the bag. We should probably work on teaching boys not to be horrible.
Teenagers are literally retarded. Like their reasoning centers are not developed and they physically cannot think. There’s no way to teach that
No, they’re not fully developed, but they distinguish actions morally speaking (even older children do) and they can choose to do better.
Having been a teenage boy myself, I wouldn’t dream of trying.
But I knew it wasn’t OK to climb a tree with binoculars to try to catch a glimpse of the girl next door changing clothes, and I knew it wasn’t OK to touch people without their consent. I knew people who did things like that were peeping toms and rapists. I believed peeping toms and rapists would be socially ostracized and legally punished more harshly than they often are in reality.
Making and sharing deepfakes of real people without their consent belongs on the same spectrum.
Being horny is one thing, sharing this stuff another. If whoever did the fake would’ve kept it to themselves, then nobody would’ve even known. The headline still is ass and typical “AI” hysteria though.
There are always two paths to take - take away all of humanity’s tools or aggressively police people who abuse them. No matter the tool (AI, computers, guns, cars, hydraulic presses) there will be somebody who abuses it, and for society to function properly we have to do something about the delinquent minority of society.
Guns do not belong in the list. Guns are weapons, not tools. Don’t bother posting some random edge case that accounts for approximately 0.000001% of use. This is a basic category error.
Governments should make rules banning and/or regulating weapons.
No matter the tool (AI, computers, guns, cars, hydraulic presses) there will be somebody who abuses it,
Hydraulic press channel guy offended you somehow? I’m missing something here.
We could also do a better job of teaching people from childhood not to be assholes.
Why not also go after these software companies for allowing such images to be generated. i.e allowing AI generated images of naked bodies on uploaded images to real people.
This sounds great, but it’s one of those things that is infinitely easier to say than do. You’re essentially asking for one of two things: Manual human intervention for every single image uploaded, or “the perfect image recognition system.” And honestly, the first is fraught with its own issues, and the second does not exist.
Under Spanish law minors under 14 cannot be charged
What about the parents?