A.I. Is Making the Sexual Exploitation of Girls Even Worse::Parents, schools and our laws need to catch up to technology, fast.

You are viewing a single thread.
View all comments View context
20 points

Ai is definitely making things worse. When I was at school there was no tool for creating a deep fake of girls, now boys sneek a pic and use an app to undress them. That then gets shared, girls find out and obviously become distressed. Without ai boys would have to either sneak into toilets/changing rooms or psychically remove girls clothes.

I’m sorry but this is bullshit. You could “photoshop” someone’s face / head onto someone else’s body already before “AI” was a thing. Here’s a tutorial that allows you to do this within minutes, seconds if you know what you’re doing: https://www.photopea.com/tuts/swap-faces-online/

That’s an article about one company that provides an app for deep fakes. It’s a shell corp so not easy to shut down through the law and arrest people, also hundreds of teenage girls have been affected by others creating non consensual nudes of them.

Also very ignorant take. You can download Stable Diffusion for free and add a face swapper to that too. Generating decent looking bodies actually might take you longer than just taking a nude photo of someone and using my previous editing method though.

permalink
report
parent
reply
8 points

You could do everything before, that’s true, but you needed knowledge/time/effort, so the phenomenon was very limited. Now that it’s easy, the number of victims (if we can call them that) is huge. And that changes things. It’s always been wrong. Now it’s also a problem

permalink
report
parent
reply
7 points

This is right. To do it before you had to be a bit smart and motivated. That’s a smaller cross section of people. Now any nasty fuck with an app on their phone can bully and harass their classmates.

permalink
report
parent
reply
2 points

I’m not sure you listened to what I said or even attempted it yourself. The time / effort here is very similar, both methods have their own quirks that make them better or worse than the other, both methods however are very fast and very easy to do. In both cases the result should just be ignored as far as personal feelings go, and reported as far as legal matters go, or report things to your teachers. You don’t need special laws to file for harassment or even possible blackmail. This whole thing is just overblown fake hysteria and media panic because “AI” is such a hot topic at the moment. In a few years this will all go away again because no one really cares that much and real leaked nudes will possibly even declared a deepfake to confuse people.

permalink
report
parent
reply
0 points
*

The time / effort here is very similar, both methods have their own quirks that make them better or worse than the other, both methods however are very fast and very easy to do.

You’re lying to yourself and you must know that, or you’re just making false assumptions. But let’s go through this step by step.

Now with a “nudify” app:

  • install a free app
  • snap a picture
  • click a button
  • you have a fake nude

Before:

  • snap a picture
  • go to a PC
  • buy Photoshop for $ 30.- / month (sure) or search for a pirated version, dowload a crack, install it and pray that it works
  • find a picture that fits with the person you’ve photographed
  • read a guide online
  • try to do it
  • you have (maybe) a bad fake nude

That’s my fist point. Second:

the result should just be ignored as far as personal feelings go

Tell it to the girl who killed herself because everyone thought that her leaked “nudes” were actual nudes. People do not work how you think they do.

You don’t need special laws to file for harassment or even possible blackmail. This whole thing is just overblown fake hysteria and media panic because “AI” is such a hot topic at the moment

True, you probably don’t need new laws. But the emergence of generative AI warrants a public discussion about its consequences. There IS a lot of hype around AI, but generative AI is here and is having/will have a tangible impact. You can be an AI skeptic but also recognise that some things are actually happening.

In a few years this will all go away again because no one really cares that much and real leaked nudes will possibly even declared a deepfake to confuse people.

For this to happen, things will have to get WAY worse before they get better. And that means people will suffer and possibly kill themselves, like it’s already happened. Are we ready to let that happen?

Also we’re talking only about fake nudes, but if you think about the fact that GenAI is going to spread throughout every aspect of our world, your point becomes even more absurd

permalink
report
parent
reply

Technology

!technology@lemmy.world

Create post

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


Community stats

  • 18K

    Monthly active users

  • 11K

    Posts

  • 505K

    Comments