54 points

People have been creating and posting realistic looking fake celebrity nudes for quite literally decades now, but now they’re using AI and its suddenly a problem?

permalink
report
reply
41 points

I’m not sure if you noticed, but people who write for a living have suddenly started writing quite a lot about how technology that can write and generate media are bad.

permalink
report
parent
reply
9 points

Which is so silly, because AI writing still needs a human editor. I write for a living and there tons of work that involves using AI as a tool to increase productivity rather than to replace writers completely… like photoshop didnt put photographers out of business it just changed the work flow.

permalink
report
parent
reply
12 points
*

I work in a clinical setting where some Doctors are trying an AI program for generating their clinical notes out of the casual conversation between them and the patient. It’s way off its mark for what we demand in quality. It requires significant editing from the healthcare provider, and if the note is very robust it quickly becomes more of a chore than modern voice transcription. Our review is not great so far.

permalink
report
parent
reply
34 points
*

It’s because a person can crank out a deep fake in 3 hours, and a crappy one in one. It never cropped up because… well lets be real it was a couple of weirdos that were doing it, unless it bubbles up from the dark corners of the internet you risk the Streisand effect by bringing attention to it.

AI can crank out 40 in a minute. 7200 in three hours. That’s an entirely different beast. The sheer mass and volume ramps up the odds of any image bubbling up from the dark corners of the web falling into the limelight and now this problem that wasn’t big enough to merit thought is rearing up it’s ugly head right in front of us.

You can generate unique pictures of Taylor Swift faster than even Taylor swift can generate pictures of Taylor Swift. Within one hour of Taylor swift being seen with a man (and you have enough images of the man) you can create a dozen images of her on a date with that man and attempt to sell them to paparazzi.

The problem is volume. Just like how email made everyone connected and allowed the Nigerian Prince scandal to occur.

permalink
report
parent
reply
22 points

It’s also not just limited to Taylor swift.

People don’t care now because people don’t thin famous people should have any right to dignity, but their minds are gonna change really quickly when it’s their sister, mother or daughter.

permalink
report
parent
reply
7 points

I don’t think many have gone viral on social media before or took less than 5 minutes to create. My uneducated guess is that previously this stuff would be in some niche forum in the recesses of the internet

permalink
report
parent
reply
3 points

My uneducated guess is that previously this stuff would be in some niche forum in the recesses of the internet

Not really. Back at the time there were public usenet groups specifically dedicated to the [hot actress of the day] fake porn.

permalink
report
parent
reply
4 points

I can see why people are upset, I can agree that distribution of these images can be an issue, but this has the same energy of “I am mad that a certain picture of me is on the internet, I demand that they take it down.” Sorry, that’s not going to happen any time soon.

permalink
report
parent
reply
2 points

You could be honest and acknowledge that there is a massive difference in time investment and skill required between the old way of creating fake porn of unconsenting people and the new way.

permalink
report
parent
reply
2 points

It’s now massively accessible and realistic. Yes, it’s a problem.

permalink
report
parent
reply
13 points

Google.com/images

Damn look at that, been accessible for decades

permalink
report
parent
reply
2 points

The tools are accessible. I wish this place wasn’t full of weirdo ai tech bros sometimes.

permalink
report
parent
reply
20 points

I was a part of a Blursed AI group on Facebook that had been a lot of fun, but suddenly this week it first shifted to Taylor Swift porn, and then to alt-right MAGA shit very quickly. The comments on the Trump and MAGA related images were very on-board with it too. The change was so abrupt that I got the fuck out of there. Seemed orchestrated.

permalink
report
reply
19 points
permalink
report
reply
2 points

Hahaha, that’s glorious!

permalink
report
parent
reply
13 points

Platforms, including X, where the images were first shared

They also now have a dedicated platform at their disposal. Expect more trash to be posted first on X in the future.

permalink
report
reply
1 point

Like back at the time they had Usenet groups, rotten.com and a long list of other sites to post fake porn and other discutible content.

Not that I am saying that is something right, just that is nothing new.

permalink
report
parent
reply

Technology

!technology@lemmy.world

Create post

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


Community stats

  • 18K

    Monthly active users

  • 11K

    Posts

  • 505K

    Comments