Someone used Shutterstock’s AI image generator to create them.

69 points

A note on the page warns, “Shutterstock does not review AI-generated content for compliance with Shutterstock’s content compliance standards.” Adding that users must not generate imagery that is “false, misleading, deceptive, harmful, or violent.”

“Pls don’t be bad mmkay?”

“We’ve done all we possibly can.”

permalink
report
reply
36 points

I’m curious what they call disturbing, but also don’t want to see in case they’re right.

permalink
report
reply
7 points

The article doesn’t mention nudity but what they described is still pretty fucked.

permalink
report
parent
reply
15 points

I think it really depends on what “young girl” means in this context. The title says “children”, but nowhere in the article does it say that. So I’m unsure if this is another AI-boogyman article, or something else.

permalink
report
parent
reply
4 points

A “young girl” would be a “child,” And multiple young girls would be children. 🤨

permalink
report
parent
reply
34 points

This may be controversial, but I don’t care what kind of AI-generated images people create as long as it’s obvious they’re not reality. Where I worry is the creation of believable false narratives, from explicit deepfakes of real people to completely fictional newsworthy events.

permalink
report
reply
2 points

I agree here. Im not worried about imaginary things except for their ability to appear like actual things and mess with truth.

permalink
report
parent
reply
1 point

I’ve read that pedophiles are more likely to act out on their urges if they have access to real images. I would guess that this also applies for ai generated images too, even if they don’t look 100% real, but I could be wrong on that. Whatever stops them from abusing kids is what I’m for.

permalink
report
parent
reply
4 points

I want to say research on the subject has been inconclusive overall. I’d certainly update my view given convincing evidence that fictional images lead to abuse of real children.

Of course, none of that has anything to do with the non-explicit video linked elsewhere in this thread of an adult woman using the toilet.

permalink
report
parent
reply
1 point
Deleted by creator
permalink
report
parent
reply
3 points

It’s not really CSAM if there is no abuse happening, is it?

permalink
report
parent
reply
-2 points

Yes, because it perpetuates demand.

permalink
report
parent
reply
9 points

There’s some pretty weird stuff on there, like kids taking a bath and someone on the toilet: https://www.shutterstock.com/video/clip-26807341-woman-sitting-on-toilet-bathroom-young-girl

permalink
report
reply
24 points

permalink
report
parent
reply
3 points

Nope nope nope. Not even risking it.

permalink
report
parent
reply
3 points

Called it

permalink
report
reply

Technology

!technology@lemmy.world

Create post

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


Community stats

  • 16K

    Monthly active users

  • 12K

    Posts

  • 554K

    Comments