There’s another round of CSAM attacks and it’s really disturbing to see those images. It was really bothering to see those and they weren’t taken down immediately. There was even a disgusting shithead in the comments who thought it was funny?? the fuck

It’s gone now but it was up for like an hour?? This really ruined my day and now I’m figuring out how to download tetris. It’s really sickening.

You are viewing a single thread.
View all comments View context
14 points
*

It will also make it a battle of attrition. Because now we’re not only using AI to block CSAM; Trolls are using AI to generate CSAM.

The issue is that these tools typically work by hashing the image (or a specific section of the image) and checking it against a database of known CSAM. That way you never actually need to view the file to compare it to the list. But with AI image generation, that list of known CSAM is essentially useless because trolls can just generate new images.

permalink
report
parent
reply
6 points

Even without the issue of new AI-generated images, those hash-based scanning tools aren’t available to hobbyist projects like the typical Lemmy instance. If they were given to hobbyist projects, it would be really easy for an abuser to just tweak their image collection until it didn’t set off the filter.

permalink
report
parent
reply
7 points

You can use CloudFlare’s CSAM scanning tool completely for free. You can’t get access to the hashes, which would allow what you are talking about.

permalink
report
parent
reply
5 points

Sure, for Lemmy instances who are Cloudflare customers. But I don’t think it can be integrated with the Lemmy code by default.

permalink
report
parent
reply
4 points

Bingo, that’s the issue. With an endless supply of fresh content, hash checking is dead

permalink
report
parent
reply
4 points

On the other hand, if the people who want those images can satisfy their urges using AI fakes, that could mean less spreading of images of actual abuse. It might even mean less abuse happening.

However, because they’re terrible people, I have to suspect that’s not the case.

permalink
report
parent
reply
4 points

People who create the content and insane monsters, but a LOT of actual pedos (vs predators looking for a power play) are disgusted by their preference. I know a ton of them look to cartoons already for stimulation, so I think AI content could draw more people away from actual material. Hopefully if demand reduces there will be less creation of new real content as the potential profits fall more proportionate to the risk.

permalink
report
parent
reply

Lemmy

!lemmy@lemmy.ml

Create post

Everything about Lemmy; bugs, gripes, praises, and advocacy.

For discussion about the lemmy.ml instance, go to !meta@lemmy.ml.

Community stats

  • 418

    Monthly active users

  • 628

    Posts

  • 6.6K

    Comments

Community moderators