You are viewing a single thread.
View all comments View context
11 points

We need more decentralization, a federated image/gif host with CSAM protections

permalink
report
parent
reply
2 points

How would one realize CSAM protection? You’d need actual ML to check for it, and I do not think there are trained models available. And now find someone that wants to train such a model, somehow. Also, running an ML model would be quite expensive in energy and hardware.

permalink
report
parent
reply
2 points

There are models for detecting adult material, idk how well they’d work on CSAM though. Additionally, there exists a hash identification system for known images, idk if it’s available to the public, but I know apple has it.

Idk, but we gotta figure out something

permalink
report
parent
reply

Memes

!memes@lemmy.ml

Create post

Rules:

  1. Be civil and nice.
  2. Try not to excessively repost, as a rule of thumb, wait at least 2 months to do it if you have to.

Community stats

  • 8.9K

    Monthly active users

  • 12K

    Posts

  • 264K

    Comments