Not a good look for Mastodon - what can be done to automate the removal of CSAM?

9 points
*

Is this Blahaj.zone admin “child abuse material” or actual child abuse material?

permalink
report
reply
4 points
*

Or maybe it’s better to err on the side of caution when it comes to maybe one of the worst legal offences you can do?

I’m tired of people harping on this decision when it’s a perfectly legitimate one from a legal standpoint. There’s a reason tons of places are very iffy about nsfw content.

permalink
report
parent
reply
-7 points
*

The article points out that the strength of the Fediverse is also it’s downside. Federated moderation makes it challenging to consistently moderate CSAM.

We have seen it even here with the challenges of Lemmynsfw. In fact they have taken a stance that CSAM like images with of age models made to look underage is fine as long as there is some dodgy ‘age verification’

The idea is that abusive instances would get defederated, but I think we are going to find that inadequate to keep up without some sort of centralized reporting escalation and ai auto screening.

permalink
report
reply
0 points

How do you plan to train the AI to recognise CP?

permalink
report
parent
reply
15 points

The problem with screening by AI is there’s going to be false positives, and it’s going to be extremely challenging and frustrating to fight them. Last month I got a letter for a speeding infraction that was automated: it was generated by a camera, the plate read in by OCR, the letter I received (from “Seat Pleasant, Maryland,” lol) was supposedly signed off by a human police officer, but the image was so blurry that the plate was practically unreadable. Which is what happened: it got one of the letters wrong, and I got a speeding ticket from a town I’ve never been to, had never even heard of before I got that letter. And the letter was full of helpful ways to pay for and dispense with the ticket, but to challenge it I had to do it it writing, there was no email address anywhere in the letter. I had to go to their website and sift through dozens of pages to find one that had any chance of being able to do something about it, and I made a couple of false steps along the way. THEN, after calling them up and explaining the situation, they apologized and said they’d dismiss the charge–which they failed to do, I got another letter about it just TODAY saying a late fee had now been tacked on.

And this was mere OCR, which has been in use for multiple decades and is fairly stable now. This pleasant process is coming to anything involving AI as a judging mechanism.

permalink
report
parent
reply
8 points

Off topic, but a few years ago a town in Tennessee had their speed camera contractor screw up in this way. Unfortunately for them, they tagged an elderly couple whose son was a very good attorney. He sued the town for enough winnable civil liability to bankrupt them and force them to disincorporate.

Speed cameras are all but completely illegal in TN now.

permalink
report
parent
reply
6 points
*

When I lived in Clarksville, they had intersection cameras to ticket anyone that ran a red light. Couple problems with it.

  1. Drivers started slamming on their brakes; causing more accidents
  2. The city outsourced the cameras, so they received only pennies on the dollar for every ticket.

I think they eventually removed them, but I can’t recall. I visited last September to take a class for work, and I didn’t see any cameras, so they might be gone.

permalink
report
parent
reply
7 points
*

THEN, after calling them up and explaining the situation, they apologized and said they’d dismiss the charge–which they failed to do

That sounds about right. When I was in college I got a speeding ticket halfway in between the college town and the city my parents lived in. Couldn’t afford the fine due to being a poor college student, and called the court and asked if an extension was possible. They told me absolutely, how long do you need, and then I started saving up. Shortly before I had enough, I got a call from my Mom that she had received a letter saying there was a bench warrant for my arrest over the fine

permalink
report
parent
reply
78 points

According to corporate news everything outside of the corporate internet is pedophiles.

permalink
report
reply
30 points

Well, terrorists became boring, and they still want the loony wing of the GOP’s clicks, so best to back off on Nazis and pro-Russians, leaving pedophiles as the safest bet.

permalink
report
parent
reply
3 points

Nazis not being the go-to target for a poisoning the well approach worries me in many different levels

permalink
report
parent
reply
3 points

Agreed. I’m in my 40s, and I’ve never seen anywhere near the level of subsurface signaling and intentional complacency we’re experiencing now.

permalink
report
parent
reply
17 points

I’m not actually going to read all that, but I’m going to take a few guesses that I’m quite sure are going to be correct.

First, I don’t think Mastodon has a “massive child abuse material” problem at all. I think it has, at best, a “racy Japanese style cartoon drawing” problem or, at worst, an “AI generated smut meant to look underage” problem. I’m also quite sure there are monsters operating in the shadows, dogwhistling and hashtagging to each other to find like minded people to set up private exchanges (or instances) for actual CSAM. This is no different than any other platform on the Internet, Mastodon or not. This is no different than the golden age of IRC. This is no different from Tor. This is no different than the USENET and BBS days. People use computers for nefarious shit.

All that having been said, I’m equally sure that this “research” claims that some algorithm has found “actual child porn” on Mastodon that has been verified by some “trusted third part(y|ies)” that may or may not be named. I’m also sure this “research” spends an inordinate amount of time pointing out the “shortcomings” of Mastodon (i.e. no built-in “features” that would allow corporations/governments to conduct what is essentially dragnet surveillance on traffic) and how this has to change “for the safety of the children.”

How right was I?

permalink
report
reply
2 points

…If you read it then you’d know if you’re rig

permalink
report
parent
reply
16 points

The content in question is unfortunately something that has become very common in recent months: CSAM (child sexual abuse material), generally AI-generated.

AI is now apparently generating entire children, abusing them, and uploading video of it.

Or, they are counting “CSAM-like” images as CSAM.

permalink
report
parent
reply
13 points

Of course they’re counting “CSAM-like” in the stats, otherwise they wouldn’t have any stats at all. In any case, they don’t really care about child abuse at all. They care about a platform existing that they haven’t been able to wrap their slimy tentacles around yet.

permalink
report
parent
reply
3 points

I’m not going to read all that. You were probably pretty right.

permalink
report
parent
reply
9 points

Halfway there. The PDF lists drawn 2D/3D, AI/ML generated 2D, and real-life CSAM. It does highlight the actual problem of young platforms with immature moderation tools not being able to deal with the sudden influx of objectional content.

permalink
report
parent
reply
1 point

@corb3t

Its quotes Thiel

Ew

Ew
EW EW EW EW EW EW EW EW

permalink
report
reply

Technology

!technology@lemmy.ml

Create post

This is the official technology community of Lemmy.ml for all news related to creation and use of technology, and to facilitate civil, meaningful discussion around it.


Ask in DM before posting product reviews or ads. All such posts otherwise are subject to removal.


Rules:

1: All Lemmy rules apply

2: Do not post low effort posts

3: NEVER post naziped*gore stuff

4: Always post article URLs or their archived version URLs as sources, NOT screenshots. Help the blind users.

5: personal rants of Big Tech CEOs like Elon Musk are unwelcome (does not include posts about their companies affecting wide range of people)

6: no advertisement posts unless verified as legitimate and non-exploitative/non-consumerist

7: crypto related posts, unless essential, are disallowed

Community stats

  • 2.6K

    Monthly active users

  • 2.7K

    Posts

  • 42K

    Comments

Community moderators