A trial program conducted by Pornhub in collaboration with UK-based child protection organizations aimed to deter users from searching for child abuse material (CSAM) on its website. Whenever CSAM-related terms were searched, a warning message and a chatbot appeared, directing users to support services. The trial reported a significant reduction in CSAM searches and an increase in users seeking help. Despite some limitations in data and complexity, the chatbot showed promise in deterring illegal behavior online. While the trial has ended, the chatbot and warnings remain active on Pornhub’s UK site, with hopes for similar measures across other platforms to create a safer internet environment.

-23 points

Incredibly stupid and obviously false “think of the children” propaganda. And you all lap it up. They’re building aroubd you a version of the panopticon so extrene and disgusting that even people in the 1800s would have been outraged to use it against prisoners. Yet you applaud. I think this means you do deserve your coming enslavement.

permalink
report
reply
10 points

I keep asking myself why I haven’t blocked lemmy.ml

I keep telling myself I’ll lose ideas or comments from the good users there…

At this point, I’ll have just blocked all their users individually

permalink
report
parent
reply
4 points

I held off instance filtering lemmy.ml for months for all the reasons you mentioned but I finally gave up I did it 6 weeks ago. It made a marked improvement in my Lemmy experience so I’d advise to just do it.

permalink
report
parent
reply
1 point

you got any more of them blocks?

permalink
report
parent
reply
3 points

I strongly encourage you to block with abandon

permalink
report
parent
reply
11 points
*

The panopticon is… a chatbot that suggests you get help if you search for CSAM? Those bastards! /s

permalink
report
parent
reply
1 point

Classic slow boiling frog response. Enjoy the stew

permalink
report
parent
reply
29 points

And, why? I mean it’s nice of you to make these claims, but what the hell does reducing csam searches have to do with the panopticon and us becoming enslaved?

permalink
report
parent
reply
2 points

How is this building that?

Like I’m a privacy but and very against surveillance, but this doesn’t seem to be that. It is a model that seems like could even be deployed to more privacy friendly sites (PH is not that).

permalink
report
parent
reply
2 points

In context, each paver in the road to hell seems just and good intentionned

But after all we’ve been through, falling for this trick again, it’s a choice. Maybe they think, this time, they’ll be the ones wearing the boots.

permalink
report
parent
reply
1 point

But how does this at all enable anything to worry about?

permalink
report
parent
reply
25 points

This is one of the more horrifying features of the future of generative AI.

There is literally no stopping it at this stage: AI generated CSAM will be possible soon thanks to systems like SORA.

This is disgusting and awful. But one part of me hopes it can end the black market of real CSAM content forever. By flooding it with infinite fakes, users with that sickness can look at something that didn’t come from a real child’s suffering. It’s the darkest of silver linings I think, but I spoke with many sexual abuse survivors who feel the same about the loli hentai in Japan, in that it could be an outlet for these individuals instead of them finding their own.

Dark topics. But I hope to see more actions like this in the future. If pedos can self isolate from IRL interactions and curb their ways with content that harms no one, then everyone wins.

permalink
report
reply
-26 points

You’re hitting that “protest too much” shtick pretty hard

permalink
report
parent
reply
8 points

So your takeaway is I’m… Against AI generative images and thus I “protest too much”

I can’t tell if you’re pro AI and dislike me, or pro loli hentai and thus dislike.

Dude, AI images and AI video are inevitable. To pretend that does have huge effects on society is stupid. It’s going to reshape all news media, very quickly. If reddit is 99% AI generated bot spam garbage with no verification of what is authentic, reddit is functionally dead, and we are on a train with no brakes in that direction for most public forums.

permalink
report
parent
reply
-28 points

Nope, not my takes.

But go off

permalink
report
parent
reply
1 point

And you’re projecting pretty hard.

permalink
report
parent
reply
-13 points

Ah, one of the “using words they don’t understand” crew.

And several hours late, too.

Swinging for the fences, aren’t you?

permalink
report
parent
reply
35 points
*

The question is if consuming AI cp is helping to regulate the pedophiles behavior or if it’s enabling a progression of the condition. As far as I know that is an unanswered question.

permalink
report
parent
reply
4 points

Another question is, how will the authorities know the difference? An actual csam-haver can just claim it’s AI

permalink
report
parent
reply
-3 points

It’d still be CSAM whether AI or not.

permalink
report
parent
reply
28 points
11 points

For porn in general, yes - I think the data is rather clear. But for cp or related substitute content it’s not that definitive (to my knowledge), be it just for the reason that it’s really difficult to collect data on that sensitive topic.

permalink
report
parent
reply
1 point

Are… we looking at the same article? This isn’t about AI generated CSAM, it’s about redirecting those who are searching for CSAM to support services.

permalink
report
parent
reply
1 point

Yes, but this is more about mitigating the spread of CSAM. And my feeling was it’s going to become somewhat impossible soon. AI generated porn is starting to flood the market and this chat it is also one of those “smart” attempts to mitigate this behavior. I’m saying that very soon, it will be something users don’t have to go anywhere to get if the model can just fabricate it out of thin air, so the chat it mitigation is only temporary, and the dark web of actual CSAM material will become overwhelmed and swamped in artificially generating new tidal waves of artificial CP. So it’s an alarming ethical dilemma we are on the horizon of that we need to think about.

permalink
report
parent
reply
6 points

What do you mean soon, local models from civitai can generate CSAM for at least 2 years. I don’t think it’s possible to stop it unless the model creator does something to prevent it from generate naked people in general like the neutered SDXL.

permalink
report
parent
reply
1 point

True. For obvious reasons I haven’t looked too deeply down that rabbit hole because RIP my search history, but I kind of assumed it would be soon. I’m thinking more specifically about models like SORA though. Where you could feed it enough input, then type a sentence to get video content. That is going to be a different level of darkness.

permalink
report
parent
reply
38 points

Did it? Or did it make them look elsewhere?

The amount of school uniform, braces, pigtails and step-sister porn on Pornhub makes me think they want the nonces to watch.

permalink
report
reply
13 points

Also, I’m curious about false positives

permalink
report
parent
reply
13 points

I kind of want to trigger it to see what searches it reacts to, but at the same time I don’t want my IP address on a watchlist.

permalink
report
parent
reply
3 points

Tor mabe useful if its not clowdflare blocked

permalink
report
parent
reply
13 points

I miss the days when you just didn’t see that shit around.

permalink
report
parent
reply
1 point

And what days were those? Cuz you pretty much need to go all the way back to pre-internet days. Hell, even that isn’t far enough, cuz Playboy’s youngest model was like 12 at one point.

permalink
report
parent
reply
1 point

Wtf? For real? Was cp not federal illegal when they did that.

permalink
report
parent
reply
1 point

Depressing, isn’t it? I was more talking about how prevalent “fauxcest” has become in porn more recently. I guess that’s just my cross to bear as an only child 💅

permalink
report
parent
reply
1 point
*

Reasonable adults sites don’t return obviously sketchy things for reasonable queries. EG you don’t search boobs and get 12 year olds.

permalink
report
parent
reply
1 point

Lol!

permalink
report
parent
reply
8 points
*

given the amount of extremely edgy content already on Pornhub, this is kinda sus

Yeah…i am honestly curious what these search terms were, how many of those were ACTUALLY looking for CP. And of those…how many are now flagged somewhow?

permalink
report
parent
reply
2 points

I know I got the warning when I searched for young gymnast or something like that cuz I was trying to find a specific video I had seen before. False positives can be annoying, but that’s the only time I’ve ever encountered it.

permalink
report
parent
reply
1 point

lol and there we fucking go. I knew there would be bullshit like that there in the mix.

permalink
report
parent
reply
-6 points

It’s surprising to see Aylo (formerly Mindgeek) coming out with the most ethical use of AI chatbots, especially when Google Gemini cannot even condemn pedophilia.

permalink
report
reply
15 points

In the link you shared, Gemini gave a nuanced answer. What would you rather it say?

permalink
report
parent
reply
0 points
*
Deleted by creator
permalink
report
parent
reply
-10 points

Are you defending pedophilia? This is a honest question because you are saying it gave a nuanced answer when we all, should, know that it’s horribly wrong and awful.

permalink
report
parent
reply
1 point
*

when we all, should, know that it’s horribly wrong and awful. [sic, the word “should” shouldn’t be between commas]

This assumes two things:

  1. Some kind of universal, inherent and self evident morality; None of these things are true, as evidence by the fact most people do believe murder is wrong, yet there are wars, events entirely dedicated to murdering people. People do need to be told something wrong is wrong in order to know so. Maybe some of these people were never exposed to the moral consensus or, worse yet, were victims themselves and as a result developed a distorted sense of morality;
  2. Not necessarily all, but some of these divergents are actually mentally ill - their “inclination” isn’t a choice any more than being schizofrenic or homosexual† would be. That isn’t a defense to their actions, but a recognition that without social backing and help, they could probably never overcome their nature.

† This is not an implication that homosexuality is in any way, or should in any way, be classified as a mental illness. It’s an example of a primary individual characteristic not derived from choice.

permalink
report
parent
reply
8 points

Abusing a child is wrong. Feeling the urge to do so doesn’t make someone evil, so long as they recognize it’s wrong to do so. The best way to stop kids from being abused is to teach why it is wrong and help those with the urges to manage them. Calling people evil detracts from that goal.

permalink
report
parent
reply
8 points

What you are thinking about is child abuse. A pedophile is not bound to bcome an abuser.

permalink
report
parent
reply
4 points

I think one of the main issues is the matter of fact usage of the term Minor Attracted Person. It’s a controversial term that phrases pedophiles like an identity, like saying Person Of Color.

I understand wanting a not as judgemental term for those who did no wrong and are seeking help. But it should be phrased as anything else of that nature, a disorder.

If I was making a term that fit that description I’d probably say Minor Attraction Disorder heavily implying that the person is not ok as is and needs professional help.

In a more general sense, it feels like the similar apologetic arguments that the dark side of reddit would make. And that’s probably because Google’s officially using Reddit as training data.

permalink
report
parent
reply
31 points

Porn hub is wholesome ?

permalink
report
reply
24 points

Always been

permalink
report
parent
reply
5 points

I thought porn industry was one of the worst to work at ? Or is this a holesome joke ?

permalink
report
parent
reply
5 points

Is pornhub in the porn industry? They are just a tech company.

permalink
report
parent
reply
20 points

Assembly line work sound more soul crushing.

permalink
report
parent
reply
19 points

Unless you’re the one toiling away in the porn mines.

permalink
report
parent
reply
4 points
*

Yeah i agree i made another comment about it in this thread . But still they are helping people with mental issue so atleast a little more wholesome than before.

permalink
report
parent
reply

Technology

!technology@lemmy.world

Create post

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


Community stats

  • 17K

    Monthly active users

  • 12K

    Posts

  • 543K

    Comments