A trial program conducted by Pornhub in collaboration with UK-based child protection organizations aimed to deter users from searching for child abuse material (CSAM) on its website. Whenever CSAM-related terms were searched, a warning message and a chatbot appeared, directing users to support services. The trial reported a significant reduction in CSAM searches and an increase in users seeking help. Despite some limitations in data and complexity, the chatbot showed promise in deterring illegal behavior online. While the trial has ended, the chatbot and warnings remain active on Pornhub’s UK site, with hopes for similar measures across other platforms to create a safer internet environment.

165 points

That kinda sounds reasonable. Especially if it can prevent someone going down that rabbithole? Good job PH.

permalink
report
reply
124 points

Imagine a porn site telling you to seek help because you’re a filthy pervert. Thats gotta push some to get some help I’d think.

permalink
report
reply
45 points
*

Imagine how dumb, in addition to deranged, these people would have to be to look for child porn on a basically legitimate website. Misleading headline too, it didn’t stop anything, it just told them “Not here”

permalink
report
parent
reply
19 points

We have culturally drawn a line in the sand where one side is legal and the other side of the line is illegal.

Of course the real world isn’t like that - there’s a range of material available and a lot of it is pretty close to being abusive material, while still being perfectly legal because it falls on the right side of someone’s date of birth.

It sounds like this initiative by Pornhub’s chatbot successfully pushes people away from borderline content… I’m not sure I buy that… but if it’s directing some of those users to support services then that’s a good thing. I worry though some people might instead be pushed over to the dark web.

permalink
report
parent
reply
13 points

Yeah…I forgot that the UK classifies some activities between consenting adults as “abusive”, and it seems some people are now using that definition in the real world.

permalink
report
parent
reply
14 points
*

I mean, is it dumb?

Didnt pornhub face a massive lawsuit or something because of the amount of unmoderated child porn that was hidden in its bowels by uploaders (in addition to rape victims, revenge porn, etc etc…), to the point that they apparently only allow verified uploaders now and purged a huge swath of their videos?

permalink
report
parent
reply
-8 points

“I’m just asking questions”

permalink
report
parent
reply
12 points

Until a few years ago, when they finally stopped allowing unmoderated, user uploaded content they had a ton a very problematic videos. And they were roasted about it in public for years. Including by many who were the unconsenting, sometimes underage subjects of these videos, and they did nothing. Good that they finally did, but they trained users for years that it was a place to find that content.

permalink
report
parent
reply
-13 points

yeah I believe everything the government says through the media too.

permalink
report
parent
reply
-2 points

Pornhub also knowingly hosted child porn. Ready or Not put them on blast for it when you raid a company called “Mindjot” for distributing child porn.

permalink
report
parent
reply
18 points

filthy pervert is down playing it but yea definitely hope to see more of this

permalink
report
parent
reply
11 points

IIRC Xhamster started doing this a few years ago, minus the AI chatbot.

permalink
report
parent
reply
8 points

Didn’t they just block certain search terms (which actually made the site somewhat difficult to use for legitimate/legal content)?

permalink
report
parent
reply
3 points
*

The ol’ Scunthorpe problem.

permalink
report
parent
reply
-2 points

Lol

permalink
report
parent
reply
88 points

Sounds like a good feature. Anything that stops people from doing that is great.

But I do have to wonder… were people really expecting to find that content on PornHub? That site certainly seems legit enough that I doubt they’d have that stuff on there. I’d imagine most actual content would be on the dark web and specialty groups, not on PH.

permalink
report
reply
71 points

PH had a pretty big problem with CSAM a few years ago, they ended up wiping ~2/3rds of their user submitted content to try fix it. (Note, they wiped all non-verified user submitted videos, not all of it was CSAM).

And im guessing they are trying to catch users who are trending towards questionable material. “College”✅ -> “Teen”⚠️ -> “Young Teen”⚠️⚠️⚠️ -> "CSAM"🚔 etc.

permalink
report
parent
reply
30 points

That explains why it’s all commercial stuff now… So I heard.

permalink
report
parent
reply
7 points

Sure sure, whatever you say Big Dick :D

permalink
report
parent
reply
18 points

Wow, that bad? I was aware they purged a lot of ‘amateur’ content over concerns regarding consent to upload/revenge porn, but I didn’t know it was that much.

permalink
report
parent
reply
33 points

Yeah, unverified user content had a lot of problems. Also piracy and gore etc.

https://arstechnica.com/tech-policy/2020/12/pornhub-purges-all-unverified-user-uploads-in-wake-of-abuse-allegations/

The purge appears to have hit almost 9 million of the 13.5 million videos on Pornhub as of Sunday, or nearly two-thirds of all the content hosted on the site.

permalink
report
parent
reply
17 points

I think it’s an early prevention type of thing.

permalink
report
parent
reply
6 points

It had all sorts of illegal things before they purged everyone unverified due to legal pressure

permalink
report
parent
reply
2 points

wree people really expecting to find that content on PornHub?

Welcome to the internet 😂 where people constantly disappoint/surprise you (what word is that? Dissurprise? Disurprint?

permalink
report
parent
reply
1 point

So…pornhub has actually had problems with CSAM. It used to be much more of a Youtube-like platform where anyone can upload.

Even without that aspect, there are a looot of producers that don’t do their checks well and a lot of underage actresses that fall through the cracks

permalink
report
parent
reply
69 points

The headline is slightly misleading. 2.8 million searches were halted, but according to the article they didn’t attempt to figure out how many of those searches came from the same users. So thankfully the number of secret pedophiles in the UK is probably much lower than the headline might suggest.

permalink
report
reply
75 points

I suspect a lot of CSAM searches come from underage users themselves

permalink
report
parent
reply
37 points
*
Deleted by creator
permalink
report
parent
reply
21 points
*

Same thing for me when I was 13. I freaked the fuck out when I saw a wikipedia article on the right. I thought I was going to jail the next day lmfao

permalink
report
parent
reply
32 points

I’d think it’s probably not a majority, but I do wonder what percentage it actually is. I do have distinct memories of being like 12 and trying to find porn of people my own age instead of “gross old people” and being confused why I couldn’t find anything. Kids are stupid lol, that’s why laws protecting them need to exist.

Also good god when I become a parent I am going to do proper network monitoring; in hindsight I should not have been left unattended on the internet at 12.

permalink
report
parent
reply
14 points

I was the same back then. And have come across some stuff which is surprisingly easy to find. Later to realize how messed up that was.

I think monitoring is good but it has a fine line not to cross in your child privacy. If they suspect anything they sure know how to work around it and you loose any insight.

permalink
report
parent
reply
6 points
*

It’s not about laws, it’s about sexual education. Sexual education is a topic that can’t be left to the parents and should be explained in school, so as to give the kids a complete knowledge base.

Most parents know about sex as much as they know about medicines. They’ve had some, but that doesn’t give them a degree for teaching that stuff.

permalink
report
parent
reply
2 points

Sorry I know this is a serious subject and not a laughing matter but that’s a funny situation. I guess I was a MILF hunter at that age because even then I was perfectly happy to knock one out watching adult porn instead!

permalink
report
parent
reply
56 points

4.4 million sounds a bit excessive. Facebook marketplace intercepted my search for “unwanted gift” once and insisted I seek help. These things have a lot of false positives.

permalink
report
reply
9 points

What is an “unwanted gift” though?

permalink
report
parent
reply
17 points
*

Probably just looking for deals on new stuff that people dont care about having been gifted.

I could definitely see “unwanted gift” being a code word for trafficking :(

permalink
report
parent
reply
9 points
*

Not necessarily trafficking, but could be trafficking-adjacent.

There used to be “child rehoming” ‘services’ on Facebook and the like, for people who regret adopting a kid, and pass them to others. Here’s a fairly in-depth article on the whole affair. Unsurprisingly, it didn’t go well.

EDIT: In hindsight, “unwanted gift” could also be about people getting unexpectedly pregnant, and putting the resulting child up for adoption, but not wanting to go through legal means for one reason or another, which seems a more likely answer.

permalink
report
parent
reply
2 points

Lol makes sense. Meta being really meta here, but if thats needed… better too much than too little

permalink
report
parent
reply
1 point

Do you really think human traffickers are listing people under secret codes on accounts obviously linked to their real identity with their real face? Remember the ikea thing where vendors who didn’t specify a price received an absurd default price for their goods eg 9999.99 and people that furniture that was listed at that price corresponded to kids being sold?

permalink
report
parent
reply
8 points

On Facebook marketplace just after Christmas? A potential bargain on unopened merch, of course!

permalink
report
parent
reply
-34 points

Found one of the guys who spoke to the bot

permalink
report
parent
reply

Technology

!technology@lemmy.world

Create post

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


Community stats

  • 18K

    Monthly active users

  • 11K

    Posts

  • 505K

    Comments