Very, Very Few People Are Falling Down the YouTube Rabbit Hole | The site’s crackdown on radicalization seems to have worked. But the world will never know what was happening before that::The site’s crackdown on radicalization seems to have worked. But the world will never know what was happening before that.
Can always go to one of the many instances that defederated them. Not like there’s account-wide upvote points to lose or anything. (genuine suggestion)
Wait what? Maybe I’m misunderstanding, but this is what I got out of the article:
“We had anecdotes and preliminary evidence of a phenomenon. A robust scientific study showed no evidence of said phenomenon. Therefore, the phenomenon was previously real but has now stopped.”
That seems like really, really bad science. Or at least, really really bad science reporting. Like, if anecdotes are all it takes, here’s one from just a few weeks ago.
I left some Andrew Tate-esque stuff running overnight by accident and ended up having to delete my watch history to get my homepage back to how it was before.
Why do we need to know what happened before? A record of the past is just material radicals can use to radicalize others.
Understanding how things happened helps you not doing it again. When the say we don’t know what happened they are not talking about individual videos. They are talking about the algorithm. Google made a radicalizing algorithm and then they stopped but because we don’t know why the first algorithm was so radicalizing it could happen all over again and we wouldn’t know until literal millions have already gone down the rabbithole.
Because clickbait works, and the treadmill of more exotic and/or clickbaity titles is infinite as online creators try to establish a niche.
See Elsa from Frozen. When “mainstream” Elsa videos get saturated, its impossible to make your video stand out. So you make “Pregnant Elsa” videos instead, which is more clickbaity and allows you to get more eyeballs / ad revenue associated to your account. But then “Pregnant Elsa” videos get over-saturated, so you go +1 on the extreme count to “Pregnant Elsa gives Birth” videos. Etc. etc. etc.
Next thing you know, sex videos are being shown to children who have clicked on too many Elsa videos in the Youtube Radicalization treadmill.
Cheap sexual content, cheap violent videos, and cheap “anger” videos are all the natural result of online content creators just trying to stand out. The more extreme you go, the less competition you have. So the more your audience connects with you. But the audience has to get there through the treadmill that steps them on this path… but recommendation algorithms automatically put people down these paths. (probably unintended behavior back then, but these days its a well known phenomenon).
Understanding past problems and solutions is critical to solve the problems of the future.
Do you really not see the importance of establishing a cause and effect relationship to past events?
What a load of dog dung that article is. Justifying censorship, labelling everything that is not liked by some politicial “expert” as far right extreme.
Censorship of online content is good, but simultaneously the censorship of sexually explicit books in elementary schools is evil.
Neat.
When an algorithm is involved, things change. These aren’t static websites that only get passed around by real people. This is some bizarre pseudo intelligence that thinks if you like WWII history and bratwurst, that you’d also like neo-nazi content. That’s not an exaggeration. One of my very left leaning friends started getting neo-nazi videos suggested to him and I suspect it was for those reasons.
Also, youtube isn’t a free speech platform. It’s an advertisement platform. Fediverse is a free speech platform, although it’s free speech for the person paying the hosting bills.