Very, Very Few People Are Falling Down the YouTube Rabbit Hole | The site’s crackdown on radicalization seems to have worked. But the world will never know what was happening before that::The site’s crackdown on radicalization seems to have worked. But the world will never know what was happening before that.

6 points

Requesting a paywall circumvention.

permalink
report
reply
12 points

Done. Check my comment.

permalink
report
parent
reply
47 points

The article below:

Around the time of the 2016 election, YouTube became known as a home to the rising alt-right and to massively popular conspiracy theorists. The Google-owned site had more than 1 billion users and was playing host to charismatic personalities who had developed intimate relationships with their audiences, potentially making it a powerful vector for political influence. At the time, Alex Jones’s channel, Infowars, had more than 2 million subscribers. And YouTube’s recommendation algorithm, which accounted for the majority of what people watched on the platform, looked to be pulling people deeper and deeper into dangerous delusions.

The process of “falling down the rabbit hole” was memorably illustrated by personal accounts of people who had ended up on strange paths into the dark heart of the platform, where they were intrigued and then convinced by extremist rhetoric—an interest in critiques of feminism could lead to men’s rights and then white supremacy and then calls for violence. Most troubling is that a person who was not necessarily looking for extreme content could end up watching it because the algorithm noticed a whisper of something in their previous choices. It could exacerbate a person’s worst impulses and take them to a place they wouldn’t have chosen, but would have trouble getting out of.

Just how big a rabbit-hole problem YouTube had wasn’t quite clear, and the company denied it had one at all even as it was making changes to address the criticisms. In early 2019, YouTube announced tweaks to its recommendation system with the goal of dramatically reducing the promotion of “harmful misinformation” and “borderline content” (the kinds of videos that were almost extreme enough to remove, but not quite). At the same time, it also went on a demonetizing spree, blocking shared-ad-revenue programs for YouTube creators who disobeyed its policies about hate speech.Whatever else YouTube continued to allow on its site, the idea was that the rabbit hole would be filled in.

A new peer-reviewed study, published today in Science Advances, suggests that YouTube’s 2019 update worked. The research team was led by Brendan Nyhan, a government professor at Dartmouth who studies polarization in the context of the internet. Nyhan and his co-authors surveyed 1,181 people about their existing political attitudes and then used a custom browser extension to monitor all of their YouTube activity and recommendations for a period of several months at the end of 2020. It found that extremist videos were watched by only 6 percent of participants. Of those people, the majority had deliberately subscribed to at least one extremist channel, meaning that they hadn’t been pushed there by the algorithm. Further, these people were often coming to extremist videos from external links instead of from within YouTube.

These viewing patterns showed no evidence of a rabbit-hole process as it’s typically imagined: Rather than naive users suddenly and unwittingly finding themselves funneled toward hateful content, “we see people with very high levels of gender and racial resentment seeking this content out,” Nyhan told me. That people are primarily viewing extremist content through subscriptions and external links is something “only [this team has] been able to capture, because of the method,” says Manoel Horta Ribeiro, a researcher at the Swiss Federal Institute of Technology Lausanne who wasn’t involved in the study. Whereas many previous studies of the YouTube rabbit hole have had to use bots to simulate the experience of navigating YouTube’s recommendations—by clicking mindlessly on the next suggested video over and over and over—this is the first that obtained such granular data on real, human behavior.

The study does have an unavoidable flaw: It cannot account for anything that happened on YouTube before the data were collected, in 2020. “It may be the case that the susceptible population was already radicalized during YouTube’s pre-2019 era,” as Nyhan and his co-authors explain in the paper. Extremist content does still exist on YouTube, after all, and some people do still watch it. So there’s a chicken-and-egg dilemma: Which came first, the extremist who watches videos on YouTube, or the YouTuber who encounters extremist content there?

Examining today’s YouTube to try to understand the YouTube of several years ago is, to deploy another metaphor, “a little bit ‘apples and oranges,’” Jonas Kaiser, a researcher at Harvard’s Berkman Klein Center for Internet and Society who wasn’t involved in the study, told me. Though he considers it a solid study, he said he also recognizes the difficulty of learning much about a platform’s past by looking at one sample of users from its present. This was also a significant issue with a collection of new studies about Facebook’s role in political polarization, which were published last month (Nyhan worked on one of them). Those studies demonstrated that, although echo chambers on Facebook do exist, they don’t have major effects on people’s political attitudes today. But they couldn’t demonstrate whether the echo chambers had already had those effects long before the study.

The new research is still important, in part because it proposes a specific, technical definition of rabbit hole. The term has been used in different ways in common speech and even in academic research. Nyhan’s team defined a “rabbit hole event” as one in which a person follows a recommendation to get to a more extreme type of video than they were previously watching. They can’t have been subscribing to the channel they end up on, or to similarly extreme channels, before the recommendation pushed them. This mechanism wasn’t common in their findings at all. They saw it act on only 1 percent of participants, accounting for only 0.002 percent of all views of extremist-channel videos.

This is great to know. But, again, it doesn’t mean that rabbit holes, as the team defined them, weren’t at one point a bigger problem. It’s just a good indication that they seem to be rare right now. Why did it take so long to go looking for the rabbit holes? “It’s a shame we didn’t catch them on both sides of the change,” Nyhan acknowledged. “That would have been ideal.” But it took time to build the browser extension (which is now open source, so it can be used by other researchers), and it also took time to come up with a whole bunch of money. Nyhan estimated that the study received about $100,000 in funding, but an additional National Science Foundation grant that went to a separate team that built the browser extension was huge—almost $500,000.

Nyhan was careful not to say that this paper represents a total exoneration of YouTube. The platform hasn’t stopped letting its subscription feature drive traffic to extremists. It also continues to allow users to publish extremist videos. And learning that only a tiny percentage of users stumble across extremist content isn’t the same as learning that no one does; a tiny percentage of a gargantuan user base still represents a large number of people.

This speaks to the broader problem with last month’s new Facebook research as well: Americans want to understand why the country is so dramatically polarized, and people have seen the huge changes in our technology use and information consumption in the years when that polarization became most obvious. But the web changes every day. Things that YouTube no longer wants to host could still find huge audiences, instead, on platforms such as Rumble; most young people now use TikTok, a platform that barely existed when we started talking about the effects of social media. As soon as we start to unravel one mystery about how the internet affects us, another one takes its place.

permalink
report
reply
19 points

Another way to put that study’s weakness, in scientific terms, is that there’s no control group against which the studied group is being compared. There’s zero indication that the 2019 changes had any effect at all, without some data from before those changes.

permalink
report
parent
reply
5 points

Always love when people try to hold social sciences to the same standard as physical sciences

permalink
report
parent
reply
2 points
*
Deleted by creator
permalink
report
parent
reply
4 points

I’ve never heard of Rumble before, apparently it’s a video platform and the company that owns Truth social, so it’s very popular with the far right

permalink
report
parent
reply
1 point
*

The article below:

Honestly don’t mean this as an attack, but couldn’t people just clicked on the link, if they really wanted to read the article?

permalink
report
parent
reply
6 points
*

It is paywalled and someone explicitly requested it in the comments.

permalink
report
parent
reply
1 point

Fair enough. Thanks for sharing it (didn’t realize it was paywalled).

permalink
report
parent
reply
148 points

Bro people were eating tidepods and we saw a resurgence of nazism and white nationlism.

I think we at least know the effects of what was happening before.

permalink
report
reply
68 points

Could we convince the Nazis to eat the tide pods?

permalink
report
parent
reply
25 points

…probably

permalink
report
parent
reply
12 points

They are a famously suggestible lot

permalink
report
parent
reply
16 points

Just get 4chan to convince the imbeciles that it’s a white supremacist symbol like they did with the okay sign.

permalink
report
parent
reply
10 points

You got that turned around. 4chan convinced politicians/pundits the ok symbol was white supremacist. Honestly, it worked, but they should have picked the shocker. Would have actually been funny.

permalink
report
parent
reply

4Chan is an early adopter of memes. Unpopular memes tend to go through 4Chan and fizzle. Popular ones go through 4chan and the get big. I don’t know the causal relationship.

The 👌 sign is still a white power movement sign, even if its used by youths and politicians trying to get down with the core.

I’d hazard a guess right wing superiority groups are epidemic with imposter syndrome, with Grand Wizards and Three-Percenter lieutenants doubting their validity more than eggs and questioning gays. Heck, Donald Trump, former President of the United States is like the god emperor of imposter syndrome.

permalink
report
parent
reply
11 points

If a bunch of “ironic” racists start using a symbol as a “joke” and one of them flashes it after murdering 50 people because of their religion, then it’s officially a hate symbol.

permalink
report
parent
reply
10 points

…something something… making your whites whiter… they’ll get the message they’re after

permalink
report
parent
reply
32 points

apparently there weren’t really any people eating tide pods

permalink
report
parent
reply
12 points

I’m pretty sure more people did it after it blew up in the ‘news’ than ever did it before that point.

permalink
report
parent
reply
18 points
*

Wait what? Maybe I’m misunderstanding, but this is what I got out of the article:

“We had anecdotes and preliminary evidence of a phenomenon. A robust scientific study showed no evidence of said phenomenon. Therefore, the phenomenon was previously real but has now stopped.”

That seems like really, really bad science. Or at least, really really bad science reporting. Like, if anecdotes are all it takes, here’s one from just a few weeks ago.

I left some Andrew Tate-esque stuff running overnight by accident and ended up having to delete my watch history to get my homepage back to how it was before.

permalink
report
reply
5 points

From the quoted bit it sounds like there was credible science that found nothing. That doesn’t mean there is nothing, but just that they found nothing.

permalink
report
parent
reply
-29 points

What a load of dog dung that article is. Justifying censorship, labelling everything that is not liked by some politicial “expert” as far right extreme.

permalink
report
reply
24 points
*
Deleted by creator
permalink
report
parent
reply
-19 points

Censorship of online content is good, but simultaneously the censorship of sexually explicit books in elementary schools is evil.

Neat.

permalink
report
parent
reply
1 point
Deleted by creator
permalink
report
parent
reply
1 point

Solid counterargument.

permalink
report
parent
reply
9 points

When an algorithm is involved, things change. These aren’t static websites that only get passed around by real people. This is some bizarre pseudo intelligence that thinks if you like WWII history and bratwurst, that you’d also like neo-nazi content. That’s not an exaggeration. One of my very left leaning friends started getting neo-nazi videos suggested to him and I suspect it was for those reasons.

Also, youtube isn’t a free speech platform. It’s an advertisement platform. Fediverse is a free speech platform, although it’s free speech for the person paying the hosting bills.

permalink
report
parent
reply

Technology

!technology@lemmy.world

Create post

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


Community stats

  • 16K

    Monthly active users

  • 13K

    Posts

  • 593K

    Comments