TLDR if you don’t wanna watch the whole thing: Benaminute (the Youtuber here) creates a fresh YouTube account and watches all recommended shorts without skipping. They repeat this 5 times, where they change their location to a random city in the US.

Below is the number of shorts after which alt-right content was recommended. Left wing/liberal content was never recommended first.

  1. Houston: 88 shorts
  2. Chicago: 98 shorts
  3. Atlanta: 109 shorts
  4. NYC: 247 shorts
  5. San Fransisco: never (Benaminute stopped after 250 shorts)

There however, was a certain pattern to this. First, non-political shorts were recommended. After that, AI Jesus shorts started to be recommended (with either AI Jesus talking to you, or an AI narrator narrating verses from the Bible). After this, non-political shorts by alt-right personalities (Jordan Peterson, Joe Rogan, Ben Shapiro, etc.) started to be recommended. Finally, explicitly alt-right shorts started to be recommended.

What I personally found both disturbing and kinda hilarious was in the case of Chicago. The non-political content in the beginning was a lot of Gen Alpha brainrot. Benaminute said that this seemed to be the norm for Chicago, as they had observed this in another similar experiment (which dealt with long-form content instead of shorts). After some shorts, there came a short where AI Gru (the main character from Despicable Me) was telling you to vote for Trump. He was going on about how voting for “Kamilia” would lose you “10000 rizz”, and how voting for Trump would get you “1 million rizz”.

In the end, Benaminute along with Miniminuteman propose a hypothesis trying to explain this phenomenon. They propose that alt-right content might be inciting more emotion, thus ranking high up in the algorithm. They say the algorithm isn’t necessarily left wing or right wing, but that alt-right wingers have understood the methodology of how to capture and grow their audience better.

201 points

I think the explanation might be even simpler - right wing content is the lowest common denominator, and mindlessly watching every recommended short drives you downward in quality.

permalink
report
reply
65 points
*

I was gonna say this. There’s very little liberal or left leaning media being made and what there is is mostly made for a female or LGBTQ audience. Not saying that men cannot watch those but there’s not a lot of “testosterone” infused content with a liberal leaning, one of the reasons Trump won was this, so by sheer volume you’re bound to see more right leaning content. Especially if you are a cisgender male.

Been considering creating content myself to at least stem the tide a little.

permalink
report
parent
reply
40 points

I think some of it is liberal media is more artsy and creative, which is more difficult to just pump out. Creation if a lot more difficult than destruction.

permalink
report
parent
reply
15 points

Not necessarily. For example a lot of “manosphere” guys have taken a hold of philosophy,health and fitness topics, a liberal influencer can give a liberal view on these subjects. For example in philosophy, explain how Nietzsche was not just saying that you can do whatever the fuck you want, or how stoicism is actually a philosophy of tolerance not of superiority etc. there’s really a lot of space that can be covered.

permalink
report
parent
reply
9 points

Plus fact based videos require research, sourcing and editing.

Emotional fiction only takes as long to create as a daydream.

permalink
report
parent
reply
5 points

Creation if a lot more difficult than destruction.

Yup. A lesson that I fear we will be learning over and over and over in the coming years.

permalink
report
parent
reply
-13 points

Really? As someone who dislikeds both mainstream extremes (I consider myself libertarian), I see a lot more left-leaning content than right-leaning content. I wouldn’t be surprised if >75% of the content I watch comes from a left-leaning creator, nor because I seek it out, but because young people into tech tend to lean left, and I’m into tech.

permalink
report
parent
reply
15 points

I refuse to watch those shit shorts; I think your theory has legs. Unfortunately there doesn’t seem to be a way to turn them off.

permalink
report
parent
reply
14 points

I use YouTube revanced to disable them.

permalink
report
parent
reply
3 points

Greatest app ever.

permalink
report
parent
reply
1 point

Thanks!

permalink
report
parent
reply
2 points
*

FreeTube on PC and Revanced on phone

permalink
report
parent
reply
4 points
*

yeah i created a new youtube account in a container once and just watched all the popular/drama suggestions. that account turned into a shitstorm immediately

these days i curate my youtube accounts making liberal use of Not interested/Do not recommend channel/Editing my history and even test watching in a container before watching it on my curated account

this is just how “the algorithm” works. shovel more of what you watch in your face.

the fact that they initially will give you right-wing, conspiracy fueled, populist, trash right off the bat is the concern

permalink
report
parent
reply
3 points

Man that seems like a lot of work just to preserve a shitty logarithm that clearly isn’t working for you… Just get a third party app and watch without logging in

permalink
report
parent
reply
1 point

oddly enough it seems to be working, if i don’t login at all youtube just offers up the usual dross

permalink
report
parent
reply

Isn’t the simpler explanation is youtube has and always will promote the alt-right? Also, no longer the alt right, it’s just the right.

permalink
report
parent
reply
3 points

No, the explanation that involves conspiracy is not the simpler explanation.

permalink
report
parent
reply

Was it a conspiracy in 2016? Was it a conspiracy that elon bought x to control the narrative? Was it a conspiracy that TikTok has avoided shutdown by glazing trump? Was it a conspiracy when zuck just changed the way they did moderation on Facebook and then showed up at trumps inauguration?

Youtube cucks are the worst.

permalink
report
parent
reply
73 points

I keep getting recommendations for content like “this woke person got DESTROYED by logic” on YouTube. Even though I click “not interested”, and even “don’t recommend channel”, I keep getting the same channel, AND video recommendation(s). It’s pretty obvious bullshit.

permalink
report
reply
22 points

Anything but the subscriptions page is absolute garbage on that site. Ideally get an app to track your subs without having to have an account. NewPipe, FreeTube etc.

permalink
report
parent
reply
7 points
*

Are those available on PC/Linux? On my TV? 😭 I have them on my phone but I feel like there’s too much hassle to do on my main viewing devices.

permalink
report
parent
reply
3 points
*

I use FreeTube on Linux. I think it’s Chromium based, so some people don’t like it, and it’s usually one of the bigger resource hogs when I have it open, but its worth it for the ad-free, subscriptions-only experience imo…

Though lately it hasn’t been behaving well with the vpn…

permalink
report
parent
reply
3 points

And if you don’t want to deal with those breaking (becaue google is actively shooting them in the face) they DO provide RSS feeds for your creators.

Just add the channel url to your feed reader (ex. https://www.youtube.com/@LinusTechTips )and forget the youtube webui exists.

permalink
report
parent
reply
19 points

You’d think a recommendation algorithm should take your preferences into account - that’s the whole justification for tracking your usage in the first place: recommending relevant content for you…

permalink
report
parent
reply
13 points
Deleted by creator
permalink
report
parent
reply
7 points

Even in the best-intentioned recommender system, trained on the content you watch to estimate what you’re interested in and recommend similar things, that would be the drift of things. You can’t really mathematically judge the emotions the viewers might feel unless they express them in a measurable way, so observing their behaviour and recommending similar by whatever heuristic. And if they keep clicking on rageposts, that’s what the system has to go on.

But at least giving the explicit indication “I don’t want to see this” should be heavily weighted in that calculation. Just straight up ignoring that is an extra layer of awful.

permalink
report
parent
reply
11 points
*

it is. But who said that you get to decide what’s relevant for you? Welcome and learn to trust your algorithmic overlords

permalink
report
parent
reply
7 points

Thanks, I hate it

permalink
report
parent
reply
2 points

The algorithms are always trying to poke you in the id.

permalink
report
parent
reply
10 points

YOU’D THINK THAT YES. [caps intended]

permalink
report
parent
reply
3 points

Wrong, the whole purpose of tracking your usage is to identify what kind of consumer you are so they can sell your views to advertisers. Recommendations are based on what category of consumer you’ve been identified as. Maintaining your viewership is secondary to the process of selling your views.

permalink
report
parent
reply
2 points

I said justification, not purpose. They claim they want to track usage to tailor your experience to you.

They don’t actually believe that, of course, but respecting your explicit expression of interest ought to be the minimum perfunctory concession to that pretense. By this we can see just how thin a pretense it is.

permalink
report
parent
reply
1 point

I feel like it at least used to pretend that it was doing this (YouTube) at least.

I can’t say for recently as I use a third party client these days and do not log in.

permalink
report
parent
reply
60 points

I hate the double standards

On a true crime video: “This PDF-File game ended himself after he was caught SAing this individual… Sorry Youtube forces me to talk like that or I might get demonetized” Flagged for discussing Suicide

On PragerU: “The Transgender Agenda is full of rapists and freaks who will sexually assault your children, they are pedophiles who must be dealt with via final solution!” Completely fucking acceptable!

permalink
report
reply
7 points

Be nice to Prager, he’s very fragile https://youtu.be/YpIQPv5Iq4Y

permalink
report
parent
reply
3 points

That statement about murdering a hitchiker has no edits, but simply saying that slavery is bad, took three edits. Says everything you need to know about Penis Prager

permalink
report
parent
reply
1 point

I haven’t experienced it lately, but sometimes when I watch too many YTPs I start getting fascist shit in my recommendations.

permalink
report
parent
reply
52 points

Instagram is probably notably worse, I have a very establish account that should be very anti that sort of thing and it keeps serving up idiotic guru garbage.

Tiktok is by far the best in this aspect, at least before recent weeks.

permalink
report
reply
17 points

A couple of years ago, I started two other Instagram accounts besides my personal one. I needed to organize and have more control of what content I want to see at times I choose. One was mostly for combat sports, other sports, and fitness. The second one was just food.

The first one, right off the bat showed me girls with OnlyFan accounts in the discovery page. Then after a few days, they begin showing me right wing content, and alpha male garbage.

The second one, the food account, showed alternative holistic solutions. Stuff like showing me 10 different accounts of people suggesting I consume raw milk. They started sending me a mix of people who just eat meat and vegans.

It’s really wild what these companies show you to complete your profile.

permalink
report
parent
reply
14 points

I saw a tiktok video talking about how Instagram starts the redpill/incel stuff early for the young people then they become failures in life at which point they push the guru stuff for “guidance”.

EU and even China has at least made a attempt of holding these companies accountable for the algorithm but US and Canadian government just sat there and did nothing.

permalink
report
parent
reply
4 points

For now.

permalink
report
parent
reply
1 point
Deleted by creator
permalink
report
parent
reply
49 points
*

I realized a while back that social media is trying to radicalize everyone and it might not even be entirely the oligarchs that control its fault.

The algorithm was written with one thing in mind: maximizing engagement time. The longer you stay on the page, the more ads you watch, the more money they make.

This is pervasive and even if educated adults tune it out, there is always children, who get Mr. Beast and thousands of others trying to trick them into like, subscribe and follow.

This is something governments should be looking at how to control. Propaganda created for the sole purpose of making money is still propaganda. I think at this point that sites feeding content that use an algorithm to personalize feeds for each user are all compromised.

permalink
report
reply
13 points

The problem is education. It’s a fools game to try and control human nature which is the commodification of all and you will always have commercials and propaganda

What is in our means is to strengthen education on how to think critically and understanding your environment. This is where we have failed and I’ll argue there are people actively destroying this for their own gain.

Educated people are dangerous people.

It’s not 1984. It’s Brave New World. Aldous Huxley was right.

permalink
report
parent
reply
11 points

I think we need to do better than just say “get an education.”

There are educated people that still vote for Trump. Making it sound like liberalism is some result of going to college is part of why so many colleges are under attack.

From their perspective I get it, many of the Trump voters didn’t go, they hear that and they just assume brainwashing.

We need to find a way to teach people to sort out information, to put their immediate emotions on pause and search for information, etc, not just the kind of “education” where you regurgitate talking points from teachers, the TV, or the radio as if they’re matter of a fact … and the whole education system is pretty tuned around regurgitation, even at the college level. A lot of the culture of exploration surrounding college (outside of the classroom) is likely more where the liberal view points come from and we’d be ill advised to assume the right can’t destroy that.

permalink
report
parent
reply
4 points

We need to find a way to teach people to sort out information, to put their immediate emotions on pause and search for information

This entire comment and @whoisearth@lemmy.ca’s comments are so powerful.

I think people have two modes of getting information: digging into a newspaper article and trying to figure out what’s going on and seeing a lurid headline in the tabloid rack. Most people do both ends of the spectrum and a lot of in-between. Modern technology lends itself to giving tabloid-like content while we’re waiting in line for a minute. This is why Tiktok is concerned about being removed from the app store, even though it’s easy to install the app yourself, easier than signing up for a newspaper delivery subscription was. But Tiktok isn’t more like a lurid tabloid that most people would not go two steps out of their way to find, but they might read it waiting in a slow line. I’m hopeful that people will learn to manage the new technology and not keep being influenced by tabloid entertainment.

permalink
report
parent
reply
1 point

I don’t think college education is the source we should be looking at.

Critical thinking skills need to be taught a much much earlier phase.

permalink
report
parent
reply
7 points

This discussion existed before computers. Before that it was TV and before that it was radio. The core problem is ads. They ruined the internet, TV, radio, the press. Probably stone tablets somehow. Fuck ads.

permalink
report
parent
reply
2 points

sites feeding content that use an algorithm to personalize feeds for each user are all compromised.

Not arguing against this at all because you’re completely correct, but this feels like a key example of governments being too slow (and perhaps too out of touch?) to properly regulate tech. People clearly like having an algorithm, but algorithms in their current form are a great excuse for tech companies to use to throw their hands up in the air and claim no foul play because of how opaque they are. “It only shows you what you tell it you want to see!” is easy for them to say, but until consumers are given the right to know how exactly each one works, almost like nutrition facts on food packaging, then we’ll never know whether they’re telling the truth. The ability for a tech company to have near unlimited control and no oversight over what millions of people are looking at day after day is clearly a major factor in what got us here in the first place

Not that there’s any hope for new consumer protections during this US administration or anything, but just something I had been thinking about for a while

permalink
report
parent
reply

Technology

!technology@lemmy.world

Create post

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


Community stats

  • 17K

    Monthly active users

  • 14K

    Posts

  • 597K

    Comments