She’s almost 70, spend all day watching q-anon style of videos (but in Spanish) and every day she’s anguished about something new, last week was asking us to start digging a nuclear shelter because Russia was dropped a nuclear bomb over Ukraine. Before that she was begging us to install reinforced doors because the indigenous population were about to invade the cities and kill everyone with poisonous arrows. I have access to her YouTube account and I’m trying to unsubscribe and report the videos, but the reccomended videos keep feeding her more crazy shit.

208 points

At this point I would set up a new account for her - I’ve found Youtube’s algorithm to be very… persistent.

permalink
report
reply
73 points

Unfortunately, it’s linked to the account she uses for her job.

permalink
report
parent
reply
129 points
*

You can make “brand accounts” on YouTube that are a completely different profile from the default account. She probably won’t notice if you make one and switch her to it.

You’ll probably want to spend some time using it for yourself secretly to curate the kind of non-radical content she’ll want to see, and also set an identical profile picture on it so she doesn’t notice. I would spend at least a week “breaking it in.”

But once you’ve done that, you can probably switch to the brand account without logging her out of her Google account.

permalink
report
parent
reply
105 points

I love how we now have to monitor the content the generation that told us “Don’t believe everything you see on the internet.” watches like we would for children.

permalink
report
parent
reply
8 points

She’s going to seek this stuff out and the algorithm will keep feeding her. This isn’t just a YouTube problem, this is also a mom problem.

permalink
report
parent
reply
0 points

Delete watch history, find and watch nice channels and her other interests, log in to the account on a spare browser on your own phone periodically to make sure there’s no repeat of what happened.

permalink
report
parent
reply
1 point
Deleted by creator
permalink
report
parent
reply
184 points

I’m a bit disturbed how people’s beliefs are literally shaped by an algorithm. Now I’m scared to watch Youtube because I might be inadvertently watching propaganda.

permalink
report
reply
101 points
Deleted by creator
permalink
report
parent
reply
34 points
*

It’s even worse than “a lot easier”. Ever since the advances in ML went public, with things like Midjourney and ChatGPT, I’ve realized that the ML models are way way better at doing their thing that I’ve though.

Midjourney model’s purpose is so receive text, and give out an picture. And it’s really good at that, even though the dataset wasn’t really that large. Same with ChatGPT.

Now, Meta has (EDIT: just a speculation, but I’m 95% sure they do) a model which receives all data they have about the user (which is A LOT), and returns what post to show to him and in what order, to maximize his time on Facebook. And it was trained for years on a live dataset of 3 billion people interacting daily with the site. That’s a wet dream for any ML model. Imagine what it would be capable of even if it was only as good as ChatGPT at doing it’s task - and it had uncomparably better dataset and learning opportunities.

I’m really worried for the future in this regard, because it’s only a matter of time when someone with power decides that the model should not only keep people on the platform, but also to make them vote for X. And there is nothing you can do to defend against it, other than never interacting with anything with curated content, such as Google search, YT or anything Meta - because even if you know that there’s a model trying to manipulate with you, the model knows - there’s a lot of people like that. And he’s already learning and trying how to manipulate even with people like that. After all, it has 3 billion people as test subjects.

That’s why I’m extremely focused on privacy and about my data - not that I have something to hide, but I take a really really great issue with someone using such data to train models like that.

permalink
report
parent
reply
7 points

Just to let you know, meta has an open source model, llama, and it’s basically state of the art for open source community, but it falls short of chatgpt4.

The nice thing about the llama branches (vicuna and wizardlm) is that you can run them locally with about 80% of chatgpt3.5 efficiency, so no one is tracking your searches/conversations.

permalink
report
parent
reply
53 points
*

My personal opinion is that it’s one of the first large cases of misalignment in ML models. I’m 90% certain that Google and other platforms have been for years already using ML models design for user history and data they have about him as an input, and what videos should they offer to him as an ouput, with the goal to maximize the time he spends watching videos (or on Facebook, etc).

And the models eventually found out that if you radicalize someone, isolate them into a conspiracy that will make him an outsider or a nutjob, and then provide a safe space and an echo-chamber on the platform, be it both facebook or youtube, the will eventually start spending most of the time there.

I think this subject was touched-upon in the Social Dillema movie, but given what is happening in the world and how it seems that the conspiracies and desinformations are getting more and more common and people more radicalized, I’m almost certain that the algorithms are to blame.

permalink
report
parent
reply
16 points

If youtube “Algorithm” is optimizing for watchtime then the most optimal solution is to make people addicted to youtube.

The most scary thing I think is to optimize the reward is not to recommend a good video but to reprogram a human to watch as much as possible

permalink
report
parent
reply
7 points

I think that making someone addicted to youtube would be harder, than simply slowly radicalizing them into a shunned echo chamber about a conspiracy theory. Because if you try to make someone addicted to youtube, they will still have an alternative in the real world, friends and families to return to.

But if you radicalize them into something that will make them seem like a nutjob, you don’t have to compete with their surroundings - the only place where they understand them is on the youtube.

permalink
report
parent
reply
3 points

100% they’re using ML, and 100% it found a strategy they didn’t anticipate

The scariest part of it, though, is their willingness to continue using it despite the obvious consequences.

I think misalignment is not only likely to happen (for an eventual AGI), but likely to be embraced by the entities deploying them because the consequences may not impact them. Misalignment is relative

permalink
report
parent
reply
2 points

fuck, this is dark and almost awesome but not in a good way. I was thinking the fascist funnel was something of a deliberate thing, but may be these engagement algorithms have more to do with it than large shadow actors putting the funnels into place. Then there’s the folks who will create any sort of content to game the algorithm and you’ve got a perfect trifecta of radicalization

permalink
report
parent
reply
6 points
*

Fascist movements and cult leaders long ago figured out the secret to engagement: keep people feeling threatened, play on their insecurities, blame others for all the problems in people’s lives, use fear and hatred to cut them off from people outside the movement, make them feel like they have found a bunch of new friends, etc. Machine learning systems for optimizing engagement are dealing with the same human psychology, so they discover the same tricks to maximize engagement. Naturally, this leads to YouTube recommendations directing users towards fascist and cult content.

permalink
report
parent
reply
23 points
*

You watch this one thing out of curiosity, morbid curiosity, or by accident, and at the slightest poke the goddamned mindless algorithm starts throwing this shit at you.

The algorithm is “weaponized” for who screams the loudest, and I truly believe it started due to myopic incompetence/greed, not political malice. Which doesn’t make it any better, as people don’t know how to take care of themselves from this bombardment, but the corporations like to pretend that ~~they~~ people can, so they wash their hands for as long as they are able.

Then on top of this, the algorithm has been further weaponized by even more malicious actors who have figured out how to game the system.
That’s how toxic meatheads like infowars and joe rogan get a huge bullhorn that reaches millions. “Huh… DMT experiences… sounds interesting”, the format is entertaining… and before you know it, you are listening to anti-vax and qanon excrement, your mind starts to normalize the most outlandish things.

EDIT: a word, for clarity

permalink
report
parent
reply
3 points

Whenever I end up watching something from a bad channel I always delete it from my watch history, in case that affects my front page too.

permalink
report
parent
reply
2 points

Huh, I tried that. Still got recommended incel-videos for months after watching a moron “discuss” the Captain Marvel movie. Eventually went through and clicked “dont recommend this” on anything that showed on my frontpage, that helped.

permalink
report
parent
reply
2 points
*

I do that, too.

However I’m convinced that Youtube still has a “suggest list” bound to IP addresses. Quite often I’ll have videos that other people in my household have watched suggested to me. While some of it can be explained by similar interests, but it happens a suspiciously often.

permalink
report
parent
reply
19 points
*

My normal YT algorithm was ok, but shorts tries to pull me to the alt-right.
I had to block many channels to get a sane shorts algorythm.

“Do not recommend channel” really helps

permalink
report
parent
reply
6 points

It really does help. I’ve been heavily policing my Youtube feed for years and I can easily see when they make big changes to the algorithm because it tries to force feed me polarizing or lowest common denominator content. Shorts are incredibly quick to smother mebin rage bait and if you so much as linger on one of those videos too long, you’re getting a cascade of alt-right bullshit shortly after.

permalink
report
parent
reply
5 points

Using Piped/Invidious/NewPipe/insert your preferred alternative frontend or patched client here (Youtube legal threats are empty, these are still operational) helps even more to show you only the content you have opted in to.

permalink
report
parent
reply
14 points

Reason and critical thinking is all the more important in this day and age. It’s just no longer taught in schools. Some simple key skills like noticing fallacies or analogous reasoning, and you will find that your view on life is far more grounded and harder to shift

permalink
report
parent
reply
14 points

I think it’s worth pointing out “no longer” is not a fair assessment since this is regularly an issue with older Americans.

I’m inclined to believe it was never taught in schools, and is probably more likely to be a subject teachers are increasingly likely to want to teach (i.e. if politics didn’t enter the classroom it would already be being taugh, and might be in some districts).

The older generations were given catered news their entire lives, only in the last few decades have they had to face a ton of potentially insidious information. The younger generations have had to grow up with it.

A good example is that old people regularly click malicious advertising, fall for scams, etc, they’re generally not good at applying critical thinking to a computer, where as younger people (typically though I hear this is regressing some with smartphones) know about this stuff and are used to validating their information (or at least have a better “feel” for what’s fishy).

permalink
report
parent
reply
9 points

Just be aware that we can ALL be manipulated, the only difference is the method. Right now, most manipulation is on a large scale. This means they focus on what works best for the masses. Unfortunately, modern advances in AI mean that automating custom manipulation is getting a lot easier. That brings us back into the firing line.

I’m personally an Aspie with a scientific background. This makes me fairly immune to a lot of manipulation tactics in widespread use. My mind doesn’t react how they expect, and so it doesn’t achieve the intended result. I do know however, that my own pressure points are likely particularly vulnerable. I’ve not had the practice resisting having them pressed.

A solid grounding gives you a good reference, but no more. As individuals, it is down to us to use that reference to resist undue manipulation.

permalink
report
parent
reply
2 points
Deleted by creator
permalink
report
parent
reply
3 points

imagine if they taught critical media literacy in schools. of course that would only be critical media literacy with an american propaganda backdoor but still

permalink
report
parent
reply
1 point

Texas basically banned critical thinking skills in the school system

permalink
report
parent
reply
14 points

I mean, you probably are, especially if it’s explicitly political. All I can recommend is CONSTANT VIGILANCE!

permalink
report
parent
reply
13 points

YouTube’s entire business is propaganda: Ads.

permalink
report
parent
reply
12 points

What ad? Glances at uBlock Origin

permalink
report
parent
reply
2 points

Lately the number of ads on YouTube has increased by an order of magnitude. What they managed to accomplish was driving me away.

permalink
report
parent
reply
12 points

I watch a lot of history, science, philosophy, stand up, jam bands and happy uplifting content… I am very much so feeding my mind lots of goodness and love it…

permalink
report
parent
reply
7 points

Just this week I stumbled across a new YT channel that seemed to talk about some really interesting science. Almost subscribed, but something seemed fishy. Went on the channel and saw the other videos, immediately got the hell out. Conspiracies and propaganda lurk everywhere and no one is save. Mind you, I’m about to get my bachelor’s degree next year, meaning I have received proper scientific education. Yet I almost fell for it.

permalink
report
parent
reply
7 points

At this point, any channel that I know is either bullshit or annoying af I just block. Out of sight out of mind.

permalink
report
parent
reply
4 points

Same. I have ads blocked and open YouTube directly to my subbed channels only. Rarely open the home tab or check related videos because of the amount of click bait and bs.

permalink
report
parent
reply
4 points

Ohh I just use BlockTube to block channels/ videos I don’t want to see.

permalink
report
parent
reply
5 points

I find it interesting how some people have so vastly different experience with YouTube than me. I watch a ton of videos there, literally hours every single day and basically all my recommendations are about stuff I’m interested in. I even watch occasional political videos, gun videos and police bodycam videos but it’s still not trying to force any radical stuff down my throat. Not even when I click that button which asks if I want to see content outside my typical feed.

permalink
report
parent
reply
6 points

My youtube is usually ok but the other day I googled an art exhibition on loan from the Tate Gallery, and now youtube is trying to show me Andrew Tate.

permalink
report
parent
reply
5 points

I watch a ton of videos there, literally hours every single day and basically all my recommendations are about stuff I’m interested in.

The algorithm’s goal is to get you addicted to Youtube. It has already succeeded. For the rest of us who watch one video a day, if at all, it employs more heavy-handed strategies.

permalink
report
parent
reply
2 points

That’s a good point. They don’t care what I watch. They just want me to watch something.

permalink
report
parent
reply
5 points

At one point I watched a few videos about marvel films and the negatives about them. One was about how captian marvel wasn’t a good hero because she was basically invincible and all powerful etc etc. I started getting more and more suggestions about how bad the new strong female leads in modern films are. Then I started getting content about politically right leaning shit. It started really innocuously and it’s hard to figure out if it’s leading you a certain way until it gets further along. It really made me think when I’m watching content from new channels. Obviously I’ve blocked/purged all channels like that and my experience is fine now.

permalink
report
parent
reply
2 points
*

The experience is different because it’s not one algorithm for everyone.

Demographics are targeted differently. If you actually get a real feed, it’s only because no one has yet paid YouTube for guiding you towards their product.

It would be an interesting experiment to set up two identical devices and then create different Google profiles for each just to watch the algorithm take them in different directions.

permalink
report
parent
reply
5 points
*

I have to clear out my youtube recommendations about once a week… no matter how many times I take out or report all the right-wing garbage, you can bet everything that by the end of the week there will be a Jordan Peterson or PragerU video in there. How are people who aren’t savvy to the right-wing’s little “culture war” supposed to navigate this?

permalink
report
parent
reply
2 points

You should use an extension like blocktube.

permalink
report
parent
reply
1 point

I probably should… but I have to admit that I kinda enjoy reporting them.

Thanks - I’ll certainly look into it.

permalink
report
parent
reply
2 points

I don’t understand how these people can endure enough ads to be lured in by qanon. The people of that generation generally don’t know about decent adblockers.

permalink
report
parent
reply
2 points
Deleted by creator
permalink
report
parent
reply
143 points

the damage that corporate social media has inflicted on our social fabric and political discourse is beyond anything we could have imagined.

permalink
report
reply
15 points

This is true, but one could say the same about talk radio or television.

permalink
report
parent
reply
44 points

Talk radio or television broadcasts the same stuff to everyone. It’s damaging, absolutely. But social media literally tailors the shit to be exactly what will force someone farther down the rabbit hole. It’s actively, aggressively damaging and sends people on a downward spiral way faster while preventing them from encountering diverse viewpoints.

permalink
report
parent
reply
16 points

I agree it’s worse, but i was just thinking how there are regions where people play ONLY Fox on every public television, and if you turn on the radio it’s exclusively a right-wing propagandist ranting to tell you democrats are taking all your money to give it to black people on welfare.

permalink
report
parent
reply
3 points

And it sucks people back in like a breadcrumbing ex when it hasn’t seen you active recently.

permalink
report
parent
reply
18 points

Yes, I agree - there have always been malevolent forces at work within the media - but before facebook started algorithmically whipping up old folks for clicks, cable TV news wasn’t quite as savage. The early days of hate-talk radio was really just Limbaugh ranting into the AM ether. Now, it’s saturated. Social media isn’t the root cause of political hatred but it gave it a bullhorn and a leg up to apparent legitimacy.

permalink
report
parent
reply
5 points

Social media is more extreme, but we can’t discount the damage Fox and people like Limbaugh or Michael Savage did.

permalink
report
parent
reply
10 points

Agreed, Ted Kaczynski was right about technology evidently.

permalink
report
parent
reply
10 points
*
Deleted by creator
permalink
report
parent
reply
4 points

He was pretty good at math too

permalink
report
parent
reply
1 point

Populism and racism is as old as societies. Anciant Greece already had it. Rome fell to it. Christianism is born out of it.

Funnily enough, people always complained about how bad their society was because of this new thing. Like 5 thousand years ago already. Probably earlier even.

Which is not to say we shouldn’t do anything about it. We definitely should. But common sense won’t save us unfortunately.

permalink
report
parent
reply
114 points

In the google account privacy settings you can delete the watch and search history. You can also delete a service such as YouTube from the account, without deleting the account itself. This might help starting afresh.

permalink
report
reply
17 points
*

I was so weirded out when I found out that you can hear ALL of your “hey Google” recordings in these settings.

permalink
report
parent
reply
9 points

Yeah, anything you send Google/Amazon/Facebook they will keep. I have been moving away from them. Protonmail for email, etc.

permalink
report
parent
reply
97 points

Log in as her on your device. Delete the history, turn off ad personalisation, unsubscribe and block dodgy stuff, like and subscribe healthier things, and this is the important part: keep coming back regularly to tell YouTube you don’t like any suggested videos that are down the qanon path/remove dodgy watched videos from her history.

Also, subscribe and interact with things she’ll like - cute pets, crafts, knitting, whatever she’s likely to watch more of. You can’t just block and report, you’ve gotta retrain the algorithm.

permalink
report
reply
18 points

Yeah, when you go on the feed make sure to click on the 3 dots for every recommended video and “Don’t show content like this” and also “Block channel” because chances are, if they uploaded one of these stupid videos, their whole channel is full of them.

permalink
report
parent
reply
8 points

Would it help to start liking/subscribing to videos that specifically debunk those kinds of conspiracy videos? Or, at the very least, demonstrate rational concepts and critical thinking?

permalink
report
parent
reply
13 points

Probably not. This is an almost 70 year old who seems not to really think rationally in the first place. She’s easily convinced by emotional misinformation.

Probably just best to occupy her with harmless entertainment.

permalink
report
parent
reply
4 points

We recommend her a youtube channel about linguistics and she didn’t like it because the Phd in linguistics was saying that is ok for language to change. Unfortunately, it comes a time when people just want to see what already confirms their worldview, and anything that challenges that is taken as an offense.

permalink
report
parent
reply

No Stupid Questions

!nostupidquestions@lemmy.world

Create post

No such thing. Ask away!

!nostupidquestions is a community dedicated to being helpful and answering each others’ questions on various topics.

The rules for posting and commenting, besides the rules defined here for lemmy.world, are as follows:

Rules (interactive)


Rule 1- All posts must be legitimate questions. All post titles must include a question.

All posts must be legitimate questions, and all post titles must include a question. Questions that are joke or trolling questions, memes, song lyrics as title, etc. are not allowed here. See Rule 6 for all exceptions.



Rule 2- Your question subject cannot be illegal or NSFW material.

Your question subject cannot be illegal or NSFW material. You will be warned first, banned second.



Rule 3- Do not seek mental, medical and professional help here.

Do not seek mental, medical and professional help here. Breaking this rule will not get you or your post removed, but it will put you at risk, and possibly in danger.



Rule 4- No self promotion or upvote-farming of any kind.

That’s it.



Rule 5- No baiting or sealioning or promoting an agenda.

Questions which, instead of being of an innocuous nature, are specifically intended (based on reports and in the opinion of our crack moderation team) to bait users into ideological wars on charged political topics will be removed and the authors warned - or banned - depending on severity.



Rule 6- Regarding META posts and joke questions.

Provided it is about the community itself, you may post non-question posts using the [META] tag on your post title.

On fridays, you are allowed to post meme and troll questions, on the condition that it’s in text format only, and conforms with our other rules. These posts MUST include the [NSQ Friday] tag in their title.

If you post a serious question on friday and are looking only for legitimate answers, then please include the [Serious] tag on your post. Irrelevant replies will then be removed by moderators.



Rule 7- You can't intentionally annoy, mock, or harass other members.

If you intentionally annoy, mock, harass, or discriminate against any individual member, you will be removed.

Likewise, if you are a member, sympathiser or a resemblant of a movement that is known to largely hate, mock, discriminate against, and/or want to take lives of a group of people, and you were provably vocal about your hate, then you will be banned on sight.



Rule 8- All comments should try to stay relevant to their parent content.

Rule 9- Reposts from other platforms are not allowed.

Let everyone have their own content.



Rule 10- Majority of bots aren't allowed to participate here.

Credits

Our breathtaking icon was bestowed upon us by @Cevilia!

The greatest banner of all time: by @TheOneWithTheHair!

Community stats

  • 9.3K

    Monthly active users

  • 3.1K

    Posts

  • 124K

    Comments