TikTok has to face a lawsuit from the mother of 10-year-old Nylah Anderson, who “unintentionally hanged herself” after watching videos of the so-called blackout challenge on her algorithmically curated For You Page (FYP). The “challenge,” according to the suit, encouraged viewers to “choke themselves until passing out.”

TikTok’s algorithmic recommendations on the FYP constitute the platform’s own speech, according to the Third Circuit court of appeals. That means it’s something TikTok can be held accountable for in court. Tech platforms are typically protected by a legal shield known as Section 230, which prevents them from being sued over their users’ posts, and a lower court had initially dismissed the suit on those grounds.

54 points

I am generally very skeptical of lawsuits making social media and other Internet companies liable for their users’ content, because that’s usually a route to censor whatever the government deems “harmful”, but I think this case actually makes perfect sense by attacking the algorithmic “curation” that they do. Imo social media should go back to being a purely chronological feed, curated by the users themselves, and cut corporate influence out of the equation.

permalink
report
reply
33 points

social media should go back to being a purely chronological feed, curated by the users themselves, and cut corporate influence out of the equation.

But then how would they make money if they can’t keep users doomscrolling forever to keep serving them ads? Won’t someone think of the shareholders?!

permalink
report
parent
reply
4 points

Unfortunately nobody can stop me from doomscrolling.

permalink
report
parent
reply

As if that would at all stop these dumbass challenges from being posted and copied? People have been hurting themselves copying something they saw someone else doing even before the invention of the camera.

permalink
report
parent
reply
14 points
*

Yes, but that is not the entirety or even majority of the problem with algorithmic feed curation by corporations. Reducing visibility of those dumb challenges is one of many benefits.

permalink
report
parent
reply
5 points

No it wouldn’t, but people would only see them if they were part of a preexisting community where such things are posted or they specifically looked for them.

On the Internet, censorship happens by having too much information for our limited time and attention span, so going after recommendation algorithms will work.

permalink
report
parent
reply
31 points

I’m gonna take the side that tok is potentially liable on the algo argument but these parents also failed their children. Teaching your kids to avoid replicating unsafe internet content should be just as primary as looking both ways before crossing the road.

permalink
report
reply
9 points

“If your friend told you to jump off a bridge, would you?”

-Any decent parent

permalink
report
parent
reply
7 points

“Bridge jumping challenge”

  • TikTok shitposter
permalink
report
parent
reply
8 points

As a society, we’re responsible for all our children. The point of child protection laws, and population protection in general, is to support and protect them, because often times, parents are incapable of doing so, or it’s social dynamics that most parents can’t really understand, follow, or teach in.

Yes, parents should teach and protect their children. But we should also create an environment where that is possible, and where children of less fortunate and of less able parents are not victims of their environment.

I don’t think demanding and requiring big social platforms to moderate and regulate at least to the degree where children are not regularly exposed to life-threatening trends is a bad idea.

That stuff can still be elsewhere if you want it. But social platforms have a social dynamic, more so than an informative one.

permalink
report
parent
reply
26 points
*

I remember reading that China’s version of tiktok more promotes stuff like sciences to kids. Then for everyone else they get degeneracy of stuff like stealing KIAs, licking grocery store items, and now black out challenges.

It would be interesting if how the algorithm is tuned for China and the rest of the world was available. Makes me wonder if it’s intentional to try to make society a worse place with inventive uses of pushing certain trends on international versions of tiktok instead of filtering them out.

Stuff like Facebook and Twitter are insane too so it’s all self sabatoge at this point, but tiktok has seemed to become the trend setter.

permalink
report
reply
14 points

Makes me wonder if it’s intentional to try to make society a worse place with inventive uses of pushing certain trends on international versions of tiktok instead of filtering them out.

Good lord, this is a massive reach. A much simpler explanation is that algorithmic garbage is profitable, and China’s government does not care about negative ramifications that occur outside China itself and so do not regulate it.

China’s run by a terrible government, not an MCU villain.

permalink
report
parent
reply
4 points

Uhhh… I don’t think you got my point for why I also included Facebook and Twitter at the end as examples of domestic companies also willingly allowing harmful societal trends.

Money being a reason doesn’t absolve and provide a convient out and let companies do whatever they want without consequence or criticism. I put them all in the camp of willingly selling out a worse society for profit, and whether a country sees that as a win for them or not doesn’t change that.

permalink
report
parent
reply
7 points
*

this is just how capitalism works - you have to appeal to your audience more than your competition, and guess which kind of content teenagers want to watch more. Hell, even adults want fun content as opposed to educational content.

they’re not willingly selling a worse society for profit, that’s just the only way to stay competitive.

any platform that pushes educational content in North America would just not get any customers and go bankrupt.

edit: there’s plenty of educational video platforms out there, like Khan academy. Try and get your kids to scroll through that during their free time instead, I bet they won’t.

permalink
report
parent
reply
13 points

Yeah Douyin is pushing educational content and is very fast to censor harmful stuff. Still full of garbage and racism though, just the sanctioned kind against people the government doesn’t like.

permalink
report
parent
reply
16 points

Shit like this is why I intend to keep my (currently) 9yo as far away from social media as I can, for as long as I can. This fucking terrifies me, as it should any parent.

permalink
report
reply
17 points

Educating your kid about the many possible pitfalls of social media is even more important. They will eventually experience it, are likely already to some degree through their friends’ devices exposed to it. Don’t make the mistake of turning social media into some kind of forbidden fruit, but instead provide them with the tools to deal with it responsibly.

That said, I would still not allow this Chinese psy-ops tool on any device in my household. Other social media is already terrible enough, but TikTok seems to be engineered to cause nothing but damage.

permalink
report
parent
reply
7 points

I know some amazing parents that have super open communication and excellent teaching moments with their kids, they still fell into the social media morass…because friends (and teenage brain) are a heavy influence even with a safe supportive home

permalink
report
parent
reply
2 points

This is why I think monitored access is a better idea than total withholding. Kids are going to end up on social media; either as they grow up and eventually become adults, or as a result of peer providing access & pressure. Best to let them on, but ensure they are safe, know how to be safe, and know why to be safe.

permalink
report
parent
reply
1 point

That’s a universal truth about parenting though and not limited to just social media.

permalink
report
parent
reply
6 points

My own belief is that all social media is a cancer, and to be avoided entirely. I’m able to do that for myself, but I’m also realistic about the chances of keeping my kids away from it. So, I focus my energy on trying to equip them with the mental skills to neutralise the toxic aspects of social media.

For my 9yo, that means teaching her to employ natural skepticism and critical thinking. I’m also trying to drum into her the understanding that social media is inherently untrustworthy and unreliable, and exists solely for the benefit of the corporations that run it.

That said, I’ve blocked Tik Tok on my home network, much to the older kids’ chagrin. They have to use mobile data if they want to access that shit on their phones.

permalink
report
parent
reply
7 points

My own belief is that all social media is a cancer, and to be avoided entirely. I’m able to do that for myself

You just posted this to a social media site…

permalink
report
parent
reply
10 points

ah shit me and my friends used to do this, pre-social media. I remember one time in middle school recess, going out to the farthest corner of the playground with my friends, and we all did a thing where we took turns holding our breath while someone else squeezed our chest. I remember blacking out, hearing the pokemon theme in pitch darkness, and then waking up on the ground.

I don’t think we did it more than once (at least I didn’t). But of course, the crucial difference was that I was with my dumbass friends, so at least there was someone to run for help if someone didn’t wake up.

permalink
report
reply

Technology

!technology@beehaw.org

Create post

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community’s icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

Community stats

  • 2.8K

    Monthly active users

  • 2.9K

    Posts

  • 53K

    Comments