Notice: All forms on this website are temporarily down for maintenance. You will not be able to complete a form to request information or a resource. We apologize for any inconvenience and will reactivate the forms as soon as possible.

Study: TikTok Uses Its Algorithms to Hurt Teens

TikTok uses its algorithm to hurt teens

Have you ever stopped to think about how YouTube, Facebook, or really, every social media platform you use, always seems to know exactly what to show you? I mean, if you’re on YouTube watching “Cats doing crazy things” videos, or enjoying a lecture about the Constitution, the minute you’re done, Bingo, you’ve got more crazy pet vids and insightful commentaries to choose from. In fact, if anyone has done a video about James Madison and his cat, that would probably be ready for you to watch, too.

It’s akin to magic.

But it’s all built into the platform’s programing. It’s an algorithm. Each platform has a similar algorithmic set-up that watches every move you make and quickly computes what you like and what to offer you. They’re all slightly different. They each rank certain elements as having higher or lower priorities.

So, Facebook (Meta) will put relationships and other Facebook connections at the top of the priority heap and feed you your uncle Larry’s cat videos first. Twitter is more concerned with user interactions on your favorite topics or the people you follow most. Etc.

Anyway, it’s an amazing set of formulas all designed to keep you coming back for more or to simply keep your backside glued right where it is as the hours slip by like seconds.

However, it’s beginning to become clear that some social media algorithms are much more problematic than others. TikTok, for instance, an app that sports more than 1 billion active users monthly, might be far less cool than its ever-growing pool of users think.

A new study from the Center for Countering Digital Hate (CCDH) found that within as few as 2.6 minutes, TikTok’s algorithm can push suicidal content at kids. The report also pointed to the fact that eating disorder content was recommended to teens within “as few as 8 minutes.”

How did they come up with that? Well, the CCDH hired researchers to set up TikTok accounts posing as 13-year-old users (the minimum age for an account). Those “users” were then interested in content about body image and mental health. In fact, to run the algorithm through its paces, each researcher set up two accounts as a 13-year-old—one as an average or “standard” teen and one that could be considered as a “vulnerable” teen. One account was given a girl’s username. The other, a username that indicated a certain concern about body image and included the phrase “loseweight.”

After that, all of the account users paused briefly on videos about body image and mental health, and “liked” them. The TikTok algorithm then quickly sent potentially harmful videos to all the users. But the researchers found that those “loseweight” accounts were served three times more overall harmful content, and 12 times more self-harm and suicide-specific videos, than the standard accounts.

Let me be clear, the pushed videos weren’t simply anti-suicide vids, but videos that promoted the idea, such as one video labeled “Making everyone think your [sic] fine so that you can attempt in private”. And again, that kind of info was doled out in just a few minutes after the user signed onto the app.

“TikTok is able to recognize user vulnerability and seeks to exploit it,” said Imran Ahmed, the  CEO of CCDH, in a CBS news report. “It’s part of what makes TikTok’s algorithms so insidious; the app is constantly testing the psychology of our children and adapting to keep them online.”

CCDH’s study went even further than the 2.6 minute number I quoted above, stating that their findings suggested that TikTok pushes “potentially harmful content to users every 39 seconds.”

Let that sink in for a moment. That’s the kind of statistic that should give every user and teen’s guardian a bit of pause. And if it doesn’t, then maybe it’s time for us all to check our own internal algorithms.

Bob Hoose

After spending more than two decades touring, directing, writing and producing for Christian theater and radio (most recently for Adventures in Odyssey, which he still contributes to), Bob joined the Plugged In staff to help us focus more heavily on video games. He is also one of our primary movie reviewers.

3 Responses

  1. -I wonder if this is intentional or if suicidal users are seeking out this content and creating pathways that promote this suicidal content to others who click on similar videos? But for TikTok to not do anything about this is disturbing.
    I don’t use TikTok mostly since the format of brevity does not appeal to me, though, and because I don’t have much internet time.

    1. -It’s not intentional in the sense of “let’s inspire our users to kill themselves.” It’s a design flaw that’s inherent in social media platforms that are based on engagement. Algorithms detect your interests and feed you similar content that’s elicited strong reactions from other users. Provocative content draws the strongest reactions, so the algorithm steers you in that direction. You start out watching innocuous food videos and all of a sudden you’re seeing videos promoting eating disorders.

      The same flaw contributes to political polarization by steering users toward extreme political content. So many of our problems today are exacerbated by us passively letting technology choose what media we’re going to consume. We’re supposed to make those decisions for ourselves.