Notice: All forms on this website are temporarily down for maintenance. You will not be able to complete a form to request information or a resource. We apologize for any inconvenience and will reactivate the forms as soon as possible.

Of Algorithms, Eating Disorders and Social Media

blog top 01-12 tiktok

You don’t have to do much research about screen time and social media before a particular word pops up: algorithm. It’s a vague, slightly menacing word if you’re not quite sure what it means. One could see it used in the context of a dystopian science fiction novel, for instance: “With every step as they ran, the teens could feel the algorithms drawing nearer, locked on to their scent, hungry to devour them.”

Dystopian drama aside, however, my fanciful sentence above actually isn’t far from the truth. An algorithm, in the most basic sense, is an automated computer code—a formula, if you will—that simply gives you more of the same kind of stuff that you’re looking for.

Sounds innocent enough, right? But if what you’re looking for—such as eating disorder videos on TikTok (which I’ll get to in a moment)—is self-destructive, the algorithm is often more than “happy” to serve up exactly that kind of content.

The Wall Street Journal recently chronicled this process with the video app TikTok, specifically as it relates to dieting and eating-disorder content. Journal contributors Tawnell D. Hobbs, Rob Barry and Yoree Koh created a dozen automated profiles for fictional 13-year-olds, then searched for videos related to dieting and weight loss. The authors describe what happened next:

“The popular video-sharing app’s algorithm served them tens of thousands of weight-loss videos within a few weeks of joining the platform. Some included tips about taking in less than 300 calories a day, several recommended consuming only water some days, another suggested taking laxatives after overeating. Other videos showed emaciated girls with protruding bones, a ‘corpse bride diet,’ an invitation to a private ‘Christmas-themed competition’ to lose as much weight as possible before the holiday and a shaming for those who give up on getting thin.”

Specifically, the authors say that their automated bot accounts received a whopping 32,000 videos of this type from October to December last year.

In other words, the algorithm went to work and did its job. Silently. Efficiently. Automatically. And the content it pushed toward these fictional young users included videos that could have negatively influenced someone struggling with an eating disorder. “The algorithm is just too freaking strong,” said Alyssa Moukheiber, a dietitian at Timberline Knolls, a treatment center outside of Chicago.

And lockdowns and school closures due to COVID-19 over the last two years have only intensified this problem. Fourteen-year-old Andi Duke told the Journal that she was spending as much as five hours a day interacting with this kind of content via TikTok: “The more I interacted with those types of videos, the more they started to show up. I wasn’t able to see how it was affecting me.”

TikTok responded to The Wall Street Journal’s investigative report by saying it’s attempting to adjust the algorithm to ensure that users don’t see too much of the same kind of content. The social media site has also said it’s seeking to give more control to users themselves to exclude certain kinds of content.

I’d love to believe that TikTok is serious about protecting young and vulnerable users. That said, color me skeptical. After all, social media sites are driven by traffic, and they’re designed to keep users engaged as long as possible. We shouldn’t be shocked to learn that TikTok’s incredibly efficient algorithm is serving up its users more of what they want.

But to expect these services to fully self-police themselves is not unlike asking Duncan Donuts or Starbucks to tell daily customers, “Sorry, you had coffee and donuts yesterday. You’re going to need to eat some vegetables today.” TikTok and the like may pay lip service to offering more features ostensibly protecting users. But that’s the diametric opposite of what their hunter-killer algorithms are designed to do.

So the responsibility for protecting our kids falls on us as parents. How can we effectively help our kids navigate a world in which the apps they and their friends love target them (and us, I might add) with the precision of a laser-guided bomb? Here some concrete suggestions.

First, delay engagement with social media apps as long as possible.

The only foolproof approach to avoiding potentially destructive influences via TikTok, Instagram, Snapchat and the like is to avoid downloading them in the first place. The older adolescents are before their first exposure to these apps, the more time they have to grow and develop the skills they need to use social media in a healthy way.

Second, establish specific boundaries on where, when and how much your kids can engage with social media that you allow.

Different families will likely have different boundaries and rules, but some suggestions include keeping phones out of bedrooms (especially at night), setting specific time periods and limits for usage, and only using social media in shared spaces in your home.

Third, engage and use parental control features.

Different apps include different capacities in terms of limits that can be set and how much parents can monitor kids’ engagement. Familiarize yourself with the features that are available, and let your children know that their social media usage is subject to your oversight.

Fourth, model healthy social media use yourself.

As parents, it’s tempting to say, “Do as I say, not as I do.” But if our kids see us using social media compulsively or addictively, they’re more likely to pay attention to what we actually do than what we say.

Fifth, become students of your kids.

If you see behavioral shifts that involve changes to eating and sleeping habits, excessive irritability or relational withdrawal, or extreme changes in clothing or personal appearance, those may be signs that there are deeper issues and potential problems to explore.

Finally, focus on your relationship with your tweens and teens.

Limits and parental controls are important parts of the equation when it comes to guiding your kids through the social media minefield. But ultimately, your relationship with your adolescents is what creates trust and influence. Take—and make—time to listen and ask questions. Seek to understand their interests before criticizing choices that are frustrating to you.

Ultimately, building that foundation of trust, respect and rapport provides the best path to recognizing and addressing the potentially problematic influences virtually every teen will face eventually.

adam-holz
Adam R. Holz

After serving as an associate editor at NavPress’ Discipleship Journal and consulting editor for Current Thoughts and Trends, Adam now oversees the editing and publishing of Plugged In’s reviews as the site’s director. He and his wife, Jennifer, have three children. In their free time, the Holzes enjoy playing games, a variety of musical instruments, swimming and … watching movies.

2 Responses

  1. -Check out Jaime French’s video about TikTok live on YouTube. It’s downright disturbing. She just posted it 1/11/22.