Your Algorithm Is Fooling You

personalized algorithms

Parents frequently worry about the effect social media is having on their families. They approach the topic as if it were some sort of villain: The Algorithm is coming for our kids, they might say. But what is “The Algorithm?” And how is it affecting all of us?

Simply put, social media algorithms are curated feeds. Machine learning allows these powerful programs to analyze someone’s social media use and—often with astonishing accuracy—predict what sort of content that someone might also enjoy. It’s a way for social media companies to increase user engagement. Because if you grow bored with what you’re seeing, or if you simply don’t like it, then you’ll log off and start doing something else.

In adolescents, whose minds are still developing, this can cause a level of social media addiction. Indeed, even adults can fall prey to these addictive features. It’s why Meta (the parent company of Facebook and Instagram) and Google (the parent company of YouTube) recently lost a landmark case regarding the design of their products: A California jury decided these companies purposefully and maliciously designed their products to be addictive, handing the plaintiff a $3 million decision.

But there’s another risk involved with algorithms that you might not be aware of: In fact, it may have already influenced you in ways that aren’t so easy to reconcile.

Algorithms Shape Our Bias

Everybody has a bias. Ask anyone who works in public relations, and they’ll tell you. But that doesn’t have to be a bad thing. For instance, if you read a lot of articles by Plugged In or Focus on the Family, you may be biased toward Christian content.

Unfortunately, social media algorithms can encourage us to form biased opinions that we may not otherwise have developed on our own.

In a study conducted by Ohio State University, researchers found that “even when you know nothing about a topic,” algorithm-driven information “can start building biases immediately and can lead to a distorted view of reality.”

Essentially, algorithmic recommendations typically only show you one viewpoint—the one that your previous choices have demonstrated you will most likely agree and engage with. Because of that, “people miss information when they follow an algorithm,” a co-author on the study said, “but they think what they do know generalizes to other features and other parts of the environment that they’ve never experienced.”

The researchers described their findings using a simple scenario:

“A person who has never watched movies from a certain country decides to try some. An on-demand streaming service offers recommendations. The viewer selects an action-thriller because it appears at the top of the list. The algorithm then promotes more action-thrillers, which the viewer continues to choose.”

The authors of the study wrote that if the viewer’s goal was to get a better understanding of the movies offered by that particular country, the algorithmic recommendation would have seriously biased that understanding. “By only seeing one genre, the person may overlook strong films in other categories. They may also form inaccurate and overly broad assumptions about the culture or society represented in those movies.”

Algorithms Affect Our Understanding

To further demonstrate the impact on bias, the researchers also applied their findings to algorithm-guided learning.

They conducted an experiment wherein participants were given information about the physical attributes of fictional extraterrestrial aliens and asked to classify them. In one group, the participants were required to look at all of an alien’s features before identifying them. In another, participants could select which features they wanted to see, and then a personalized algorithm would provide recommendations based on those selections. In this group, participants could still manually select any feature they wanted to view. They could also skip anything they didn’t want to look at. But the algorithm usually recommended the same features over and over, rather than giving them the whole picture. Therefore, when tested, the participants in this group frequently classified the aliens incorrectly.

We may not be looking to social media to identify extraterrestrial visitors, but these studies illustrate how algorithms can shape, and distort, our own view of the world. If someone chooses to learn about a subject by watching YouTube videos or scrolling through TikToks feeds or Reels, they’re not going to get the whole picture. They might know a lot about one attribute of a topic, but they may not understand how that attribute fits into the broader subject. It would be like studying one verse or even one book of Scripture. You may have memorized it, know who authored it and even applied it to your daily life, but without the context of the whole Bible, you may not truly understand what it means—what God intended for it to mean.

If you’ve heard social media compared to an echo chamber, this is why. Algorithms aren’t programmed to broaden our horizons. They’re programmed to pinpoint the exact, precise, niche topics that we’ll most enjoy consuming. But “consuming similar content is often not aligned with learning,” the study authors cautioned.

Teens Love Their Algorithms

Now, you can explain everything written in that study—everything I just summarized—to your teens. And maybe some of them will listen. But many teens actually like their personalized algorithms.

According to a qualitative interview study featured in Fast Company, teens enjoy the customized content provided by algorithms because it shows them things they agree with and actually want to see without having to search for it themselves. And because algorithms can be so accurate, they even see the content presented as a reflection of themselves.

Because of this, many teens are unaware—or unconcerned—with algorithmic bias. Based on what they told the interviewers, they felt confident in their ability to ignore or scroll past content that didn’t align with their personal beliefs or self-image.

Unfortunately, further research says otherwise.

According to several studies, teens have proven themselves “highly vulnerable to self-image distortion and other mental health problems based on social media algorithms.” Researchers know that the developing teen brain is exceptionally malleable to what their peers say and believe—including feedback provided via social media. So “teens are wrong to believe that they can scroll past the self-identity risks of algorithms.”

Parents Can Fight Algorithmic Tendencies

So parents, that leaves you with a pretty heavy burden. How do you teach your child to resist their algorithmic tendencies without challenging their very sense of self?

  1. Reinforce your family’s values and beliefs. When your son or daughter has a good foundation built on truth, it will be harder for algorithmic content to fool them into accepting something false about the world.
  2. Constantly affirm your child’s inherent value. Coming back to that foundation: When your kids are rooted in truth, it will be more difficult for algorithms to trick them into believing something untrue about themselves.
  3. Encourage your kids to challenge what they see online. Rather than using YouTube to do a deep dive, resist the algorithm by looking for a book about the subject at your local library. Or try to “break” your algorithm by searching for and exclusive watching content that you know comes from a reputable source.
  4. Fight your own algorithmic tendencies. Demonstrate healthy online behaviors by further researching the content that you yourself are consuming. Verify your sources. And when you catch yourself mindlessly scrolling through social media because you’re bored, put the phone away and find a new activity—especially family activities—to engage in.

Emily Tsiao

Emily studied film and writing when she was in college. And when she isn’t being way too competitive while playing board games, she enjoys food, sleep, and geeking out with her husband indulging in their “nerdoms,” which is the collective fan cultures of everything they love, such as Star Wars, Star Trek, Stargate and Lord of the Rings.

2 Responses

  1. I appreciate that you guys are highlighting this issue, as I believe our algorithm driven, social media culture is more primarily to blame for our widespread social and ideological division than anything else these days. Its important to recognize how these companies are using algorithms to shape our biases and feed us rage-inducing content to keep us angry at each other and addicted to their platforms- all for the sake of their own profit margins.

  2. Your article is a good way to talk to the algorithm

    In my experience, you can get out of the YouTube algorithm by using Firefox in private browsing on a Windows laptop at a large wifi network such as your public library. But it doesn’t always work on an android device and ideally you should not connect any of your devices to a Google service while doing it. Beyond the first page of results, though, Google will start building a temporary algorithm and mainly show channels you have clicked on, so you have to close out your window often and re-search to keep the results more objective.

    What’s on Netflix has lists of everything on Netflix sorted by rating, so every option can be seen.

    Other video websites like Rumble don’t seem to have as much of a pigeonholing effect, but they also are more podcasts.

Leave a Reply

Your email address will not be published. Required fields are marked *