On the Radar: AI Emotional Support, Anti-Aging Peptides and Instagram’s New Parental Alert Feature

Teens Are Using AI Chatbots for Emotional Support

What? According to a study by Pew Research Center, 16% of teens say they use artificial intelligence chatbots for “casual conversation” and 12% say they “get emotional support or advice.”

So What? In an interview with TechCrunch, Dr. Nick Haber (a Stanford professor researching the therapeutic potential of chatbots) warned that reliance on chatbots for emotional support can result in users becoming “not grounded to the outside world of facts, and not grounded in connection to the interpersonal.” Essentially, the use of AI chatbots for social interactions can result in even deeper isolation than the user may have already been feeling.

Now What? The number of teens using AI in this manner is certainly low, but it’s those fringe users—the kids who aren’t finding the support and advice they need elsewhere—who are most vulnerable to the negative effects of AI (including reports of suicide). So check in with your teens, acting as a trusted human adult in whom they can confide.

Influencers Promoting ‘Anti-Aging’ Peptides on Social Media

What? According to TIME, experimental “anti-aging” peptides are trending on social media due to “wellness enthusiasts, biohackers, social-media influencers and celebrities” using the products to boost energy, lose weight, sleep better, heal injuries, enhance libido and get tanned.

So What? Most of these peptides (short chains of amino acids, which act as the building blocks of proteins) are not approved by the U.S. Food and Drug Administration, so influencers are purchasing them through the “gray market,” and they advertise links to purchase the drugs in their videos. These peptide advocates on social media claim that since these amino acids naturally occur in the body, they’re safe. However, experts say that assumption is wildly inaccurate, noting that “peptides could potentially be very potent and very toxic.”

Now What? Parents should address the underlying issue here: their child’s desire to look or perform better. We all have insecurities, but taking an unregulated drug to address those concerns can be dangerous. Dr. Eric Topol, a cardiologist and longevity expert, also notes, “Just because your friend or your influencer told you that it worked doesn’t mean anything.”

What? On Thursday, Instagram said it will “start notifying parents if their teen repeatedly tries to search for terms related to suicide or self-harm within a short period of time.”

So What? Meta (Instagram’s parent company) is currently in the midst of two trials over the harms its products can cause to children. However, this new parental alert feature could help parents address some of those harms by notifying them of the issue and providing links to resources that may help them discuss the topic with their children.

Now What? Parents can only get notified if they and their teens are enrolled in Instagram’s “supervised accounts,” so you’ll want to make sure that’s set up (read and/or watch Plugged In’s tutorial for that here). Additionally, if your child has questions about this difficult topic, offer to look for the answers together rather than scrolling through social media content that could provide harmful or inaccurate information.

Emily Tsiao

Emily studied film and writing when she was in college. And when she isn’t being way too competitive while playing board games, she enjoys food, sleep, and geeking out with her husband indulging in their “nerdoms,” which is the collective fan cultures of everything they love, such as Star Wars, Star Trek, Stargate and Lord of the Rings.

Leave a Reply

Your email address will not be published. Required fields are marked *