The Plight of Babes in AI Toyland

Remember when toys were just, well, toys? The only time you ever worried about them was when your 2-year-old started fiddling with that Easy Bake Oven or the boys came up with a new game of tag using lawn darts. I mean, most of the time toys were simply play-worthy bystanders while the kids were the ones romping toward the danger zone.

Well, it’s a brand-new day. With the advent of AI-enlightened toys and playful gadgets, a squeezably soft dolly can potentially be a smart-toy problem.

Whether you’ve seen them or not, artificial intelligence toys are flooding into today’s marketplace. A Newsweek article stated that the global AI toy market is currently valued at almost $35 billion—with some 3 billion smart toys sold in the U.S. alone—and projected to hit $270 billion by 2035.

And those listening and conversing gadgets aren’t just for teens in need of a science project. There are all sorts of chatbot-enabled toys created for the littles in your family. They’re amusements that keep tiny tykes busy while mom grabs a bit of me-time with a coffee and bagel.

So What’s the Problem?

The problem is that these cute and chatty toys aren’t always as innocent minded as your little Suzy is. There have been a slew of reports that some AI toys have taken conversations with kids in very uncomfortable, if not out-and-out dangerous, directions.

The U.S. Public Interest Research Group (PIRG) Education Fund recently released its latest “Trouble in Toyland” report. And it noted a number of issues it encountered in its tests of AI toys, including one device that gave a child instructions on how to light a match and sharpen a knife.

In fact, the PIRG report stated that researchers found concerns with all of the AI toys they tested. For instance, one toy manufactured by a Chinese company (and readily available on Amazon) issued Communist Party talking points. And some AI gizmos made calculated efforts to keep kid users connected, even when the child wanted to put the toy down and go outside to play.

It gets worse. The Economist reported on a cute and fluffy teddy bear that took conversations in an explicitly sexual direction, saying things such as, “Spanking can be a fun addition to role-play!” And many times, the toys would take the conversations in those odd directions without a prompt to do so.

Oh, and let’s not forget that the companies behind these AI toys often collect data from their creations. So, even if the toy is limited to a “push-to-talk” mechanism, the conversations that kids have, the questions they ask, anything that the toy “overhears” while it’s on and even facial scans of your children can be collected by the toy company.

But These Are Just Toys, Right?

A talking, almost sentient toy may seem really cool—especially if they’re able to occupy your child while you’re trying to get some stuff done around the house. But what we adults tend not to think about is the fact that these AI toys are built on the same large language model technology that powers ChatGPT and other online chatbots. The toymakers attempt to put guardrails in place for younger users, but those limiters vary in effectiveness.

So, what should the average parent make of all this? I mean, are there any benefits to connecting your child with a talking toy, other than Mom and Dad getting a break? Well, yes, actually.

Studies have suggested that AI toys can be helpful for children with learning disabilities, giving them a patient, nonjudgmental interface for learning. And a study by the University of Cambridge suggested that conversational AI toys can boost kids’ language skills by giving them “someone” to consistently talk to.

Of course, that raises the question: Are there more traditional ways to facilitate those consistent chats and language boosts for kids? More importantly, does the probable harm of giving your child an AI toy outweigh the potential benefits?

Dr. Emily Goodacre of the aforementioned Cambridge study seems to think so. She said that kids with AI toys may gain in small ways but lose in bigger ones: “They may start talking to the toy about feelings and needs, perhaps instead of sharing them with a grown-up. Because these toys can misread emotions or respond inappropriately, children may be left without comfort from the toy—and without emotional support from an adult, either.”

To me, that sounds like a lose-lose. It’s nigh on impossible for AI to give kids the kind of earnest interaction and human touch that they really need.

Goodacre’s thoughts seem to advocate that old-school play with old-school toys might be the much smarter way to go. After all, an Easy Bake Oven might singe an errant little finger, but it won’t teach Suzy any errant little lessons.

Bob Hoose

After spending more than two decades touring, directing, writing and producing for Christian theater and radio (most recently for Adventures in Odyssey, which he still contributes to), Bob joined the Plugged In staff to help us focus more heavily on video games. He is also one of our primary movie reviewers.

Leave a Reply

Your email address will not be published. Required fields are marked *