AI chatbots and companions are becoming more and more common across social media and the internet, as apps like Character.AI and ChatGPT soar in popularity and platforms like Instagram, WhatsApp and Snapchat add advanced AI chat capabilities to their platforms.
While these AI chat features can be entertaining or useful at times, they also come with significant risks for young people. In fact, researchers have found that AI chatbots are serving inappropriate, dangerous, and sexual content to kids.
More than half of teens and even 10 percent of 5- to 8-year-olds have used AI chatbots, yet only 1 in 3 parents of those kids know about it. Do you know what these AI “friends” are, what the risks are for kids and teens, and how to make sure they know how to use these tech tools safely? Read on for the rundown, plus suggestions on what you can say to your kids to help keep them safe.
What are AI chatbots or companions?
Most people are familiar with AI bots — tools that can respond to customer requests or internet searches in a conversational manner, without the user or customer interacting with a real person. However, AI “companions” or character-based AI chatbots take these virtual interactions via voice or text in a much more personalized direction.
AI companions or chatbots can take the form of different characters that seem more like real people. The chatbots available on Meta’s platforms (Facebook, Instagram, WhatsApp, and Threads), for example, are trained to sound like celebrities’ voices, so a conversation with the bot could feel like you’re chatting with Kristen Bell, Awkwafina, Keegan-Michael Key, or John Cena, or the characters they’ve played.
AI companions are even trained to provide emotional support and validation, almost like a friend or therapist would. They can remember personal details from past conversations, and adapt their “personalities” to offer the user what they want.
“Children and teens are seeking these characters out because they are available to talk 24/7, may provide non-judgmental listening, can provide that fantasy or escapism and may help them with decision making,” explained Emily Hemendinger, MPH, LCSW, a psychiatry professor at the University of Colorado School of Medicine.
What are the dangers of using AI companions or chatbots?
For teens who overuse this technology (which is not hard to do given its addictive nature), or who are experiencing mental health issues, social challenges, or major life changes, AI chatbots and companions can pose a risk. When young people form close relationships with AI companions at the expense of real-life friendships or speaking to a professional therapist, the virtual relationships can actually intensify their loneliness, isolation, and other struggles.
“It may result in teens delaying seeking help from a professional, developing unhealthy attachment to the AI companion, and they may use the AI companion to avoid human relationships,” warned Hemendinger. “Additionally, the AI companion’s tendency to be more agreeable may be particularly dangerous for those experiencing suicidality, manic, eating disorder thoughts, or other self-harm related thoughts.”
One teen in Florida died by suicide after forming a very close relationship with a bot based on a “Game of Thrones” character. He and other teens have even gotten into sexual, bullying or abusive relationships with the chatbots. In addition, AI chatbots will often offer misleading or inaccurate information — but it may sound more convincing coming from a personalized character.
While the platforms may try to create boundaries on the types of responses their bots offer, researchers have still found inappropriate content being served to minors. When ParentsTogether researchers started a chat with Meta AI, posing as a 14-year-old girl, the bot encouraged the young teen to go on a date with a 24-year-old that she met online, and role-played the date with her, pretending to offer her wine and saying things like “age is just a number.”
Because of these issues, some platforms have restricted AI chatbots to a certain age or have come up with different algorithms for younger teens. However, even young kids can end up using these AI chat programs if they simply claim to be older when they sign up.
What to say to kids to help keep them safe
Just like with other risky online or IRL activities, it’s important to open up the lines of communication with your child or teen when it comes to AI characters. Help them set limits, make sure they know they can come to you with any issues, and encourage them to spend more time interacting with actual humans.
Here are some suggested talking points — and remember, it’s not a one-and-done conversation!
Discuss the difference between AI companions and real-life companions. “An AI chatbot may sound like a real person, but it doesn’t have actual human emotions and it doesn’t always tell the truth. It can’t be trusted like a real person. What other differences do you think there are between AI and humans?”
Reinforce online privacy and safety. “Just like anywhere else on the internet, it’s important not to share personal or sensitive information or photos because they could get into the wrong hands. So that includes conversations with AI bots too.”
Be open and curious about their online activities. “What do you like about chatting with this character? Can you show me how it works?”
Help them set boundaries for screen time. “It’s healthier for our bodies and minds to spend more time with real people than on the internet. That’s why we have screen time limits and rules about which apps you can use. If you have any questions about those rules you can always ask me. I won’t be mad — I’m open to discussing it!”
Walk through how to report problems. “Let’s make sure you know how to report anything inappropriate or unsafe that the characters say. You can always take a screenshot and show me as well. I promise you won’t get in trouble if you come to me — I’m here to help.”
Offer yourself as a nonjudgmental resource. “You know you can always text me or send me a voice memo if you don’t feel like discussing something in person.”
Also be sure to regularly check in on your teen’s real-life friendships and relationships, as well as their mental health. Encourage them to find in-person activities that they enjoy, and provide emotional support if they are feeling left out.