Better World

Character AI just became the first major AI chatbot platform to go 18+

Pinterest LinkedIn Tumblr

Character.ai has become the first major AI chatbot platform to restrict chatting to adults only. The company announced its change in policy after a series of events led to increased attention on the mental health and safety of minors who were using their character-based chatbots.

Over the course of the past year, parents across the country have been speaking out about harmful experiences on Character.ai. Some have filed lawsuits against the company, claiming the platform contributed to the deaths and sexual exploitation of their children. Recent research by Common Sense Media, ParentsTogether Action, and Heat Initiative also confirmed the alarming risks associated with kids’ use of Character.ai.

This fall, Character.ai announced they will no longer allow children under 18 to have open chats on the platform. While this announcement is an important recognition of the serious risks Character.ai presents to kids, it lacks clarity on how it will be enforced moving forward. Character.ai said kids would lose access to open chat at the end of November.

Instead of open chat, Character.ai now allows kids to make “stories” in a choose-your-own-adventure style with AI-generated images and voices. Some of these stories use cartoon characters to act out disturbing or inappropriate scenarios.

In addition, while Character.ai has implemented age verification for existing kids’ accounts, there is currently no age verification for new adult accounts, meaning that kids can easily lie about their age to use the open chat feature.

What does all of this mean for parents?

This is a great time to ask your child if they have ever used Character.ai, and to have a conversation about chatbots. If your kids do use Character.ai, there are likely still serious risks, even if they are using it with their real birthdate. If your child is using Character.ai as an adult, there are very few safety measures to protect their mental health and safety.

If your kid is spending hours with Character.ai or similar chatbots, you’re not alone. Seven out of ten teens are using AI chatbots right now — 19 percent even report having a romantic relationship with AI — and most parents have no idea. That’s right, they’re not just using AI to answer homework questions. These bots are designed to form emotional relationships with kids. They respond instantly, they say what they think we want to hear, and they keep the conversation going forever.

So what should you watch for? Some of the red flags that kids may be spending too much time with chatbots include:

  • Pulling away from friends
  • Spending way more time alone
  • Having mood swings
  • Getting stressed when they can’t get online
  • Grades are slipping

What to do if your child is using AI chatbots

Start by asking with genuine curiosity — not judgment — about which chatbots they’re using and for what, and whether they signed up with their real birthday. For older kids, explain how these companies manipulate emotions to keep them online longer — nobody likes being tricked, including kids.

Then set real boundaries around all screen time, not just AI. You can help them get real about how much time they spend with character bots by checking their use together. Help them detox their phone by moving the chatbot app off their home screen, blocking it during key hours, or deleting it altogether. Look for these settings:

  • On iPhone: Settings > Screen Time > See All App & Website Activity
  • On Android: Settings > Digital Wellbeing & parental controls

And, just as importantly: Help your child reconnect with actual friends. If they’re turning to a chatbot because they feel lonely, that’s the real issue you need to address. See our tips on supporting a kid who feels lonely or left out.

If you’re worried about how your child’s mental health may be linked to their tech use, talk to their doctor or check out Internet and Technology Addicts Anonymous or the Crisis Text Line (text HOME to 741741) for support. You’re not overreacting — AI chatbot platforms are designed to be addictive, and that’s not your fault or your child’s fault.

 

 

View this post on Instagram

 

A post shared by ParentsTogether (@parentstogether)

ParentsTogether is a 501 (c)3 nonprofit community of over 3 million parents, caregivers, and advocates working together to make the world a better place for all children and families.