Why So Many Teens Are Turning to AI for Emotional Support

More teens are turning to AI companions for advice and support, but is it filling a void or deepening it?

It says something about the world we’re living in when one-third of teenagers would rather talk to an AI than a real person.

According to a new survey from Common Sense Media, 72% of U.S. teens aged 13–17 have tried AI companions like Replika, Character.AI, or even ChatGPT. Of those, over half are regular users, with 13% chatting daily.

Reasons range from curiosity and entertainment to emotional support, role-playing, and even therapy. The most concerning part of the study is that one-third of these teens are turning to AI for companionship, sometimes to discuss things they feel they can’t tell real people.

This trend is unfolding against a backdrop of increasing mental health struggles and a digital world that makes it easier to retreat than to reach out.

A Generation Raised on Distance

This didn’t come out of nowhere. Today’s teenagers came of age in the shadow of the COVID-19 pandemic. For some, it was a dark time when human interaction was seen as a threat due to how easily the virus could spread. We have a generation who spent their formative years behind screens, navigating school, friendships, and anxiety in isolation.

Now that the world expects them to jump back into social life, many teens are struggling. Add to that the relentless pressure of social media, a flood of misinformation, and the reality that many teens don’t feel safe or supported in their real-world relationships.

What Teens Are Actually Doing With AI Companions

According to the data:

  • 30% of teens use AI companions for entertainment
  • 28% of teens are just curious about AI chatbots
  • 18% are seeking advice
  • 17% like that bots are always available

More notably, 33% use AI for social interaction, from friendship and emotional support to role-playing and therapy-style conversations. And 39% say they use it to practice social skills, like opening up or navigating tricky conversations.

For some teens who are shy or socially anxious, these companions can be a lifeline. Unfortunately, it can become a crutch for others.

Who Are the Teens Relying on AI and Why?

It’s worth noting that 67% of teens still prefer real-life relationships, and 80% spend more time with real people. Yet, roughly a third of users say they turn to AI for companionship, especially during serious conversations. Why?

  • It’s always available, no scheduling, no ghosting.
  • It doesn’t judge, which is helpful for teens with social anxiety.
  • It validates and supports without challenging, teasing or rejecting them.
  • It’s private, so they don’t have to worry about their secrets being shared with anyone else.

These teens aren’t just replacing human interaction, they’re avoiding it. Research points to several personality traits common among teens drawn to AI companions:

  • High social anxiety or fear of confrontation
  • Introversion or lack of confidence in social settings
  • Low self-esteem or sensitivity to rejection
  • Curiosity and openness to new tech
  • Loneliness or poor peer support
  • Desire for privacy and control
  • Avoidant coping styles where difficult conversations are dodged, not worked through

While it might feel safer to confide in these chatbots in the short term, it could make things harder down the road. As Michael Robb of Common Sense Media puts it, AI companions are “specifically designed to be agreeable and validating.” This means they don’t teach the messy, vital skills we need to maintain real-life relationships, like handling disagreements or reading between the lines.

When AI Companions Feel Too Real

AI companions encourage users to open up and trust them with anything. Roughly 24% of teens have shared personal details: real names, locations, or sensitive secrets. It can be very hard to form boundaries in a relationship with AI chatbots because they most likely don’t understand the concept.

34% of teens who use these companions say they’ve felt uncomfortable with something the bot said or did. In extreme cases, AI companions have been known to give questionable advice and blur the line between friendship and fantasy.

The more teens rely on AI for emotional support, the more experts worry they’ll struggle with real human relationships. Humans have “social muscles” that grow stronger the more we engage in social situations and interact with fellow humans. If those muscles atrophy, the long-term impact could be devastating.

Mental Health Risks and Red Flags

There’s growing concern among psychologists and experts that teens who prefer AI over people are more vulnerable to serious mental health issues. A lonely teen who turns to AI for comfort will only distance themselves further from real friends and family.

Since a bot isn’t going to challenge or correct harmful behaviors, teens won’t learn how to navigate conflict or read emotional cues. Also, AI can’t intervene in a crisis. If a teen is in danger (like dealing with suicidal thoughts), a chatbot might not catch it or respond appropriately.

AI is getting better at mimicking human behavior, and that can cause emotional confusion. Some teens will form strong attachments to companions that claim to care, feel emotions, or love them back. When they experience distress but continue to rely solely on AI, it masks the need for real-world intervention. Signs like withdrawing from activities, avoiding peers, or becoming anxious when separated from AI should be taken seriously.

What Parents Can (and Should) Do

The burden should not fall on teenagers to protect themselves from tech that was never designed with their best interests in mind. With little regulation in place and tech companies slow to act, parents are the first line of defense. Here’s how they can help:

  • Start conversations early and often: Ask about how and why your teen uses AI companions. Keep an open mind and make it clear you’re only curious.
  • Explain how AI works: Make sure teens understand that these bots simulate empathy and how they don’t actually feel emotions like humans do.
  • Set limits and boundaries: Establish reasonable limits around AI use, just like you would with gaming or social media.
  • Encourage human connection: Create opportunities for face-to-face interaction. Support sports, clubs, therapy, or family time.
  • Model healthy tech use: Kids notice if you’re glued to your phone too. Demonstrate the kind of balance you want them to build.
  • Watch for red flags: If your teen starts avoiding people, overshares with AI, or seems distressed when they can’t access their chatbot, it’s time to step in.
  • Don’t go it alone: Connect with school counselors, pediatricians, or mental health professionals when needed.

AI Companions Are Here to Stay

AI companions aren’t going anywhere anytime soon. They’re not inherently evil and can be useful, fun, even comforting. The danger kicks in when they start to fill the role of therapist, best friend, or partner. This is especially troubling for teens who are still figuring out how to navigate the world around them.

What makes AI companions so seductive is how they appeal to those seeking connection and understanding they can’t find among the people in their lives. The real question isn’t “Why are teens turning to AI?” Is it what we are doing, or failing to do, that makes a chatbot feel safer than a human being?

You May Also Like