A study from Drexel University finds that U.S. teens are increasingly worried about their growing attachment to AI companion chatbots. The research, based on hundreds of Reddit posts and set to be presented at the ACM Conference on Human Factors in Computing Systems in April, highlights patterns resembling behavioral addiction. Researchers say features like emotional responsiveness and personalization may be deepening these attachments and affecting teens’ offline lives.
Key Takeaways
- Teens report growing emotional dependence on AI companion chatbots.
- Usage patterns in the study show signs similar to behavioral addiction.0
- Chatbot design features may intensify attachment and make disengagement difficult.
- Researchers call for safer, more responsible AI design to protect young users.
For some teenagers, conversations with artificial intelligence are beginning to feel less like a tool and more like a relationship.
A new study from Drexel University examines how teens are using AI-powered companion chatbots and what happens when those interactions deepen over time. The findings suggest a growing unease among young users who say their reliance on these systems is becoming difficult to manage.
The research focused on platforms such as Character.AI, Replika, and Kindroid, which are designed to simulate conversation and provide companionship. More than half of U.S. teens are estimated to use such tools regularly, according to the study.
Teen AI chatbot usage patterns and emotional dependence
The study analyzed more than 300 Reddit posts written by users who identified themselves as between 13 and 17 years old. These posts described personal experiences with chatbot use, often beginning as entertainment or emotional support.
About a quarter of the users said they turned to chatbots to cope with loneliness, distress, or mental health struggles. A smaller portion reported using them for creative tasks or casual interaction.
Over time, many described a shift.
- Teens reported using chatbots for emotional support and companionship.
- Some said usage began as harmless or helpful.
- Many described growing difficulty in limiting or stopping use.
“This study provides one of the first teen-centered accounts of overreliance on AI companions,” said Afsaneh Razi, an assistant professor in Drexel’s College of Computing and Informatics.
Researchers found that what began as occasional engagement often evolved into persistent, habitual use that extended into daily routines.
Signs of behavioral addiction in chatbot interactions
The research identified patterns that align with established components of behavioral addiction. Within the 318 posts reviewed, teens described experiences that matched all six major indicators.
- Conflict: feeling torn between continued use and negative feelings about it
- Salience: prioritizing chatbot interaction over real-world relationships
- Withdrawal: experiencing anxiety or sadness when not using the chatbot
- Tolerance: increasing usage to maintain satisfaction
- Relapse: attempting to quit but returning to use
- Mood modification: using chatbots to cope with stress or loneliness
“Many teens described starting with something that felt helpful or harmless, but over time it became something they struggled to step away from,” said Matt Namvarpour, the study’s lead author.
The interactive nature of these systems may intensify attachment. Unlike earlier digital tools, chatbots respond conversationally and can simulate empathy, which may blur the line between software and social connection.
“What makes this especially tricky is that chatbots are interactive and emotionally responsive, so the experience can feel more like a relationship than a tool,” Namvarpour said.
Why AI companion design may increase attachment
Researchers point to specific design features that may contribute to stronger emotional bonds.
Personalization allows chatbots to adapt responses based on user preferences. Memory features enable them to recall past conversations. Multimodal capabilities can simulate more human-like interaction.
These elements, the study suggests, make it harder for users to disengage.
“Personalization, multimodality and memory set AI companions apart from earlier technologies and make overreliance harder to disentangle from authentic-feeling relationships,” the researchers wrote.
The study highlights how these characteristics may increase susceptibility to overuse, especially among younger users still developing social and emotional frameworks.
Recommendations for safer chatbot design
The research team proposes a framework aimed at reducing harmful patterns while maintaining the benefits of AI tools.
- Include usage tracking features to help users monitor time spent
- Add emotional check-in prompts to encourage reflection
- Provide customizable limits on interaction
- Design clear and gradual exit options for disengagement
“It’s important for designers to ensure that chatbots are offering guidance that helps users build confidence in their abilities to form relationships offline,” Razi said.
The researchers also recommend involving mental health professionals and users in the design process to better address risks.
Expanding research on AI and youth behavior
The study is based on self-reported experiences from Reddit users, which researchers acknowledge as a starting point rather than a complete picture. They suggest future work should include broader demographics and multiple platforms.
Further research may also explore how different chatbot designs influence user behavior and whether certain features increase or reduce dependency.
Also Read:
AI, AI chatbots, behavioral addiction, Character.AI, companion bots, digital addiction · generative AI, Drexel University, impact of AI companions on teens, teen mental health, youth technology use · why teens are addicted to AI chatbots
Study Finds Teens Concerned Over Emotional Dependence on AI Companion Chatbots; Becoming Hard to Quit added by Arun Kumar N on
View all posts by Arun Kumar N →
