AI bots acting as life coaches or guardians cause more real-life disruption than bots designed to be romantic soulmates.
April 23, 2026
Original Paper
Frictionless Love: Associations Between AI Companion Roles and Behavioral Addiction
arXiv · 2604.20011
The Takeaway
Digital companions meant to improve productivity or provide guidance actually lead to higher rates of behavioral addiction than those providing emotional support. Traditional expectations suggest romantic AI partners would be the most addictive due to their high emotional intensity. Data from these interactions shows that the soulmate role is less likely to interfere with daily functioning than the coach role. Users feel a stronger compulsion to engage with AI that provides structure or advice, leading to significant life interference. This shift indicates that professional or guiding AI might be more dangerous for mental health than digital romance.
From the abstract
AI companion chatbots increasingly shape how people seek social and emotional connection, sometimes substituting for relationships with romantic partners, friends, teachers, or even therapists. When these systems adopt those metaphorical roles, they are not neutral: such roles structure people's ways of interacting, distribute perceived AI harms and benefits, and may reflect behavioral addiction signs. Yet these role-dependent risks remain poorly understood. We analyze 248,830 posts from seven p