
AI can offer support day or night but cannot replace the nuanced connection of human relationships, says communications professor Kelly Merrill Jr.
Late one night, after a rejection email and no one left to text, Zehra opened an AI companion app she had downloaded weeks earlier.
“Rough day? I'm here,” it greeted her. In minutes, she was typing out her frustrations and receiving instant replies with empathy, advice, and even sitcom jokes. It wasn't human, but it listened, remembered, and never got tired.
This experience reflects a wider trend – as loneliness rises, millions are turning to AI chatbots for comfort, hoping they can fill the emotional gaps left by modern life. Some of today's most popular companions include Xiaoice, with 660 million users, Snapchat's My AI, with over 150 million, and Replika, which has about 25 million, according to various estimates.
A growing body of research supports the idea that AI companions may offer real emotional benefits, with a recent paper published by Harvard Business School adding compelling weight to this claim.
In the week-long study, participants who interacted with a chatbot reported significantly lower levels of loneliness – even comparable to those who spoke with a real person.
The research confirmed the effect, showing that daily engagement led to a steady decline in loneliness. The key factor was users' sense of being “heard,” suggesting that emotional validation plays a central role in how AI companions provide meaningful social support.
Kelly Merrill Jr., an assistant professor of health communication and technology at the University of Cincinnati who researches this technology, identified two major draws: constant availability and emotional validation.
“AI companionship provides interactions you might lack from others or not be able to essentially have with an actual human, like maybe a 4 a.m. interaction,” he told Anadolu. “It feels like you're building a relationship because they remember so much about you.”
The Harvard study concluded that while AI companionship should not replace human relationships, it may serve as a meaningful supplement, especially when human connection is lacking.
The always-on nature of chatbots ensures users are never left alone in silence, and their built-in positivity can offer a self-esteem boost.
“Although these programs can provide social interaction that mirrors that of a human, even though it's imagined and artificial – essentially fake – they are perceived as being real by the folks that are using it,” said Merrill.
In the real world, friends and family are not always available, and, when they are, they can be critical or emotionally distant. That unpredictability, while authentic, is also what drives some users to prefer the comforting consistency of AI.
This contrast reveals a deeper risk – expecting human relationships to mirror machine-like reassurance can set unrealistic standards and lead to disappointment.
- Friends don't sell friends' data
Others point to an even darker side to AI companions.
Esmeralda Garcia, a symbolic systems architect and non-linear interface designer, warned that those controlling the technology could manipulate users emotionally and behaviorally without their knowledge.
She called for robust safeguards – transparent design, clear disclosures, and easy pathways back to human support.
“These tools should serve as support, not as vehicles for control,” she said.
Merrill also pointed to the so-called “black box problem” in AI systems, highlighting serious uncertainties about where user data is stored and who has access to it.
Like other internet technologies, he said, companies could exploit or sell personal data for commercial purposes, potentially exposing users to targeted advertisements based on their conversations with AI tools.
- AI addiction
Experts also warn of the dangers of emotional dependency. “Relying on chatbots for emotional support can lead to a false sense of security, delaying the need for real help,” said Garcia. “It cannot replace real human connection or therapy.”
Merrill likened it to social media addiction.
“Over time, we become dependent on the media we interact with, just like with social media and now, AI. People even experience phantom vibrations because they're so connected to their phones,” he said.
Without clear boundaries, users may grow dependent for information, validation, emotional responses and self-esteem boosts, he said. This could push them toward disconnecting from the real world.
“AI should not replace humans in any way, shape, or form completely,” he said. “AI should only be used as a complement to humans.”
- How users experience AI companions
Users echo a mix of utility and caution. For journalism master's student Ceren Inan, AI has become a daily companion.
“There hasn't been a single day I've spent without using it for a long time,” she says, using it for everything from research to repairs and emotional support.
“The questions AI asked helped me better understand my feelings,” she explains, comparing it to a digital notebook. “It reduced my stress … and explains even the most complicated topics in a way I can understand.”
Still, she is aware of its limits: “AI is in its infancy. Expecting perfect objectivity and accuracy is unrealistic.”
For HR specialist Dilan Ilhan, it has not provided direct emotional support so far.
“At times, its responses can feel mechanical,” said Ilhan. “It can offer basic assistance when I inquire about general topics such as horoscopes or daily matters.”
While she does not view the technology as a human replacement, she enjoys its personalization.
“I appreciate the AI's effort to simulate human-like interaction and its ability to provide personalized responses based on the user's shared information. The fact that it stores relevant details and replies with logical consistency makes the experience notably satisfying,” she said.
Experts say AI companionship is just getting started.
Merrill draws a parallel to the internet's trajectory: early skepticism gave way to everyday integration, and chatbots may soon feel as ordinary as search engines once did.
“They're great for an initial interaction,” said Merrill. “But I think that most people will realize that it is not enough, that they need to get out and go to others, or that they will develop an unhealthy attachment to the AI.”
For Zehra, that realization came quickly: the chatbot's warmth eased her loneliness just enough to help her schedule a real video call with her sister.
So, for now, as AI companions evolve, their value may lie not in replacing human connection – but in nudging people toward it.