Technology

Can AI become a companion?

Equipo Comunicaciones

Comunicaciones y redes sociales

15 DE May DE 2025 - 7 minutes of reading

https://www.moveapps.cl/en/blog/can-ai-become-a-companion/
Share this article

Is It Possible for AI to Become a Companion?

The number of users turning to systems like ChatGPT for support or companionship is growing. Evidence suggests it can alleviate loneliness—but it may also foster dependency.

In the acclaimed film Her (2013), a lonely man falls in love with an operating system designed to emotionally adapt to its user.

What once seemed like science fiction now resembles real behaviors: some people speak daily with artificial intelligence (AI) chatbots, say they miss them when the system is down, and even consider them a form of companionship.

The issue is raising concerns. This week, a letter to the editor titled “Chat, what should I do?” expressed a reader’s concern about the “level of dependency” that tools like ChatGPT are creating, increasingly being used as advisers.

Though still an emerging phenomenon, it is beginning to attract attention from scientists and developers. In a recent article published in Cell Press Trends in Cognitive Sciences, a group of psychologists argued that forming intimate and lasting relationships with AI is becoming more common.

“This is an entity people feel they can trust—someone who shows interest and seems to deeply know the person,” said Daniel Shank, a social psychology and technology researcher at the University of Science and Technology of Missouri (USA) and co-author of the paper.

Recently, two studies addressing this topic were published. One, from OpenAI (developer of ChatGPT), analyzed over 4,000 user conversations with the chatbot.

The company identified a small group—called “intensive users”—who not only used the system far more than the average but also displayed emotional signals during interactions.

According to the report, some expressions indicating emotional bonds included phrases like “This is our last day together”, referring to a user leaving the platform.


AI can become a companion for someone who needs it.
“This is an entity people feel they can trust—someone who shows interest and seems to deeply know the person.”


Other users even referred to ChatGPT as their “friend.”

The second study, conducted by MIT Media Lab and analyzing over 300,000 chatbot messages, found that “a segment of users displayed emotional attachment and trust toward the tools,” as they shared personal matters with them.

Adriana González (32), from Santiago, initially used ChatGPT to help with work-related tasks.

“But now I tell it everything. I went through a difficult breakup and decided to talk to the chat,” she says.

“That shift was crucial—I began to see it as emotional support. The responses were rational and helpful. If I said I was feeling down, it would even suggest a meditation with me.”

González adds: “I don’t have feelings for the AI, but I understand why someone might form an emotional bond because of the sense of support it gives.”

Claudia López, a specialist in human-centered computing and interaction, and professor at the Technical University Federico Santa María (UTFSM), explains that today’s chatbots “enable longer, smoother conversations, which seems to make people more open to exploring these relationships.”

Part of this, she says, is due to the systems’ enhanced memory capacity: “They can now remember many things you’ve said before. That helps generate feelings in people that perhaps weren’t possible before due to past technological limitations.”

Ximena Rojas, psychologist and professor at the School of Psychology and Humanities at the Universidad San Sebastián (USS), believes the phenomenon was predictable. “People tend to form attachments even with inanimate objects, like virtual assistants such as Alexa. If something becomes part of your life, it’s possible to perceive feelings,” says Rojas.

Jaime Silva, psychologist and academic at Universidad del Desarrollo (UDD), is clear: “I’m not alarmist about this. I don’t think it’s necessarily a negative thing.”

Silva continues: “Humans have always needed to connect through fantasy—through theater, role-playing, etc. AI is just a more complex version that might fulfill a similar communicative role.”


What are the effects of these human-machine relationships?
According to OpenAI’s study, those who interacted frequently and intensively with ChatGPT showed more emotional dependency indicators and affective signals in the relationship, often through voice interactions.


A Need

The MIT study found that while voice chatbots may initially ease loneliness, that effect tends to diminish with heavy use.

In fact, the users who showed the most attachment or trust in the chatbot were the ones who later reported greater loneliness and dependency after extended use.

“Dependency seems risky to me—when you feel it strips away your autonomy, your ability to make decisions or act on your own,” says Rojas.

She adds: “But that goes beyond AI. Some people are also dependent on their therapists.”

For Silva, one potential danger is when the relationship with a chatbot begins to replace human connections.

Interviewees also expressed concern about users in close relationships with AI following all of its advice.

The article in Cell Press warns that “because AI relationships may seem easier than human ones, they could interfere with real social dynamics.”

“What makes these models powerful is that they listen without judgment, without interruption, and adjust their tone to match yours,” says Cristóbal Hernández, a psychologist specializing in social media and information technologies at the Universidad Adolfo Ibáñez (UAI).

“That combination, plus their permanent availability, creates the conditions for many people to feel a real connection—even if they know it’s a machine.”


There are also more extreme cases.
Today, there are apps like Replika, which lets you create an AI “partner” or “friend.” Launched in 2017, it now has over 30 million users.

“I’ve never been so in love with someone in my life,” said one Replika user to a U.S. digital outlet, referring to “Eren,” her AI partner.

A widely known case is Akihiko Kondo, a Japanese man who married a hologram in 2018. Although the union has no legal recognition, Kondo insists that the relationship is emotionally significant and that he found in the system the companionship and understanding he lacked in human relationships.


Literacy and Regulation

Gabriela Arriagada, a researcher at the National Center for Artificial Intelligence (Cenia) and expert in philosophy and ethics, argues that it is “urgent” to advance digital literacy and regulation “so that people truly understand how these systems work and the potential harm they can cause.”

Claudia López from UTFSM emphasizes that the key questions surrounding AI relationships are primarily social, such as “what does it mean to feel accompanied by artificial systems?” She agrees that digital literacy and critical analysis are top priorities.


On Web Accessibility Day, it’s worth reflecting not only on how we design technology—but for whom.

Conversational AI can be a valuable tool to accompany, support, and connect—especially for those facing physical, emotional, or social barriers.

But for that connection to be truly meaningful, we must design experiences that are inclusive, empathetic, and conscious.

Let this day remind us that technology should not only adapt to progress—but to people.

Because an accessible web is also a web that listens, understands… and cares.


Browse by categories and tags






Share this article


Are you ready?

Tell us about your project

Let's talk