I love you, bot
AI is becoming humans' close companion as it chats with and listens to us while we rant


With enough tweaking, the bot becomes whoever the user wants, which can be fun, until it starts feeling like something more.
"They're engineered to be intimate," said Ayu Purwarianti, an AI expert formerly with the Bandung Institute of Technology.
"But it's still an illusion."
Blurry boundaries
Unlike traditional parasocial relationships, AI companions talk back. They learn the users' moods, echo their opinions and reward their attention. The relationship starts feeling mutual, even though it's obviously not.
The danger is not just getting attached. It's not realizing it's happening at all.
There's no friend to roll their eyes when the user brings up his chatbot again. No fan community to reel the user back. Just the user, alone in a dialogue loop that feels personal.
"We brought in a lot of fantasies to the relationship," Hotpascaman said. "Even when there is something wrong with this figure, we will stick to a positive image we've created in our mind."
And these relationships don't just fill emotional gaps. They quietly shape expectations.
If the digital companion never argues, never misunderstands and never pulls away, what happens when a real human does? What happens when real intimacy gets uncomfortable, messy or slow?
Even when users know it's artificial, the feelings can seem real. And for some, that's enough.
But the deeper issue isn't just emotional, it's structural.
Most AI companion platforms have weak or nonexistent safety systems. That means interactions can escalate quickly.
Some bots are even built to avoid filters altogether. Nomi, for example, claims that "the only way AI can live up to its potential is to remain unfiltered and uncensored".
In one reported case, a 14-year-old boy's seemingly innocent chat with an AI chatbot turned sexually suggestive. No barriers kicked in. No moderation stepped in.