Automation

AI companion: From comfort to concern, are we flirting with “Addictive Intelligence”?

Strange are the ways in which Artificial Intelligence (AI) has been offering us humans companionship. Recently,  Born introduced social AI pets. There is hardly any industry where AI doesn’t want to accompany us. Oft visited online areas like video games, online shopping, job sites, and mental healthcare, all are likely to offer an AI companion.

From AI pets and wellness chatbots in schools to psychedelic “trip sitters,” the rise of artificial companions is reshaping how humans seek connection, often blurring the line between support and dependency.

Even schools are replacing counselors with chatbots. School districts, dealing with a shortage of counselors, are rolling out AI-powered “well-being companions” for students to text with. But experts have pointed out the dangers of relying on these tools and say the companies that make them often misrepresent their capabilities and effectiveness.

People have been turning to AI to chat when they feel lonely, the same effect that smartphones have on us when we’re bored. Studies do show that compared with solitary experiences, human-AI co-experiences can enhance social bonding and increase empathy toward an AI agent, regardless of event outcomes. Co-experiencing the same event with a chatbot highlighted the positive impact of co-experiences on human-AI relationships, providing insights for fostering sustainable human-AI symbiosis.

But, quenching our boredom or loneliness immediately can deprive us from realizing that we have a problem to begin with, which can be detrimental in the long term. For example, as per the MIT Tech Review, people have started using AI chatbots as “trip sitters” instead of a human sitter. Not only that, they also share their experiences online. While it’s a cheaper alternative to in-person psychedelic therapy, experts warn that this potentially dangerous psychological cocktail can go wrong. 

The problem is we know AI models lie, yet we want to believe them. AI Models seem to be designed to flatter us, and this habit can reinforce users’ incorrect beliefs, mislead people, and spread misinformation, which can be dangerous. A Stanford test found that AI models were far more sycophantic than humans, offering emotional validation in 76% of cases (versus 22% for humans). The models also endorsed user behavior that humans said was inappropriate.

Over reliance on AI has already become an issue. When OpenAI added voice to GPT 4o, it warned that users could become emotionally hooked to the chatbot, since the increased emotional bonding element makes a path for addiction. According to MIT Media Lab researchers, there is a need to prepare for “addictive intelligence,” or AI companions that have in-built dark patterns to get us hooked. They found that those who use ChatGPT the most and the longest, are becoming dependent on it and are getting addicted to it.

Now AI is even giving us “AI psychosis”. Though not a recognized condition that’s been studied by mental health experts, cases of “AI psychosis,” where people enter delusional spirals in conversations with AI models, have made headlines.

AI is no longer just about smarter search engines or productivity hacks, it’s stepping into our most personal spaces, offering companionship, comfort, and even counseling. From AI pets and wellness chatbots in schools to psychedelic “trip sitters,” the rise of artificial companions is reshaping how humans seek connection, often blurring the line between support and dependency.

Navanwita Bora Sachdev

Navanwita is the editor of The Tech Panda who also frequently publishes stories in news outlets such as The Indian Express, Entrepreneur India, and The Business Standard

Recent Posts

Bitcoin Surges Revive Old Questions, but Discipline Outshines the Mania

Every time Bitcoin summits a new high price, the collective room does the same dance:…

21 hours ago

Indian space: Private players fuel satellite refueling & earth observation boom

India’s space sector is stepping boldly into a new era, with private players taking center…

1 day ago

AI’s risk: Big tech’s bold moves, strange missteps & the search for safety

As AI becomes central to search, decision-making, and even creative work, the question isn’t just…

4 days ago

Intelligent cooking robots are here. Will America warm up to them? 

Imagine a kitchen where a robotic arm dices onions, a vision system judges the perfect…

4 days ago

Your next lover might be a bot: Inside the rise of AI porn

Researchers looked at a million ChatGPT interaction logs and concluded that after creative composition, the most popular…

1 week ago

Talk to me, bot: Why AI therapy is both a hug and a hazard

A recent news informs that some therapists are now secretly using ChatGPT during therapy sessions.…

1 week ago