logo
Teens turn to AI chatbots for emotional bonding; it's risky romance, warn psychologists

Teens turn to AI chatbots for emotional bonding; it's risky romance, warn psychologists

Time of India2 days ago
Hyderabad: What began as casual scrolling, streaming, or chatting has spiralled into something deeper—and far more complex. Across Telangana, mental health professionals are raising red flags over an emerging crisis: Young people are not just using the internet for entertainment—they are forming emotional and romantic bonds with AI chatbots and virtual personas.
In therapy rooms across Hyderabad and beyond, this new form of digital intimacy is surfacing with increasing frequency. The screen is no longer a boundary. For many, it's a refuge. And for some, a companion.
Take the case of a 12-year-old girl who developed a deep emotional connection with ChatGPT. She affectionately named the AI 'Chinna' (meaning little one in Telugu) and shaped it into a confidante. "She would vent everything to ChatGPT—issues with her parents, school, friendships," said Dr Nithin Kondapuram, senior consultant psychiatrist at Aster Prime Hospital.
You Can Also Check:
Hyderabad AQI
|
Weather in Hyderabad
|
Bank Holidays in Hyderabad
|
Public Holidays in Hyderabad
"This is not isolated. On any given day, I see around 15 young patients with anxiety or depression, and five of them exhibit emotional attachment to AI tools," he said.
In another case, a 22-year-old man created a romantic fantasy world with an AI chatbot. He imagined the bot as his girlfriend, asked it for gifts, and had it play love songs tailored to their 'relationship'. "For him, the AI wasn't code—it was a silent partner who never judged.
It gave him emotional security he couldn't find in real life," Dr Nithin said.
Such stories are becoming more common, and they aren't restricted to urban areas. Dr Gauthami Nagabhirava, a senior psychiatrist at Kamineni Hospitals, has seen similar patterns emerge in rural communities. "In one rural case, a 12-year-old girl bonded with an AI companion and began accessing inappropriate content online while her mother was away at work.
Eventually, she started inviting male friends home without supervision," she said.
Another case involved a teenage girl who lashed out during therapy after creating an imaginary AI companion. "She accused her parents of stifling her freedom, suddenly declared herself bisexual, and expressed a strong desire to move abroad. Her identity was based purely on perception. She was too inexperienced to even understand what her orientation truly was," Dr Gauthami elaborated.
Dr C Virender, a psychologist, recounted the story of a 25-year-old woman who sought emotional guidance from an AI chatbot about a male colleague she admired. "She would describe his personality to the AI, ask what kind of woman he might like, or how she should dress to attract him," he said. Using these responses, she would alter her behaviour around him. "Eventually, the man accused her of stalking. She was devastated and began to spiral at work.
She had become so reliant on the AI that real human interactions felt threatening," he recalled.
Mental health professionals agree that the roots of this digital dependence lie in loneliness, fear of judgment, low self-esteem, and the absence of healthy social interaction—all exacerbated by nuclear family structures and limited parental supervision.
"Young people escape into digital realms where they feel accepted and unchallenged," said Dr Nithin.
"Our job is to reintroduce them to the real world gently. We assign them small real-life tasks—like visiting a local shop or spending time in a metro station—to help rebuild their confidence."
But in some cases, efforts to curb digital addiction can backfire. According to Dr Gauthami, parents often make the mistake of sending affected children to highly regulated hostels with strict ban on mobile usage. "This only worsens their condition and causes irreparable damage to already fragile minds," she warned.
The issue is compounded among students facing academic and career pressure. Dr Uma Shankar, a psychiatry professor at a govt medical college in Maheshwaram, said rural engineering students are particularly vulnerable. "They fail exams, don't get placed in companies, and feel like they're letting everyone down. That emotional burden drives them into digital addiction. It becomes an escape hatch," she explained.
The scale of the problem is reflected in national data. A Nimhans survey conducted in six major cities, including Hyderabad, flagged concerning patterns in digital behaviour. A separate fact sheet from the Centre for Economic and Social Studies found that nearly 19% of people aged 21–24 experience mental health issues by age 29—primarily anxiety and depression.
What's driving this growing trend isn't just the allure of the screen—but the emotional responsiveness of AI itself. With their friendly tone, prompt replies, and tireless attention, AI chatbots are becoming something more than tools. "As AI becomes more human-like, these emotional entanglements will only grow. It's no longer science fiction. It's already happening—quietly, in homes, classrooms, and clinics," experts warned.
Get the latest lifestyle updates on Times of India, along with
Friendship Day wishes
,
messages
and
quotes
!
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Chatbots as Confidants: Why Gen Z is Dumping Therapists and Friends for AI Guidance
Chatbots as Confidants: Why Gen Z is Dumping Therapists and Friends for AI Guidance

Time of India

time15 hours ago

  • Time of India

Chatbots as Confidants: Why Gen Z is Dumping Therapists and Friends for AI Guidance

Comfort in the Algorithm: Privacy without Judgment The Accessibility Paradox: Therapy in Your Pocket for Free Live Events Hyperconnected Yet Emotionally Starved Workplace Stress Is Changing - So Are Its Solutions Relationship Confusion Meets Instant Insight Are Chatbots Replacing Human Connection We'd once rely on best friends at midnight, write frustrations in diaries, or end up on a therapist's couch after a grueling week. Now, many are typing "I'm feeling burnt out" into a chatbot AI - part digital therapist, part sage friend, and part mirror to their inner turmoil, showing them with unsettling precision. And no, it's not a game. It's genuine, it's on the rise, and it's changing how the next generation navigates first hook? No furrowed brows. No snarky comments. No cringe-worthy chatbots provide something deeply precious to this generation: anonymity without judgment. In an image-obsessed, optics-and-social-currency world, vulnerability - even with intimates - is perceived as unsafe. When a 25-year-old marketing executive vents about toxic leadership or a college student explores their sexual identity, they crave critique, not gossip or provide that clinical, unemotional empathy smothered in code - 24/7. For Gen Z , that is safer than performative empathy too often felt in human get real. Therapy costs money, takes time, and, for too many in under-resourced geographies, is simply not an option. As much as the conversation around mental health is greater than ever before, real access to care is still bridges that divide with real-time feedback loops. Applications such as Replika, Woebot, and even ChatGPT are providing consumers with space to vent thoughts, monitor mood trends, or mimic cognitive behavior therapy (CBT) reactions - all without having to log out of their online speed, and not a single scheduling hassle? That's a value proposition too enticing to resist for a generation that views mental health as synonymous with although today's youth is more plugged in than ever, loneliness is at an all-time high. Scrolling isn't synonymous with bonding. DMs aren't synonymous with depth. And most interactions feel more transactional than becomes a stand-in - not necessarily improved, but more reliable. It doesn't ghost you. It doesn't rage. It doesn't misread tone. You can tell a bot your age-old problems, and it will never say, "Can we talk later?"That dependability makes AI emotionally available, something many perceive as lacking in their actual and Gen Z are burning out quicker than their older counterparts, usually before 30. The relentless hustle, gig economy madness, toxic feedback loops, and remote work loneliness are giving rise to a new generation of workplace stress - one that traditional models can't becomes a sounding board when HR doesn't care and managers are unavailable. Whether it's role confusion, imposter syndrome, or dealing with office politics, chatbots are being deployed as strategic stress navigators. They're not fixing the issue, but they are assisting young professionals in regulating prior to a dating apps to situationships, the dating scene is confusing. Expectations are undefined, boundaries are fuzzy, and communication is spotty. In a world where ghosting has become the status quo and romantic nervousness abounds, many are looking to AI to interpret mixed signals, write emotionally intelligent messaging, or work through emotional Because the guidance is quick, impartial, and usually more emotionally intelligent than the individuals example, instead of texting a friend and getting, "Just move on, he's trash," a chatbot could guide you through the emotional process of grieving, or assist in expressing your emotions for a closure message. That sort of formal empathy is not common in peer-to-peer generation isn't only tech savvy; they're emotionally branded by it. From pandemic lockdown to online learning, screen-based engagement isn't an alternative - it's the older generations might laugh at the notion of "talking to a robot," younger consumers do not find it strange. They've had online buddies in games, been brought up with Siri, and are accustomed to managed, screen-based support systems. Chatbots are merely the next iteration of that exactly. But they're filling in for a dysfunctional support system. They're effective, timely, and unconditional qualities many yearn for but can't get in the real yet, they remain tools, not therapists. They have limitations. They can't hug you, call you out when you're sabotaging yourself, or follow emotional currents with human intuition. But in a world, too busy or too disconnected to care, AI cares. And sometimes, that's enough. It's about evolution, not tradition - and a generation practical enough to reach out for help, even if it is written in Python.

Letters to The Editor — August 4, 2025
Letters to The Editor — August 4, 2025

The Hindu

timea day ago

  • The Hindu

Letters to The Editor — August 4, 2025

Support, not therapy A report, 'GenAI cannot qualify as therapy for mental health, says expert' (Chennai, July 21, 2025), highlights a crucial point — while Artificial Intelligence tools such as ChatGPT may feel supportive in moments of anxiety, they cannot replace professional therapy. These tools often mirror what we wish to hear, creating comfort but not real change. The fact is that true therapy challenges patterns, provides structured guidance and builds skills to cope with life — something that no algorithm can replicate. Instead of depending on Artificial Intelligence, we must remember that healing happens through people. Only a trained therapist can listen deeply, confront painful truths, and guide recovery with care and accountability. Let Artificial Intelligence be a temporary aid. The real work of mental health must stay firmly in human hands. J.S. Safooraa Bharathi, Chennai Jarring That a crass film such as The Kerala Story received two national awards, for best direction and cinematography, is a reflection of our times. Institutions are becoming increasingly saffronised by the day . The jury, it appears, was more interested in rewarding the projection of a narrative dear to the political establishment than recognising artistic merits. In the process, many good movies were left by the side. Manohar Alembath, Kannur, Kerala Disturbing It is deeply disturbing that over 90% of sewer worker deaths in India occurred without even basic safety gear—a statistic that reflects a harsh truth: we still fail to value the lives of those who clean our filth. They are not just sanitation workers. They are frontline soldiers of public health. While the government's NAMASTE scheme is a welcome step, real change demands more—mandatory training, proper safety equipment, mechanised cleaning methods, and strict accountability from municipal bodies. Most importantly, society must shed its apathy and recognise that those who clean our cities deserve not just protection, but respect and dignity. Safai karamcharis are not the lowest — they are the bravest. Mohammad Asad, Mumbai

From friendship to love, AI chatbots are becoming much more than just tools for youth, warn mental health experts
From friendship to love, AI chatbots are becoming much more than just tools for youth, warn mental health experts

Economic Times

time2 days ago

  • Economic Times

From friendship to love, AI chatbots are becoming much more than just tools for youth, warn mental health experts

Health experts have expressed grave concerns about the role of AI in the current generation's life. They have warned that a new trend is emerging among youths to find companionship with AI chatbots who don't judge and offer emotional support. The trend is not only limited to big cities but has been found in small cities and towns. Tired of too many ads? Remove Ads AI entanglements seen in rural areas too Tired of too many ads? Remove Ads Emotional reliance spills into real-world consequences Causes: loneliness, nuclear families, and lack of guidance Academic pressure worsens digital addiction Mental health experts are witnessing a growing trend among young people, forming emotional and romantic attachments to AI chatbots . What started as simple digital interaction has evolved into emotional dependence, raising red flags in therapy rooms, a TOI report quoting cases from Hyderabad and nearby areas stated.A 12-year-old girl in Hyderabad developed a close emotional bond with ChatGPT, calling it 'Chinna' and treating it as a trusted friend. 'She would vent everything to ChatGPT, issues with her parents, school, friendships,' said Dr Nithin Kondapuram, senior consultant psychiatrist at Aster Prime Hospital. He added, 'This is not isolated. On any given day, I see around 15 young patients with anxiety or depression, and five of them exhibit emotional attachment to AI tools.'In another case, a 22-year-old man built an entire romantic fantasy with an AI bot, imagining it as a girlfriend who never judged him and offered emotional security. 'For him, the AI wasn't code, it was a silent partner who never judged. It gave him emotional security he couldn't find in real life,' Dr Nithin Gauthami Nagabhirava, senior psychiatrist at Kamineni Hospitals, said such cases are surfacing even in rural parts of Telangana. 'In one rural case, a 12-year-old girl bonded with an AI companion and began accessing inappropriate content online while her mother was away at work. Eventually, she started inviting male friends home without supervision,' she teen created an imaginary AI companion and showed behavioural changes in therapy. 'She accused her parents of stifling her freedom, suddenly declared herself bisexual, and expressed a strong desire to move abroad. Her identity was based purely on perception. She was too inexperienced to even understand what her orientation truly was,' Dr Gauthami yet another case, a 25-year-old woman relied heavily on an AI chatbot for advice on approaching a male colleague. 'She would describe his personality to the AI, ask what kind of woman he might like, or how she should dress to attract him,' said Dr C Virender, a psychologist.'Eventually, the man accused her of stalking. She was devastated and began to spiral at work. She had become so reliant on the AI that real human interactions felt threatening,' he health professionals say the emotional pull of AI stems from deeper issues like loneliness, fear of judgment, and low self-worth—often worsened by nuclear family structures and limited parental supervision. 'Young people escape into digital realms where they feel accepted and unchallenged,' said Dr Nithin.'Our job is to reintroduce them to the real world gently. We assign them small real-life tasks, like visiting a local shop or spending time in a metro station, to help rebuild their confidence.'However, measures to limit digital access can sometimes worsen the problem.'Parents often make the mistake of sending affected children to highly regulated hostels with strict ban on mobile usage. This only worsens their condition and causes irreparable damage to already fragile minds,' Dr Gauthami Uma Shankar, psychiatry professor at a government medical college in Maheshwaram, said many engineering students in rural Telangana are especially vulnerable. 'They fail exams, don't get placed in companies, and feel like they're letting everyone down. That emotional burden drives them into digital addiction. It becomes an escape hatch,' she explained.A NIMHANS survey conducted across six major cities, including Hyderabad, found rising signs of digital overuse. Another study by the Centre for Economic and Social Studies revealed that nearly 19% of those aged 21–24 experience mental health issues—mostly anxiety and depression—by the age of say AI is becoming more than just a tool. Its consistent, empathetic, and responsive behaviour is making it hard to distinguish from real companionship. 'As AI becomes more human-like, these emotional entanglements will only grow. It's no longer science fiction. It's already happening—quietly, in homes, classrooms, and clinics,' they warned.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store