logo
Will AI replace your psychologist?

Will AI replace your psychologist?

The Citizen17 hours ago

Ever since the Tamagotchi virtual pet was launched in the late 90s, humankind's relationship with machines and technology has slowly...
Ever since the Tamagotchi virtual pet was launched in the late 90s, humankind's relationship with machines and technology has slowly ramped up to where we are today. In a world where binary code controls almost every action and reaction, the way we communicate has changed. We either talk to one another through machines or cut out people completely and chinwag with chatbots. And it's everywhere.
Mental health support has joined the autotune queue. Generative artificial intelligence tools programmed in the therapeutic space deliver quick access, affordability and machine empathy on demand. Virtual assistants like Woebot and Wysa reach out their virtual hands of measurement and method. These platforms track moods; prompt reflective moments and dish out neatly packaged advice dug deep from within its code. Their appeal is obvious, said medical doctor and psychologist Dr Jonathan Redelinghuys. 'They're anonymous, instant and never overbooked.'
AI-based chatbots significantly reduced symptoms of depression
A review published in 2023 saw a study that considered more than 7000 academic records, narrowed them down to 35 studies and came to interesting conclusions. It found that AI-based chatbots significantly reduced symptoms of depression and distress, especially when embedded into instant messaging apps. While results were promising for clinically diagnosed patients and elderly users who may teeter on the edge of mental wellness, the same review noted that the technology didn't significantly improve broader psychological well-being.
Also Read: Love hurts: Seven common sex injuries
Relief, yes. Recovery not so much said Dr Redelinghuys. 'The usefulness of technology should not be confused with therapeutic depth,' he said. 'There's value in having something to turn to in moments of need but that doesn't make it therapy. Therapy is relational. It's anchored in nuance and emotional feedback, which a machine just doesn't have.' Emotional intelligence is still a human trait and while a computer or an app can pretend to understand, it does not and cannot process grief, shame or longing. 'It can't notice when someone's about to cry but doesn't. It won't pause, adjust tone or sit in silence when silence says more than words,' said Dr Redelinghuys.
AI can't notice when someone's about to cry
A review done by the University of California in 2019 explored how AI could predict and classify mental health issues using everything from electronic health records and brain imaging to smartphone data and social media activity. The findings showed strong predictive capabilities, but limitations in scale and applicability. Most of the underlying studies were small, and there is a risk of generalisation while mental health is, well, unique to an individual.
Human therapists adapt on go based on patient input, said Dr Redelinghuys. 'Humans pick up what's not being said, read body language and know when to sit back or take note. A machine can't go beyond what it was programmed to do. It can learn language, it can talk back, but it can't feel you. 'Therapy is a process that involves building a relationship with someone who gets to know you over time. Support isn't always about saying the right thing because it or you are hardwired to do so. Sometimes it's about sitting with someone in discomfort until they find their own way through.'
Healing is not plug-and-play
Remember, said Dr Redelinghuys, healing is not a plug-and-play device. The role of AI can be supportive and even provide a measure of comfort, he said. 'But it cannot replace humanness.'
Online, opinions vary on channels like Reddit. Some users report positive outcomes with chatbots, especially in managing day-to-day anxiety or spirals. Others use them for mood tracking; diary prompts and even crisis moments. But those dealing with trauma, identity confusion or challenging emotional issues often find AI support limited and, as one user called it, emotionally sterile. 'Uncoded or human therapists come with ethical standards, formal training and legal responsibilities. They are accountable,' said Dr Redelinghuys. 'Chatbots and their programmers are not held to answer. Confidentiality might be implied, but there are no professional boards or licensing bodies governing a chatbot's conduct. Data privacy is a real concern.'
Now Read: Doing Niksen; the art of nothingness

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Will AI replace your psychologist?
Will AI replace your psychologist?

The Citizen

time17 hours ago

  • The Citizen

Will AI replace your psychologist?

Ever since the Tamagotchi virtual pet was launched in the late 90s, humankind's relationship with machines and technology has slowly... Ever since the Tamagotchi virtual pet was launched in the late 90s, humankind's relationship with machines and technology has slowly ramped up to where we are today. In a world where binary code controls almost every action and reaction, the way we communicate has changed. We either talk to one another through machines or cut out people completely and chinwag with chatbots. And it's everywhere. Mental health support has joined the autotune queue. Generative artificial intelligence tools programmed in the therapeutic space deliver quick access, affordability and machine empathy on demand. Virtual assistants like Woebot and Wysa reach out their virtual hands of measurement and method. These platforms track moods; prompt reflective moments and dish out neatly packaged advice dug deep from within its code. Their appeal is obvious, said medical doctor and psychologist Dr Jonathan Redelinghuys. 'They're anonymous, instant and never overbooked.' AI-based chatbots significantly reduced symptoms of depression A review published in 2023 saw a study that considered more than 7000 academic records, narrowed them down to 35 studies and came to interesting conclusions. It found that AI-based chatbots significantly reduced symptoms of depression and distress, especially when embedded into instant messaging apps. While results were promising for clinically diagnosed patients and elderly users who may teeter on the edge of mental wellness, the same review noted that the technology didn't significantly improve broader psychological well-being. Also Read: Love hurts: Seven common sex injuries Relief, yes. Recovery not so much said Dr Redelinghuys. 'The usefulness of technology should not be confused with therapeutic depth,' he said. 'There's value in having something to turn to in moments of need but that doesn't make it therapy. Therapy is relational. It's anchored in nuance and emotional feedback, which a machine just doesn't have.' Emotional intelligence is still a human trait and while a computer or an app can pretend to understand, it does not and cannot process grief, shame or longing. 'It can't notice when someone's about to cry but doesn't. It won't pause, adjust tone or sit in silence when silence says more than words,' said Dr Redelinghuys. AI can't notice when someone's about to cry A review done by the University of California in 2019 explored how AI could predict and classify mental health issues using everything from electronic health records and brain imaging to smartphone data and social media activity. The findings showed strong predictive capabilities, but limitations in scale and applicability. Most of the underlying studies were small, and there is a risk of generalisation while mental health is, well, unique to an individual. Human therapists adapt on go based on patient input, said Dr Redelinghuys. 'Humans pick up what's not being said, read body language and know when to sit back or take note. A machine can't go beyond what it was programmed to do. It can learn language, it can talk back, but it can't feel you. 'Therapy is a process that involves building a relationship with someone who gets to know you over time. Support isn't always about saying the right thing because it or you are hardwired to do so. Sometimes it's about sitting with someone in discomfort until they find their own way through.' Healing is not plug-and-play Remember, said Dr Redelinghuys, healing is not a plug-and-play device. The role of AI can be supportive and even provide a measure of comfort, he said. 'But it cannot replace humanness.' Online, opinions vary on channels like Reddit. Some users report positive outcomes with chatbots, especially in managing day-to-day anxiety or spirals. Others use them for mood tracking; diary prompts and even crisis moments. But those dealing with trauma, identity confusion or challenging emotional issues often find AI support limited and, as one user called it, emotionally sterile. 'Uncoded or human therapists come with ethical standards, formal training and legal responsibilities. They are accountable,' said Dr Redelinghuys. 'Chatbots and their programmers are not held to answer. Confidentiality might be implied, but there are no professional boards or licensing bodies governing a chatbot's conduct. Data privacy is a real concern.' Now Read: Doing Niksen; the art of nothingness

Chatbot aimed at women and girls
Chatbot aimed at women and girls

eNCA

time3 days ago

  • eNCA

Chatbot aimed at women and girls

JOHANNESBURG - NGO Shout It Now has launched an AI companion called AIMEE a WhatsApp chat bot service that offers a space for young women to discuss sexual health challenges. READ: AI chatbot helps victims of digital sexual violence Over 4,500 women and girls have engaged on the platform since its launch in March. Topics vary between contraception, mental health support, sexually transmitted infection, HIV testing or treatment, gender-based violence, and more. Doctor Alex Spyrelis, a Research Coordinator for Shout It Now, discussed this with eNCA.

Mental health pills for pets? The rise of psychiatric medication in dog behaviour therapy
Mental health pills for pets? The rise of psychiatric medication in dog behaviour therapy

IOL News

time18-06-2025

  • IOL News

Mental health pills for pets? The rise of psychiatric medication in dog behaviour therapy

We love our dogs like family, but putting them on medication for 'behavioural problems' may not be the answer. We love our dogs like family . They greet us with wagging tails, comfort us on tough days and ask for little more than a walk and a treat. But lately, a question has been quietly echoing online and in vet clinics - "Are we medicating our pets for just being themselves?" A recent viral TikTok video by content creator Dr Jef has stirred the pot on pet parenting, sparking fiery debate about whether we've lost sight of what it means to let animals just be animals. The clip zooms in on an unsettling trend: more and more dogs are being prescribed psychiatric medication like Prozac, not for serious mental health disorders, but for simply behaving like right. Barking. Chewing. Running wildly around the house. Behaviours that were once shrugged off as part of the canine experience are now being 'treated' with pills. And it's not just social media buzz. Data backs it up. Dr Diana Neil at the UK's Royal Veterinary College analysed data from over 2.3 million dogs and found that 1 in every 500 dogs was prescribed Prozac for behaviour-related issues. Just a decade ago, that number was 1 in 10,000. The reasons range from barking excessively and chewing furniture to showing separation anxiety or general restlessness. Research confirms that mood stabilisers for pets, often the same ones prescribed to humans are becoming more common, especially post-lockdown. Covid created a generation of 'lockdown puppies' who missed critical early socialisation. When their humans returned to the office, many dogs were left with high anxiety levels they didn't know how to cope with. The stress is real for both the dog and the owner. Are we projecting our mental health struggles onto our pets? Melissa Bain, a professor of veterinary behaviour at the University of California, Davis, told STAT News, "When we start to recognise things in humans, we recognise it in our dogs too." She believes the growing focus on human mental health, especially since the pandemic, has made us more conscious of our pets' emotions. But while awareness is essential, experts warn that not every 'bad' behaviour requires a pill. Breed choices and lifestyle mismatches High-energy breeds like Border Collies, Labradors and Belgian Malinois - historically bred for work - top the list for behavioural drug prescriptions. Why? Because we bring them into small apartments, work long hours, and expect them to be Instagrammable cuddle companions. We're medicating normal animal behaviours because they don't fit into our lifestyles. If a dog has too much energy for your schedule, that's not the dog's fault. That's a mismatch. And let's be honest: many of us pick pets based on cuteness, not compatibility. The rise in popularity of working dogs with strong instincts and high prey drive clashes with our urban, high-stress lives.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store