4 days ago
UAE experts warn AI may feel like 'real' therapist, delays mental health help
For 27-year-old Sara (name changed upon request), ChatGPT was a good resource for work-related help. She used it for fact-checking, clarifying ideas, and getting help on the go. However, it soon began to shift to something more.
'I began using it during emotionally tough situations at work, in the family and even in relationships,' she said. 'I love how it analyses everything like it reads my mind. It gives me another perspective, and I love the reassurance I get from it. It makes me feel like my feelings are valid.'
Over time, Sara started using ChatGPT to reflect on her habits and personality. 'It became like a coach, helping me understand myself better,' she said. 'I'd love to go to therapy one day even if it's just for self-awareness. However, therapy can be expensive and out of budget sometimes. It's really comforting to have something private, discreet, and available 24/7. Especially when I feel a panic attack coming on.'
Experts say that the increasing trend of youngsters turning to ChatGPT for mental support is 'not surprising' but is extremely 'concerning' for a number of reasons. 'It's not surprising that more young people are turning to AI platforms like ChatGPT for emotional support,' said Dr Alexandre Machado, Clinical Neuropsychologist at Hakkini mental health clinic in Dubai. 'It's easy, anonymous, and always available, kind of like having a friend in your pocket.'
Concerns, hidden dangers
However, the real danger lies hidden, said Dr Waleed Alomar, specialist psychiatrist at Medcare Royal Speciality Hospital in Al Qusais. 'It's concerning that some chatbots are presenting themselves as therapists,' he said. 'While users might [initially] be aware that they are chatting with a bot, many, particularly young people, can easily get carried away and start to feel like they are speaking to a real person or even a licensed professional.'
He added that this is an issue because artificial intelligence does not always recognise the line between everyday sadness and a serious mental health issue.
'Since chatbots lack the credentials to diagnose or treat serious mental health conditions, they cannot connect users with human care when a person genuinely needs a mental health expert's support,' he said. 'While a chatbot may offer a brief sense of relief, it might also delay people from pursuing the professional help they truly need, leaving them feeling even more isolated.'
His comments were echoed by Dr Alexandre who said there were plenty of case studies to prove how dangerous the trend can be. 'For example, a man in Belgium ended his life after being influenced by a bot, and a young boy in the UK once tried to assassinate the queen based on AI advice,' he said. 'These cases show how dangerous it can be to rely on unregulated AI for emotional support.'
Benefits of 'instant' support
Despite the obvious concerns, the experts agree that there were some benefits to having AI as a mental health support tool. 'AI tools are accessible anytime, which they may find especially helpful during those late-night hours when emotions can feel overwhelming,' said Dr Waleed. 'For a generation that has grown up with on-demand services, having support available 'anytime' is a real breakthrough. Also using AI for mental health support provides a sense of anonymity and a non-judgmental space.'
Dr Alexandre added that while the tools cannot replace a therapist, they can help in some situations. 'It's easy, anonymous, and always available, kind of like having a friend in your pocket,' he said. 'But it's important to remember that AI can't adapt like a human can. Use it as a tool, but don't let it take over.'