logo
RAK Hospital introduces 'Happy Gas' procedures for dental care

RAK Hospital introduces 'Happy Gas' procedures for dental care

Khaleej Times23-06-2025
In a move to redefine patient comfort and ease dental anxiety, RAK Hospital has introduced 'Happy Gas' procedures — using nitrous oxide sedation — for the first time in the Northern Emirates. Commonly referred to as laughing gas, nitrous oxide is widely used in the Western world and is now available at RAK Hospital to help patients, especially adults with dental phobia, undergo procedures in a relaxed and pain-free state.
The service is led by Dr. Anurag Singh, Specialist Oral and Maxillofacial Surgeon. The introduction of Happy Gas is especially beneficial for adult patients with dental phobia, enabling a calm, comfortable, and virtually stress-free experience during dental treatments ranging from fillings to surgical procedures.
'Dental anxiety is a very real and common issue,' said Dr. Anurag Singh. 'With Happy Gas, patients remain fully conscious yet relaxed and dissociated from fear. It makes procedures significantly more comfortable, especially for those who've had negative dental experiences in the past. We've already seen tremendous positive feedback from patients opting for this sedation.'
Widely used across Western countries, nitrous oxide sedation is a quick-acting and reversible method that allows patients to undergo treatments without the stress traditionally associated with dental visits. At RAK Hospital, the procedure is available on an outpatient basis. Patients can resume normal activities immediately post-treatment.
The Happy Gas session is priced competitively at Dh350.
Dr. Raza Siddiqui, Executive Director of RAK Hospital, added: 'We are proud to be the first in the Northern Emirates to introduce this globally trusted technique in our dental department. At RAK Hospital, we constantly strive to bring world-class innovations to enhance patient experience. Happy Gas is not just about comfort—it's about making dental care more approachable and fear-free for our diverse community.'
Although primarily available for adults currently, there are plans to extend this service to pediatric patients in the due course. The hospital has already successfully treated multiple patients with this method and expects its popularity to grow rapidly.
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Response Plus and Burjeel institute $1mn award for wellness in energy sector
Response Plus and Burjeel institute $1mn award for wellness in energy sector

Arabian Business

time8 minutes ago

  • Arabian Business

Response Plus and Burjeel institute $1mn award for wellness in energy sector

Response Plus Holding (RPM) and Burjeel Holdings will jointly recognise and reward excellence in physical and mental well-being across the global energy sector – the first global award focused exclusively on the energy industry – with a US$1 million Human Energy Health and Wellbeing Award. The award will be won by one organisation associated with the energy sector, which introduces the most innovative, impactful, and measurable solution to enhance the physical and mental health and well-being of energy workers worldwide. The inaugural winner of the award will be announced at ADIPEC in November 2025. The awards are open to organisations of all sizes, with a particular focus on small- and medium-sized enterprises (SMEs) in the energy supply chain, where the financial incentive can deliver significant operational and cultural impact. Submissions will be evaluated by an independent, international jury, with expertise in health, wellness, and global leadership. Dr Richard Heron, former Vice President of Health and Chief Medical Officer at BP, will serve as Chair of the Jury. With over 15 years of experience leading global health strategies and serving on multiple international advisory boards, he brings deep insight into occupational health and well-being. Joining Dr Heron will be Vinay Menon, a renowned wellness coach known for his work with elite athletes, including Chelsea FC and the Belgium national team at FIFA 2022, and John Defterios, an award-winning journalist and economic analyst with over three decades of experience covering global energy and emerging markets. The initiative addresses a growing awareness of the need to prioritise mental health, resilience, and holistic wellness as integral components of operational performance and employee engagement in the energy sector, and Dr Rohil Raghavan, CEO of Response Plus Holding PJSC, said: 'The Human Energy Awards reflects RPM's commitment to placing people at the centre of progress. This initiative is about setting a new benchmark to recognise how the Energy sector values and supports its workforce.' John Sunil, CEO of Burjeel Holdings, added: 'Our partnership reflects a shared mission to prioritise the health of those driving the energy sector. This award highlights the vital role of preventive care and well-being in sustaining a resilient workforce.'

Gulf Pharmaceutical Industries sends 12 tonnes of vital medicine to people of Gaza
Gulf Pharmaceutical Industries sends 12 tonnes of vital medicine to people of Gaza

The National

time11 minutes ago

  • The National

Gulf Pharmaceutical Industries sends 12 tonnes of vital medicine to people of Gaza

Gulf Pharmaceutical Industries (Julphar) has sent 12 tonnes of medical aid in an aid shipment to hospitals in the Gaza Strip. The aid was sent in co-ordination with the Saqr bin Mohammed Al Qasimi Charity and Humanitarian Foundation, state news agency Wam reported. The essential medicines were for conditions such as gastrointestinal disorders, and antibiotics for bacterial and fungal infections, including skin conditions, pain relievers and fever reducers. The aid was delivered under the UAE's Operation Chivalrous Knight 3 project. Launched in 2023 by President Sheikh Mohamed, the drive has been carried out in collaboration with Emirates Red Crescent, and humanitarian and charitable institutions in the UAE. More than 55,000 tonnes of aid have been delivered on more than 500 flights, six transport ships and 2,500 lorries to Gaza. In a separate Birds of Goodness operation, more than 3,700 tonnes of humanitarian aid has been dropped from the air by parachute into areas inaccessible over land.

No, ChatGPT can't be your mental health counsellor
No, ChatGPT can't be your mental health counsellor

Khaleej Times

timean hour ago

  • Khaleej Times

No, ChatGPT can't be your mental health counsellor

As people across the world, especially in the UAE, turn increasingly to artificial intelligence (AI) as a convenient tool to navigate life, experts are becoming concerned that it is being used to work through some of their biggest emotional challenges in lieu of a seeking a professional therapist. Sreevidhya Kottarapat Srinivas, Clinical Psychologist at Medcare Camali Clinic, told wknd. that the growing dependence on ChatGPT and other AI tools for mental health guidance reflects a larger shift in the ways in which people are seeking help. 'It is like a 'quick fix', often providing immediate solutions. [Plus, it's] easily accessible and anonymous,' she said. The trend is seen more in the younger generation, for whom AI is becoming the first point of contact to explore emotions or understand symptoms before reaching out to a professional. 'While this trend offers potential for early psychoeducation and de-stigmatising of mental health concerns, it should not be a substitute for qualified mental health professionals,' she explained. Last year, a study by the Oliver Wyman Forum found that 36 per cent of Gen-Z and millennials would consider using AI for mental health support, while only 27 per cent of other generations would. The move has been sparked by an uptick in mental health issues and awareness even as the stigma slowly lifts. Since the pandemic, there has been a 25-27 per cent rise in depression and anxiety, according to the World Health Organisation. And about half of the world's population is expected to experience a mental health disorder during their lifetime, according to researchers at Harvard Medical School and the University of Queensland. Srinivas said concern arises when individuals begin to over-rely on AI responses, especially in complex or high-risk situations that require more hands-on solutions and the involvement of another human being with skills such as empathy, diagnostic clarity, and real-time crisis management. 'ChatGPT doesn't know the full context, and often emotions such as pain, trauma, and anger may not be well communicated over text. AI has its limitations in forming a therapeutic alliance, and offers limited accountability. So, while the advice may seem helpful on the face of it, it can undermine or miss the signs of underlying trauma, nuanced behaviour, or even reinforce cognitive distortions,' she said. Dr Rebecca Steingiesser, a consultant clinical psychologist and clinical neuropsychologist based in Dubai, said the issue is becoming more prevalent. 'I'm hearing a lot about this now in my practice, with my clients using AI to help themselves organise their goals and make important life decisions, for example,' she said. 'It's obvious that AI is already beginning to reshape the landscape of therapy and mental health support. We are seeing the emergence of AI-powered tools offering psychoeducation, symptom checkers, journaling prompts, mood tracking, and even basic cognitive-behavioural interventions. These are what I would normally share with clients in sessions on paper forms,' she added. She said while these tools can be helpful adjuncts to therapy, particularly for monitoring progress between sessions or providing immediate, low-intensity in-the-moment support, they are not substitutes for the nuanced, relational, and highly individualised work that occurs in therapy. 'I've also seen individuals use it for exploring whether their experiences might be consistent with certain diagnoses, though that comes with serious risks, especially if they are making decisions about medications based on this information without consulting with their psychiatrists,' she added. Devika Mankani, psychologist at The Hundred Wellness Centre Dubai, who has 25 years' experience, has seen the consequences of using AI in patients who came after using it before turning to a professional. 'I've seen clients come into therapy after months of relying on AI tools. In one case, a woman believed she was in a 'toxic' marriage because ChatGPT repeatedly affirmed her frustrations without context or challenge. She later admitted what she needed was a safe space to explore her own patterns, not to be nudged toward an exit,' she said. 'In another case, a man with narcissistic traits used AI to validate his belief that others were always at fault, reinforcing his grandiosity and avoiding self-reflection.' She says that while the interaction may feel therapeutic at the time, it is not always rooted in clinical assessment, supervision, or accountability. Srinivas explained that AI models are trained on generalised data and cannot replace clinical judgment. 'There is also a risk of emotional dependence on a system that cannot provide attuned human responsiveness, which is a key part of therapeutic healing,' she warned. She too has seen the cases first hand with concerning consequences for those depending on the technology that has taken the world by storm. 'I've had clients mention they consulted ChatGPT about whether their partner had a narcissistic personality or whether they were struggling with Attention deficit hyperactivity disorder (ADHD), often based on a list of traits and no proper assessment. In one case, a client who was a child completely withdrew from social interactions in the real world and would often communicate her thoughts and feelings through the app. When asked, she said: 'ChatGPT is my only friend.' This was a case of AI unintentionally validating a skewed narrative because of a lack of therapeutic insight. The stigma of seeking therapy also remains a deterrent, globally. According to a 2022 study in Springer Nature, 'Attitudes towards mental health problems in a sample of United Arab Emirates' residents', researchers said: 'Mental health issues are still stigmatised in the United Arab Emirates (UAE), possibly due to cultural reasons.' This attitude has of course undergone a change since then, with the UAE government making strides in de-stigmatising mental healthcare. Still, for some, it is easier to engage with AI than an actual person. Dr Steisenger says not only is AI use more common among younger adults and teens already comfortable with digital platforms and more open to experimenting with technology, it is also turned to as it is seen as less intimidating or judgmental than a real-life therapist. 'That said, I'm also seeing an increase in busy professionals using AI for support in managing stress or burnout, particularly when access to therapy is delayed or limited due to long waitlists or challenges with a busy work schedule,' she added. Context in using AI, she agrees is key. Listening to AI lacking critical pieces of a complex human puzzle, especially for people making major life decisions such as ending relationships, changing careers, or self-diagnosing mental health conditions, can be disastrous. 'It is also very clear that AI can't detect red flags for risk of harm, such as suicidal ideation, in the way a trained professional can, how a person presents in person, their energy, their demeanour. So many subtle indicators would never be picked up on,' she reasons while explaining why online therapy is unsuitable for high-risk clients. Reading between the lines 'Misdiagnosis, minimisation of distress, or reinforcement of harmful thinking patterns are very real concerns, and I always caution my clients from putting too much emphasis on it,' she explained. Alarmingly, this month, during the first episode of OpenAI's official podcast, OpenAI CEO Sam Altman said he is surprised at how much people rely on ChatGPT: 'People have a very high degree of trust in ChatGPT, which is interesting, because AI hallucinates. It should be the tech that you don't trust that much.' Srinivas said the issue sheds light on how challenging it is to access appropriate mental health services in many parts of the world. 'Turning to AI is not always a matter of preference, it might also be the only option for some individuals who find it difficult to access or afford mental health services,' she said. 'Policymakers need to prioritise community mental health funding, make insurance coverage available, and make mental health a part of primary care. If this gap is not addressed in making services accessible and available, we are only increasing the risk where individuals have to resort to alternatives that are potentially harmful and inadequate.' Mankani agrees: 'This trend is not going away. Our responsibility is to build a future where technology supports humans flourishing, without replacing human care.'

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store