
Carvykti interactions: Alcohol, medications, and other factors
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Daily Mail
13 minutes ago
- Daily Mail
I'm a relationships expert: these are the commonly missed signs that your female friends are TOXIC (and how to cut them off)
I have gone through more friendship break-ups than I care to admit and, controversially, I believe that makes me a better friend. It might even keep me younger too. A study last week revealed that toxic friendships cause premature biological ageing, comparable to that triggered by smoking. New York University found that social exchanges with so-called frenemies can cause chemical changes to DNA by keeping the body in a state of high stress.


The Guardian
an hour ago
- The Guardian
Using Generative AI for therapy might feel like a lifeline – but there's danger in seeking certainty in a chatbot
Tran* sat across from me, phone in hand, scrolling. 'I just wanted to make sure I didn't say the wrong thing,' he explained, referring to a recent disagreement with his partner. 'So I asked ChatGPT what I should say.' He read the chatbot-generated message aloud. It was articulate, logical and composed – almost too composed. It didn't sound like Tran. And it definitely didn't sound like someone in the middle of a complex, emotional conversation about the future of a long-term relationship. It also did not mention anywhere some of Tran's contributing behaviours to the relationship strain that Tran and I had been discussing. Like many others I've seen in therapy recently, Tran had turned to AI in a moment of crisis. Under immense pressure at work and facing uncertainty in his relationship, he'd downloaded ChatGPT on his phone 'just to try it out'. What began as a curiosity soon became a daily habit, asking questions, drafting texts, and even seeking reassurance about his own feelings. The more Tran used it, the more he began to second-guess himself in social situations, turning to the model for guidance before responding to colleagues or loved ones. He felt strangely comforted, like 'no one knew me better'. His partner, on the other hand, began to feel like she was talking to someone else entirely. ChatGPT and other generative AI models present a tempting accessory, or even alternative, to traditional therapy. They're often free, available 24/7 and can offer customised, detailed responses in real time. When you're overwhelmed, sleepless and desperate to make sense of a messy situation, typing a few sentences into a chatbot and getting back what feels like sage advice can be very appealing. But as a psychologist, I'm growing increasingly concerned about what I'm seeing in the clinic; a silent shift in how people are processing distress and a growing reliance on artificial intelligence in place of human connection and therapeutic support. AI might feel like a lifeline when services are overstretched – and make no mistake, services are overstretched. Globally, in 2019 one in eight people were living with a mental illness and we face a dire shortage of trained mental health professionals. In Australia, there has been a growing mental health workforce shortage that is impacting access to trained professionals. Clinician time is one of the scarcest resources in healthcare. It's understandable (even expected) that people are looking for alternatives. Turning to a chatbot for emotional support isn't without risk however, especially when the lines between advice, reassurance and emotional dependence become blurred. Many psychologists, myself included, now encourage clients to build boundaries around their use of ChatGPT and similar tools. Its seductive 'always-on' availability and friendly tone can unintentionally reinforce unhelpful behaviours, especially for people with anxiety, OCD or trauma-related issues. Reassurance-seeking, for example, is a key feature in OCD and ChatGPT, by design, provides reassurance in abundance. It never asks why you're asking again. It never challenges avoidance. It never says, 'let's sit with this feeling for a moment, and practice the skills we have been working on'. Tran often reworded prompts until the model gave him an answer that 'felt right'. But this constant tailoring meant he wasn't just seeking clarity; he was outsourcing emotional processing. Instead of learning to tolerate distress or explore nuance, he sought AI-generated certainty. Over time, that made it harder for him to trust his own instincts. Beyond psychological concerns, there are real ethical issues. Information shared with ChatGPT isn't protected by the same confidentiality standards as registered Ahpra professionals. Although OpenAI states that data from users is not used to train its models unless permission is given, the sheer volume of fine print in user agreements often goes unread. Users may not realise how their inputs can be stored, analysed and potentially reused. There's also the risk of harmful or false information. These large language models are autoregressive; they predict the next word based on previous patterns. This probabilistic process can lead to 'hallucinations', confident, polished answers that are completely untrue. AI also reflects the biases embedded in its training data. Research shows that generative models can perpetuate and even amplify gender, racial and disability-based stereotypes – not intentionally, but unavoidably. Human therapists also possess clinical skills; we notice when a client's voice trembles, or when their silence might say more than words. This isn't to say AI can't have a place. Like many technological advancements before it, generative AI is here to stay. It may offer useful summaries, psycho-educational content or even support in regions where access to mental health professionals is severely limited. But it must be used carefully, and never as a replacement for relational, regulated care. Tran wasn't wrong to seek help. His instincts to make sense of distress and to communicate more thoughtfully were logical. However, leaning so heavily on to AI meant that his skill development suffered. His partner began noticing a strange detachment in his messages. 'It just didn't sound like you', she later told him. It turned out: it wasn't. She also became frustrated about the lack of accountability in his correspondence to her and this caused more relational friction and communication issues between them. As Tran and I worked together in therapy, we explored what led him to seek certainty in a chatbot. We unpacked his fears of disappointing others, his discomfort with emotional conflict and his belief that perfect words might prevent pain. Over time, he began writing his own responses, sometimes messy, sometimes unsure, but authentically his. Good therapy is relational. It thrives on imperfection, nuance and slow discovery. It involves pattern recognition, accountability and the kind of discomfort that leads to lasting change. A therapist doesn't just answer; they ask and they challenge. They hold space, offer reflection and walk with you, while also offering up an uncomfortable mirror. For Tran, the shift wasn't just about limiting his use of ChatGPT; it was about reclaiming his own voice. In the end, he didn't need a perfect response. He needed to believe that he could navigate life's messiness with curiosity, courage and care – not perfect scripts. Name and identifying details changed to protect client confidentiality Carly Dober is a psychologist living and working in Naarm/Melbourne In Australia, support is available at Beyond Blue on 1300 22 4636, Lifeline on 13 11 14, and at MensLine on 1300 789 978. In the UK, the charity Mind is available on 0300 123 3393 and Childline on 0800 1111. In the US, call or text Mental Health America at 988 or chat


BBC News
3 hours ago
- BBC News
Government plans to bring more Gazan children to UK for treatment
Plans to evacuate more seriously ill or injured children from Gaza and bring them to the UK for medical treatment are being carried out "at pace", the government BBC understands the aim is to bring the plan into operation within is unclear how many children might be involved, but the Sunday Times reports the government is to allow up to 300 young people to enter the UK to receive free medical Gazan children have already been brought privately to the UK for medical treatment through an initiative by Project Pure Hope, but the government has so far not evacuated any through its own scheme during the conflict. A government spokesperson said on Sunday the plan was to "evacuate children from Gaza who require urgent medical care," adding "we are working at pace to do so as quickly as possible."More than 50,000 children have been killed or injured since the war in Gaza begun in October 2023, according to the UN charity Unicef.A Foreign Affairs Committee report published at the end of July said the government had "declined to support a medical evacuation of critically injured children to the UK, involving coordinating travel permits, medical visas and safe transport to the UK, where the children can receive specialised care unavailable to them in Gaza".Following that, Prime Minister Sir Keir Starmer said the UK was "urgently accelerating efforts" to evacuate children who needed critical medical assistance to the UK for to the Sunday Times, the government scheme will require a parent or guardian to accompany each child, and the Home Office will carry out biometric and security checks before they travel. Gazan boy first to be treated in UK for war injuriesGazan girls arrive in UK for medical treatment Project Pure Hope, a British organisation which includes volunteer medical professionals, has so far brought three children to the UK for most recent, 15-year-old Majd al-Shagnobi, arrived in the UK last week. He required complex facial reconstructive surgery after an Israeli tank shell destroyed his jaw when he was trying to access aid in February 2024. He was the first Palestinian child to be flown to the UK for treatment for war evacuation was organised in conjunction with the US NGO, Kinder Relief, which has helped other children from Gaza get medical treatment treatment, privately funded by Project Pure Hope, will begin at Great Ormond Street Hospital in London in the coming days, carried out by a medical team who will all work for organisation has been urging the government to establish a scheme similar to one created to treat Ukrainian refugees and welcomed the government's plan, saying it would be able to share its expertise from successful private said: "Our blueprint can help ensure the UK acts quickly and effectively, so that every child who needs urgent care has the best chance of survival and recovery." In April, the group secured visas for two girls -13-year-old Rama and five-year-old Ghena - to have privately funded operations in the UK for life-long medical were brought to London after being evacuated to Egypt from has had laser surgery to relieve the pressure in her left eye, which she was at risk of losing. And Rama has had exploratory surgery for a serious bowel mothers say both girls are doing well. Medics have been warning of shortages in vital food and medical supplies for weeks, after Israel began a months-long blockade of all aid and goods into has since been partially lifted, but humanitarian agencies have said more aid must be allowed to enter to Gaza to prevent famine and malnutrition Hamas-run health ministry said 175 people, including 93 children, have died from denies it is deliberately blocking aid flowing into Gaza and accuses the UN and other aid agencies of failing to deliver the start of the war, the UK has provided funds so that injured Gazans can be treated by hospitals in the region, and has also been working with Jordan to airdrop aid into the Keir said last week that the UK would recognise a Palestinian state in September unless Israel took "substantive steps to end the appalling situation in Gaza" - a move Israeli Prime Minister Benjamin Netanyahu said "rewards Hamas's monstrous terrorism". The Israel Defense Forces (IDF) launched a campaign in Gaza in response to the Hamas-led attack on southern Israel on 7 October 2023, in which about 1,200 people were killed and 251 others were taken than 60,000 people have since been killed in Gaza, according to the Hamas-run health ministry.