logo
Common medicines may not work for some people based on their DNA, experts find

Common medicines may not work for some people based on their DNA, experts find

The Sun4 days ago
A PILOT scheme has revealed a widespread genetic sensitivity to common medicines which could increase side effects or stop them working as they should.
The trial saw 2,200 adults undergo whole genome sequencing to analyse how their individual DNA responds to the likes of antibiotics and over the counter painkillers.
1
A staggering 99 per cent showed a genetic variant that affects their sensitivity to certain medicines.
This could mean some drugs, including over the counter, everyday painkillers, antibiotics and other prescription medications, won't work for some people based on their individual DNA.
The blood test, part of Bupa's My Genomic Health scheme, also looked their genetic risk of developing 36 preventable diseases including cancers, heart conditions and type 2 diabetes.
It found 91 per cent of participants were found to be at risk of developing a disease with genetic and lifestyle risk factors, such as fatty liver disease, breast cancer and certain heart diseases.
While 73 per cent had multiple genetic variants that put them at raised risk of developing a condition that could be prevented or detected early, leading to better health outcomes, including the likes of high cholesterol, skin cancer and type 2 diabetes.
And 49 per cent were found to be carriers of a genetic variant that could lead to raised risk of certain condition in future generations.
Following the successful pilot, Medication Check can now be purchased through Bupa, and will also be available to more than three million its customers as part of its workplace health scheme.
A saliva test will establish what medications are most likely to be effective, those with increased risk of adverse side effects, or ones that won't work for them at all.
Dr Rebecca Rohrer, clinical innovation and genomics director for Bupa, said: 'We've long known that most medications only work for 30-50 per cent of the population.
'However, this pilot has highlighted just how significantly individual genomes impact the effectiveness of medications in treating conditions.
Beware 3 of the most dangerous medicines in the world - including one found in almost EVERY home
'With more than half of us regularly taking a prescription medication and an increasing number affected by a chronic condition, it's crucial that people are prescribed the right medicine from the start, tailored to their unique genetic makeup.
'In the longer term, genomics is key to early detection and even preventing some illnesses altogether.'
After completing the at-home medication check, patients will be offered a GP consultation with the healthcare provider to review any medication identified in their genetic tests.
It comes as Bupa is about to introduce two new products to its My Genomic Health suite later this year, that will help to prevent or detect illness earlier.
The DNA Health Check will give people early warning of an increased genetic risks of four different conditions - breast cancer, prostate cancer, type 2 diabetes and cardiovascular disease.
While the Advanced DNA Health Check will combine insights from medication, disease risk, carrier status and traits, and will look at the genetic risk of developing conditions such as heart disease, metabolic disease and 10 types of cancer.
Carlos Jaureguizar, CEO for Bupa Global, India & UK, said: 'Whole genomic sequencing is fundamentally changing our approach to healthcare, pivoting from treatment to prevention.
'It has the power to become a health passport that people can reference throughout their lives.
'We firmly believe genomics is the path to health innovation and prevention, reducing the nation's health burden and giving people personalised knowledge of their own genomic profile to live well for longer.'
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Swinney welcomes bringing Gaza children to UK but ‘regrets' it wasn't sooner
Swinney welcomes bringing Gaza children to UK but ‘regrets' it wasn't sooner

The Independent

time15 minutes ago

  • The Independent

Swinney welcomes bringing Gaza children to UK but ‘regrets' it wasn't sooner

First Minister John Swinney has criticised the UK Government for not acting sooner to bring children from Gaza to the UK for medical treatment. Mr Swinney welcomed reported plans that up to 300 children could be flown from Gaza to be treated on the NHS. But he said he regretted the action did not come sooner. The SNP leader said he had written to Prime Minister Sir Keir Starmer on July 9 urging such action to be taken. He said: 'We have been consistently clear that the suffering being inflicted on the people of Gaza is beyond any justification. 'People in Gaza are being bombed and left to starve by Israel on a massive scale. 'I wrote to the Prime Minister on 9 July to request support from the UK Government in meeting the call from Unicef to provide medical care for children from Gaza. 'If the UK Government is prepared to evacuate Palestinians for medical treatment it would be entirely welcome. 'My only regret is the UK Government has taken this long to act. 'I urge the UK Government to do everything in its power to move swiftly so that lives can be saved. And Scotland will play our part.' The evacuation plans are reportedly set to be announced within weeks. A parent or guardian will accompany each child, as well as siblings if necessary, and the Home Office will carry out biometric and security checks before travel, the Sunday Times reported. This will happen 'in parallel' with an initiative by Project Pure Hope, a group set up to bring sick and injured Gazan children to the UK privately for treatment. More than 50,000 children are estimated to have been killed or injured in Gaza since October 2023, according to Unicef. Sir Keir said last week that the UK was 'urgently accelerating' efforts to bring children over for treatment. A UK Government spokesperson said: 'We are taking forward plans to evacuate more children from Gaza who require urgent medical care, including bringing them to the UK for specialist treatment where that is the best option for their care. 'We are working at pace to do so as quickly as possible, with further details to be set out in due course.'

Two in five Brits secretly suspect a friend is using weight-loss jabs… but are too scared to ask them
Two in five Brits secretly suspect a friend is using weight-loss jabs… but are too scared to ask them

Daily Mail​

time16 minutes ago

  • Daily Mail​

Two in five Brits secretly suspect a friend is using weight-loss jabs… but are too scared to ask them

Two in five Brits secretly suspect a friend is using weight loss treatments - but most are too scared to ask them about it, a new study reveals. Research, conducted for Well Pharmacy, found four in 10 Brits believe a friend (30%) or a close friend (13%) have been using weight loss services to help manage their weight. The findings suggest that all Brits can think of at least one person in their life who is currently using, or has in the past, used weight loss treatments to slim down. However, the study also reveals three-quarters (75%) of Brits would never dream of asking a friend - or anyone they know - whether they'd had medical help to lighten the load. The most common excuse for not inquiring is that it's 'none of their business' (49%), over one in 10 (15%) worry they might offend while a similar number fear they might be wrong. Some 1.5 million people in the UK use weight-loss treatments to manage their size [1], though that number continues to rise after GPs were given the green light to prescribe them last month [2]. According to the study, a quarter of Brits (24%) suspect a work colleague has used a weight loss service while around one in 10 (13%) put a neighbour's change in appearance down to the drugs. Gen X women are the age group most likely to believe a friend has used weight-loss treatments but they are also the most afraid to find out. Eight in 10 (80%) women aged 44 to 59 admit they would prefer not to ask someone whether they had used treatments to shed the pounds. The research comes as Well Pharmacy launches its in-store weight loss management service across all of it's stores, which offers one-to-one coaching with one of their expert pharmacists. Well Pharmacy is one of the UK's largest pharmacy chains, with over 700 branches nationwide. Well provide services such as flu vaccinations, blood pressure checks, and health advice, alongside prescription services. The pharmacists will talk to patients about their weight loss journey, their objectives as well as measuring their blood pressure and tracking their weight on a monthly basis. Well Pharmacy's Mital Thakrar hopes that patients will see the benefit from the services offered. He said: 'More and more people are turning to weight-loss treatments as a way to successfully manage their size. 'Here at Well Pharmacy, we believe it's important to consider the weight loss journey as a holistic one, where we can support the patient throughout their journey. Our face to face interaction ensures we can tailor the advice to the patient and we can support them by measuring their blood pressure and BMI on a monthly basis in store. This way we can be confident that the treatments are safe and effective for patients and we can be on hand to answer any of their questions or concerns. 'During these one-to-one consultations, the patient can discuss all aspects of the treatment, whether that be increasing their dosage or stopping treatment.' Each pharmacist across every Well store will ensure that patients are safely adhering to their weight loss programme.

Using Generative AI for therapy might feel like a lifeline – but there's danger in seeking certainty in a chatbot
Using Generative AI for therapy might feel like a lifeline – but there's danger in seeking certainty in a chatbot

The Guardian

timean hour ago

  • The Guardian

Using Generative AI for therapy might feel like a lifeline – but there's danger in seeking certainty in a chatbot

Tran* sat across from me, phone in hand, scrolling. 'I just wanted to make sure I didn't say the wrong thing,' he explained, referring to a recent disagreement with his partner. 'So I asked ChatGPT what I should say.' He read the chatbot-generated message aloud. It was articulate, logical and composed – almost too composed. It didn't sound like Tran. And it definitely didn't sound like someone in the middle of a complex, emotional conversation about the future of a long-term relationship. It also did not mention anywhere some of Tran's contributing behaviours to the relationship strain that Tran and I had been discussing. Like many others I've seen in therapy recently, Tran had turned to AI in a moment of crisis. Under immense pressure at work and facing uncertainty in his relationship, he'd downloaded ChatGPT on his phone 'just to try it out'. What began as a curiosity soon became a daily habit, asking questions, drafting texts, and even seeking reassurance about his own feelings. The more Tran used it, the more he began to second-guess himself in social situations, turning to the model for guidance before responding to colleagues or loved ones. He felt strangely comforted, like 'no one knew me better'. His partner, on the other hand, began to feel like she was talking to someone else entirely. ChatGPT and other generative AI models present a tempting accessory, or even alternative, to traditional therapy. They're often free, available 24/7 and can offer customised, detailed responses in real time. When you're overwhelmed, sleepless and desperate to make sense of a messy situation, typing a few sentences into a chatbot and getting back what feels like sage advice can be very appealing. But as a psychologist, I'm growing increasingly concerned about what I'm seeing in the clinic; a silent shift in how people are processing distress and a growing reliance on artificial intelligence in place of human connection and therapeutic support. AI might feel like a lifeline when services are overstretched – and make no mistake, services are overstretched. Globally, in 2019 one in eight people were living with a mental illness and we face a dire shortage of trained mental health professionals. In Australia, there has been a growing mental health workforce shortage that is impacting access to trained professionals. Clinician time is one of the scarcest resources in healthcare. It's understandable (even expected) that people are looking for alternatives. Turning to a chatbot for emotional support isn't without risk however, especially when the lines between advice, reassurance and emotional dependence become blurred. Many psychologists, myself included, now encourage clients to build boundaries around their use of ChatGPT and similar tools. Its seductive 'always-on' availability and friendly tone can unintentionally reinforce unhelpful behaviours, especially for people with anxiety, OCD or trauma-related issues. Reassurance-seeking, for example, is a key feature in OCD and ChatGPT, by design, provides reassurance in abundance. It never asks why you're asking again. It never challenges avoidance. It never says, 'let's sit with this feeling for a moment, and practice the skills we have been working on'. Tran often reworded prompts until the model gave him an answer that 'felt right'. But this constant tailoring meant he wasn't just seeking clarity; he was outsourcing emotional processing. Instead of learning to tolerate distress or explore nuance, he sought AI-generated certainty. Over time, that made it harder for him to trust his own instincts. Beyond psychological concerns, there are real ethical issues. Information shared with ChatGPT isn't protected by the same confidentiality standards as registered Ahpra professionals. Although OpenAI states that data from users is not used to train its models unless permission is given, the sheer volume of fine print in user agreements often goes unread. Users may not realise how their inputs can be stored, analysed and potentially reused. There's also the risk of harmful or false information. These large language models are autoregressive; they predict the next word based on previous patterns. This probabilistic process can lead to 'hallucinations', confident, polished answers that are completely untrue. AI also reflects the biases embedded in its training data. Research shows that generative models can perpetuate and even amplify gender, racial and disability-based stereotypes – not intentionally, but unavoidably. Human therapists also possess clinical skills; we notice when a client's voice trembles, or when their silence might say more than words. This isn't to say AI can't have a place. Like many technological advancements before it, generative AI is here to stay. It may offer useful summaries, psycho-educational content or even support in regions where access to mental health professionals is severely limited. But it must be used carefully, and never as a replacement for relational, regulated care. Tran wasn't wrong to seek help. His instincts to make sense of distress and to communicate more thoughtfully were logical. However, leaning so heavily on to AI meant that his skill development suffered. His partner began noticing a strange detachment in his messages. 'It just didn't sound like you', she later told him. It turned out: it wasn't. She also became frustrated about the lack of accountability in his correspondence to her and this caused more relational friction and communication issues between them. As Tran and I worked together in therapy, we explored what led him to seek certainty in a chatbot. We unpacked his fears of disappointing others, his discomfort with emotional conflict and his belief that perfect words might prevent pain. Over time, he began writing his own responses, sometimes messy, sometimes unsure, but authentically his. Good therapy is relational. It thrives on imperfection, nuance and slow discovery. It involves pattern recognition, accountability and the kind of discomfort that leads to lasting change. A therapist doesn't just answer; they ask and they challenge. They hold space, offer reflection and walk with you, while also offering up an uncomfortable mirror. For Tran, the shift wasn't just about limiting his use of ChatGPT; it was about reclaiming his own voice. In the end, he didn't need a perfect response. He needed to believe that he could navigate life's messiness with curiosity, courage and care – not perfect scripts. Name and identifying details changed to protect client confidentiality Carly Dober is a psychologist living and working in Naarm/Melbourne In Australia, support is available at Beyond Blue on 1300 22 4636, Lifeline on 13 11 14, and at MensLine on 1300 789 978. In the UK, the charity Mind is available on 0300 123 3393 and Childline on 0800 1111. In the US, call or text Mental Health America at 988 or chat

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store