
Medicare and Medicaid move toward covering Ozempic and other weight loss drugs
State Medicaid programs and Medicare Part D plans could soon voluntarily choose to cover these drugs for 'weight management,' The Washington Post reports, citing internal documents from the Centers for Medicare and Medicaid.
The proposed plan, which has not been finalized, would start in April 2026 for Medicaid and in January 2027 for Medicare.
Covering the drugs for weight loss would cost Medicare an estimated $35 billion from 2026 to 2034, the Post reports. However, Medicare is negotiating lower prices for Ozempic and Wegovy in 2027.
Currently, the popular drugs can cost upwards of $1,200.
These negotiations could save consumers money. Some experts have argued that Medicare prices can serve as a benchmark for private insurance companies and lead to savings, according to the health non-profit Kaiser Family Foundation. More insurers could also be pressured to provide coverage for these medications for weight loss if states opt into the proposed program.
Many GLP-1 medications, such as Ozempic, are intended to treat Type 2 diabetes, but in recent years, millions of Americans have turned to them for weight loss management.
Medicare covers GLP-1 drugs primarily for Type 2 diabetes treatment, while some private insurance companies already cover the drugs for weight loss. A 2024 Kaiser Foundation survey found that more than half of adults said the cost made it difficult to afford.
It's unclear how many states might opt in to the program. Thirteen state Medicaid programs have already chosen to cover GLP-1s for weight loss, the Post reports.
Novo Nordisk, the company that makes GLP-1s Ozempic and Wegovy, told the Post they believe 'comprehensive coverage through government and commercial insurance plans is critical to affordable health care and treatment options.'
This proposal comes after the Trump administration said in April that Medicaid and Medicare would not cover GLP-1 drugs for weight loss, the Post reports. This ended a previous plan under Joe Biden's administration to cover the drugs.
Members of the Trump administration may also be divided on the issue. Centers for Medicare and Medicaid Administration Mehmet Oz has previously called the drugs a 'big help,' while Health and Human Services Secretary Robert F. Kennedy Jr. has raised concerns about the cost of the drugs, the Post reports.
A Centers for Medicare and Medicaid spokesperson declined to comment on the proposal.
'All drug coverages undergo a cost-benefit review,' the spokesperson said in a statement. 'CMS does not comment on potential models or coverage.'
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Daily Mail
10 minutes ago
- Daily Mail
Tourist's life-changing injuries after swimming in filthy hotel pool and contracting horrific infection
A woman endured a horrific infection and agonizing injuries after she swam in a hotel's indoor pool that her lawyers say wasn't treated with chlorine. Alexis Williams, 23, was staying at the Residence Inn Downtown Ann Arbor Hotel in Michigan in June while visiting her grandmother, who was having a procedure at a nearby medical center. She decided to take a swim in the hotel's pool with her cousins, who soon became violently ill and started vomiting. Williams scraped her knee while she was swimming and contracted a rare infection called MRSA, or Methicillin-resistant Staphylococcus aureus. MRSA is caused by staph bacteria and is resistant to most antibiotics, according to the Centers for Disease Control and Prevention. Within hours of swimming in the pool, Williams was overcome with severe pain and couldn't even walk, she recalled to local news. 'It was outrageous,' she told local Fox affiliate, Fox 2 Detroit. 'The pain was excruciating. I had to get poked a lot with a whole bunch of needles, and being prescribed medications I never thought I'd be prescribed to.' Williams told local news that doctors said they may have to amputate her leg if they can't get the infection under control Williams had three surgeries on her leg and remains on strong IV antibiotics, according to her lawyer, Ven Johnson. She now has to constantly receive medication through intravenous therapy and needs a walker. Williams even feared that her leg would have to be amputated. 'I've gone through a lot of pain and suffering, and still currently am,' she told the Detroit Free Press. 'I'm very frightened, very nervous and just appalled by everything.' Williams' lawyers obtained records from the Michigan Department of Environment, Great Lakes, and Energy that revealed the heinous conditions of the hotel's pool. Her lawyers said inspections of the swimming pool on June 12, June 27, and July 8 showed no chlorine or bromine in the water. The civil complaint argues that 'the hotel knew that its swimming pool had a Standard Plate Count that exceeded 200 CFU/ml, which indicates a dangerous level of bacteria present in the swimming pool and poor disinfection.' Williams' lawyers believe that the hotel knew the pool didn't have these chemicals and had improper pH levels. Her legal team accused the hotel of disregarding public safety and creating an unsafe environment for guests. 'Alexis started developing this infection within several hours of coming into contact with this water,' Michael Freifeld, an attorney with Williams' legal team, told the Detroit Free Press. Williams' legal team alleges that the hotel's pool didn't have chlorine or bromine in the water, creating an unsafe situation for guests 'We have no doubt, given the records we have and the experts that we are going to hire, that the infection Alexis experienced, and is experiencing, was clearly connected to the pool.' Johnson added that Williams still has a long way ahead of her, and doctors said they may have to amputate if the infection isn't under control. 'For anybody, let alone a 23-year-old young person, it's a very scary, uncertain prognosis,' Johnson said. The lawsuit is seeking $25,000 in damages. Daily Mail reached out to First Martin Corporation, which owns the Residence Inn for comment on the accusations.


Daily Mail
7 hours ago
- Daily Mail
The unlikely group 'destined' to be struck by Alzheimer's disease as early as 40
Alzheimer's disease is largely seen as one of old age. The most common form of memory-robbing dementia, Alzheimer's affects nearly 7 million Americans, most of whom are over the age of 65. Your browser does not support iframes. Your browser does not support iframes. Your browser does not support iframes.


The Guardian
8 hours ago
- The Guardian
Using Generative AI for therapy might feel like a lifeline – but there's danger in seeking certainty in a chatbot
Tran* sat across from me, phone in hand, scrolling. 'I just wanted to make sure I didn't say the wrong thing,' he explained, referring to a recent disagreement with his partner. 'So I asked ChatGPT what I should say.' He read the chatbot-generated message aloud. It was articulate, logical and composed – almost too composed. It didn't sound like Tran. And it definitely didn't sound like someone in the middle of a complex, emotional conversation about the future of a long-term relationship. It also did not mention anywhere some of Tran's contributing behaviours to the relationship strain that Tran and I had been discussing. Like many others I've seen in therapy recently, Tran had turned to AI in a moment of crisis. Under immense pressure at work and facing uncertainty in his relationship, he'd downloaded ChatGPT on his phone 'just to try it out'. What began as a curiosity soon became a daily habit, asking questions, drafting texts, and even seeking reassurance about his own feelings. The more Tran used it, the more he began to second-guess himself in social situations, turning to the model for guidance before responding to colleagues or loved ones. He felt strangely comforted, like 'no one knew me better'. His partner, on the other hand, began to feel like she was talking to someone else entirely. ChatGPT and other generative AI models present a tempting accessory, or even alternative, to traditional therapy. They're often free, available 24/7 and can offer customised, detailed responses in real time. When you're overwhelmed, sleepless and desperate to make sense of a messy situation, typing a few sentences into a chatbot and getting back what feels like sage advice can be very appealing. But as a psychologist, I'm growing increasingly concerned about what I'm seeing in the clinic; a silent shift in how people are processing distress and a growing reliance on artificial intelligence in place of human connection and therapeutic support. AI might feel like a lifeline when services are overstretched – and make no mistake, services are overstretched. Globally, in 2019 one in eight people were living with a mental illness and we face a dire shortage of trained mental health professionals. In Australia, there has been a growing mental health workforce shortage that is impacting access to trained professionals. Clinician time is one of the scarcest resources in healthcare. It's understandable (even expected) that people are looking for alternatives. Turning to a chatbot for emotional support isn't without risk however, especially when the lines between advice, reassurance and emotional dependence become blurred. Many psychologists, myself included, now encourage clients to build boundaries around their use of ChatGPT and similar tools. Its seductive 'always-on' availability and friendly tone can unintentionally reinforce unhelpful behaviours, especially for people with anxiety, OCD or trauma-related issues. Reassurance-seeking, for example, is a key feature in OCD and ChatGPT, by design, provides reassurance in abundance. It never asks why you're asking again. It never challenges avoidance. It never says, 'let's sit with this feeling for a moment, and practice the skills we have been working on'. Tran often reworded prompts until the model gave him an answer that 'felt right'. But this constant tailoring meant he wasn't just seeking clarity; he was outsourcing emotional processing. Instead of learning to tolerate distress or explore nuance, he sought AI-generated certainty. Over time, that made it harder for him to trust his own instincts. Beyond psychological concerns, there are real ethical issues. Information shared with ChatGPT isn't protected by the same confidentiality standards as registered Ahpra professionals. Although OpenAI states that data from users is not used to train its models unless permission is given, the sheer volume of fine print in user agreements often goes unread. Users may not realise how their inputs can be stored, analysed and potentially reused. There's also the risk of harmful or false information. These large language models are autoregressive; they predict the next word based on previous patterns. This probabilistic process can lead to 'hallucinations', confident, polished answers that are completely untrue. AI also reflects the biases embedded in its training data. Research shows that generative models can perpetuate and even amplify gender, racial and disability-based stereotypes – not intentionally, but unavoidably. Human therapists also possess clinical skills; we notice when a client's voice trembles, or when their silence might say more than words. This isn't to say AI can't have a place. Like many technological advancements before it, generative AI is here to stay. It may offer useful summaries, psycho-educational content or even support in regions where access to mental health professionals is severely limited. But it must be used carefully, and never as a replacement for relational, regulated care. Tran wasn't wrong to seek help. His instincts to make sense of distress and to communicate more thoughtfully were logical. However, leaning so heavily on to AI meant that his skill development suffered. His partner began noticing a strange detachment in his messages. 'It just didn't sound like you', she later told him. It turned out: it wasn't. She also became frustrated about the lack of accountability in his correspondence to her and this caused more relational friction and communication issues between them. As Tran and I worked together in therapy, we explored what led him to seek certainty in a chatbot. We unpacked his fears of disappointing others, his discomfort with emotional conflict and his belief that perfect words might prevent pain. Over time, he began writing his own responses, sometimes messy, sometimes unsure, but authentically his. Good therapy is relational. It thrives on imperfection, nuance and slow discovery. It involves pattern recognition, accountability and the kind of discomfort that leads to lasting change. A therapist doesn't just answer; they ask and they challenge. They hold space, offer reflection and walk with you, while also offering up an uncomfortable mirror. For Tran, the shift wasn't just about limiting his use of ChatGPT; it was about reclaiming his own voice. In the end, he didn't need a perfect response. He needed to believe that he could navigate life's messiness with curiosity, courage and care – not perfect scripts. Name and identifying details changed to protect client confidentiality Carly Dober is a psychologist living and working in Naarm/Melbourne In Australia, support is available at Beyond Blue on 1300 22 4636, Lifeline on 13 11 14, and at MensLine on 1300 789 978. In the UK, the charity Mind is available on 0300 123 3393 and Childline on 0800 1111. In the US, call or text Mental Health America at 988 or chat