logo
I'm an addiction doctor, and I can't get lifesaving meds for many of my patients

I'm an addiction doctor, and I can't get lifesaving meds for many of my patients

Yahoo20-02-2025
Opioid overdoses kill more than three Arizonans every day.
Luckily, we have an effective treatment: buprenorphine and methadone, collectively referred to as medications for opioid use disorder (MOUD). Not only do they treat the bone-crushing symptoms of opioid withdrawal, but they also keep people in addiction treatment and, most importantly, alive.
Public health experts consider these medications to be the gold standard treatment for opioid use disorder.
Unfortunately, it's easier to get the poison than the cure.
Whereas cheap fentanyl can be found on the streets, it is not so easy to access MOUD. Few physicians prescribe these life-saving medications, and much of rural Arizona does not have access to a methadone clinic.
And for the doctors who do prescribe MOUD, many are constrained by insurance companies' requirements for prior authorizations.
If a medication requires prior authorization, a doctor must first submit a form for the insurance companies to approve — essentially arguing why they are recommending a particular treatment for their patient.
If approved, the insurer will cover the bill. If not, the patient is on the hook, even if their doctor is adamant about the care they prescribed.
Although prior authorizations claim to save money for the health system and protect resources, they can actually increase overall costs by delaying access to necessary care, contributing to preventable hospitalizations and wasting health care providers' time — something that could be better spent seeing patients.
On average, physicians and their staff spend 12 hours a week fighting the red tape of prior authorization denials, according to a recent American Medical Association survey.
Perhaps it is no surprise that only 4.1% of overdose survivors receive MOUD, with an average 72-day delay. But for those who do, the results are stark: they are at least 52% less likely to die from a subsequent overdose.
Recognizing the challenges of prior authorization, the American Medical Association recommends that all prior-authorizations for buprenorphine be removed. Similarly, the American Society of Addiction Medicine recommends that payers 'eliminate prior authorization requirements for all formulations of addiction medications.'
As an addiction physician at the University of Arizona, I have cared for many patients who need a long-acting, injectable form of buprenorphine. Instead of having pills that they forget to take or that get stolen, they could just come to the clinic once a month for an injection.
But because of prior authorization requirements around this FDA-approved formulation, many of my patients cannot access the medication they need.
Inevitably, I get a call — they are back in the hospital.
Opinion: Fentanyl kills. But tougher sentences won't save anyone
They couldn't afford their prescription. They relapsed onto fentanyl. And then they got sick again, whether it be from a recurrence of their soft tissue infection or an exacerbation of their heart failure.
This costly readmission could have been prevented if I could only have been allowed to treat my patient according to my own medical judgment.
To fix this problem, Arizona lawmakers have introduced House Bill 2674 to ensure that prior authorization requirements within the state's Medicaid program do not limit doctors from prescribing life-saving medications to those who need them.
While opponents may point to the potential cost of implementing this policy, there is significant funding available to support this mandate; Arizona has almost $500 million of opioid settlement funds remaining.
Furthermore, this bill will not open a floodgate of prescribers. Across the country, similar regulatory changes designed to remove treatment barriers have not significantly affected the amount of MOUD prescribed. Considering how few addiction specialists there are, few clinicians even know how to start these medications.
But for the clinicians who are willing, let's remove the regulatory obstacles in their path. By treating addiction like the chronic disease it is, HB 2674 offers a unique opportunity to combat the opioid epidemic and make life better for all Arizonans.
Dr. Melody Glenn is an addiction and emergency physician at The University of Arizona. Her first book, 'Mother of Methadone,' is forthcoming from Beacon Press this July. Reach her at melodyglenn@arizona.edu.
This article originally appeared on Arizona Republic: Fentanyl is easy to find. Addiction medications? Not so much | Opinion
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Missouri AG sues Planned Parenthood for allegedly lying about dangers of abortion drugs
Missouri AG sues Planned Parenthood for allegedly lying about dangers of abortion drugs

The Hill

time24 minutes ago

  • The Hill

Missouri AG sues Planned Parenthood for allegedly lying about dangers of abortion drugs

Missouri is suing Planned Parenthood for allegedly lying to patients about the risks of the abortion medication mifepristone. Missouri Attorney General Andrew Bailey (R) filed a lawsuit Wednesday in Jefferson City arguing that Planned Parenthood's claims that the abortion drug is safer than many other medications including penicillin and Tylenol are untrue and violate the state's consumer-protection law. Bailey claims that the nonprofit organization has lied about the safety of the drug to 'cut costs and boost revenue,' according to the lawsuit. The complaint also requests a court order to stop Planned Parenthood from 'continuing to promote the falsehoods,' in Missouri and for the organization to pay more than $1.8 million in civil penalties. The attorney general's office is also asking for the organization to be fined $1,000 in damages to every woman in the Show-Me State that has received abortion medication through one of is providers in the past five years. On top of this, it asks that the organization reimburse the state for Medicaid and other tax-payer-funded emergency care provided to people who suffered complications after taking mifepristone. 'We are going to hold these charlatans and death dealers accountable,' Bailey wrote in a post on social platform X about the suit. The crux of the lawsuit's argument comes down to a disagreement over how many people suffer adverse health effects after taking mifepristone. On the Food and Drug Administration's (FDA) warning label for the drug, it states that between 2.9 percent and 4.6 percent of people who have taken it along with misoprostol report visiting an emergency room afterwards. There are two drugs typically needed for a medication abortion: mifepristone and misoprostol. Mifepristone stops the pregnancy from growing while misoprostol induces cramping and bleeding to empty the uterus. More than 100 scientific studies across decades have looked at the efficacy and safety of the pair and all of them have found the drugs safe for use, according to an analysis from The New York Times. Bailey's lawsuit claims that the FDA's label is inaccurate and that 'recent studies' suggest the complication rate is much higher. The lawsuit does not cite a specific study to back up its claim and a spokesperson for the attorney general's office did not answer questions from The Hill about what data was used to back up its claim. The lawsuit does echo findings outlined in a deeply flawed study published in April by the conservative think tank the Ethics and Public Policy Center (EPPC) which states that after analyzing more than 865,000 prescribed mifepristone abortions, it determined that nearly 11 percent of women experienced a 'serious adverse event.' That's nearly 22 times higher than what the FDA reports. Bailey's lawsuit references a 'dataset' of more than 850,000 mifepristone abortions that identified 'serious adverse events' in more than 10 percent of women who took the drug. Medical researchers have criticized the EPPC study for its lack of transparency and for flaws in its methodology. One of the largest hiccups of the study is its inclusion of emergency room visits as one of the 'serious adverse events' that can happen after taking the abortion pill, health experts say. The EPPC study breaks down 'serious adverse events' into categories including hemorrhage, sepsis and emergency room visits. It looks as if emergency room visits were counted as adverse events even if health care workers determined the patient was healthy and released them without treatment. Some people might go to an emergency room after taking the abortion pill to confirm that they are no longer pregnant or to make sure that the bleeding they are experiencing is normal, two principal research scientists at the Guttmacher Institute noted in an op-ed last month. The lawsuit is the latest attack from conservative lawmakers on Planned Parenthood. Under the GOP's new tax and spending bill, the organization would lose its ability to receive Medicaid reimbursements for health services it provides for one year. The nonprofit sued the Trump administration over the provision and a federal judge granted the organization's request for a temporary injunction earlier this week.

GE HealthCare drives growth with investment in AI-enabled medical devices and tops FDA's list of AI authorizations for 4th Year with 100
GE HealthCare drives growth with investment in AI-enabled medical devices and tops FDA's list of AI authorizations for 4th Year with 100

Yahoo

time2 hours ago

  • Yahoo

GE HealthCare drives growth with investment in AI-enabled medical devices and tops FDA's list of AI authorizations for 4th Year with 100

Increased R&D investments to integrate AI on devices across disease states designed to boost productivity, efficiency, and diagnostic confidence for healthcare professionals, and drive the company's long-term growth Milestone advances GE HealthCare's goal of attaining more than 200 authorizations by 2028 CHICAGO, July 23, 2025--(BUSINESS WIRE)--GE HealthCare (Nasdaq: GEHC) has topped a U.S. Food and Drug Administration (FDA) list of AI-enabled medical device authorizations for the fourth year in a row with 100 listed authorizations to date in the U.S. This milestone reflects GE HealthCare's continued research and development (R&D) investment and focus on developing AI solutions to advance precision care by enhancing medical devices across the care journey. Smart devices, software, and cloud-based solutions, which are central to GE HealthCare's precision care strategy, help enhance outcomes for patients, improve the daily work of care teams, and boost healthcare professional efficiency. These AI-enabled devices help solve customer challenges and are in high demand, which contributes to orders, revenue, and growth for the company. The momentum demonstrates GE HealthCare's progress toward achieving its goal of securing more than 200 authorizations. "Our sustained leadership in AI-enabled medical devices reflects our commitment to research and development, which is powering the creation of next-generation solutions. These solutions are designed to address the toughest challenges our customers are facing including care team shortages and burnout, rising costs, and inefficient workflows," said Dr. Taha Kass-Hout, GE HealthCare's Global Chief Science and Technology Officer. "As we continue to drive the industry forward, we remain committed to doing so in a responsible way, building in our Responsible AI principles at every stage of our product development which include a focus on safety, validity, transparency, explainability, and fairness." The FDA's webpage, Artificial Intelligence-Enabled Medical Devices, provides a list of device authorizations, granted through 510(k) clearances, De Novo requests, or by premarket approval (PMA). GE HealthCare's 100 authorizations to date demonstrate innovation across imaging modalities and care pathways including oncology, cardiology, and neurology, helping to ease the burden of care and improve workflows for healthcare systems. Examples of GE HealthCare's AI solutions that are helping solve customer challenges and driving growth include: AI-based Auto Positioning uses deep learning to automatically detect anatomical landmarks, which are used to determine the patient's orientation inside computed tomography (CT) and positron emission tomography (PET)/CT devices, including Revolution Apex platform and Omni Legend. The solution helps minimize the action required by technologists into a single click operation, enabling faster patient positioning compared to traditional manual positioning operations.1 AIR™ Recon DL is a pioneering deep learning algorithm for image reconstruction that enables radiologists to achieve pin-sharp images quicker. By combining magnetic resonance imaging (MRI) with deep learning, AIR™ Recon DL reduces artifacts, enhances image clarity, and shortens scan times by up to 50%.2 It has been estimated that more than 50 million patients have been scanned since its launch in 2020.3 The LOGIQ™ Series ultrasound portfolio of systems empowers clinicians to scan, diagnose, and treat a wide range of patients and conditions. With AI-powered automation, real-time workflow enhancements, and exceptional image quality, the LOGIQ Series is designed to facilitate faster, more efficient scanning and support diagnostic precision. Intelligent anatomy recognition enables dynamic image optimization as well as repeatable and reproducible automated measurements and results – providing elevated accuracy and greater diagnostic confidence. Precision DL is deep learning-based image processing, available on the Omni Legend PET/CT system, that enhances image quality in PET/CT scans, providing clinicians with a powerful solution to aid in precise diagnoses, treatment planning, and monitoring with the image quality performance benefits typically associated with hardware-based Time-of-Flight (ToF) reconstruction, without compromising sensitivity, including improved contrast-to-noise ratio, contrast recovery4, and quantitative accuracy.5 Venue Family point-of-care ultrasound systems with AI-powered Caption Guidance™ software provides real-time, step-by-step guidance to help even new ultrasound users capture cardiac views and diagnostic-quality images successfully. "We're accelerating the pace of innovation to meet the urgency of today's healthcare challenges. Reaching this milestone is also an important step along our journey of evolving from an imaging company to a healthcare solutions provider, enabling us to deliver holistic and integrated solutions that meet our customers' needs today and will help enable them to stay ahead in a rapidly evolving healthcare environment," said Kass-Hout. GE HealthCare is pushing forward the boundaries of innovation by fostering new ways to use AI, cloud, and software to move the future of healthcare forward in a responsible way in devices, across the care journey, and at the hospital system level. These projects and innovations run the gamut from early R&D to commercially available solutions, often the result of working closely with leading medical institutions, universities, and technology companies to bring in the best thinking from industry, technology, and academia. Regardless of a project's maturity, GE HealthCare combines deep healthcare expertise, a commitment to responsible innovation, and pioneering spirit to help customers address pressing global challenges from aging populations, chronic disease management, remote care, and more. For more information about GE Healthcare's AI-enabled medical device and enterprise software solutions, visit About GE HealthCare Technologies Inc. GE HealthCare is a trusted partner and leading global healthcare solutions provider, innovating medical technology, pharmaceutical diagnostics, and integrated, cloud-first AI-enabled solutions, services and data analytics. We aim to make hospitals and health systems more efficient, clinicians more effective, therapies more precise, and patients healthier and happier. Serving patients and providers for more than 125 years, GE HealthCare is advancing personalized, connected and compassionate care, while simplifying the patient's journey across care pathways. Together, our Imaging, Advanced Visualization Solutions, Patient Care Solutions and Pharmaceutical Diagnostics businesses help improve patient care from screening and diagnosis to therapy and monitoring. We are a $19.7 billion business with approximately 53,000 colleagues working to create a world where healthcare has no limits. GE HealthCare is proud to be among 2025 Fortune World's Most Admired Companies™. Follow us on LinkedIn, X, Facebook, Instagram, and Insights for the latest news, or visit our website for more information. 1 "AI-based Auto Positioning," February 2021, 2 AIR™ Recon DL, 3 Calculated using IB data with an estimation of 20 scans per day, 5.5 days per week, from 4 weeks after delivery to April 2025. 4 Precision DL with Omni Legend 32cm data improves Contrast Recovery (CR) by 11% on average and Contrast-to-Noise Ratio (CNR) by average of 23% as compared to non-ToF reconstruction. CR and CNR demonstrated using clinical data with inserted lesions of known size, location, and contrast. Using data from Omni Legend 32 cm, CR and CNR were measured using High Precision DL and QCHD. 5 Precision DL with Omni Legend 32cm improves feature quantitation accuracy by 14% as compared to Discovery MI with ToF reconstruction, at comparable noise level. Quantitation accuracy demonstrated using clinical data with inserted lesions of known size, location, and contrast (ground truth). Feature SUVmean from Omni Legend 32 cm with High Precision DL compared to SUVmean from Discovery MI 25 cm with QCFX. View source version on Contacts GE HealthCare Media Contact Sofia Mata-LeclercHead of Communications, Science and

FDA's New Drug Approval AI Is Generating Fake Studies: Report
FDA's New Drug Approval AI Is Generating Fake Studies: Report

Gizmodo

time2 hours ago

  • Gizmodo

FDA's New Drug Approval AI Is Generating Fake Studies: Report

Robert F. Kennedy Jr., the Secretary of Health and Human Services, has made a big push to get agencies like the Food and Drug Administration to use generative artificial intelligence tools. In fact, Kennedy recently told Tucker Carlson that AI will soon be used to approve new drugs 'very, very quickly.' But a new report from CNN confirms all our worst fears. Elsa, the FDA's AI tool, is spitting out fake studies. CNN spoke with six current and former employees at the FDA, three of whom have used Elsa for work that they described as helpful, like creating meeting notes and summaries. But three of those FDA employees told CNN that Elsa just makes up nonexistent studies, something commonly referred to in AI as 'hallucinating.' The AI will also misrepresent research, according to these employees. 'Anything that you don't have time to double-check is unreliable. It hallucinates confidently,' one unnamed FDA employee told CNN. And that's the big problem with all AI chatbots. They need to be double-checked for accuracy, often creating even more work for the human behind the computer if they care about the quality of their output at all. People who insist that AI actually saves them time are often fooling themselves, with one recent study of programmers showing that tasks took 20% longer with AI, even among people who were convinced they were more efficient. Kennedy's Make America Healthy Again (MAHA) commission issued a report back in May that was later found to be filled with citations for fake studies. An analysis from the nonprofit news outlet NOTUS found that at least seven studies cited didn't even exist, with many more misrepresenting what was actually said in a given study. We still don't know if the commission used Elsa to generate that report. FDA Commissioner Marty Makary initially deployed Elsa across the agency on June 2, and an internal slide leaked to Gizmodo bragged that the system was 'cost-effective,' only costing $12,000 in its first week. Makary said that Elsa was 'ahead of schedule and under budget' when he first announced the AI rollout. But it seems like you get what you pay for. If you don't care about the accuracy of your work, Elsa sounds like a great tool for allowing you to get slop out the door faster, generating garbage studies that could potentially have real consequences for public health in the U.S. CNN notes that if an FDA employee asks Elsa to generate a one-paragraph summary of a 20-page paper on a new drug, there's no simple way to know if that summary is accurate. And even if the summary is more or less accurate, what if there's something within that 20-page report that would be a big red flag for any human with expertise? The only way to know for sure if something was missed or if the summary is accurate is to actually read the report. The FDA employees who spoke with CNN said they tested Elsa by asking basic questions like how many drugs of a certain class have been approved for children. Elsa confidently gave wrong answers, and while it apparently apologized when it was corrected, a robot being 'sorry' doesn't really fix anything. We still don't know the workflow being deployed when Kennedy says AI will allow the FDA to approve new drugs, but he testified in June to a House subcommittee that it's already being used to 'increase the speed of drug approvals.' The secretary, whose extremist anti-vaccine beliefs didn't keep him from becoming a public health leader, seems intent on injecting unproven technologies into mainstream science. Kennedy also testified to Congress that he wants every American to be strapped with a wearable health device within the next four years. As it happens, President Trump's pick for Surgeon General, Casey Means, owns a wearables company called Levels that monitors glucose levels in people who aren't diabetic. There's absolutely no reason that people without diabetes need to constantly monitor their glucose levels, according to experts. Means, a close ally of Kennedy, has not yet been confirmed by the Senate. The FDA didn't respond to questions emailed on Wednesday about what the agency is doing to address Elsa's fake study problem. Makary acknowledged to CNN that Elsa could 'potentially hallucinate,' but that's 'no different' from other large language models and generative AI. And he's not wrong on that. The problem is that AI is not fit for purpose when it's consistently just making things up. But that won't stop folks from continuing to believe that AI is somehow magic.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store