logo
ChatGPT Flags Woman's Symptoms As Blood Cancer A Year Before Formal Diagnosis

ChatGPT Flags Woman's Symptoms As Blood Cancer A Year Before Formal Diagnosis

News1825-04-2025
Last Updated:
Marley Garnreiter, 27, from Paris, found early signs of Hodgkin lymphoma via ChatGPT months before diagnosis. This instance showcase AI's potential in healthcare
While AI tools like ChatGPT are not designed to diagnose illnesses, there have been increasing instances where they have helped users identify early signs of serious health conditions.
A 27-year-old woman from Paris has shared her extraordinary experience of how ChatGPT identified symptoms of blood cancer almost a year before she received a formal diagnosis from medical professionals. The case has sparked renewed discussion around the growing potential of artificial intelligence in healthcare.
Marley Garnreiter began experiencing night sweats and skin irritation, symptoms she attributed to stress following the loss of her father to colon cancer. Despite multiple visits to doctors, all her test results returned as normal, and no serious health concerns were flagged at the time.
Turning To AI For Clarity
Still concerned, Garnreiter decided to input her symptoms into ChatGPT in search of alternative insights. The AI tool flagged the possibility of blood cancer, a warning she initially dismissed. Speaking to People.com, she admitted she didn't take the suggestion seriously, with friends also advising her not to rely on a machine for health guidance.
Symptoms Persist, Diagnosis Confirmed
Months later, Garnreiter began experiencing chronic fatigue and chest pain, prompting further medical investigation. A scan eventually revealed a large lump in her left lung, leading to a diagnosis of Hodgkin lymphoma, a rare type of blood cancer affecting white blood cells. She is now preparing to begin chemotherapy.
Reflecting on the experience, Garnreiter expressed her shock. 'It's incredible that an AI tool picked up on something so important, well before the healthcare system did.", she said.
While Hodgkin lymphoma is rare, early detection greatly improves treatment outcomes. Medical experts note that the five-year survival rate exceeds 80 percent when identified early. Common symptoms include itching, fatigue, night sweats, fever, and stomach discomfort—many of which Garnreiter had reported.
Although ChatGPT is not a substitute for professional medical advice, her case adds to the growing conversation around AI's potential in symptom analysis, particularly when traditional methods fall short or diagnosis is delayed. However, experts caution that such tools should be used as a supplement rather than a substitute for professional healthcare.
First Published:
April 25, 2025, 15:29 IST
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Back-to-school 2025: 5 smart strategies every US student should know before the first bell
Back-to-school 2025: 5 smart strategies every US student should know before the first bell

Time of India

time2 days ago

  • Time of India

Back-to-school 2025: 5 smart strategies every US student should know before the first bell

As August edges into September, millions of students across the United States begin the annual pilgrimage back to classrooms—some with excitement, others with anxiety, and many with a bit of both. While the back-to-school season varies from state to state—starting as early as late July in southern districts and extending into mid-September in parts of the Northeast—it marks a universal transition. It's a reset not just of routines, but of mindsets, habits, and ambitions. And in 2025, navigating this shift requires more than just fresh notebooks and sharpened pencils. This year brings with it heightened academic expectations, the deepening integration of AI in education, a renewed national focus on student mental health, and the continued balancing act between screen time and face-to-face interaction. For students and parents alike, the return to school in 2025 is less about returning to 'normal' and more about adapting intelligently to the new demands of learning—and living—in a post-pandemic, hyperconnected world. 1. Rebuilding routines without the burnout Summer's late nights and unscheduled days don't yield easily to the 6:30 a.m. alarm. For students—especially adolescents whose biological sleep clocks skew later—this transition can be jarring. by Taboola by Taboola Sponsored Links Sponsored Links Promoted Links Promoted Links You May Like Buy Resmed AirSense 11 with flat 20% off ResMed Buy Now Undo Experts recommend beginning the reset at least a week in advance. Gradually shifting sleep schedules, limiting screen exposure after dusk, and introducing light physical activity in the morning can help recalibrate the body's rhythm. But beyond sleep, daily structure matters. Creating realistic time blocks for studying, unwinding, and digital detoxes is critical. Students who map out their weeks early often find they're less overwhelmed once assignments and assessments pile up. Parents, too, benefit from establishing predictable after-school routines, balancing support with autonomy. 2. Tech is here to stay—use it smartly The rise of generative AI tools like ChatGPT, Grammarly, and Notion AI has reshaped the way students learn, write, and even organize their time. While some schools have implemented restrictions or AI-use policies, others are encouraging responsible engagement. For high school and college students, using AI tools as learning companions—rather than shortcuts—can foster deeper understanding. Summarizing complex readings, generating study questions, or checking grammar can be productive uses. But over-reliance can dull original thinking. Parents and educators must help students differentiate between aid and avoidance. And with most schoolwork now living online—from Canvas to Google Classroom—it's imperative to maintain digital hygiene. Calendar syncing, file backups, and note organization shouldn't be afterthoughts—they are essential academic skills in their own right. 3. The mental health conversation is no longer optional One of the most significant shifts in American education over the past five years is the growing normalization of mental health dialogue. Yet, a 2024 CDC report revealed that nearly 1 in 3 high school students still experience persistent feelings of sadness or hopelessness—a figure that remains alarmingly unchanged. Schools are responding with expanded counseling services, SEL (social-emotional learning) periods, and mindfulness initiatives. But access and quality still vary widely by zip code. For students, learning to identify early signs of burnout, social anxiety, or depression is as vital as understanding algebra. Journaling, seeking peer support, or speaking to a trusted adult are no longer 'extra' practices—they are necessary tools for survival and success. 4. Navigating academic pressure and setting realistic goals The pressure to perform—amplified by college admissions stress, competitive GPAs, and stacked extracurriculars—can overshadow the joy of learning. Back-to-school 2025 is an opportunity to recalibrate. Rather than fixating solely on grades or rankings, students should focus on progress over perfection. Micro-goals, such as improving a study habit or engaging more in class discussions, often lead to better outcomes than lofty, undefined resolutions. Parents play a key role here: emphasizing effort over outcomes, praising resilience instead of perfection, and allowing space for occasional failure can buffer students against academic anxiety. 5. Making space for connection, offline After years of hybrid models, disrupted schedules, and digital-first friendships, many students are relearning how to engage in person. This social reset isn't easy for everyone. For some, the cafeteria is more intimidating than the classroom. Encouraging small, consistent social risks—joining a club, initiating group study sessions, or even just greeting seatmates—can help rebuild confidence. Schools that foster belonging through peer mentorship, inclusive events, and advisory periods often see better academic and emotional outcomes. Not just back to school, forward with intention Back-to-school 2025 isn't just a return to routine—it's a re-entry into an educational system undergoing quiet but seismic shifts. For students, this year is a chance to be intentional: about how they manage time, embrace tech, engage with peers, and protect their mental well-being. For parents, it's a moment to let go of outdated metrics of success and tune in to what their children truly need to thrive—not just in school, but in life. If approached with clarity, compassion, and a touch of courage, this school year can be more than just another grind. It can be a foundation for something meaningful, resilient, and real. Ready to navigate global policies? Secure your overseas future. Get expert guidance now!

Is It Safe To Take Treatment Advice From ChatGPT? Top Doctor Says...
Is It Safe To Take Treatment Advice From ChatGPT? Top Doctor Says...

News18

time2 days ago

  • News18

Is It Safe To Take Treatment Advice From ChatGPT? Top Doctor Says...

Many turn to Google or ChatGPT for health advice, but is it safe? A leading doctor shares why self-diagnosing online may do more harm than good. Read these key insights Since the arrival of the internet, Google, and artificial intelligence, people have come to believe that every answer is just a click away. Whether it's a familiar issue or something completely new, users now instinctively turn to the web for information. In fact, 95 out of every 100 smartphone users reportedly search for disease names and symptoms online when experiencing health concerns. This habit doesn't stop at symptoms. Many individuals also attempt to interpret their ultrasound, X-ray, or MRI reports using AI tools such as ChatGPT, Grok, or Gemini. While this may seem helpful, health experts warn that it can do more harm than good. AI May Create More Anxiety Than Answers According to Dr GC Khilnani, former professor at AIIMS and head of Pulmonary, Critical Care and Sleep Medicine at PSRI Hospital in Delhi, the rise of self-diagnosis through AI tools is concerning. 'What I am witnessing is a flood of information on platforms like Google and ChatGPT. Patients come to me anxious and panicked after reading online content about their symptoms or test results. I often spend extra time clarifying facts and correcting misinformation. In many cases, patients are so overwhelmed with data that they struggle to accept the real diagnosis," he explains. By contrast, patients using AI platforms read surface-level information within minutes, often misinterpreting or oversimplifying complex medical topics. In some cases, AI-generated responses may be outdated, misleading, or even incorrect, leading patients to draw dangerous conclusions. Why Doctors Don't Always Have Instant Answers Another misconception is that doctors should have an immediate answer to every health concern. However, as Dr Khilnani points out, medicine is not an exact science. Diagnoses often require detailed investigation and clinical judgement, which AI simply cannot replicate. What Should Patients Do? Dr Khilnani strongly advises that patients consult a medical professional directly rather than relying on platforms like Google, Gemini, or ChatGPT for diagnosis and treatment. 'Searching for symptoms online only increases unnecessary anxiety," he says. 'Patients often begin to fear serious illnesses they don't actually have. Instead of guessing, go to a qualified doctor, someone who can guide you with proper testing, interpretation, and care." While AI and the internet are powerful tools for learning, they are not substitutes for professional medical advice. In matters of health, turning to a trained doctor remains the safest and most reliable choice. Disclaimer: Comments reflect users' views, not News18's. Please keep discussions respectful and constructive. Abusive, defamatory, or illegal comments will be removed. News18 may disable any comment at its discretion. By posting, you agree to our Terms of Use and Privacy Policy.

Should you double-check your doctor with ChatGPT? Yes, you absolutely should
Should you double-check your doctor with ChatGPT? Yes, you absolutely should

India Today

time5 days ago

  • India Today

Should you double-check your doctor with ChatGPT? Yes, you absolutely should

First, there was Google. Or rather Doctor Google, as it is mockingly called by the men and women in white coats, the ones who come in one hour late to see their patients and those who brush off every little query from patients brusquely and sometimes with unwarranted there is a new foe in town, and it is only now that doctors are beginning to realise it. This is ChatGPT, or Gemini, or something like DeepSeek, the AI systems that are coherent and powerful enough to act like medical guides. Doctors are, obviously, not happy about it. Just the way they enrage patients for trying to discuss with them what the ailing person finds after Googling symptoms, now they are fuming against advice that ChatGPT can dish problem is that no one likes to be double-checked. And Indian doctors, in particular, hate it. They want their word to be the gospel. Bhagwan ka roop or something like that. But frustratingly for them, the capabilities of new AI systems are such that anyone can now re-check their doctor's prescription, or can read diagnostic films and observations, using tools like ChatGPT. The question, however, is: should you do it? Absolutely yes. The benefits outweigh the harms. Let me tell you a story. This is from around 15 years ago. A person whom I know well went to a doctor for an ear infection. This was a much-celebrated doctor, leading the ENT department in a hospital chain which has a name starting with the letter F. The doctor charged the patient a princely sum and poked and probed the ear in question. After a few days of tests and consultations, a surgery — rather complex one — was recommended. It was at this time, when the patient was submitting the consent forms for the surgery that was scheduled for a few days later, that the doctor discovered some new information. He found that the patient was a journalist in a large media group, the name of which starts with the letter new information, although not related to the patient's ear, quickly changed the tune the doctor was whistling. He became coy and cautious. He started having second thoughts about the surgery. So, he recommended a second opinion, writing a reference for another senior doctor, who was the head of the ENT at a hospital chain which has a name starting with the letter A. The doctor at this new hospital carried out his own observations. The ear was probed and poked again, and within minutes he declared, 'No, surgery needed. Absolutely, no surgery needed.'What happened? I have no way of confirming this. But I believe here is what happened. The doctor at hospital F was pushing for an unnecessary and complex surgery, the one where chances of something going wrong were minimal but not zero. However, once he realised that the patient was a journalist, he decided not to risk it and to get out of the situation, relied on the doctor at hospital is a story I know, but I am sure almost everyone in this country will have similar anecdotes. At one time or another, we have all had a feeling that this doctor or that was probably pushing for some procedure, some diagnostic test, or some advice that did not sit well with us. And in many unfortunate cases, people actually underwent some procedure or some treatment that harmed them more than it helped. Medical negligence in India flies under the radar of 'doctor is bhagwan ka roop' and other other countries where medical negligence is something that can have serious repercussions for doctors and hospitals, in India, people in white coats get flexibility in almost everything that they do. A lot of it is due to the reverence that society has for doctors, the savers of life. Some of it is also because, in India, we have far fewer doctors than are needed. This is not to say that doctors in India are incompetent. In general, they are not, largely thanks to the scholastic nature of modern medicine and procedures. Most of them also work crazy long hours, under conditions that are extremely frugal in terms of equipment and highly stressful in terms of this is exactly why we should use ChatGPT to double-check our doctors in India. Because there is a huge supply-demand mismatch, it is safe to say that we have doctors in the country who are not up for the task, whether these are doctors with dodgy degrees or those who have little to no background in modern medicine, and yet they put Dr in front of their name and run clinics where they deal with most complex is precisely because doctors are overworked in India that their patients should use AI to double-check their diagnostic opinions and suggested treatments. Doctors, irrespective of what we feel about them and how we revere them, are humans at the end of the day. They are prone to making the same mistakes that any human would make in a challenging work finally, because many doctors in India — not all, but many — tend to overdo their treatment and diagnostic tests, we should double-check them with AI. Next time, when you get a CT scan, also show it to ChatGPT and then discuss with your doctor if the AI is telling you something different. In the last one year, again and again, research has highlighted that AI is extremely good at diagnosis. Just earlier this month, a new study by a team at Microsoft found that their MAI-DxO — a specially-tuned AI system for medical diagnosis — outperformed human doctors. Compared to 21 doctors who were part of the study and who were correct in only 20 per cent of cases, MAI-DxO was correct in 85 per cent of cases in its none of this is to say that you should replace your doctor with ChatGPT. Absolutely not. Good doctors are indeed precious and their consultation is priceless. They will also be better with subtleties of the human body compared to any AI system. But in the coming months and years, I have a feeling that doctors in India will launch a tirade against AI, similar to how they once fought Dr they will shame and harangue their patients for using ChatGPT for a second opinion. When that happens, we should push back. Indian doctors are not used to questions, they don't like to explain, they don't want to be second-guessed or double-checked. And that is exactly why we should ask them questions, seek explanations and double-check them, if needed, even with the help of ChatGPT.(Javed Anwer is Technology Editor, India Today Group Digital. Latent Space is a weekly column on tech, world, and everything in between. The name comes from the science of AI and to reflect it, Latent Space functions in the same way: by simplifying the world of tech and giving it a context)- Ends(Views expressed in this opinion piece are those of the author)Trending Reel

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store