logo
My husband has the same terminal cancer as Joe Biden... here's the warning signs we wish we hadn't ignored

My husband has the same terminal cancer as Joe Biden... here's the warning signs we wish we hadn't ignored

Daily Mail​20-05-2025
Joe Biden's sudden diagnosis of aggressive prostate cancer has shocked the world.
Now, other men who have been blighted by late-stage disease, only being diagnosed after the disease had already spread, are stepping forward to share their stories.
When Eric White started experiencing testicular pain at 49, he blamed it on the hours he spent operating a forklift at work in a warehouse.
But when this was accompanied by extreme fatigue and weight loss, his wife Megan encouraged him to see a doctor.
The father-of-three from Alabama was initially diagnosed with epididymitis, an infection that causes swelling of a small tube at the back of the testicles that stores and transports sperm.
To treat the condition, Mr White was put on a three-week course of antibiotics and referred to a urologist, who then gave him another three-week course of the same drug.
But despite the medication, his condition deteriorated and his wife says he started experiencing 'flow problems' when he went to the bathroom.
She recalls: 'It was hard for him to start urinating [and] he was dribbling. When he went back to the urologist, they decided to check his prostate, and it was hard.'
This prompted doctors to order a prostate-specific antigen (PSA) test to measure the amount of proteins produced by the prostate gland.'
At 49, he was slightly too young to be getting screened regularly (most doctors only recommend it after age 55)
Mr White's results showed high levels of PSA and he underwent a biopsy of his prostate, which revealed he had cancer.
PSA levels are generally measured in nanograms per milliliter (ng/mL).
In general, a level above 4 ng/mL is considered abnormal and may result in a recommendation for further examination.
Mr White's reading was 19.1 ng/mL and the biopsy confirmed cancer had spread throughout his entire prostate, a small gland found only in men located just below the bladder and in front of the rectum.
It produces a fluid that contributes to semen.
In May 2024 he was diagnosed with prostate adenocarcinoma. Also known as glandular prostate cancer, it is the most common type of prostate cancer, accounting for over 95 percent of cases.
It originates in the gland cells of the prostate, which produce prostate fluid.
Over 300,000 American men are diagnosed with prostate cancer every year, and 35,000 die from the disease. It's the most common cancer in men.
It's also considered one of the most treatable if caught early - with a nearly 100 percent survival rate if detected while localized in the prostate.
The United States Preventive Services Task Force (USPSTF) states men ages 55 to 69 undergo prostate cancer screening every two to three years. However, those with a family history or who are at a greater risk, such as Black men, may need more frequent screening.
If cancer is found during the biopsy, MRI and CT scans may be used to determine if the disease has spread outside the prostate to other parts of the body.
The family was shocked to learn Mr White's cancer had spread to his neck, his femur and he had 'two spots on his liver'.
This meant he had stage 4 cancer, which has a five-year survival rate of around 30 percent.
Prostate cancer - also dubbed the 'silent killer' - is often diagnosed at later stages due to a combination of factors, including a misleading focus on urinary symptoms, a lack of awareness, and challenges in accessing or utilizing early screening methods.
Medical experts have declared it 'inconceivable' that former President Joe Biden 's 'aggressive' form of prostate cancer was not caught earlier by doctors.
The 82-year-old's office announced the devastating diagnosis on Sunday, saying the cancer had spread to his bones and his family were reviewing treatment options.
His cancer was given a Gleason score of 9 and a Grade Group of 5, a dire stage of the rapidly-spreading disease. The diagnosis came days after doctors found a 'small nodule' on his prostate.
After Mr White's cancer was diagnosed, an appropriate treatment plan was drawn up.
In a TikTok video, his wife says: '[After his diagnosis] he did 10 palliative radiation treatments to his neck and his femur to help with pain.
'His neck was really giving him a fit for a while. After that, his PSA was dropping and he was doing really good.
'The medication that they put him on was a testosterone blocker, and that caused him to lose a lot of muscle mass [and] made him really tired, but I mean, he was doing pretty good.'
Despite making good progress, around six months later, in October 2024, Mr White started experiencing extreme pain in his pelvis.
Mrs White said it was so bad he couldn't sleep and spent hours in the shower to try and soothe the pain.
She revealed: 'So the pain was so bad, he would practically live in the shower.
The Whites hope by talking openly about their story that more men will go to get tested sooner rather than later if they feel something is wrong
'He would take multiple showers a day. He didn't sleep at night. He slept in the shower.
'We went back to the oncologist and they put him on some strong pain medication, [but] it didn't touch the pain.
'So then when we went back again to the oncologist, they ordered a scan, and they found a tumor that was growing in his pelvis. The oncologist said that we need to get a biopsy ASAP.'
The Whites decided to go to a bigger cancer center in Georgia for the biopsy and Mr White was diagnosed with a rare and aggressive form of prostate cancer called a sarcomatoid carcinoma.
This form of the disease often develops in men who have had prior prostate cancer, and its prognosis is generally poor. There are less than 100 cases reported in literature and the average survival rate is typically around 10 months.
In her latest post, Mrs White reveals her husband had 18 more rounds of radiation to the mass in his pelvis and he is now undergoing chemotherapy.
Mr White is currently midway through his chemotherapy treatment.
Mrs White says: 'That's pretty much where we're at now. We just got to wait on these scans and pray to God that this chemo is working.
'So please, please, please keep him in your prayers.'
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

I'm a relationships expert: these are the commonly missed signs that your female friends are TOXIC (and how to cut them off)
I'm a relationships expert: these are the commonly missed signs that your female friends are TOXIC (and how to cut them off)

Daily Mail​

timean hour ago

  • Daily Mail​

I'm a relationships expert: these are the commonly missed signs that your female friends are TOXIC (and how to cut them off)

I have gone through more friendship break-ups than I care to admit and, controversially, I believe that makes me a better friend. It might even keep me younger too. A study last week revealed that toxic friendships cause premature biological ageing, comparable to that triggered by smoking. New York University found that social exchanges with so-called frenemies can cause chemical changes to DNA by keeping the body in a state of high stress.

Using Generative AI for therapy might feel like a lifeline – but there's danger in seeking certainty in a chatbot
Using Generative AI for therapy might feel like a lifeline – but there's danger in seeking certainty in a chatbot

The Guardian

time2 hours ago

  • The Guardian

Using Generative AI for therapy might feel like a lifeline – but there's danger in seeking certainty in a chatbot

Tran* sat across from me, phone in hand, scrolling. 'I just wanted to make sure I didn't say the wrong thing,' he explained, referring to a recent disagreement with his partner. 'So I asked ChatGPT what I should say.' He read the chatbot-generated message aloud. It was articulate, logical and composed – almost too composed. It didn't sound like Tran. And it definitely didn't sound like someone in the middle of a complex, emotional conversation about the future of a long-term relationship. It also did not mention anywhere some of Tran's contributing behaviours to the relationship strain that Tran and I had been discussing. Like many others I've seen in therapy recently, Tran had turned to AI in a moment of crisis. Under immense pressure at work and facing uncertainty in his relationship, he'd downloaded ChatGPT on his phone 'just to try it out'. What began as a curiosity soon became a daily habit, asking questions, drafting texts, and even seeking reassurance about his own feelings. The more Tran used it, the more he began to second-guess himself in social situations, turning to the model for guidance before responding to colleagues or loved ones. He felt strangely comforted, like 'no one knew me better'. His partner, on the other hand, began to feel like she was talking to someone else entirely. ChatGPT and other generative AI models present a tempting accessory, or even alternative, to traditional therapy. They're often free, available 24/7 and can offer customised, detailed responses in real time. When you're overwhelmed, sleepless and desperate to make sense of a messy situation, typing a few sentences into a chatbot and getting back what feels like sage advice can be very appealing. But as a psychologist, I'm growing increasingly concerned about what I'm seeing in the clinic; a silent shift in how people are processing distress and a growing reliance on artificial intelligence in place of human connection and therapeutic support. AI might feel like a lifeline when services are overstretched – and make no mistake, services are overstretched. Globally, in 2019 one in eight people were living with a mental illness and we face a dire shortage of trained mental health professionals. In Australia, there has been a growing mental health workforce shortage that is impacting access to trained professionals. Clinician time is one of the scarcest resources in healthcare. It's understandable (even expected) that people are looking for alternatives. Turning to a chatbot for emotional support isn't without risk however, especially when the lines between advice, reassurance and emotional dependence become blurred. Many psychologists, myself included, now encourage clients to build boundaries around their use of ChatGPT and similar tools. Its seductive 'always-on' availability and friendly tone can unintentionally reinforce unhelpful behaviours, especially for people with anxiety, OCD or trauma-related issues. Reassurance-seeking, for example, is a key feature in OCD and ChatGPT, by design, provides reassurance in abundance. It never asks why you're asking again. It never challenges avoidance. It never says, 'let's sit with this feeling for a moment, and practice the skills we have been working on'. Tran often reworded prompts until the model gave him an answer that 'felt right'. But this constant tailoring meant he wasn't just seeking clarity; he was outsourcing emotional processing. Instead of learning to tolerate distress or explore nuance, he sought AI-generated certainty. Over time, that made it harder for him to trust his own instincts. Beyond psychological concerns, there are real ethical issues. Information shared with ChatGPT isn't protected by the same confidentiality standards as registered Ahpra professionals. Although OpenAI states that data from users is not used to train its models unless permission is given, the sheer volume of fine print in user agreements often goes unread. Users may not realise how their inputs can be stored, analysed and potentially reused. There's also the risk of harmful or false information. These large language models are autoregressive; they predict the next word based on previous patterns. This probabilistic process can lead to 'hallucinations', confident, polished answers that are completely untrue. AI also reflects the biases embedded in its training data. Research shows that generative models can perpetuate and even amplify gender, racial and disability-based stereotypes – not intentionally, but unavoidably. Human therapists also possess clinical skills; we notice when a client's voice trembles, or when their silence might say more than words. This isn't to say AI can't have a place. Like many technological advancements before it, generative AI is here to stay. It may offer useful summaries, psycho-educational content or even support in regions where access to mental health professionals is severely limited. But it must be used carefully, and never as a replacement for relational, regulated care. Tran wasn't wrong to seek help. His instincts to make sense of distress and to communicate more thoughtfully were logical. However, leaning so heavily on to AI meant that his skill development suffered. His partner began noticing a strange detachment in his messages. 'It just didn't sound like you', she later told him. It turned out: it wasn't. She also became frustrated about the lack of accountability in his correspondence to her and this caused more relational friction and communication issues between them. As Tran and I worked together in therapy, we explored what led him to seek certainty in a chatbot. We unpacked his fears of disappointing others, his discomfort with emotional conflict and his belief that perfect words might prevent pain. Over time, he began writing his own responses, sometimes messy, sometimes unsure, but authentically his. Good therapy is relational. It thrives on imperfection, nuance and slow discovery. It involves pattern recognition, accountability and the kind of discomfort that leads to lasting change. A therapist doesn't just answer; they ask and they challenge. They hold space, offer reflection and walk with you, while also offering up an uncomfortable mirror. For Tran, the shift wasn't just about limiting his use of ChatGPT; it was about reclaiming his own voice. In the end, he didn't need a perfect response. He needed to believe that he could navigate life's messiness with curiosity, courage and care – not perfect scripts. Name and identifying details changed to protect client confidentiality Carly Dober is a psychologist living and working in Naarm/Melbourne In Australia, support is available at Beyond Blue on 1300 22 4636, Lifeline on 13 11 14, and at MensLine on 1300 789 978. In the UK, the charity Mind is available on 0300 123 3393 and Childline on 0800 1111. In the US, call or text Mental Health America at 988 or chat

Undeclared milk leads to US-wide butter recall
Undeclared milk leads to US-wide butter recall

The Independent

time7 hours ago

  • The Independent

Undeclared milk leads to US-wide butter recall

A voluntary recall has been issued for over 64,000 pounds of Bunge North America's NH European Style Butter Blend due to undeclared milk, a common allergen. The recall was initiated on 14 July and classified as a Class II recall by the FDA. The affected butter was distributed to 12 US centres and one in the Dominican Republic. This butter recall is part of a series of recent food and drink issues, including High Noon Vodka Seltzer being recalled as it was mislabeled as non-alcoholic energy drinks. Consumers are advised to check affected products and dispose of or return them.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store