
Lupita Nyong'o opens up about health diagnosis
Nyong'o discovered she had the non-cancerous growths in 2014, the same year she won her Oscar, and underwent surgery to remove 30 fibroids.
She highlighted that while some individuals are asymptomatic, others suffer debilitating symptoms such as heavy bleeding and pain, noting the condition's high prevalence, particularly among Black women.
Nyong'o criticized the normalization of female pain and called for increased societal discussion, early education, better screening, and comprehensive research into women's reproductive health.
She has joined Democratic congresswomen and senators to introduce legislative bills aimed at expanding research funding, improving early detection, and increasing public awareness for uterine fibroids, also partnering on a research grant.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


The Guardian
an hour ago
- The Guardian
World in $1.5tn ‘plastics crisis' hitting health from infancy to old age, report warns
Plastics are a 'grave, growing and under-recognised danger' to human and planetary health, a new expert review has warned. The world is in a 'plastics crisis', it concluded, which is causing disease and death from infancy to old age and is responsible for at least $1.5tn (£1.1tn) a year in health-related damages. The driver of the crisis is a huge acceleration of plastic production, which has increased by more than 200 times since 1950 and is set to almost triple again to more than a billion tonnes a year by 2060. While plastic has many important uses, the most rapid increase has been in the production of single-use plastics, such as drinks bottles and fast-food containers. As a result, plastic pollution has also soared, with 8bn tonnes now polluting the entire planet, the review said, from the top of Mount Everest to the deepest ocean trench. Less than 10% of plastic is recycled. Plastics endangered people and the planet at every stage, the review said, from the extraction of the fossil fuels they were made from, to production, use and disposal. This results in air pollution, exposure to toxic chemicals and infiltration of the body with microplastics. Plastic pollution can even boost disease-carrying mosquitoes, as water captured in littered plastic provides good breeding sites. The review, published in the leading medical journal the Lancet, was released before the sixth and probably final round of negotiations between countries to agree a legally binding global plastics treaty to tackle the crisis. The talks have been dogged by a deep disagreement between more than 100 countries that back a cap on plastic production and petrostates such as Saudi Arabia that oppose the proposal. The Guardian recently revealed how petrostates and plastic industry lobbyists are derailing the negotiations. 'We know a great deal about the range and severity of the health and environmental impacts of plastic pollution,' said Prof Philip Landrigan, a paediatrician and epidemiologist at Boston College in the US, and lead author of the new report. He said it was imperative the plastics treaty included measures to protect human and planetary health. 'The impacts fall most heavily on vulnerable populations, especially infants and children,' he said. 'They result in huge economic costs to society. It is incumbent on us to act in response.' Petrostates and the plastics industry have argued the focus should be on recycling plastic, not cutting production. But, unlike paper, glass, steel and aluminium, chemically complex plastics cannot be readily recycled. The report said: 'It is now clear that the world cannot recycle its way out of the plastic pollution crisis.' More than 98% of plastics are made from fossil oil, gas and coal. The energy-intensive production process drives the climate crisis by releasing the equivalent of 2bn tonnes of CO2 a year – more than the emissions of Russia, the world's fourth biggest polluter. Plastic production also produces air pollution, while more than half of unmanaged plastic waste was burned in the open air, further increasing dirty air, the report noted. Sign up to Down to Earth The planet's most important stories. Get all the week's environment news - the good, the bad and the essential after newsletter promotion More than 16,000 chemicals are used in plastics, including fillers, dyes, flame retardants and stabilisers. Many plastic chemicals were linked to health effects at all stages of human life, the report said, but there was a lack of transparency about which chemicals were present in plastics. The analysis found that foetuses, infants and young children were highly susceptible to the harms associated with plastics, with exposure associated with increased risks of miscarriage, premature and stillbirth, birth defects, impaired lung growth, childhood cancer and fertility problems later in life. Plastic waste often breaks down into micro- and nano-plastics which enter the human body via water, food and breathing. The particles have been found in blood, brains, breast milk, placentas, semen and bone marrow. Their impact on human health is largely unknown as yet, but they have been linked to strokes and heart attacks and the researchers said a precautionary approach was needed. Plastic is often seen as a cheap material but the scientists argue it is expensive when the cost of health damages are included. One estimate of the health damage from just three plastic chemicals – PBDE, BPA and DEHP – in 38 countries was $1.5tn a year. The new analysis is the start of a series of reports that will regularly track the impact of plastics. Margaret Spring, a senior lawyer and one of the report's co-authors, said: 'The reports will offer decision-makers around the world a robust and independent data source to inform the development of effective policies addressing plastic pollution at all levels.'


Daily Mail
3 hours ago
- Daily Mail
The unlikely group 'destined' to be struck by Alzheimer's disease as early as 40
Alzheimer's disease is largely seen as one of old age. The most common form of memory-robbing dementia, Alzheimer's affects nearly 7 million Americans, most of whom are over the age of 65. Your browser does not support iframes. Your browser does not support iframes. Your browser does not support iframes.


The Guardian
4 hours ago
- The Guardian
Using Generative AI for therapy might feel like a lifeline – but there's danger in seeking certainty in a chatbot
Tran* sat across from me, phone in hand, scrolling. 'I just wanted to make sure I didn't say the wrong thing,' he explained, referring to a recent disagreement with his partner. 'So I asked ChatGPT what I should say.' He read the chatbot-generated message aloud. It was articulate, logical and composed – almost too composed. It didn't sound like Tran. And it definitely didn't sound like someone in the middle of a complex, emotional conversation about the future of a long-term relationship. It also did not mention anywhere some of Tran's contributing behaviours to the relationship strain that Tran and I had been discussing. Like many others I've seen in therapy recently, Tran had turned to AI in a moment of crisis. Under immense pressure at work and facing uncertainty in his relationship, he'd downloaded ChatGPT on his phone 'just to try it out'. What began as a curiosity soon became a daily habit, asking questions, drafting texts, and even seeking reassurance about his own feelings. The more Tran used it, the more he began to second-guess himself in social situations, turning to the model for guidance before responding to colleagues or loved ones. He felt strangely comforted, like 'no one knew me better'. His partner, on the other hand, began to feel like she was talking to someone else entirely. ChatGPT and other generative AI models present a tempting accessory, or even alternative, to traditional therapy. They're often free, available 24/7 and can offer customised, detailed responses in real time. When you're overwhelmed, sleepless and desperate to make sense of a messy situation, typing a few sentences into a chatbot and getting back what feels like sage advice can be very appealing. But as a psychologist, I'm growing increasingly concerned about what I'm seeing in the clinic; a silent shift in how people are processing distress and a growing reliance on artificial intelligence in place of human connection and therapeutic support. AI might feel like a lifeline when services are overstretched – and make no mistake, services are overstretched. Globally, in 2019 one in eight people were living with a mental illness and we face a dire shortage of trained mental health professionals. In Australia, there has been a growing mental health workforce shortage that is impacting access to trained professionals. Clinician time is one of the scarcest resources in healthcare. It's understandable (even expected) that people are looking for alternatives. Turning to a chatbot for emotional support isn't without risk however, especially when the lines between advice, reassurance and emotional dependence become blurred. Many psychologists, myself included, now encourage clients to build boundaries around their use of ChatGPT and similar tools. Its seductive 'always-on' availability and friendly tone can unintentionally reinforce unhelpful behaviours, especially for people with anxiety, OCD or trauma-related issues. Reassurance-seeking, for example, is a key feature in OCD and ChatGPT, by design, provides reassurance in abundance. It never asks why you're asking again. It never challenges avoidance. It never says, 'let's sit with this feeling for a moment, and practice the skills we have been working on'. Tran often reworded prompts until the model gave him an answer that 'felt right'. But this constant tailoring meant he wasn't just seeking clarity; he was outsourcing emotional processing. Instead of learning to tolerate distress or explore nuance, he sought AI-generated certainty. Over time, that made it harder for him to trust his own instincts. Beyond psychological concerns, there are real ethical issues. Information shared with ChatGPT isn't protected by the same confidentiality standards as registered Ahpra professionals. Although OpenAI states that data from users is not used to train its models unless permission is given, the sheer volume of fine print in user agreements often goes unread. Users may not realise how their inputs can be stored, analysed and potentially reused. There's also the risk of harmful or false information. These large language models are autoregressive; they predict the next word based on previous patterns. This probabilistic process can lead to 'hallucinations', confident, polished answers that are completely untrue. AI also reflects the biases embedded in its training data. Research shows that generative models can perpetuate and even amplify gender, racial and disability-based stereotypes – not intentionally, but unavoidably. Human therapists also possess clinical skills; we notice when a client's voice trembles, or when their silence might say more than words. This isn't to say AI can't have a place. Like many technological advancements before it, generative AI is here to stay. It may offer useful summaries, psycho-educational content or even support in regions where access to mental health professionals is severely limited. But it must be used carefully, and never as a replacement for relational, regulated care. Tran wasn't wrong to seek help. His instincts to make sense of distress and to communicate more thoughtfully were logical. However, leaning so heavily on to AI meant that his skill development suffered. His partner began noticing a strange detachment in his messages. 'It just didn't sound like you', she later told him. It turned out: it wasn't. She also became frustrated about the lack of accountability in his correspondence to her and this caused more relational friction and communication issues between them. As Tran and I worked together in therapy, we explored what led him to seek certainty in a chatbot. We unpacked his fears of disappointing others, his discomfort with emotional conflict and his belief that perfect words might prevent pain. Over time, he began writing his own responses, sometimes messy, sometimes unsure, but authentically his. Good therapy is relational. It thrives on imperfection, nuance and slow discovery. It involves pattern recognition, accountability and the kind of discomfort that leads to lasting change. A therapist doesn't just answer; they ask and they challenge. They hold space, offer reflection and walk with you, while also offering up an uncomfortable mirror. For Tran, the shift wasn't just about limiting his use of ChatGPT; it was about reclaiming his own voice. In the end, he didn't need a perfect response. He needed to believe that he could navigate life's messiness with curiosity, courage and care – not perfect scripts. Name and identifying details changed to protect client confidentiality Carly Dober is a psychologist living and working in Naarm/Melbourne In Australia, support is available at Beyond Blue on 1300 22 4636, Lifeline on 13 11 14, and at MensLine on 1300 789 978. In the UK, the charity Mind is available on 0300 123 3393 and Childline on 0800 1111. In the US, call or text Mental Health America at 988 or chat