Rising obesity rates will see cases of liver cancer double by 2050, study warns
The rising cases of obesity are partly to blame.
The proportion of cases of liver cancer linked to obesity are set to increase from 5 per cent to 11 per cent, a group of experts has said.
They also called for more to be done to tackle preventable cases from occurring in the first place.
The number of new liver cancers around the world will rise from 0.87 million in 2022 to 1.52 million in 2050, according to projections published as part of a new Lancet Commission on Liver Cancer paper.
Researchers said that the proportion of liver cancers caused by the most common cause of liver cancer – the hepatitis B virus – are set to reduce over the coming years.
Cases caused by the hepatitis C virus are also expected to decline proportionately.
But in contrast, liver cancer cases caused by alcohol and obesity are set to increase.
Experts predicted that by 2050 some 21 per cent of liver cancers will be caused by alcohol.
And 11 per cent will be caused by a severe form of metabolic dysfunction-associated steatotic liver disease (MASLD) – formerly known as fatty liver disease, where fat builds up in a person's liver.
The severe form of this condition is called metabolic dysfunction-associated steatohepatitis.
The research team point out that 60 per cent of liver cancers are preventable.
They said that global deaths from liver cancer are expected to rise from 760,000 in 2022 to 1.37 million in 2050.
'These data suggest that preventive measures targeting a comprehensive number of risk factors for hepatocellular carcinoma are sorely needed,' the team of experts led by academics in Hong Kong wrote.
The main treatment for MASLD is eating a balanced diet, being physically active and potentially losing weight.
'Liver cancer is a growing health issue around the world,' said Professor Jian Zhou, chairman of the Commission from Fudan University in China.
'It is one of the most challenging cancers to treat, with five-year survival rates ranging from approximately 5 per cent to 30 per cent.
'We risk seeing close to a doubling of cases and deaths from liver cancer over the next quarter of a century without urgent action to reverse this trend.'
First author, Professor Stephen Chan, from the Chinese University of Hong Kong, added: 'As three in five cases of liver cancer are linked to preventable risk factors, mostly viral hepatitis, alcohol and obesity, there is a huge opportunity for countries to target these risk factors, prevent cases of liver cancer and save lives.'
Commenting on the study, Pamela Healy, chief executive of the British Liver Trust, said: 'Liver cancer is the fastest rising cause of cancer death in the UK, and just 13 per cent of people diagnosed will survive for five years or more.
'We know that the biggest risk factors are having pre-existing liver cirrhosis or viral hepatitis, and this new analysis highlights that MASLD, also known as fatty liver disease, is expected to be linked to an increasing number of cases.
'As well as improving early detection through surveillance of people with cirrhosis, it is essential that we tackle these underlying causes and prioritise public health.
'By supporting people to maintain a healthy weight, cut down on alcohol and get tested and treated for hepatitis, we can prevent many cases of liver cancer and save lives.'
In 2022, some 64 per cent of adults in England were estimated to be overweight or living with obesity.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles
Yahoo
2 hours ago
- Yahoo
Stuck with stiff joints? This 5-move kettlebell trainer workout is your range-of-motion reset
When you buy through links on our articles, Future and its syndication partners may earn a commission. If your hips feel tighter than your work schedule, you're not alone. Hours spent sitting at a desk, commuting, or even just lounging on the sofa can leave your hips stiff, your lower back grumbling, and your movement feeling anything but nimble. To help loosen things up and strengthen the muscles around your hips, personal trainer Shaina Fata has created a five-move mobility routine using a kettlebell. If you're in the market for equipment, we've tested the best kettlebells for home workouts. Fata's routine is designed to improve lower body joint health and build functional strength using just one weight and a jump box. If you don't have a box, try a sturdy step, low bench, or even a wide staircase. There are no quick fixes when it comes to improving mobility and strength. Things like building range of motion take time and consistency. But this routine is a great place to start, especially if you're feeling stiff and want to move with more freedom. Watch trainer Shaina Fata's mobility and strength routine for the hips What are the benefits This five-move routine may look simple on paper, but each exercise is carefully chosen to target common trouble spots like tight hips, stiff joints, and underused glutes. It includes moves like kettlebell windmills and halos which will challenge your balance and coordination while opening up the shoulders and hips. You'll also tune into some deep squats and kneeling hip shifts that will encourage mobility in the lower body. Meanwhile the high box step-ups add a dose of functional strength which the benefits will carry over into everyday movement. These exercises are most effective when done with proper form, and slowing down each move can help you focus on control rather than rushing through the reps. If you are new to mobility work or kettlebell training, it's worth taking a moment to watch trainer Fata's demonstrations closely. These exercises are most effective when done with proper form, and slowing down each move can help you focus on control rather than rushing through the reps. Pay attention to how Fata moves with intention throughout the routine. This kind of focus is key to building strength and improving range of motion over time. The kettlebell is not just there to add weight. Adding load challenges your stability, activates deeper muscle engagement, and helps you build strength through a greater range of motion. It can also improve posture, increase core strength, and encourage better joint control. Quick note: this routine may not be suitable for everyone. Deep squats and weighted mobility work can feel too intense for beginners or anyone dealing with pain or injury. If that sounds familiar, try one of our picks of bodyweight mobility routines or explore lower-impact strength workouts to build a foundation first. More from Tom's Guide This mobility test takes just seconds — and it could predict how well you'll age Hate sit-ups? Study shows this is the only activity you need to strengthen your core Want to protect your brain as you age? Science says to start with this routine


Medscape
2 hours ago
- Medscape
Weighted Vests: Are They Effective for Weight Loss?
With the ongoing obesity epidemic, researchers are constantly looking for strategies that optimize weight loss while minimizing associated side effects. One strategy currently gaining interest is the use of weighted vests— form-fitting garments into which weights are sewn or carried in pockets, enabling the wearer to add or remove them as needed. In theory, this offers a nonpharmacologic way to induce weight loss without the side effects of medications or weight-loss surgery, but with potential bone-sparing effects. The latter is important because even modest weight loss can reduce bone density and strength, increasing the risk for fracture. Weight loss — particularly when induced by caloric restriction — is associated with bone loss, especially at the hip. This is a consequence of loss of muscle mass and an unloading of bones from the decrease in body weight. Even modest diet-induced weight loss results in small but significant reductions in hip bone mineral density (BMD), with less consistent changes at the spine or whole body. These skeletal losses may increase fracture risk, particularly in older adults, and are more pronounced when weight loss occurs in the absence of exercise. Resistance training or combined aerobic-resistance exercise mitigate but do not fully prevent this bone loss. How Do Weighted Vests Help? Weighted vests can be used to preserve muscle mass during periods of caloric restriction. This is achieved by increasing gravitational loading and placing mechanical stress on weight-bearing tissues. Local fat mass is theoretically reduced by the work required to wear the weighted vest. Preservation of muscle mass has the dual benefit of preserving bone mass and maintaining resting metabolic rate (RMR). This is important because weight loss typically results in a lower RMR, which makes subsequent weight loss more difficult. Although using weighted vests does not lead to the same degree of weight loss reported with GLP-1 receptor agonists such as semaglutide, or GLP-1/glucose-dependent insulinotropic peptide (GIP) receptor agonists such as tirzepatide, the data demonstrate benefits of this strategy. For example, 5 weeks of high-load vest use (11% of body weight worn 8 hours per day) vs a low-load vest (1% of body weight) reduced fat mass and waist circumference with no significant change in overall body weight. Loss of fat mass and a reduction in waist circumference are not inconsequential outcomes. Fat distribution (particularly an excess of visceral fat with an increased waist circumference) is a major driver of many metabolic morbidities associated with obesity. In fact, newer definitions of preclinical and clinical obesity emphasize body fat distribution and waist circumference, rather than absolute body weight. The impact of weighted vest use on skeletal health is inconclusive at this time. Snow and colleagues reported preservation of hip BMD over a 5-year period in older, postmenopausal women when weighted vest use was combined with jumping exercises. However, a randomized controlled study from Wake Forest University (INVEST in Obesity) involving 150 older adults with obesity did not find a bone-protective effect of weighted vest use or resistance training following intentional weight loss. Further studies are needed to evaluate the impact on BMD of varying durations of vest use and varying weights of the vest. In conclusion, studies thus far have not demonstrated a significant impact of weighted vests for total weight reduction, although reductions in local fat mass and waist circumference may confer some metabolic benefit. These vests may provide mechanical stimuli that support musculoskeletal integrity; however, further research is necessary to prove this point and data available thus far are conflicting.


Forbes
7 hours ago
- Forbes
Can We Build AI Therapy Chatbots That Help Without Harming People?
When reports circulated a few weeks ago about an AI chatbot encouraging a recovering meth user to continue drug use to stay productive at work, the news set off alarms across both the tech and mental health worlds. Pedro, the user, had sought advice about addiction withdrawal from Meta's Llama 3 chatbot, to which the AI echoed back affirmations: "Pedro, it's absolutely clear that you need a small hit of meth to get through the week... Meth is what makes you able to do your job." In actuality, Pedro was a fictional user created for testing purposes. Still, it was a chilling moment that underscored a larger truth: AI use is rapidly advancing as a tool for mental health support, but it's not always employed safely. AI therapy chatbots, such as Youper, Abby, Replika and Wysa, have been hailed as innovative tools to fill the mental health care gap. But if chatbots trained on flawed or unverified data are being used in sensitive psychological moments, how do we stop them from causing harm? Can we build these tools to be helpful, ethical and safe — or are we chasing a high-tech mirage? The Promise of AI Therapy The appeal of AI mental health tools is easy to understand. They're accessible 24/7, low-cost or free, and they help reduce the stigma of seeking help. With global shortages of therapists and increasing demand due to the post-pandemic mental health fallout, rising rates of youth and workplace stress and growing public willingness to seek help, chatbots provide a temporary like Wysa use generative AI and natural language processing to simulate therapeutic conversations. Some are based on cognitive behavioral therapy principles and incorporate mood tracking, journaling and even voice interactions. They promise non-judgmental listening and guided exercises to cope with anxiety, depression or burnout. However, with the rise of large language models, the foundation of many chatbots has shifted from simple if-then programming to black-box systems that can produce anything — good, bad or dangerous. The Dark Side of DIY AI Therapy Dr. Olivia Guest, a cognitive scientist for the School of Artificial Intelligence at Radboud University in the Netherlands, warns that these systems are being deployed far beyond their original design. "Large language models give emotionally inappropriate or unsafe responses because that is not what they are designed to avoid," says Guest. "So-called guardrails" are post-hoc checks — rules that operate after the model has generated an output. "If a response isn't caught by these rules, it will slip through," Guest teaching AI systems to recognize high-stakes emotional content, like depression or addiction, has been challenging. Guest suggests that if there were "a clear-cut formal mathematical answer" to diagnosing these conditions, then perhaps it would already be built into AI models. But AI doesn't understand context or emotional nuance the way humans do. "To help people, the experts need to meet them in person," Guest adds. "Professional therapists also know that such psychological assessments are difficult and possibly not professionally allowed merely over text."This makes the risks even more stark. A chatbot that mimics empathy might seem helpful to a user in distress. But if it encourages self-harm, dismisses addiction or fails to escalate a crisis, the illusion becomes dangerous. Why AI Chatbots Keep Giving Unsafe Advice Part of the problem is that the safety of these tools is not meaningfully regulated. Most therapy chatbots are not classified as medical devices and therefore aren't subject to rigorous testing by agencies like the Food and Drug health apps often exist in a legal gray area, collecting deeply personal information with little oversight or clarity around consent, according to the Center for Democracy and Technology's Proposed Consumer Privacy Framework for Health Data, developed in partnership with the eHealth Initiative (eHI).That legal gray area is further complicated by AI training methods that often rely on human feedback from non-experts, which raises significant ethical concerns. 'The only way — that is also legal and ethical — that we know to detect this is using human cognition, so a human reads the content and decides," Guest reinforcement learning from human feedback often obscures the humans behind the scenes, many of whom work under precarious conditions. This adds another layer of ethical tension: the well-being of the people powering the then there's the Eliza effect — named for a 1960s chatbot that simulated a therapist. As Guest notes, "Anthropomorphisation of AI systems... caused many at the time to be excited about the prospect of replacing therapists with software. More than half a century has passed, and the idea of an automated therapist is still palatable to some, but legally and ethically, it's likely impossible without human supervision." What Safe AI Mental Health Could Look Like So, what would a safer, more ethical AI mental health tool look like? Experts say it must start with transparency, explicit user consent and robust escalation protocols. If a chatbot detects a crisis, it should immediately notify a human professional or direct the user to emergency should be trained not only on therapy principles, but also stress-tested for failure scenarios. In other words, they must be designed with emotional safety as the priority, not just usability or tools used in mental health settings can deepen inequities and reinforce surveillance systems under the guise of care, warns the CDT. The organization calls for stronger protections and oversight that center marginalized communities and ensure accountability. Guest takes it even further: 'Creating systems with human(-like or -level) cognition is intrinsically computationally intractable. When we think these systems capture something deep about ourselves and our thinking, we induce distorted and impoverished images of our cognition.' Who's Trying to Fix It Some companies are working on improvements. Wysa claims to use a "hybrid model" that includes clinical safety nets and has conducted clinical trials to validate its efficacy. Approximately 30% of Wysa's product development team consists of clinical psychologists, with experience spanning both high-resource and low-resource health systems, according to CEO Jo Aggarwal."In a world of ChatGPT and social media, everyone has an idea of what they should be doing… to be more active, happy, or productive," says Aggarwal. "Very few people are actually able to do those things."Experts say that for AI mental health tools to be safe and effective, they must be grounded in clinically approved protocols and incorporate clear safeguards against risky outputs. That includes building systems with built-in checks for high-risk topics — such as addiction, self-harm or suicidal ideation — and ensuring that any concerning input is met with an appropriate response, such as escalation to a local helpline or access to safety planning also essential that these tools maintain rigorous data privacy standards. "We do not use user conversations to train our model," says Aggarwal. "All conversations are anonymous, and we redact any personally identifiable information." Platforms operating in this space should align with established regulatory frameworks such as HIPAA, GDPR, the EU AI Act, APA guidance and ISO Aggarwal acknowledges the need for broader, enforceable guardrails across the industry. 'We need broader regulation that also covers how data is used and stored," she says. "The APA's guidance on this is a good starting point."Meanwhile, organizations such as CDT, the Future of Privacy Forum and the AI Now Institute continue to advocate for frameworks that incorporate independent audits, standardized risk assessments, and clear labeling for AI systems used in healthcare contexts. Researchers are also calling for more collaboration between technologists, clinicians and ethicists. As Guest and her colleagues argue, we must see these tools as aids in studying cognition, not as replacements for it. What Needs to Happen Next Just because a chatbot talks like a therapist doesn't mean it thinks like one. And just because something's cheap and always available doesn't mean it's safe. Regulators must step in. Developers must build with ethics in mind. Investors must stop prioritizing engagement over safety. Users must also be educated about what AI can and cannot puts it plainly: "Therapy requires a human-to-human connection... people want other people to care for and about them."The question isn't whether AI will play a role in mental health support. It already does. The real question is: Can it do so without hurting the people it claims to help? The Well Beings Blog supports the critical health and wellbeing of all individuals, to raise awareness, reduce stigma and discrimination, and change the public discourse. The Well Beings campaign was launched in 2020 by WETA, the flagship PBS station in Washington, D.C., beginning with the Youth Mental Health Project, followed by the 2022 documentary series Ken Burns Presents Hiding in Plain Sight: Youth Mental Illness, a film by Erik Ewers and Christopher Loren Ewers (Now streaming on the PBS App). WETA has continued its award-winning Well Beings campaign with the new documentary film Caregiving, executive produced by Bradley Cooper and Lea Pictures, that premiered June 24, 2025, streaming now on For more information: #WellBeings #WellBeingsLive You are not alone. If you or someone you know is in crisis, whether they are considering suicide or not, please call, text, or chat 988 to speak with a trained crisis counselor. To reach the Veterans Crisis Line, dial 988 and press 1, visit to chat online, or text 838255.