logo
Persons with history of diabetes, high BP can be deceased donors: Expert group

Persons with history of diabetes, high BP can be deceased donors: Expert group

Time of India6 days ago
NEW DELHI: Can organs, say a kidney or the liver, retrieved from a deceased donor with a history of diabetes or hypertension be used for transplantation?
In a first of its kind factual assessment and report, the Indian Society for Organ Transplantation (ISOT) has said the benefits far outweigh the risks involved and, therefore, such donations should be considered subject to the viability of the organ.
According to the ISOT, while Indian data are lacking, international registries report hypertension in 15-20% and diabetes in 2-8% deceased donors. 'Evidence from the US Renal Data System (USRDS) and the United Network of Organ Sharing (UNOS) databases suggests minimally increased risk of primary non-function, acute rejection, or delayed graft function (DGF) and marginally lower graft survival from such donors, particularly in kidney transplantation,' the ISOT statement published in the Lancet Regional – South-east Asia says.
It has been co-authored by doctors from 32 top medical institutions in the country, including AIIMS Delhi, Safdarjung hospital, Kokilabnen Dhirubhai Ambani hospital Mumbai, Max Saket and Madras Medical College Chennai among others.
Over time, diabetes that isn't well controlled can damage blood vessels in the kidneys that filter waste from the blood. This can lead to kidney damage and cause high blood pressure.
High blood pressure can cause more kidney damage by raising the pressure in the filtering system of the kidneys, according to the Mayo Clinic.
Dr Dinesh Khullar, a leading nephrologist and co-author of the ISOT statement, said they have suggested a screening criterion to decide whether kidneys donated from a diabetic deceased donor can be considered or not. 'Outright rejection is wrong. In my view, doctors should carry out individualized risk assessment of the donor organ and recipient profile to reach a conclusion,' he said.
Dr Shiv Sarin, director of the Institute of Liver and Biliary Sciences (ILBS) also said that organs from deceased donors with a history of diabetes, hypertension or for that matter cancer can be used on a case-to-case basis. 'A liver biopsy should be done to see the extent of fibrosis and fat in liver from a diabetic donor, as one-third may be unfit. Similar caution for hypertensive donors for kidney donation needed. Organ from a donor cured of cancer for more than two years, should be acceptable,' Dr Sarin said.
More than two lakh Indians require transplantation annually. Not even 10% get it. This is because organs donated from deceased donors are scarce. That's why, earlier, preference was accorded to younger patients – those below 65 years of age – for receiving the organs.
Recently, the govt did away with the age bar. Doctors say the demand for organs has, therefore, gone up further.
A living person can donate only for immediate blood relations (brother, sister, parents and children). He or she can donate kidney (as one kidney is capable of maintaining the body functions), a portion of pancreas (as half of the pancreas is adequate for sustaining pancreatic functions) and part of the liver (as the few segments that are donated will regenerate after a period. A brain-dead person, on the other hand, can donate more than 20 organs and tissues including the heart, lungs, liver, kidney, intestines, pancreas, eyes, heart valves, skin, bone marrow, connective tissue, middle ear and blood vessels.
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Health coach shares 5 things you must know 'before blindly loading on protein': ‘You don't need 100g of protein'
Health coach shares 5 things you must know 'before blindly loading on protein': ‘You don't need 100g of protein'

Hindustan Times

time8 minutes ago

  • Hindustan Times

Health coach shares 5 things you must know 'before blindly loading on protein': ‘You don't need 100g of protein'

Among the many weight loss trends circulating online, one of the most popular is increasing protein intake in meals. Protein is known to keep you fuller for longer, reduce cravings, and support muscle building while burning fat. But does that mean we should load up on protein without caution? Also read | 10 high-protein foods that make weight loss easy, according to nutritionist Things to know before blinding loading on protein.(Shutterstock) Health and nutrition coach Nikita Bardia warned against it. On May 15, Nikita shared an Instagram post explaining the downsides of having too much protein. 'When I started consuming protein, I also started gaining weight. Things you must know before blindly loading on protein,' she wrote. 1. Protein can still make you gain weight, if you're in a surplus If your maintenance is 1800 kcal and you're eating 2100 kcal (even from clean, high-protein foods), you will gain fat. Calories still matter. Indian example: 150g paneer = 270 kcal 1 scoop plant protein = 120 kcal 2 tbsp peanut butter = 200 kcal It adds up fast if you're not tracking your intake! 2. Not all protein is lean protein Many Indian sources are protein-fat combos (paneer, peanuts, cheese, dals). You may think you're eating high protein, but you're also consuming a lot of hidden fat. Also read | Are you vegan? Nutritionist shares 11 high-protein foods to boost daily protein intake naturally Better vegetarian swaps? Swap paneer for tofu Swap peanuts for roasted chana Use Greek yogurt (unsweetened, low-fat) 3. Too much protein without strength training = stored energy. Protein supports muscle repair, but if you're not lifting or training, that extra protein becomes extra calories, often stored as fat, not muscle. Twist: Protein doesn't go to muscle by default. You must give your body the signal (resistance training). 4. Your digestion and kidney function matter. If you're bloated, gassy, or feel heavy after high protein meals, your gut might not be ready for that jump. Fix digestion before you double your protein. Add jeera and ajwain water, fermented foods, and chew mindfully. 5. You don't need 100g of protein overnight. Start small: 0.8g per kg of your ideal body weight. Then build it up with training and biofeedback. Also read | Too tired to cook? Dietician shares 5 high-protein but easy meals to prepare Note to readers: This article is for informational purposes only and not a substitute for professional medical advice. Always seek the advice of your doctor with any questions about a medical condition.

As young Indians turn to AI ‘therapists', how confidential is their data?
As young Indians turn to AI ‘therapists', how confidential is their data?

Scroll.in

time3 hours ago

  • Scroll.in

As young Indians turn to AI ‘therapists', how confidential is their data?

This is the second of a two-part series. Read the first here. Imagine a stranger getting hold of a mental health therapist's private notes – and then selling that information to deliver tailored advertisements to their clients. That's practically what many mental healthcare apps might be doing. Young Indians are increasingly turning to apps and artificial intelligence-driven tools to address their mental health challenges – but have limited awareness about how these digital tools process user data. In January, the Centre for Internet and Society published a study based on 45 mental health apps – 28 from India and 17 from abroad – and found that 80% gathered user health data that they used for advertising and shared with third-party service providers. An overwhelming number of these apps, 87%, shared the data with law enforcement and regulatory bodies. The first article in this series had reported that some of these apps are especially popular with young Indian users, who rely on them for quick and easy access to therapy and mental healthcare support. Users had also told Scroll that they turned to AI-driven technology, such as ChatGPT, to discuss their feelings and get advice, however limited this may be compared to interacting with a human therapist. But they were not especially worried about data misuse. Keshav*, 21, reflected a common sentiment among those Scroll interviewed: 'Who cares? My personal data is already out there.' The functioning of Large Language Models, such as ChatGPT, is already under scrutiny. LLMs are 'trained' on vast amounts of data, either from the internet or provided by its trainers, to simulate human learning, problem solving and decision making. Sam Altman, CEO of OpenAI that built ChatGPT, said on a podcast in July that though users talk about personal matters with the chatbot, there are no legal safeguards protecting that information. 'People use it – young people, especially, use it – as a therapist, a life coach; having these relationship problems and [asking] what should I do?' he asked. 'And right now, if you talk to a therapist or a lawyer or a doctor about those problems, there's legal privilege for it. There's doctor-patient confidentiality, there's legal confidentiality, whatever. And we haven't figured that out yet for when you talk to ChatGPT.' Play He added: 'So if you go talk to ChatGPT about your most sensitive stuff and then there's like a lawsuit or whatever, we could be required to produce that, and I think that's very screwed up.' Therapists and experts said the ease of access of AI-driven mental health tools should not sideline privacy concerns. Clinical psychologist Rhea Thimaiah, who works at Kaha Mind, a collective that provides mental health services, emphasised that confidentiality is an essential part of the process of therapy. 'The therapeutic relationship is built on trust and any compromise in data security can very possibly impact a client's sense of safety and willingness to engage,' she said. 'Clients have a right to know how their information is being stored, who has access, and what protections are in place.' This is more than mere data – it is someone's memories, trauma and identity, Thimaiah said. 'If we're going to bring AI into this space, then privacy shouldn't be optional, it should be fundamental.' Srishti Srivastava, founder of AI-driven mental health app Infiheal, said that her firm collects user data to train its AI bot, but users can access the app even without signing up and also ask for their data to be deleted. Dhruv Garg, a tech policy lawyer at Indian Governance and Policy Project, said the risk lies not just in apps collecting data but in the potential downstream uses of that information. 'Even if it's not happening now, an AI platform in the future could start using your data to serve targeted ads or generate insights – commercial, political, or otherwise – based on your past queries,' said Garg. 'Current privacy protections, though adequate for now, may not be equipped to deal with each new future scenario.' India's data protection law For now, personal data processed by chatbots is governed by the Information Technology Act framework and Sensitive Personal Data Rules, 2011. Section 5 of the sensitive data rules says that companies must obtain consent in writing before collecting or using sensitive information. According to the rules, information relating to health and mental health conditions are considered sensitive data. There are also specialised sectoral data protection rules that apply to regulated entities like hospitals. The Digital Personal Data Protection Act, passed by Parliament in 2023, is expected to be notified soon. But it exempts publicly available personal data from its ambit if this information has voluntarily been disclosed by an individual. Given the black market of data intermediaries that publish large volumes of personal information, it is difficult to tell what personal data in the public domain has been made available 'voluntarily'. The new data protection act does not have different regulatory standards for specific categories of personal data – financial, professional, or health-related, Garg said. This means that health data collected by AI tools in India will not be treated with special sensitivity under this framework. 'For instance, if you search for symptoms on Google or visit WebMD, Google isn't held to a higher standard of liability just because the content relates to health,' said Garg. WebMD provides health and medical information. It might be different for AI tools explicitly designed for mental healthcare – unlike general-purpose models like ChatGPT. These, according to Garg, 'could be made subject to more specific sectoral regulations in the future'. However, the very logic on which AI chatbots function – where it responds based on user data and inputs – could itself be a privacy risk. Nidhi Singh, a senior research analyst and programme manager at Carnegie India, an American think tank, said she has concerns about how tools like ChatGPT customise responses and remember user history – even though users may appreciate those functions. Singh said India's new data protection is quite clear that any data made publicly available by putting it on the internet is no longer considered personal data. 'It is unclear how this will apply to your conversations with ChatGPT,' she said. Without specific legal protections, there's no telling how an AI-driven tool will use the data it has gathered. According to Singh, without a specific rule designating conversations with generative AI as an exception, it is likely that a user's interactions with these AI systems won't be treated as personal data and consequently will not fall under the purview of the act. Who takes legal responsibility? Technology firms have tried hard to evade legal liability for harm. In Florida, a lawsuit by a mother has alleged that her 14-year-old son died by suicide after becoming deeply entangled in an 'emotionally and sexually abusive relationship' with a chatbot. In case of misdiagnosis or harmful advice from an AI tool, legal responsibility is likely to be analysed in court, said Garg. 'The developers may argue that the model is general-purpose, trained on large datasets, and not supervised by a human in real-time,' said Garg. 'Some parallels may be drawn with search engines – if someone acts on bad advice from search results, the responsibility doesn't fall on the search engine, but on the user.' Highlighting the urgent need for a conversation on sector-specific liability frameworks, Garg said that for now, the legal liability of AI developers will have to be assessed on a case-to-case basis. 'Courts may examine whether proper disclaimers and user agreements were in place,' he said. In another case, Air Canada was ordered to pay compensation to a customer who was misled by its chatbot regarding bereavement fares. The airline had argued that the chatbot was a ' separate legal entity ' and therefore responsible for its own actions. Singh of Carnegie India said that transparency is important and that user consent should be meaningful. 'You don't need to explain the model's source code, but you do need to explain its limitations and what it aims to do,' she said. 'That way, people can genuinely understand it, even if they don't grasp every technical step.' AI, meanwhile, is here for the long haul. Until India can expand its capacity to offer mental health services to everyone, Singh said AI will inevitably fill that void. 'The use of AI will only increase as Indic language LLMs are being built, further expanding its potential to address the mental health therapy gap,' she said.

Simpler tests could slash biosimilar costs, widen patient access
Simpler tests could slash biosimilar costs, widen patient access

The Hindu

time4 hours ago

  • The Hindu

Simpler tests could slash biosimilar costs, widen patient access

Most of the drugs that we consume are called 'small molecule drugs'. Their chemical structure is reasonably simple. Disprin, for instance, has a molecular weight of around 180 daltons. There is another breed of drugs that are very large, complex molecules. For instance, the molecular weight of insulin is around 5,800 daltons and that of the monoclonal antibody remicade, about 150,000 daltons. (One dalton is equal to one-12th the mass of a carbon-12 atom.) Small molecule drugs also tend to have fixed structures that do not change for the duration of their use. In contrast, the complex molecules, which we call biologics, are produced in biological systems and therefore during their production slight variations in the structure may arise. However, these variations may have no impact on the stability of the molecule, its efficacy or its side effects. When a company produces a small molecule drug for the first time, it seeks patent protection for that drug. That is, no competitor may make that drug for several years. It is only when the drug goes 'off patent' that competitors may make it. In the absence of competition, the originator company can price the drug very high. Once there is competition, the competitor companies produce generics, which are copies of the original drug. They don't undertake the research and development to make the drug and they may not spend as much on marketing and sales, so the costs of generics are also much lower. Most of the drugs that you and I take are generics and are priced very cheaply compared to the originator drug. A good example is Sovaldi, a drug used to treat hepatitis C: it originally cost $84,000 for a 12-week course in the US but that dropped to $1,000 once Indian generic firms started making it. Largely it is generic drugs that keep us in India alive and well. Since biologics made by a generic firm will be produced by different biological systems, they may not be identical to those made by the originator company. Thus they are called biosimilars, not generics. For many years, a debate has raged over how much proof is required for a manufacturer to prove that a given biosimilar will work as well as the original biologic drug. Therefore, whereas much simpler testing was required to show that a generic small molecule was working like the originator molecule, there are more elaborate and expensive tests for biosimilars. Major drug regulators such as those of the US, the UK, and in Europe have been working to determine how they can simplify the requirements for approving a biosimilar, in view of the availability of modern analytical techniques. For example, the UK has removed animal trials and the US has planned to replace them with more human-relevant methods (like using organoids). In India, this requirement has not yet been updated although there is a proposal to waive animal studies on a case-by-case basis. Some have also argued that India should follow the practices of the UK and the US. The same holds for the more expensive clinical trials, which in the UK are currently required only in certain cases. Biosimilars need to be made less expensively while ensuring efficacy and minimal adverse effects. The larger the number of affordable biosimilars, the more options we will have for our patients. Gayatri Saberwal is a consultant at the Tata Institute for Genetics and Society.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store