
‘Just felt like it was a waste of money': How a fitness trainer mistook an ovarian cyst the size of a newborn baby for a common postpartum condition and delayed treatment for 7 years
'I just felt like it was a waste of money and so I just stopped going,' she told People, in an exclusive interaction. As her stomach started expanding, Johnson was convinced it was diastasis recti, a condition where the rectus abdominal muscles or the 'six-pack' muscles separate, often postpartum, creating a gap. 'My stomach was getting bigger and just wider,' she said, adding that she even used the 'two-finger test' she learned about online to self-assess. This involves
But when she opened up on TikTok, her followers sounded the alarm. Many urged her to seek real help, which she finally did. After a trip to the ER (Emergency Room), tests confirmed she not only had diastasis recti but also a 'massive cystic mass extending from the left upper quadrant to the floor of the pelvis measuring up to 48 centimeters,' which is close to the size of a newborn baby.
She nicknamed her ovarian cyst 'Cysterella.' Reflecting on her hesitation, Johnson confessed, 'My own pride got in the way of me getting help because I would justify my symptoms — like nothing's wrong with me and I know best.'
According to the outlet, Johnson had a successful surgery during which 27 pounds of fluid were drained from her body on May 22. However, in the process of removing the cyst, doctors had to take out one ovary and a fallopian tube — a risk she was aware of. As for her diastasis recti, her medical team hopes it will heal naturally after the surgery.
Dr R Uthra, MS(OBG)., DCG., consultant obstetrics and gynaecology at DHEE Hospitals, tells indianexpress.com, 'Medical avoidance is far more common than we realise, especially among younger adults. In Megan's case, the lack of insurance compounded the issue. Many people internalise the idea that unless something is 'urgent,' medical care can be delayed.'
There's also an emotional layer to it. Avoidance is sometimes a coping mechanism, people may fear a serious diagnosis and feel that not knowing is less distressing than confirming their worst suspicions.
Johnson was convinced she had diastasis recti based on online research and used the 'two-finger test' to self-diagnose. Dr Uthra mentions that the 'two-finger test' is a basic tool that may suggest the possibility of diastasis recti, but it cannot replace a proper medical evaluation. It involves measuring the gap between abdominal muscles during a slight crunch, where a gap of two or more fingers may indicate the condition. 'Internal organ conditions, like ovarian cysts or tumours, often have overlapping symptoms and may not be visible or detectable through surface-level tests.'
Social media can oversimplify or generalise medical issues, leading to misdiagnosis or delayed care. In reproductive health especially, symptoms can be subtle and progress silently.
Johnson shared, 'Feeling like I'm not able to help myself makes me feel unqualified to help others.' Dr Uthra notes that for professionals in the fitness or wellness industry, there is often an unspoken expectation to 'embody' health. 'When they face a medical issue that goes undiagnosed or unresolved, it can deeply impact their sense of competence and credibility.'
Moreover, she adds that the pressure to maintain a certain physical appearance or standard of health can discourage them from seeking help. It's essential to normalise the idea that health professionals, too, are human and vulnerable, and seeking help is a sign of strength, not failure.
DISCLAIMER: This article is based on information from the public domain and/or the experts we spoke to. Always consult your health practitioner before starting any routine.

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Time of India
4 hours ago
- Time of India
Content moderators for Big Tech unite to tackle mental trauma
Content moderators from the Philippines to Turkey are uniting to push for greater mental health support to help them cope with the psychological effects of exposure to a rising tide of disturbing images online. The people tasked with removing harmful content from tech giants like Meta Platforms or TikTok, report a range of noxious health effects from loss of appetite to anxiety and suicidal thoughts. "Before I would sleep seven hours," said one Filipino content moderator who asked to remain anonymous to avoid problems with their employer. "Now I only sleep around four hours." Workers are gagged by non-disclosure agreements with the tech platforms or companies that do the outsourced work, meaning they cannot discuss exact details of the content they are seeing. But videos of people being burned alive by the Islamic State, babies dying in Gaza and gruesome pictures from the Air India crash in June were given as examples by moderators who spoke to the Thomson Reuters Foundation. Social media companies, which often outsource content moderation to third parties, are facing increasing pressure to address the emotional toll of moderation. Meta, which owns Facebook, WhatsApp and Instagram, has already been hit with workers' rights lawsuits in Kenya and Ghana, and in 2020 the firm paid a $52 million settlement to American content moderators suffering long-term mental health issues. The Global Trade Union Alliance of Content Moderators was launched in Nairobi in April to establish worker protections for what they dub "a 21st century hazardous job", similar to the work of emergency responders. Their first demand is for tech companies to adopt mental health protocols, such as exposure limits and trauma training, in their supply chains. "They say we're the ones protecting the internet, keeping kids safe online," the Filipino worker said, "But we are not protected enough." Scrolling trauma Globally, tens of thousands of content moderators spend up to 10 hours a day scrolling through social media posts to remove harmful content - and the mental toll is well-documented. "I've had bad dreams because of the graphic content, and I'm smoking more, losing focus," said Berfin Sirin Tunc, a content moderator for TikTok in Turkey employed via Canadian-based tech company Telus, which also does work for Meta. In a video call with the Thomson Reuters Foundation, she said the first time she saw graphic content as part of her job she had to leave the room and go home. While some employers do provide psychological support, some workers say it is just for show - with advice to count numbers or do breathing exercises. Therapy is limited to either group sessions or a recommendation to switch off for a certain number of "wellness break" minutes. But taking them is another thing. "If you don't go back to the computer, your team leader will ask where are you and (say) that the queue of videos is growing," said Tunc, "Bosses see us just as machines." In emailed statements to the Thomson Reuters Foundation, Telus and Meta said the well-being of their employees is a top priority and that employees should have access to 24/7 healthcare support. Rising pressure Moderators have seen an uptick in violent videos. A report by Meta for the first quarter of 2025 showed a rise in the sharing of violent content on Facebook, after the company changed its content moderation policies in a commitment to "free expression." However, Telus said in its emailed response that internal estimates show that distressing material represents less than 5% of the total content reviewed. Adding to the pressure on moderators is a fear of losing jobs as companies shift towards artificial intelligence-powered moderation. Meta, which invested billions and hired thousands of content moderators globally over the years to police extreme content, scrapped its US fact-checking programme in January, following the election of Donald Trump. In April, 2,000 Barcelona-based workers were sent home after Meta severed a contract with Telus. A Meta spokesperson said the company has moved the services that were being performed from Barcelona to other locations. "I'm waiting for Telus to fire me," said Tunc, "because they fired my friends from our union." Fifteen workers in Turkey are suing the company after being dismissed, they say, after organising a union and attending protests this year. A spokesperson for Telus said in an emailed response that the company "respects the rights of workers to organise". Telus said a May report by Turkey's Ministry of Labour found contract terminations were based on performance and it could not be concluded that the terminations were union-related. The Labour Ministry did not immediately respond to a request for comment. Protection protocols Moderators in low-income countries say that the low wages, productivity pressure and inadequate mental health support can be remedied if companies sign up to the Global Alliance's eight protocols. These include limiting exposure time, making realistic quotas and 24/7 counselling, as well as living wages, mental health training and the right to join a union. Telus said in its statement that it was already in compliance with the demands, and Meta said it conducts audits to check that companies are providing required on-site support. New European Union rules - such as the Digital Services Act, the AI Act and supply chain regulations which demand tech companies address risks to workers - should give stronger legal grounds to protect content moderators' rights, according to labour experts. "Bad things are happening in the world. Someone has to do this job and protect social media," said Tunc. "With better conditions, we can do this better. If you feel like a human, you can work like a human."


The Hindu
5 hours ago
- The Hindu
Content moderators for Big Tech unite to tackle mental trauma
Content moderators from the Philippines to Turkey are uniting to push for greater mental health support to help them cope with the psychological effects of exposure to a rising tide of disturbing images online. The people tasked with removing harmful content from tech giants like Meta Platforms or TikTok, report a range of noxious health effects from loss of appetite to anxiety and suicidal thoughts. "Before I would sleep seven hours," said one Filipino content moderator who asked to remain anonymous to avoid problems with their employer. "Now I only sleep around four hours." Workers are gagged by non-disclosure agreements with the tech platforms or companies that do the outsourced work, meaning they cannot discuss exact details of the content they are seeing. But videos of people being burned alive by the Islamic State, babies dying in Gaza and gruesome pictures from the Air India crash in June were given as examples by moderators who spoke to the Thomson Reuters Foundation. Social media companies, which often outsource content moderation to third parties, are facing increasing pressure to address the emotional toll of moderation. Meta, which owns Facebook, WhatsApp and Instagram, has already been hit with workers' rights lawsuits in Kenya and Ghana, and in 2020 the firm paid a $52 million settlement to American content moderators suffering long-term mental health issues. The Global Trade Union Alliance of Content Moderators was launched in Nairobi in April to establish worker protections for what they dub "a 21st century hazardous job", similar to the work of emergency responders. Their first demand is for tech companies to adopt mental health protocols, such as exposure limits and trauma training, in their supply chains. "They say we're the ones protecting the internet, keeping kids safe online," the Filipino worker said, "But we are not protected enough." Globally, tens of thousands of content moderators spend up to 10 hours a day scrolling through social media posts to remove harmful content, and the mental toll is well-documented. "I've had bad dreams because of the graphic content, and I'm smoking more, losing focus," said Berfin Sirin Tunc, a content moderator for TikTok in Turkey employed via Canadian-based tech company Telus, which also does work for Meta. In a video call with the Thomson Reuters Foundation, she said the first time she saw graphic content as part of her job she had to leave the room and go home. While some employers do provide psychological support, some workers say it is just for show, with advice to count numbers or do breathing exercises. Therapy is limited to either group sessions or a recommendation to switch off for a certain number of "wellness break" minutes. But taking them is another thing. "If you don't go back to the computer, your team leader will ask where are you and (say) that the queue of videos is growing," said Tunc, "Bosses see us just as machines." In emailed statements to the Thomson Reuters Foundation, Telus and Meta said the well-being of their employees is a top priority and that employees should have access to 24/7 healthcare support. Moderators have seen an uptick in violent videos. A report by Meta for the first quarter of 2025 showed a rise in the sharing of violent content on Facebook, after the company changed its content moderation policies in a commitment to "free expression." However, Telus said in its emailed response that internal estimates show that distressing material represents less than 5% of the total content reviewed. Adding to the pressure on moderators is a fear of losing jobs as companies shift towards artificial intelligence-powered moderation. Meta, which invested billions and hired thousands of content moderators globally over the years to police extreme content, scrapped its U.S. fact-checking programme in January, following the election of Donald Trump. In April, 2,000 Barcelona-based workers were sent home after Meta severed a contract with Telus. A Meta spokesperson said the company has moved the services that were being performed from Barcelona to other locations. "I'm waiting for Telus to fire me," said Tunc, "because they fired my friends from our union." Fifteen workers in Turkey are suing the company after being dismissed, they say, after organising a union and attending protests this year. A spokesperson for Telus said in an emailed response that the company "respects the rights of workers to organise". Telus said a May report by Turkey's Ministry of Labour found contract terminations were based on performance and it could not be concluded that the terminations were union-related. The Labour Ministry did not immediately respond to a request for comment. Moderators in low-income countries say that the low wages, productivity pressure and inadequate mental health support can be remedied if companies sign up to the Global Alliance's eight protocols. These include limiting exposure time, making realistic quotas and 24/7 counselling, as well as living wages, mental health training and the right to join a union. Telus said in its statement that it was already in compliance with the demands, and Meta said it conducts audits to check that companies are providing required on-site support. New European Union rules, such as the Digital Services Act, the AI Act and supply chain regulations which demand tech companies address risks to workers, should give stronger legal grounds to protect content moderators' rights, according to labour experts. "Bad things are happening in the world. Someone has to do this job and protect social media," said Tunc. "With better conditions, we can do this better. If you feel like a human, you can work like a human."


NDTV
16 hours ago
- NDTV
Amid 'Korean Botox' Buzz On Social Media, Experts Tell NDTV What Not To Do
Innotox, colloquially known as "Korean Botox," is a type A botulinum toxin product, used for cosmetic procedures to reduce wrinkles, fine lines and other signs of ageing. Developed by the South Korean company Medytox, Innotox has gone viral on social media platforms like Instagram and TikTok because of its alleged convenience and has influenced millions worldwide. What is Innotox? "It is gaining traction due to its pre-diluted, ready-to-use liquid formulation," Dr Jisha Pillai, Dermatologist, Lilavati Hospital, Mumbai, told NDTV. Innotox temporarily blocks nerve signals to targeted facial muscles, preventing them from contracting and reducing the appearance of wrinkles. The treatment involves small injections into the target areas of the face. Shefali Jariwala's death Indian actor Shefali Jariwala's sudden death has stirred a debate on the use of such medications, especially in the absence of a professional. The anti-ageing and skin-lightening pills also come in vials and are injected directly into the skin, but users often do it unsupervised. Reports claimed that Jariwala consumed her anti-ageing drugs during fasting, which led to a drop in her blood pressure, causing her death. Unsupervised cosmetic procedures Although, Dr Pillai called it is "inappropriate" and "irresponsible" to link individual's death to cosmetic procedures without any confirmed medical evidence, she said the rise of K-beauty and injectable trends globally has inadvertently led to "increased misuse" of aesthetic treatments to promote "unrealistic expectations" without proper medical guidance. The usage of unregulated products and non-medical or unlicensed clinical injectors can pose serious health risks. "While botulinum toxin can be safe, it must only be administered by trained and board-certified professionals in clinical settings," Dr Pillai said. "Its misuse in non-clinical environments can lead to complications such as ptosis, asymmetry, muscle weakness, or long-standing nerve damage." What are the DOs and DONTs? Pre-Procedure Consultation: Discuss your medical history, allergies, medications, and treatment goals with a board-certified specialist. Medication Disclosure: Inform your doctor about all medications and supplements you're taking, especially blood thinners. Approved Vials: Ensure the use of FDA or CDSCO-approved vials stored under proper cold-chain conditions. Post-Injection Care: Avoid alcohol as it can increase bruising and swelling. Also, avoid heavy exercise and massaging or rubbing. Qualified Practitioner: Ensure the procedure is performed by a board-certified medical expert in a sterile environment. Verify Credentials: Check the practitioner's medical board certification and professional affiliations. Avoid Non-Medical Settings: Never get injections in salons or from unqualified individuals. Informed Consent: Ensure you provide proper consent and understand the risks and benefits of the procedure. "Not all trending skincare is safe skincare. Trends don't diagnose your medical/skin condition. Always choose evidence and scientific research-based care over viral trends/hype," Dr Pillai concluded. "Beauty is not just about how we look" Dr Sundeep Kochar, life coach and TEDx alumnus, told NDTV that true beauty is not just about how we look, but how we feel and what we radiate from within. In a time when outer appearance often takes centre stage, his message brings our focus back to what truly matters-inner peace, positive energy, and self-care. "While injections and beauty treatments may offer quick fixes, they can never replace the natural glow that comes from a calm mind, a kind heart, and a balanced lifestyle," Dr Kochar told NDTV. "When we practice meditation, eat nourishing food, stay in a peaceful environment, and treat others with compassion, we build a powerful and magnetic aura. This is the kind of beauty that lasts and leaves a deeper impact," Dr Kochar said. Dr Kochar emphasises a few essential DOs and DONTs to help cultivate lasting inner beauty. He advises practising daily meditation, mindful eating with fresh and sattvic food, maintaining a clutter-free environment, and engaging in uplifting conversations.