logo
Teaching college student overcomes cancer battle to achieve her dream

Teaching college student overcomes cancer battle to achieve her dream

The Sun6 days ago
A teaching college student has become an inspiration to many after graduating despite facing severe health challenges, proving that determination and resilience can overcome even the toughest of trials.
Nur Syahidatunnajwa Mohamad Naim, a student from the Institute of Teacher Education (IPG) Tuanku Bainun Campus in Bukit Mertajam, Penang recently went viral on TikTok after a video of her convocation ceremony was shared by @shafiqalsukri.
What made her graduation truly remarkable was her battle with Hodgkin lymphoma cancer.
According to the announcer at the convocation, Syahidatunnajwa had undergone 22 sessions of chemotherapy, 15 sessions of radiotherapy, and was still undergoing medical treatment at the time of her graduation.
Despite these challenges, she successfully completed her degree in Islamic Studies, becoming a symbol of perseverance and courage for many across the nation.
The comments section on the viral TikTok video was flooded with well-wishes and words of admiration for Syahidatunnajwa.
Many called her a 'real fighter' who overcame not only physical pain but also emotional and mental struggles on her journey to academic success.
One user, @itsbloomienine, who identified herself as Syahidatunnajwa's friend, shared a heartfelt tribute.
'Najwa is such a kind person. She's gentle, and her character is truly admirable. Indeed, Allah tested her with cancer and the loss of her father. We're so proud of you, Syahid! Alhamdulillah, you finally graduated after going through so many storms!'
Syahidatunnajwa's inspiring story reminds netizens that no matter how big the challenges are, they do not have to stop you from achieving your dreams.
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Content moderators unite to tackle trauma from extreme content
Content moderators unite to tackle trauma from extreme content

New Straits Times

time4 days ago

  • New Straits Times

Content moderators unite to tackle trauma from extreme content

Content moderators from the Philippines to Turkiye are uniting to push for greater mental health support to help them cope with the psychological effects of exposure to a rising tide of disturbing images online. The people tasked with removing harmful content from tech giants like Meta Platforms or TikTok report a range of noxious health effects from loss of appetite to anxiety and suicidal thoughts. "Previously, I could sleep for seven hours," said one Filipino content moderator who asked to remain anonymous to avoid problems with their employer. "Now, I only sleep for around four hours." Workers are gagged by non-disclosure agreements with the tech platforms or companies that do the outsourced work, meaning they cannot discuss details of the content they are seeing. But videos of babies dying in Gaza and gruesome pictures from the Air India crash in June were given as examples by moderators. Social media companies, which often outsource content moderation to third parties, are facing increasing pressure to address the emotional toll of moderation. Meta, which owns Facebook, WhatsApp and Instagram, has already been hit with workers' rights suits in Kenya and Ghana, and in 2020 the firm paid a US$52 million settlement to American content moderators suffering long-term mental health issues. The Global Trade Union Alliance of Content Moderators was launched in Nairobi in April to establish worker protections for what they dub "a 21st-century hazardous job", similar to the work of emergency responders. Their first demand is for tech companies to adopt mental health protocols, such as exposure limits and trauma training, in their supply chains. "They say we're the ones protecting the Internet, keeping kids safe online," the Filipino worker said. "But we are not protected enough." Globally, tens of thousands of content moderators spend up to 10 hours a day scrolling through social media posts to remove harmful content — and the mental toll is well-documented. "I've had bad dreams because of the graphic content, and I'm smoking more, losing focus," said Berfin Sirin Tunc, a content moderator for TikTok in Turkiye employed via Canadian-based tech company Telus, which also does work for Meta. She said the first time she saw graphic content as part of her job she had to leave the room and go home. While some employers do provide psychological support, some workers say it is just for show — with advice to count numbers or do breathing exercises. Therapy is limited to either group sessions or a recommendation to switch off for a certain number of "wellness break" minutes. But taking them is another thing. "If you don't go back to the computer, your team leader will ask where are you and (say) that the queue of videos is growing," said Tunc, "Bosses see us just as machines." Moderators have seen an uptick in violent videos. However, Telus said in its emailed response that internal estimates show that distressing material represents less than five per cent of the total content reviewed. Adding to the pressure on moderators is a fear of losing jobs as companies shift towards artificial intelligence-powered moderation. Meta, which invested billions and hired thousands of content moderators globally over the years to police extreme content, scrapped its US fact-checking programme in January, following the election of Donald Trump. In April, 2,000 Barcelona-based workers were sent home after Meta severed a contract with Telus. "I'm waiting for Telus to fire me," said Tunc, "because they fired my friends from our union." Fifteen workers in Turkiye are suing the company after being dismissed, they say, after organising a union and attending protests this year. Moderators in low-income countries say that the low wages, productivity pressure and inadequate mental health support can be remedied if companies sign up to the Global Alliance's eight protocols. These include limiting exposure time, making realistic quotas and 24/7 counselling, as well as living wages, mental health training and the right to join a union. "With better conditions, we can do this better. If you feel like a human, you can work like a human," said Tunc. The writer is from Reuters

'Bosses see us as machines': Content moderators for Big Tech unite to tackle mental trauma
'Bosses see us as machines': Content moderators for Big Tech unite to tackle mental trauma

The Star

time4 days ago

  • The Star

'Bosses see us as machines': Content moderators for Big Tech unite to tackle mental trauma

BRUSSELS: Content moderators from the Philippines to Turkey are uniting to push for greater mental health support to help them cope with the psychological effects of exposure to a rising tide of disturbing images online. The people tasked with removing harmful content from tech giants like Meta Platforms or TikTok, report a range of noxious health effects from loss of appetite to anxiety and suicidal thoughts. "Before I would sleep seven hours," said one Filipino content moderator who asked to remain anonymous to avoid problems with their employer. "Now I only sleep around four hours." Workers are gagged by non-disclosure agreements with the tech platforms or companies that do the outsourced work, meaning they cannot discuss exact details of the content they are seeing. But videos of people being burned alive by terrorists, babies dying in Gaza and gruesome pictures from the Air India crash in June were given as examples by moderators who spoke to the Thomson Reuters Foundation. Social media companies, which often outsource content moderation to third parties, are facing increasing pressure to address the emotional toll of moderation. Meta, which owns Facebook, WhatsApp and Instagram, has already been hit with workers' rights lawsuits in Kenya and Ghana, and in 2020 the firm paid a US$52mil (RM219.54mil) settlement to American content moderators suffering long-term mental health issues. The Global Trade Union Alliance of Content Moderators was launched in Nairobi in April to establish worker protections for what they dub "a 21st century hazardous job", similar to the work of emergency responders. Their first demand is for tech companies to adopt mental health protocols, such as exposure limits and trauma training, in their supply chains. "They say we're the ones protecting the Internet, keeping kids safe online," the Filipino worker said, "But we are not protected enough." Scrolling trauma Globally, tens of thousands of content moderators spend up to 10 hours a day scrolling through social media posts to remove harmful content – and the mental toll is well-documented. "I've had bad dreams because of the graphic content, and I'm smoking more, losing focus," said Berfin Sirin Tunc, a content moderator for TikTok in Turkey employed via Canadian-based tech company Telus, which also does work for Meta. In a video call with the Thomson Reuters Foundation, she said the first time she saw graphic content as part of her job she had to leave the room and go home. While some employers do provide psychological support, some workers say it is just for show – with advice to count numbers or do breathing exercises. Therapy is limited to either group sessions or a recommendation to switch off for a certain number of "wellness break" minutes. But taking them is another thing. "If you don't go back to the computer, your team leader will ask where are you and (say) that the queue of videos is growing," said Tunc, "Bosses see us just as machines." In emailed statements to the Thomson Reuters Foundation, Telus and Meta said the well-being of their employees is a top priority and that employees should have access to 24/7 healthcare support. Rising pressure Moderators have seen an uptick in violent videos. A report by Meta for the first quarter of 2025 showed a rise in the sharing of violent content on Facebook, after the company changed its content moderation policies in a commitment to "free expression". However, Telus said in its emailed response that internal estimates show that distressing material represents less than 5% of the total content reviewed. Adding to the pressure on moderators is a fear of losing jobs as companies shift towards artificial intelligence-powered moderation. Meta, which invested billions and hired thousands of content moderators globally over the years to police extreme content, scrapped its US fact-checking programme in January, following the election of Donald Trump. In April, 2,000 Barcelona-based workers were sent home after Meta severed a contract with Telus. A Meta spokesperson said the company has moved the services that were being performed from Barcelona to other locations. "I'm waiting for Telus to fire me," said Tunc, "because they fired my friends from our union." Fifteen workers in Turkey are suing the company after being dismissed, they say, after organising a union and attending protests this year. A spokesperson for Telus said in an emailed response that the company "respects the rights of workers to organise". Telus said a May report by Turkey's Ministry of Labour found contract terminations were based on performance and it could not be concluded that the terminations were union-related. The Labour Ministry did not immediately respond to a request for comment. Protection protocols Moderators in low-income countries say that the low wages, productivity pressure and inadequate mental health support can be remedied if companies sign up to the Global Alliance's eight protocols. These include limiting exposure time, making realistic quotas and 24/7 counselling, as well as living wages, mental health training and the right to join a union. Telus said in its statement that it was already in compliance with the demands, and Meta said it conducts audits to check that companies are providing required on-site support. New European Union rules – such as the Digital Services Act, the AI Act and supply chain regulations which demand tech companies address risks to workers – should give stronger legal grounds to protect content moderators' rights, according to labour experts. "Bad things are happening in the world. Someone has to do this job and protect social media," said Tunc. "With better conditions, we can do this better. If you feel like a human, you can work like a human." – Thomson Reuters Foundation

Dogs are being seen as children as pet ownership increases
Dogs are being seen as children as pet ownership increases

The Star

time5 days ago

  • The Star

Dogs are being seen as children as pet ownership increases

Affection, vulnerable and totally dependent on their humans, dogs share many characteristics with children. These similarities may explain why some people refer to their pets as their 'babies', suggests a Hungarian study. This increasingly common anthropomorphic trend nevertheless raises ethical questions. In developed countries, the domesticated dog has become much more than a simple pet – it's often considered a family member in its own right. This is a relatively recent phenomenon, and one that has given rise to a forthright anthropomorphisation of our pets. In fact, it's no longer unusual for dogs adopted by Millennials or Gen Zs to have their own Instagram or TikTok accounts. The phenomenon is so widespread, particularly in Europe, East Asia and North America, that dogs are frequently referred to as 'fur babies'. Nothing is too good when it comes to offering these pooches the very best, as can be seen with the proliferation of dog-related products and services over the last few decades, including luxury grooming salons, strollers, dog parks and doggie daycare. There are even restaurants for dogs! Evident analogy The analogy with a young child is sometimes evident, and owners don't shy away from it: some even refer to their dog as their 'child' or 'baby'. Whether it's a way to ease loneliness, the satisfaction of having a dependent being to protect and train, or the feeling of being useful or needed, there are many reasons why these canines are perceived as child substitutes. A team of Hungarian researchers from the Department of Ethology at ELTE Eotvos Lorand University (Budapest) has studied this phenomenon in Western societies. Published in the journal European Psychologist, their study highlights several factors that may explain the phenomenon of likening dogs to children. One is that dogs' cognitive abilities and adaptability to human communication enable them to adopt a wide range of social behaviours, often comparable to those of pre-verbal children. But their morphology could also play a role in this association. In particular, the researchers link the appeal of small dogs – especially brachycephalic breeds like French bulldogs and pugs – to infantile traits that can make them appear 'as helpless, harmless and innocent as small children.' Seen as children... but not quite as humans In the light of these arguments, it's easy to understand why some people call themselves 'dog parents' rather than 'masters' or 'owners' of their pets. But there are nevertheless some key differences. 'Despite the high dependency and attachment of dogs to their caregivers, in the eyes of many, commitments coming with dog ownership remain less burdensome than child parenting,' explains Laura Gillet, a PhD student at the Department of Ethology and coauthor of the study, quoted in a news release. Another difference is that, since dogs don't live as long as humans, their owners generally assume that they will outlive them – which is obviously not the case with a child. 'While some owners might see their dog as a child surrogate to spoil, others actively choose to have dogs and not children, bearing in mind that they have species-specific characteristics and needs,' the study authors write. They point out that, contrary to popular belief, only a small minority of dog owners actually treat their pets like human children. 'In most cases, dog parents choose dogs precisely because they are not like children, and they acknowledge their species- specific needs,' explains study coauthor, Eniko Kubinyi. These researchers are not new to the subject. They have also carried out another study published in the journal Scientific Reports. Involving over 700 dog owners, the research compared the relationships we have with our dogs to those we have with our loved ones. The results reveal that, in the eyes of their owners, dogs are more than just loyal companions: they combine the qualities of a child with those of a best friend. They inspire a sense of security and the need for protection, just as a child would, while at the same time offering a harmonious, conflict-free relationship, similar to a deep friendship. – AFP Relaxnews

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store