3 days ago
Malaysia's fight against CSAM: Why shared responsibility is key
Many adults hesitate to report CSAM when they encounter it, unsure whether they are allowed to, afraid of the stigma, or worried about making things worse for the child involved. – AI Image
KUALA LUMPUR (July 20): The internet has transformed how children learn, play, and connect but it has also opened the door to new and deeply disturbing forms of harm.
Among the most serious of these is the growing circulation of Child Sexual Abuse Material (CSAM), which represents a profound violation of a child's safety, dignity, and rights.
In Malaysia and around the world, alarming spikes in online child sexual exploitation and abuse have made it clear that this is no longer a hidden problem. It has become a public crisis that requires urgent, collective action.
The Kempen Internet Selamat (KIS) plays a pivotal role in encouraging public involvement and legal reform in the fight against CSAM.
The campaign emphasises that online safety requires ordinary citizens to understand the signs, know how to report abuse, and demand accountability from platforms and policymakers.
Educating the public is a vital first step in dismantling the silence that allows online child exploitation to persist.
CSAM is a severe violation of a child's dignity and safety where in 2023 alone, there were 32 million reports of CSAM worldwide, with over 90 percent of the images being self-generated, often through coercion, manipulation, or blackmail.
Alarmingly, cases involving very young children, even those between the ages of 3 and 6, have been on the rise.
According to the National Center for Missing & Exploited Children (NCMEC), Malaysia recorded 197,659 reports of suspected CSAM through its CyberTipline in 2024.
That same year, the Internet Watch Foundation Malaysia reported 8,600 actionable cases.
One of the reasons this crisis persists is the silence that surrounds it. Many adults hesitate to report CSAM when they encounter it, unsure whether they are allowed to, afraid of the stigma, or worried about making things worse for the child involved.
This silence from surrounding adults is enabling harm to continue and further traumatise the victims.
As Sarawak Women for Women Society (SWWS) member Gill Raja, aptly puts it, 'If we don't take appropriate action, we are complicit.'
Inaction allows CSAM to circulate, and that can lead to further exploitation, blackmail, and psychological trauma for the victim.
Reporting CSAM is the first step towards taking it down. Yet many people, including victims, fear that reporting could draw more attention or lead to further harm.
'This is why trusted, child-friendly channels to report are so important. Accessible, confidential reporting options must be widely known and easy to use,' Gill emphasizes.
Gill also warns of the risks of turning a blind eye, 'The child could continue to be exploited and abused to create more material if those doing this remain in contact with them or have passed on their details to others.'
Gill Raja
She reminds us that failing to act means becoming part of the problem, 'If we don't take appropriate action, we are complicit in harming them. We need to protect each other's children to make the internet a safe place.'
The good news is, everyone has a role to play in ending this. Shared responsibility is not just a slogan, it is the only viable solution.
Parents, teachers, corporations, social media platforms, government agencies, non-profits, and everyday citizens all have unique roles and tools they can mobilise to fight this crisis.
The internet may be vast and borderless, but so is society's capacity to protect the children if everyone acts together to protect the children.
For individuals, reporting CSAM is a critical first step. Safe and confidential channels exist but remain underutilised due to lack of public awareness.
Malaysians can report abuse directly via the Childline Foundation portal, which connects to the Internet Watch Foundation's global takedown system. The Talian Kasih 15999 hotline and Cyber999 portal also offer accessible, sometimes anonymous, options.
As Gill highlights, 'We need more awareness and easy access so as soon as people see CSAM, they can easily see how to report. This requires a stronger response from social media platforms than we currently have.
'Every report is a crucial step in reducing stress on a child and shows that you care and are standing by them,' she says.
At a community level, adults or caretakers must normalise discussions around online safety where children need to be taught, in age-appropriate and culturally relevant ways, how to protect themselves, recognise risks, and seek help.
However, equally, the adults in their lives including parents, teachers, guardians must have the knowledge and tools to respond appropriately when abuse is disclosed.
Gill notes that current efforts fall short.
'We need to reach all children in age, language, and culturally appropriate ways that effectively engage them, plus informing the adults in their lives too.'
She stresses the need for training not only for children, but also for adults, who must understand 'how they may inadvertently put their children at risk by sharing photos online or how young people are themselves being sucked into viewing and sometimes producing CSAM.'
Media and tech platforms also bear tremendous responsibility. Safety-by-design should no longer be optional. Platforms must be required to proactively screen, detect, and remove CSAM.
They must offer easy-to-use reporting tools that children and adults can find without difficulty. While some platforms are making progress, others have scaled back moderation just as AI-generated CSAM is on the rise.
As Gill observes, 'Some major platforms have recently cut back on their vetting processes just as we are seeing a surge of material being produced including using AI. This is unacceptable.'
Laws and policies must also evolve rapidly.
While Malaysia has the Sexual Offences Against Children Act 2017, there is no legal requirement for platforms or ISPs to take down or report CSAM promptly.
Nor are there age-verification or parental consent mechanisms for online access. These loopholes allow predators to exploit vulnerable users and make law enforcement's job more difficult.
That's why advocates are pushing for a harmonised legal framework that outlines the responsibilities of both public institutions and private companies.
A designated national lead agency with the resources and authority to coordinate efforts across sectors is essential.
This body could ensure consistent reporting mechanisms, facilitate international cooperation, and manage end-to-end victim support systems including helplines, counselling and legal assistance.
Access to psychological care, legal aid, and rehabilitation must be expanded to help the victims cope before trauma becomes permanent.
Services must be inclusive and sensitive to each child's age, gender, ability, and background. A single, toll-free, 24/7 national child helpline staffed by trained professionals could be a lifeline.
Prevention efforts should also include nationwide digital literacy campaigns that teach children and adults about healthy online behaviour, consent, and boundaries.
Ultimately, protecting children online is not the sole responsibility of parents, teachers, or the police. It is a collective duty.
'Today we are part of a huge global, internet 'village,'' says Gill. 'We have to give children a path back. Every time we educate, report, and intervene, we are part of the solution.'
This is not just about fighting abuse. It is about defending every child's right to grow up free from exploitation and fear. Every child deserves that chance and every adult has the power to make it happen. child pornography CSAM Gill Raja lead