logo
#

Latest news with #NationalCentreforMissingandExploitedChildren

National meeting called to address AI child abuse
National meeting called to address AI child abuse

Sky News AU

time13-07-2025

  • Sky News AU

National meeting called to address AI child abuse

Experts and authorities on child exploitation material will meet for emergency meetings this week as the amount of AI-generated abuse explodes. The National Children's Commissioner will meet fellow experts in Canberra on Thursday for the roundtable discussions. 'We are seeing AI generate entirely new types of child abuse material. This is a turning point,' international expert Jon Rouse said. Figures from the US-based National Centre for Missing and Exploited Children show AI use has massively increased among predators. The centre reports a 1325 per cent increase in child sexual exploitation material reports involving generative AI, up from 4700 in 2023 to more than 67,000 in 2024. While based in the US, the centre works closely with law enforcement around the world. The meeting in Canberra has been called to discuss responses to AI-generated child sexual abuse material, deepfakes, automated grooming and childlike AI personas. 'This roundtable represents a pivotal moment for child protection in Australia,' International Centre for Missing and Exploited Children Australia chief executive Colm Gannon said. 'AI is being weaponised to harm children, and Australia must act swiftly to prevent these technologies from outpacing our systems of protection.' Australian of the Year Grace Tame will lend her expertise to the roundtable, as will representatives from the eSafety Commissioner, child protection organisation Bravehearts, and Childlight Australia. 'If we act now, Australia can set a global benchmark for ethical AI and child protection,' Mr Gannon said. Originally published as 'Act now': National meeting to combat AI child abuse

‘Urgent' forum to combat AI child abuse
‘Urgent' forum to combat AI child abuse

Perth Now

time13-07-2025

  • Politics
  • Perth Now

‘Urgent' forum to combat AI child abuse

Experts and authorities on child exploitation material will meet for emergency meetings this week as the amount of AI-generated abuse explodes. The National Children's Commissioner will meet fellow experts in Canberra on Thursday for the roundtable discussions. 'We are seeing AI generate entirely new types of child abuse material. This is a turning point,' international expert Jon Rouse said. Figures from the US-based National Centre for Missing and Exploited Children show AI use has massively increased among predators. Federal politicians met with social media and search engine company representatives in 2023 for a child exploitation inquiry. Since then, the use of AI has increased immensely. NewsWire / Martin Ollman Credit: News Corp Australia The centre reports a 1325 per cent increase in child sexual exploitation material reports involving generative AI, up from 4700 in 2023 to more than 67,000 in 2024. While based in the US, the centre works closely with law enforcement around the world. The meeting in Canberra has been called to discuss responses to AI-generated child sexual abuse material, deepfakes, automated grooming and childlike AI personas. 'This roundtable represents a pivotal moment for child protection in Australia,' International Centre for Missing and Exploited Children Australia chief executive Colm Gannon said. 'AI is being weaponised to harm children, and Australia must act swiftly to prevent these technologies from outpacing our systems of protection.' Australian of the Year Grace Tame will lend her expertise to the roundtable, as will representatives from the eSafety Commissioner, child protection organisation Bravehearts, and Childlight Australia. 'If we act now, Australia can set a global benchmark for ethical AI and child protection,' Mr Gannon said.

‘Act now': National meeting to combat AI child abuse
‘Act now': National meeting to combat AI child abuse

West Australian

time13-07-2025

  • West Australian

‘Act now': National meeting to combat AI child abuse

Experts and authorities on child exploitation material will meet for emergency meetings this week as the amount of AI-generated abuse explodes. The National Children's Commissioner will meet fellow experts in Canberra on Thursday for the roundtable discussions. 'We are seeing AI generate entirely new types of child abuse material. This is a turning point,' international expert Jon Rouse said. Figures from the US-based National Centre for Missing and Exploited Children show AI use has massively increased among predators. The centre reports a 1325 per cent increase in child sexual exploitation material reports involving generative AI, up from 4700 in 2023 to more than 67,000 in 2024. While based in the US, the centre works closely with law enforcement around the world. The meeting in Canberra has been called to discuss responses to AI-generated child sexual abuse material, deepfakes, automated grooming and childlike AI personas. 'This roundtable represents a pivotal moment for child protection in Australia,' International Centre for Missing and Exploited Children Australia chief executive Colm Gannon said. 'AI is being weaponised to harm children, and Australia must act swiftly to prevent these technologies from outpacing our systems of protection.' Australian of the Year Grace Tame will lend her expertise to the roundtable, as will representatives from the eSafety Commissioner, child protection organisation Bravehearts, and Childlight Australia. 'If we act now, Australia can set a global benchmark for ethical AI and child protection,' Mr Gannon said.

B.C. man's ‘relatively modest' child porn collection doesn't merit jail time, judge rules
B.C. man's ‘relatively modest' child porn collection doesn't merit jail time, judge rules

Vancouver Sun

time11-06-2025

  • Vancouver Sun

B.C. man's ‘relatively modest' child porn collection doesn't merit jail time, judge rules

A B.C. man who admitted to police he was aroused by child pornography found in his home but later denied sexual interest in children, will serve his sentence at home, partly due to the 'relatively modest size' of his stash — six images. Such was the ruling from provincial court Judge Andrew Tam in Kelowna, who handed down a conditional sentence of two years less a day to Mark Keenan in a decision published this week. 'Although there is no strict mathematical relationship between the size of the collection and the length (or indeed type) of sentence, the size of a collection has often been held to be an aggravating factor, ' Tam wrote. Start your day with a roundup of B.C.-focused news and opinion. By signing up you consent to receive the above newsletter from Postmedia Network Inc. A welcome email is on its way. If you don't see it, please check your junk folder. The next issue of Sunrise will soon be in your inbox. Please try again Interested in more newsletters? Browse here. 'It stands to reason, then, that a modest collection, while not a mitigating factor, could nevertheless distinguish it from other cases.' Keenan, 54, pleaded guilty to one count each of possession of child pornography and distribution of the same, offences for which he was first arrested in September 2018 after someone alerted the National Centre for Missing and Exploited Children (NCMEC) about his account. The RCMP obtained a search warrant and found the six images meeting the definition of child pornography on one of his devices. Forensic investigations also uncovered communication between Keenan and other users, mostly adults but including one who identified as a 15-year-old boy, in which he discussed sexual activities with children and the exchange of images. Keenan insisted at his sentencing hearing in March that he had 'no sexual interest in children' and that he happened upon the images while searching for beach and sunset photos. So appalled at the sight of child pornography, Keenan, in both a statement to police upon his arrest and again at his hearing, maintained he found the images in September 2017 and was using them to run his own 'undercover sting' to catch pedophiles on Tumblr so he could get them banned. Tam rejected the idea the offences were carried out 'for the public good' and noted Keenan, in his statement to police, provided 'the most damning evidence.' When asked by officers if he was aroused by the images he replied, 'I mean, in a way, it's hard not to be.' Other than that admission, Tam cited several reasons for dismissing Kennan's assertion on the balance of probabilities, He first pointed to the messages, in which Keenan offered 'extensive and detailed' insight into his sexual interests with children. The judge called the exchanges 'unnecessarily graphic' for someone only trying to snag a predator and questioned why they went on long after ample evidence had been acquired. 'His participation in these conversations seems too genuinely earnest for it to be a ruse,' Tam wrote. That Keenan accepted one individual's claim to be a minor, and that he still chose to exchange images of their respective genitalia, also didn't bolster his 'sting operation' defence. 'Sending a photo of his own penis, and asking for a photo of the penis of someone who is ostensibly 15 do not serve that purpose,' the judge decided. Tam also couldn't accept that reasoning because Keenan kept the six original photos and carried out his operation for a year. Doing so, the judge said, resulted in what Keenan sought to snuff out — 'the proliferation of child pornography on the site.' While the undercover operation certainly wasn't a 'mitigating factor,' Tam highlighted others that led him to deliver the conditional sentence as opposed to time behind bars. He said Keenan is a gainfully employed community member with a common-law partner of 12 years from whom the judge notes 'he still enjoys tremendous support' and who described him as 'a genuine and loving person with integrity.' He has no criminal record, nor any history of mental health or substance issues, and was cooperative with police from the outset. He's also been living in the community for over six years without incident. Keenan, who'll be in the DNA database for five years, will be on house arrest for the first 18 months, with the remainder spent on a 6 p.m. curfew. Twelve months of probation will follow. Our website is the place for the latest breaking news, exclusive scoops, longreads and provocative commentary. Please bookmark and sign up for our daily newsletter, Posted, here .

Worrying trends, unclear data: India's CSAM challenge
Worrying trends, unclear data: India's CSAM challenge

Hindustan Times

time14-05-2025

  • Hindustan Times

Worrying trends, unclear data: India's CSAM challenge

The recent CyberTipline data released by the National Centre for Missing and Exploited Children (NCMEC) makes for grim reading. In 2024, India accounted for the largest number of reports related to Child Sexual Abuse Material (CSAM) globally with a staggering 2.3 million reports. This data is not just alarming—it is an emergency. While these numbers are alarming, they reveal only half the story. What they obscure might be even more important. We still don't know how many of these reports pertain to unique instances of abuse or how many relate to the same content being circulated again and again. We don't know how many were generated by perpetrators and how many came from individuals who, in horror or ignorance, reshared the material. Crucially, we have no clear picture of how many reports are translated into timely interventions, legal action, or support for survivors. This gap in understanding should concern all of us. Because in the absence of clarity, effective policy and accountability are impossible. And as long as we continue to work in the dark, children will continue to suffer. The lack of transparency and disaggregated data is not just a technical issue—it is a moral failure. Every number in that 2.3 million reports from India is a potential instance of unspeakable harm. But without knowing the context, India's efforts to respond to this crisis are reactive at best. Encouragingly, India has already taken important legal steps to address online child sexual exploitation. Under Section 67B of the Information Technology Act, the creation, transmission, and even viewing of CSAM is a criminal offence, punishable by imprisonment and fines. The Information Technology (Intermediary Guidelines and Digital Media Ethics Code) Rules, 2021 further require social media platforms and other intermediaries to remove CSAM swiftly upon knowledge or notification and to report such content to Indian law enforcement. Most recently, in 2024, the Supreme Court of India clarified that even the possession or viewing of CSAM is punishable under Indian law and recommended replacing the term child pornography with Child Sexual Exploitative and Abuse Material (CSEAM) to reflect the grave nature of the offence more accurately. However, India urgently needs a national strategy—one that is research-driven, coordinated, and survivor-centred. At the heart of this strategy must be a few core principles: · First, we need platform transparency. Major tech companies must be compelled to share disaggregated data with regulators, researchers, and civil society. That means not just how many reports were filed, but the nature of the content, its origin, distribution patterns, and response timelines. · Second, we need robust legal frameworks. Our laws must evolve to reflect the complexity of the digital age. There should be clear legal distinctions between those creating or intentionally distributing CSAM, and those who may unwittingly share such material out of shock or confusion. The law must be firm, but also fair. · Third, we need a massive public education campaign. Many people, especially younger users, do not know what to do when encountering harmful content online. Some try to flag it by posting screenshots. Others share it in outrage. We must teach people that the safest, most responsible action is to report the content immediately to platform moderators or relevant authorities, and never to redistribute it. · Fourth, and most critically, we need a national commitment to survivor support. Children who have experienced abuse, especially when that abuse is digitised and distributed, require specialised care. From trauma-informed counselling and medical support to legal aid and safe housing, survivors need pathways to recovery that are compassionate and sustained. · And finally, we need research. There is an urgent need for academic institutions and civil society to be empowered to study the scale, nature, and consequences of online child sexual exploitation in India. We must stop depending solely on foreign data sets. India must invest in its national data infrastructure while maintaining international collaboration. The truth is, the numbers we are seeing may be the tip of the iceberg. And for each data point we miss, we risk failing a child. This is not just a criminal justice issue. It is a societal one. It is about the kind of digital environment we are willing to accept—and the kind of country we want our children to grow up in. If India is serious about building a safe digital future, then child protection must be placed at the core of our internet governance strategy. That includes robust law enforcement, yes—but also education, prevention, corporate accountability, and above all, compassion. Even one child harmed is one too many. And 2.3 million reports are not just numbers—it is a wake-up call. The question is: Will we listen? This article is authored by Ranjana Kumari, founder, Centre for Social Research, New Delhi.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store