logo
AI turbocharges child abuse as image-creation made easy

AI turbocharges child abuse as image-creation made easy

The Advertiser17-07-2025
Australia needs to re-examine how it tackles child sexual exploitation as experts warn rapid development in artificial intelligence is widening gaps exploited by perpetrators.
International Centre for Missing and Exploited Children chief executive Colm Gannon said the organisation had received a 1325 per cent spike in reports involving AI-generated child sexual abuse material in a year.
The centre received more than 67,000 reports on the matter in 2024.
Experts and government officials convened at Parliament House for a round table to address the increasing use of AI in the sexual exploitation of children.
Child safety advocates called for the explicit criminalisation of the use and possession of software designed to generate child sexual exploitation material.
"I have been involved in investigations where there is active trading and profiteering from using these models, it's a pay-as-you-use design that's happening within child sexual offender communities," Mr Gannon told reporters in Canberra on Thursday.
"There is no social-positive reason why people are going to be in possession of this software except to generate child sexual abuse material."
A 10-year government plan released in 2021 to address child protection needed to be updated to capture new technology as it didn't mention AI and associated harms, he said.
Child abuse survivor Grace Tame said there needed to be a broader review into tackling child sexual exploitation with a royal commission into institutional child sexual abuse more than a decade ago failing to examine key issues.
"It was very specifically focused on institutional child sexual abuse and the responses of institutions," the former Australia of the Year said.
"Incest accounts for the overwhelming majority of all child sexual abuse.
"A lot of this is taking place in home, a lot of the online content that we're seeing is often filmed by parents and distributed by parents and there's no institution involved in that."
Jon Rouse, who worked in law enforcement for nearly four decades and tackled online child exploitation material, called for authorities to be given greater resources and new tools to quickly identify victims and combat the crime.
"The tragedy about that is that if we don't find them quickly, they get buried in a landslide of new content," he said of child abuse content.
Mr Rouse also demanded risk assessments for new technology as social media algorithms pushed users toward disturbing and harmful content.
"The tragedy is we're at a point now where we're having to ban our kids from social media, because we can't rely on any sector of the industry to protect our kids, which is pretty sad," he said.
One social media app kept suggesting AI-generated content of scantily clad mothers with young children, he said, showing reporters a series of photos.
"They're not sexually explicit but they are telling you something about the people that created them," Mr Rouse said.
There also needed to be community-wide education on how to spot problem behaviours and precipitating actions from offenders, Ms Tame said.
"We've been talking about early childhood education - these kids are pre-verbal, so they're even more vulnerable," she said.
Lifeline 13 11 14
Kids Helpline 1800 55 1800 (for people aged 5 to 25)
1800 RESPECT (1800 737 732)
National Sexual Abuse and Redress Support Service 1800 211 028
Australia needs to re-examine how it tackles child sexual exploitation as experts warn rapid development in artificial intelligence is widening gaps exploited by perpetrators.
International Centre for Missing and Exploited Children chief executive Colm Gannon said the organisation had received a 1325 per cent spike in reports involving AI-generated child sexual abuse material in a year.
The centre received more than 67,000 reports on the matter in 2024.
Experts and government officials convened at Parliament House for a round table to address the increasing use of AI in the sexual exploitation of children.
Child safety advocates called for the explicit criminalisation of the use and possession of software designed to generate child sexual exploitation material.
"I have been involved in investigations where there is active trading and profiteering from using these models, it's a pay-as-you-use design that's happening within child sexual offender communities," Mr Gannon told reporters in Canberra on Thursday.
"There is no social-positive reason why people are going to be in possession of this software except to generate child sexual abuse material."
A 10-year government plan released in 2021 to address child protection needed to be updated to capture new technology as it didn't mention AI and associated harms, he said.
Child abuse survivor Grace Tame said there needed to be a broader review into tackling child sexual exploitation with a royal commission into institutional child sexual abuse more than a decade ago failing to examine key issues.
"It was very specifically focused on institutional child sexual abuse and the responses of institutions," the former Australia of the Year said.
"Incest accounts for the overwhelming majority of all child sexual abuse.
"A lot of this is taking place in home, a lot of the online content that we're seeing is often filmed by parents and distributed by parents and there's no institution involved in that."
Jon Rouse, who worked in law enforcement for nearly four decades and tackled online child exploitation material, called for authorities to be given greater resources and new tools to quickly identify victims and combat the crime.
"The tragedy about that is that if we don't find them quickly, they get buried in a landslide of new content," he said of child abuse content.
Mr Rouse also demanded risk assessments for new technology as social media algorithms pushed users toward disturbing and harmful content.
"The tragedy is we're at a point now where we're having to ban our kids from social media, because we can't rely on any sector of the industry to protect our kids, which is pretty sad," he said.
One social media app kept suggesting AI-generated content of scantily clad mothers with young children, he said, showing reporters a series of photos.
"They're not sexually explicit but they are telling you something about the people that created them," Mr Rouse said.
There also needed to be community-wide education on how to spot problem behaviours and precipitating actions from offenders, Ms Tame said.
"We've been talking about early childhood education - these kids are pre-verbal, so they're even more vulnerable," she said.
Lifeline 13 11 14
Kids Helpline 1800 55 1800 (for people aged 5 to 25)
1800 RESPECT (1800 737 732)
National Sexual Abuse and Redress Support Service 1800 211 028
Australia needs to re-examine how it tackles child sexual exploitation as experts warn rapid development in artificial intelligence is widening gaps exploited by perpetrators.
International Centre for Missing and Exploited Children chief executive Colm Gannon said the organisation had received a 1325 per cent spike in reports involving AI-generated child sexual abuse material in a year.
The centre received more than 67,000 reports on the matter in 2024.
Experts and government officials convened at Parliament House for a round table to address the increasing use of AI in the sexual exploitation of children.
Child safety advocates called for the explicit criminalisation of the use and possession of software designed to generate child sexual exploitation material.
"I have been involved in investigations where there is active trading and profiteering from using these models, it's a pay-as-you-use design that's happening within child sexual offender communities," Mr Gannon told reporters in Canberra on Thursday.
"There is no social-positive reason why people are going to be in possession of this software except to generate child sexual abuse material."
A 10-year government plan released in 2021 to address child protection needed to be updated to capture new technology as it didn't mention AI and associated harms, he said.
Child abuse survivor Grace Tame said there needed to be a broader review into tackling child sexual exploitation with a royal commission into institutional child sexual abuse more than a decade ago failing to examine key issues.
"It was very specifically focused on institutional child sexual abuse and the responses of institutions," the former Australia of the Year said.
"Incest accounts for the overwhelming majority of all child sexual abuse.
"A lot of this is taking place in home, a lot of the online content that we're seeing is often filmed by parents and distributed by parents and there's no institution involved in that."
Jon Rouse, who worked in law enforcement for nearly four decades and tackled online child exploitation material, called for authorities to be given greater resources and new tools to quickly identify victims and combat the crime.
"The tragedy about that is that if we don't find them quickly, they get buried in a landslide of new content," he said of child abuse content.
Mr Rouse also demanded risk assessments for new technology as social media algorithms pushed users toward disturbing and harmful content.
"The tragedy is we're at a point now where we're having to ban our kids from social media, because we can't rely on any sector of the industry to protect our kids, which is pretty sad," he said.
One social media app kept suggesting AI-generated content of scantily clad mothers with young children, he said, showing reporters a series of photos.
"They're not sexually explicit but they are telling you something about the people that created them," Mr Rouse said.
There also needed to be community-wide education on how to spot problem behaviours and precipitating actions from offenders, Ms Tame said.
"We've been talking about early childhood education - these kids are pre-verbal, so they're even more vulnerable," she said.
Lifeline 13 11 14
Kids Helpline 1800 55 1800 (for people aged 5 to 25)
1800 RESPECT (1800 737 732)
National Sexual Abuse and Redress Support Service 1800 211 028
Australia needs to re-examine how it tackles child sexual exploitation as experts warn rapid development in artificial intelligence is widening gaps exploited by perpetrators.
International Centre for Missing and Exploited Children chief executive Colm Gannon said the organisation had received a 1325 per cent spike in reports involving AI-generated child sexual abuse material in a year.
The centre received more than 67,000 reports on the matter in 2024.
Experts and government officials convened at Parliament House for a round table to address the increasing use of AI in the sexual exploitation of children.
Child safety advocates called for the explicit criminalisation of the use and possession of software designed to generate child sexual exploitation material.
"I have been involved in investigations where there is active trading and profiteering from using these models, it's a pay-as-you-use design that's happening within child sexual offender communities," Mr Gannon told reporters in Canberra on Thursday.
"There is no social-positive reason why people are going to be in possession of this software except to generate child sexual abuse material."
A 10-year government plan released in 2021 to address child protection needed to be updated to capture new technology as it didn't mention AI and associated harms, he said.
Child abuse survivor Grace Tame said there needed to be a broader review into tackling child sexual exploitation with a royal commission into institutional child sexual abuse more than a decade ago failing to examine key issues.
"It was very specifically focused on institutional child sexual abuse and the responses of institutions," the former Australia of the Year said.
"Incest accounts for the overwhelming majority of all child sexual abuse.
"A lot of this is taking place in home, a lot of the online content that we're seeing is often filmed by parents and distributed by parents and there's no institution involved in that."
Jon Rouse, who worked in law enforcement for nearly four decades and tackled online child exploitation material, called for authorities to be given greater resources and new tools to quickly identify victims and combat the crime.
"The tragedy about that is that if we don't find them quickly, they get buried in a landslide of new content," he said of child abuse content.
Mr Rouse also demanded risk assessments for new technology as social media algorithms pushed users toward disturbing and harmful content.
"The tragedy is we're at a point now where we're having to ban our kids from social media, because we can't rely on any sector of the industry to protect our kids, which is pretty sad," he said.
One social media app kept suggesting AI-generated content of scantily clad mothers with young children, he said, showing reporters a series of photos.
"They're not sexually explicit but they are telling you something about the people that created them," Mr Rouse said.
There also needed to be community-wide education on how to spot problem behaviours and precipitating actions from offenders, Ms Tame said.
"We've been talking about early childhood education - these kids are pre-verbal, so they're even more vulnerable," she said.
Lifeline 13 11 14
Kids Helpline 1800 55 1800 (for people aged 5 to 25)
1800 RESPECT (1800 737 732)
National Sexual Abuse and Redress Support Service 1800 211 028
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

AI turbocharges child abuse as image-creation made easy
AI turbocharges child abuse as image-creation made easy

The Advertiser

time17-07-2025

  • The Advertiser

AI turbocharges child abuse as image-creation made easy

Australia needs to re-examine how it tackles child sexual exploitation as experts warn rapid development in artificial intelligence is widening gaps exploited by perpetrators. International Centre for Missing and Exploited Children chief executive Colm Gannon said the organisation had received a 1325 per cent spike in reports involving AI-generated child sexual abuse material in a year. The centre received more than 67,000 reports on the matter in 2024. Experts and government officials convened at Parliament House for a round table to address the increasing use of AI in the sexual exploitation of children. Child safety advocates called for the explicit criminalisation of the use and possession of software designed to generate child sexual exploitation material. "I have been involved in investigations where there is active trading and profiteering from using these models, it's a pay-as-you-use design that's happening within child sexual offender communities," Mr Gannon told reporters in Canberra on Thursday. "There is no social-positive reason why people are going to be in possession of this software except to generate child sexual abuse material." A 10-year government plan released in 2021 to address child protection needed to be updated to capture new technology as it didn't mention AI and associated harms, he said. Child abuse survivor Grace Tame said there needed to be a broader review into tackling child sexual exploitation with a royal commission into institutional child sexual abuse more than a decade ago failing to examine key issues. "It was very specifically focused on institutional child sexual abuse and the responses of institutions," the former Australia of the Year said. "Incest accounts for the overwhelming majority of all child sexual abuse. "A lot of this is taking place in home, a lot of the online content that we're seeing is often filmed by parents and distributed by parents and there's no institution involved in that." Jon Rouse, who worked in law enforcement for nearly four decades and tackled online child exploitation material, called for authorities to be given greater resources and new tools to quickly identify victims and combat the crime. "The tragedy about that is that if we don't find them quickly, they get buried in a landslide of new content," he said of child abuse content. Mr Rouse also demanded risk assessments for new technology as social media algorithms pushed users toward disturbing and harmful content. "The tragedy is we're at a point now where we're having to ban our kids from social media, because we can't rely on any sector of the industry to protect our kids, which is pretty sad," he said. One social media app kept suggesting AI-generated content of scantily clad mothers with young children, he said, showing reporters a series of photos. "They're not sexually explicit but they are telling you something about the people that created them," Mr Rouse said. There also needed to be community-wide education on how to spot problem behaviours and precipitating actions from offenders, Ms Tame said. "We've been talking about early childhood education - these kids are pre-verbal, so they're even more vulnerable," she said. Lifeline 13 11 14 Kids Helpline 1800 55 1800 (for people aged 5 to 25) 1800 RESPECT (1800 737 732) National Sexual Abuse and Redress Support Service 1800 211 028 Australia needs to re-examine how it tackles child sexual exploitation as experts warn rapid development in artificial intelligence is widening gaps exploited by perpetrators. International Centre for Missing and Exploited Children chief executive Colm Gannon said the organisation had received a 1325 per cent spike in reports involving AI-generated child sexual abuse material in a year. The centre received more than 67,000 reports on the matter in 2024. Experts and government officials convened at Parliament House for a round table to address the increasing use of AI in the sexual exploitation of children. Child safety advocates called for the explicit criminalisation of the use and possession of software designed to generate child sexual exploitation material. "I have been involved in investigations where there is active trading and profiteering from using these models, it's a pay-as-you-use design that's happening within child sexual offender communities," Mr Gannon told reporters in Canberra on Thursday. "There is no social-positive reason why people are going to be in possession of this software except to generate child sexual abuse material." A 10-year government plan released in 2021 to address child protection needed to be updated to capture new technology as it didn't mention AI and associated harms, he said. Child abuse survivor Grace Tame said there needed to be a broader review into tackling child sexual exploitation with a royal commission into institutional child sexual abuse more than a decade ago failing to examine key issues. "It was very specifically focused on institutional child sexual abuse and the responses of institutions," the former Australia of the Year said. "Incest accounts for the overwhelming majority of all child sexual abuse. "A lot of this is taking place in home, a lot of the online content that we're seeing is often filmed by parents and distributed by parents and there's no institution involved in that." Jon Rouse, who worked in law enforcement for nearly four decades and tackled online child exploitation material, called for authorities to be given greater resources and new tools to quickly identify victims and combat the crime. "The tragedy about that is that if we don't find them quickly, they get buried in a landslide of new content," he said of child abuse content. Mr Rouse also demanded risk assessments for new technology as social media algorithms pushed users toward disturbing and harmful content. "The tragedy is we're at a point now where we're having to ban our kids from social media, because we can't rely on any sector of the industry to protect our kids, which is pretty sad," he said. One social media app kept suggesting AI-generated content of scantily clad mothers with young children, he said, showing reporters a series of photos. "They're not sexually explicit but they are telling you something about the people that created them," Mr Rouse said. There also needed to be community-wide education on how to spot problem behaviours and precipitating actions from offenders, Ms Tame said. "We've been talking about early childhood education - these kids are pre-verbal, so they're even more vulnerable," she said. Lifeline 13 11 14 Kids Helpline 1800 55 1800 (for people aged 5 to 25) 1800 RESPECT (1800 737 732) National Sexual Abuse and Redress Support Service 1800 211 028 Australia needs to re-examine how it tackles child sexual exploitation as experts warn rapid development in artificial intelligence is widening gaps exploited by perpetrators. International Centre for Missing and Exploited Children chief executive Colm Gannon said the organisation had received a 1325 per cent spike in reports involving AI-generated child sexual abuse material in a year. The centre received more than 67,000 reports on the matter in 2024. Experts and government officials convened at Parliament House for a round table to address the increasing use of AI in the sexual exploitation of children. Child safety advocates called for the explicit criminalisation of the use and possession of software designed to generate child sexual exploitation material. "I have been involved in investigations where there is active trading and profiteering from using these models, it's a pay-as-you-use design that's happening within child sexual offender communities," Mr Gannon told reporters in Canberra on Thursday. "There is no social-positive reason why people are going to be in possession of this software except to generate child sexual abuse material." A 10-year government plan released in 2021 to address child protection needed to be updated to capture new technology as it didn't mention AI and associated harms, he said. Child abuse survivor Grace Tame said there needed to be a broader review into tackling child sexual exploitation with a royal commission into institutional child sexual abuse more than a decade ago failing to examine key issues. "It was very specifically focused on institutional child sexual abuse and the responses of institutions," the former Australia of the Year said. "Incest accounts for the overwhelming majority of all child sexual abuse. "A lot of this is taking place in home, a lot of the online content that we're seeing is often filmed by parents and distributed by parents and there's no institution involved in that." Jon Rouse, who worked in law enforcement for nearly four decades and tackled online child exploitation material, called for authorities to be given greater resources and new tools to quickly identify victims and combat the crime. "The tragedy about that is that if we don't find them quickly, they get buried in a landslide of new content," he said of child abuse content. Mr Rouse also demanded risk assessments for new technology as social media algorithms pushed users toward disturbing and harmful content. "The tragedy is we're at a point now where we're having to ban our kids from social media, because we can't rely on any sector of the industry to protect our kids, which is pretty sad," he said. One social media app kept suggesting AI-generated content of scantily clad mothers with young children, he said, showing reporters a series of photos. "They're not sexually explicit but they are telling you something about the people that created them," Mr Rouse said. There also needed to be community-wide education on how to spot problem behaviours and precipitating actions from offenders, Ms Tame said. "We've been talking about early childhood education - these kids are pre-verbal, so they're even more vulnerable," she said. Lifeline 13 11 14 Kids Helpline 1800 55 1800 (for people aged 5 to 25) 1800 RESPECT (1800 737 732) National Sexual Abuse and Redress Support Service 1800 211 028 Australia needs to re-examine how it tackles child sexual exploitation as experts warn rapid development in artificial intelligence is widening gaps exploited by perpetrators. International Centre for Missing and Exploited Children chief executive Colm Gannon said the organisation had received a 1325 per cent spike in reports involving AI-generated child sexual abuse material in a year. The centre received more than 67,000 reports on the matter in 2024. Experts and government officials convened at Parliament House for a round table to address the increasing use of AI in the sexual exploitation of children. Child safety advocates called for the explicit criminalisation of the use and possession of software designed to generate child sexual exploitation material. "I have been involved in investigations where there is active trading and profiteering from using these models, it's a pay-as-you-use design that's happening within child sexual offender communities," Mr Gannon told reporters in Canberra on Thursday. "There is no social-positive reason why people are going to be in possession of this software except to generate child sexual abuse material." A 10-year government plan released in 2021 to address child protection needed to be updated to capture new technology as it didn't mention AI and associated harms, he said. Child abuse survivor Grace Tame said there needed to be a broader review into tackling child sexual exploitation with a royal commission into institutional child sexual abuse more than a decade ago failing to examine key issues. "It was very specifically focused on institutional child sexual abuse and the responses of institutions," the former Australia of the Year said. "Incest accounts for the overwhelming majority of all child sexual abuse. "A lot of this is taking place in home, a lot of the online content that we're seeing is often filmed by parents and distributed by parents and there's no institution involved in that." Jon Rouse, who worked in law enforcement for nearly four decades and tackled online child exploitation material, called for authorities to be given greater resources and new tools to quickly identify victims and combat the crime. "The tragedy about that is that if we don't find them quickly, they get buried in a landslide of new content," he said of child abuse content. Mr Rouse also demanded risk assessments for new technology as social media algorithms pushed users toward disturbing and harmful content. "The tragedy is we're at a point now where we're having to ban our kids from social media, because we can't rely on any sector of the industry to protect our kids, which is pretty sad," he said. One social media app kept suggesting AI-generated content of scantily clad mothers with young children, he said, showing reporters a series of photos. "They're not sexually explicit but they are telling you something about the people that created them," Mr Rouse said. There also needed to be community-wide education on how to spot problem behaviours and precipitating actions from offenders, Ms Tame said. "We've been talking about early childhood education - these kids are pre-verbal, so they're even more vulnerable," she said. Lifeline 13 11 14 Kids Helpline 1800 55 1800 (for people aged 5 to 25) 1800 RESPECT (1800 737 732) National Sexual Abuse and Redress Support Service 1800 211 028

Child abuse images a click away as experts warn predators are using AI
Child abuse images a click away as experts warn predators are using AI

7NEWS

time17-07-2025

  • 7NEWS

Child abuse images a click away as experts warn predators are using AI

Australia needs to re-examine how it tackles child sexual exploitation as experts warn rapid development in artificial intelligence is widening gaps exploited by perpetrators. International Centre for Missing and Exploited Children chief executive Colm Gannon said the organisation had received a 1325 per cent spike in reports involving AI-generated child sexual abuse material in a year. The centre received more than 67,000 reports on the matter in 2024. Experts and government officials convened at Parliament House for a round table to address the increasing use of AI in the sexual exploitation of children. Child safety advocates called for the explicit criminalisation of the use and possession of software designed to generate child sexual exploitation material. 'I have been involved in investigations where there is active trading and profiteering from using these models, it's a pay-as-you-use design that's happening within child sexual offender communities,' Gannon told reporters in Canberra on Thursday. 'There is no social-positive reason why people are going to be in possession of this software except to generate child sexual abuse material.' A 10-year government plan released in 2021 to address child protection needed to be updated to capture new technology as it didn't mention AI and associated harms, he said. Child abuse survivor Grace Tame said there needed to be a broader review into tackling child sexual exploitation with a royal commission into institutional child sexual abuse more than a decade ago failing to examine key issues. 'It was very specifically focused on institutional child sexual abuse and the responses of institutions,' the former Australian of the Year said. 'Incest accounts for the overwhelming majority of all child sexual abuse. 'A lot of this is taking place in home, a lot of the online content that we're seeing is often filmed by parents and distributed by parents and there's no institution involved in that.' Jon Rouse, who worked in law enforcement for nearly four decades and tackled online child exploitation material, called for authorities to be given greater resources and new tools to quickly identify victims and combat the crime. 'The tragedy about that is that if we don't find them quickly, they get buried in a landslide of new content,' he said of child abuse content. Rouse also demanded risk assessments for new technology as social media algorithms pushed users toward disturbing and harmful content. 'The tragedy is we're at a point now where we're having to ban our kids from social media, because we can't rely on any sector of the industry to protect our kids, which is pretty sad,' he said. One social media app kept suggesting AI-generated content of scantily clad mothers with young children, he said, showing reporters a series of photos. 'They're not sexually explicit but they are telling you something about the people that created them,' Rouse said. There also needed to be community-wide education on how to spot problem behaviours and precipitating actions from offenders, Tame said. 'We've been talking about early childhood education — these kids are pre-verbal, so they're even more vulnerable,' she said.

Calls to criminalise possession and use of AI tools that create child abuse material
Calls to criminalise possession and use of AI tools that create child abuse material

SBS Australia

time17-07-2025

  • SBS Australia

Calls to criminalise possession and use of AI tools that create child abuse material

Child safety advocates have met at Parliament House to confront the problem of child safety in the age of Artificial Intelligence. From the rising prevalence of deepfake imagery to the availability of so-called nudify apps, the use of A-I to sexually exploit children is growing exponentially - and there is concern Australian laws are falling behind. The roundtable was convened by the International Centre for Exploited and Missing Children Australia, or ICMEC. CEO Colm Gannon says Australia's current child protection framework, introduced just three years ago, fails to address the threat of AI - and he's calling on the government to make it a priority. "(By) bringing it into the national framework that was brought around in 2021, a 10 year framework that doesn't mention AI. We're working to hopefully develop solutions for government to bring child safety in the age of AI at the forefront." Earlier this year, the United Kingdom became the first country in the world to introduce A-I sexual abuse offences to protect children from predators generating images with artificial intelligence. Colm Gannon is leading the calls for similar legislation in Australia. "What we need to do is look at legislation that's going to be comprehensive - comprehensive for protection, comprehensive for enforcement and comprehensive to actually be technology neutral." Former Australian of the Year Grace Tame says online child abuse needs to be addressed at a society-wide level. "Perpetrators are not just grooming their individual victims, they're grooming their entire environments to create a regime of control in which abuse can operate in plain sight. And there are ways through education that needs to be aimed at, not just children and the people who work with children but the entire community, there are ways to identify these precipitating behaviours that underpin the contact offending framework, it's things like how offenders target victims, which victims they are targeting specifically." In 2023, intelligence company Graphika reported that the use of synthetic, non-consensual, intimate imagery was becoming more widespread - moving from being available in niche internet forums, to an automated and scaled online business. It found that 34 of such image providers received more than 24 million unique visitors to their websites, while links accessing these services increased on platforms including Reddit and X. As part of Task Force Argos, former policeman Professor Jon Rouse pioneered Australia's first proactive operations against internet child sex offenders, and has also chaired INTERPOL's Covert Internet Investigations Group. Now working with Childlight Australia, he says big tech providers like Apple have a responsibility to build safety controls into their products. "Apple has got accountability here as well, they just put things on the app store, they get money every time somebody downloads it, but there's no regulation around this, you create an app you put it on the app store. Who checks and balances what the damages that, that's going to cause is. No-one. The tragedy is we're at a point now that we have to ban our kids from social media. Because we can't rely on any sector of the industry to ban our kids. Which is pretty sad." Apple has not responded to a request for comment. But in 2021 it announced it had introduced new features designed to help keep young people safe - such as sending warnings when children receive, or attempt to send, images or videos containing nudity. Although potentially dangerous, AI can also be used to detect grooming behaviour and child sexual abuse material. Colm Gannon from ICMEC says these opportunities need to be harnessed. "The other thing that we want to do is use law enforcement as a tool to help identify victims. There is technology out there that can assist in rapid and easy access to victim ID and what's happening at the moment is law enforcement are not able to use that technology."

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store