Latest news with #CSAM

Yahoo
a day ago
- Yahoo
Pottsville man pleads guilty to CSAM charges
POTTSVILLE — A 51-year-old city man pleaded guilty to possessing child sexual abuse material Friday. James J. Wagner, who faced more than 40 felony counts related to Child Sexual Abuse Materials (CSAM), entered a guilty plea on four of those charges in Schuylkill County Court. Wagner and his attorney, Edward M. Olexa of Hazleton, appeared in front of Judge William L. J. Burke for a status conference. Wagner pleaded guilty to a count of criminal use of a communication facility and three counts of child pornography, all third-degree felonies. Each charge carries a maximum penalty of seven years in prison and a $15,000 fine, Burke said. The Pennsylvania State Police Bureau of Criminal Intelligence charged Wagner on Sept. 23, 2024. The defendant faced 41 charges in total, including 20 felony counts of disseminating photo/film of child sex acts and 20 of child pornography. The commonwealth opted not to prosecute the remaining charges, per the plea agreement. Wagner has been free on bail since Sept. 26.


Sky News
2 days ago
- Sky News
Teachers given new guidance in dealing with AI-generated child sexual abuse material
Guidelines on how to deal with AI-generated child sexual abuse material (CSAM) have been issued to 38,000 teachers and staff across the UK. The guidelines are an attempt to help people working with children tackle the "highly disturbing" rise in AI-generated CSAM. They have been issued by the National Crime Agency (NCA) and the Internet Watch Foundation (IWF). The AI-generated content is illegal in the UK and is treated the same as any other sexual abuse imagery of children, even if the imagery isn't photorealistic. "The rise in AI-generated child sexual abuse imagery is highly disturbing and it is vital that every arm of society keeps up with the latest online threats," said safeguarding minister Jess Phillips. "AI-generated child sexual abuse is illegal and we know that sick predators' activities online often lead to them carrying out the most horrific abuse in person. "We will not allow technology to be weaponised against children and we will not hesitate to go further to protect our children online," she said. The guidelines suggest that if young people are using AI to create nude images from each other's pictures - known as nudifying - or creating AI-generating CSAM, they may not be aware that what they're doing is illegal. Nudifying is when a non-explicit picture of someone is edited to make them appear nude and is increasingly common in "sextortion" cases - when someone is blackmailed with explicit pictures. "Where an under-18 is creating AI-CSAM, they may think it is 'just a joke' or 'banter' or do so with the intention of blackmailing or harming another child," suggests the guidance. "They may or may not recognise the illegality or the serious, lasting impact their actions can have on the victim." Last year, the NCA surveyed teachers and found that over a quarter weren't aware AI-generated CSAM was illegal, and most weren't sure their students were aware either. More than half of the respondents said guidance was their most urgently needed resource. The IWF has seen an increasing amount of AI-generated CSAM as it scours the internet, processing 380% more reports of the abuse in 2024 than in 2023. "The creation and distribution of AI-manipulated and fake sexual imagery of a child can have a devastating impact on the victim," said Derek Ray-Hill, interim chief executive at the IWF. "It can be used to blackmail and extort young people. There can be no doubt that real harm is inflicted and the capacity to create this type of imagery quickly and easily, even via an app on a phone, is a real cause for concern." Multiple paedophiles have been sent to jail for using artificial intelligence to create child sexual abuse images in recent years. Last year, Hugh Nelson was sentenced to 18 years in jail for creating AI-generated CSAM that police officers were able to link back to real children. "Tackling child sexual abuse is a priority for the NCA and our policing partners, and we will continue to investigate and prosecute individuals who produce, possess, share or search for CSAM, including AI-generated CSAM," said Alex Murray, the NCA's director of threat leadership and policing lead for artificial intelligence. In February, the government announced that AI tools designed to generate child sex abuse material would be made illegal under "world-leading" legislation. In the meantime, however, campaigners called for guidance to be issued to teachers. Laura Bates, the author of a book on the spread of online misogyny, told MPs earlier this month that deepfake pornography "would be the next big sexual violence epidemic facing schools, and people don't even know it is going on." "It shouldn't be the case that a 12-year-old boy can easily and freely access tools to create these forms of content in the first place," she said.


New Indian Express
4 days ago
- New Indian Express
Mizoram Child Rights Commission demands stringent punishment against man for sexual abuse
AIZAWL: Mizoram State Commission for Protection of Child Rights (MSCPCR) has demanded stringent punishment against a man, who was arrested recently in a case involving sexual assault of a minor and possession and circulation of child sexual abuse material (CSAM). Speaking to reporters in Aizawl, MSCPCR chairperson Jimmy Laltlanmawia on Tuesday said that Lalrampana was arrested by CBI on June 9 for allegedly sexually assaulting a minor and uploading a video of the act online. "MSCPCR expresses deep regret over the incident and demands stringent punishment against the accused," he said. Jimmy said that the POCSO act strictly prohibits sexual assault against children and possession or sharing online of child sexual-related video or picture. Any violator can be convicted with an imprisonment of 5 years and a fine of up to Rs. 10 lakh and another 7 years imprisonment and Rs 10 lakh fine for repeated offence, he said. Expressing regret over incidents of child sexual abuse in Mizoram, Jimmy said that the commission want that such incidents do not occur any more in the state. He said that there are currently 194 under-trial cases under the POCSO Act in various district courts across the state.


Business Upturn
4 days ago
- Business Upturn
Accelerating Child Exploitation Investigations: Cellebrite Integrates Data from the National Center for Missing and Exploited Children (NCMEC)
TYSONS CORNER, Va., June 24, 2025 (GLOBE NEWSWIRE) — Cellebrite (NASDAQ: CLBT), a global leader in premier Digital Investigative solutions for the public and private sectors, today announced the expansion of its relationship with the National Center for Missing and Exploited Children (NCMEC) that will help speed up investigations involving crimes against children. NCMEC's CyberTipline hash value list is now integrated within Cellebrite's flagship digital forensics software, Cellebrite Inseyets, allowing public safety agencies to immediately pinpoint known child sexual abuse material (CSAM) files – speeding up time to evidence and justice for victims and survivors of abuse. The hash value list contains approximately 10-million files reported by electronic service providers to NCMEC, which have been confirmed to depict apparent CSAM. Instead of spending hours reviewing data to locate CSAM on suspected offenders' devices, this integration allows digital forensic examiners and investigators around the world to match CSAM files instantly. This provides investigators with the evidence needed to arrest and prosecute offenders, and in parallel, limit law enforcement's exposure to the material, which helps protect their mental health. 'This integration represents a critical leap forward in our efforts to protect children and hold offenders accountable,' said John Shehan, Senior Vice President, Exploited Children Division & International Engagement at NCMEC. 'We're proud to strengthen our nine-year partnership with Cellebrite in the fight to end online child exploitation.' 'Any tool that speeds up time to evidence is critical for our teams,' said Ben Morrison, the Washington Internet Crimes Against Children (ICAC) Task Force Commander. 'Digital evidence is the holy grail in ICAC investigations, and this integration means getting to more cases and protecting more kids.' New Hampshire ICAC Task Force Commander Eric Kinsman adds, 'We are very excited about this integration. When a known CSAM match is made, it adds to the probable cause in an investigation which greatly increases our chances to arrest an offender, ensuring they are no longer a danger in our community.' 'Our mission is in lock step with NCMEC, and it's an honor to partner with them and help the heroes working these cases on the front lines,' said David Gee, Cellebrite's chief marketing officer. 'This integration will be a game changer and will undoubtedly save and prevent our most vulnerable from the most heinous crimes.' This integration, available to Cellebrite Design Partners for early access now and generally available the week of June 30, 2025, is part of Cellebrite's 'Operation Find Them All' (OFTA) initiative. The landmark program is helping public safety agencies use technology to protect children – alongside strategic partners including NCMEC, The Exodus Road and Raven. Since launching in January of 2024, OFTA has assisted in numerous investigations that have helped rescue hundreds of victims and resulted in the arrests of dozens of perpetrators. OFTA is playing an important, active, ongoing role in helping to further investigations where NCMEC is assisting public safety agencies in cases involving missing and endangered children. References to Websites and Social Media Platforms References to information included on, or accessible through, websites and social media platforms do not constitute incorporation by reference of the information contained at or available through such websites or social media platforms, and you should not consider such information to be part of this press release. About Cellebrite Cellebrite's (Nasdaq: CLBT) mission is to enable its global customers to protect and save lives by enhancing digital intelligence and accelerating justice in communities around the world. Cellebrite's AI-powered Case-to-Closure (C2C) platform enables customers to lawfully access, collect, analyze and share digital evidence in legally sanctioned investigations while preserving data privacy. Thousands of public safety organizations, intelligence agencies and businesses rely on the Company's cloud-ready digital forensic and investigative solutions to close cases faster and safeguard communities. To learn more, visit us at and find us on social media @Cellebrite. About NCMEC The National Center for Missing & Exploited Children is a private, non-profit 501(c)(3) corporation whose mission is to help find missing children, reduce child sexual exploitation, and prevent child victimization. NCMEC works with families, victims, private industry, law enforcement, and the public to assist with preventing child abductions, recovering missing children, and providing services to deter and combat child sexual exploitation. Contacts: MediaVictor CooperSr. Director of Global Corporate Communications [email protected] +1 404.510.2823


Express Tribune
5 days ago
- Express Tribune
Adult film actor Austin Wolf, pleaded guilty to enticing a 15-year-old into sexual activity.
Justin Heath Smith, professionally known as adult film actor Austin Wolf, has pleaded guilty to the federal charge of enticing a minor for sex, according to court documents cited by the New York Post. The plea, entered on June 20, comes nearly a year after his arrest in New York on child sexual abuse material (CSAM) charges. The federal offense of 'enticement of a minor' carries a mandatory minimum sentence of 10 years in prison. Smith expressed remorse during his court appearance, admitting to having induced a 15-year-old to engage in a sex act. 'I knew it was wrong when I did it. I take full 100% responsibility for my actions and I am prepared for the consequences,' he said. As part of the plea deal, the original CSAM charges—possession and distribution, which carried potential sentences of up to 20 years—will be dropped. The agreement also includes Smith's forfeiture of more than 25 electronic devices connected to the investigation. Smith had been under investigation since his arrest in June 2024, when FBI agents executed a search warrant at his apartment and allegedly discovered an SD card containing around 200 CSAM videos. According to the Justice Department's complaint, Smith was accused of using the encrypted messaging app Telegram to send and receive explicit material involving minors, including images of children under 12 and videos of infants. He is currently being held without bail at New York City's Metropolitan Detention Center, a high-security facility known for housing high-profile detainees. His sentencing is scheduled for September 9. Smith's attorneys had recently filed to reschedule a hearing, confirming his intention to plead guilty to the charge carrying the mandatory minimum sentence.