logo
#

Latest news with #KidsOnlineSafetyAct

Epstein Scandal Is Just the Beginning of Government's Failures To Protect Kids
Epstein Scandal Is Just the Beginning of Government's Failures To Protect Kids

Newsweek

time2 days ago

  • Politics
  • Newsweek

Epstein Scandal Is Just the Beginning of Government's Failures To Protect Kids

If people are really concerned about child sex trafficking, they should not treat it like a political tool. While it's always a good thing when child sexual exploitation receives the attention it deserves, the issue should be more than political fodder or a talking point for conspiracy theorists. It is, of course, important for the public to recognize and expose powerful people who exploit the vulnerable. That list is long—allegations have plagued Matt Gaetz, Bill Cosby, Robert Kraft, Linda McMahon, and Sean Combs, to name a few from recent headlines. The circumstances around Jeffrey Epstein's ability to obtain a secret plea deal while running an international child sex ring should absolutely be exposed. That being said, the problem of child sexual abuse and exploitation merits sustained attention. If people really care about sexual exploitation and trafficking, their focus should be on how the powerful fail to put children first both in politics and in society. A major battleground in the fight to protect children is social media and AI regulation. Following reports that leading Big Tech companies knew about their products' negative effects on children—including increased risks of sexual exploitation—legislators sprang into action. At the same time, however, other leading congressional figures have disrupted those efforts, allowing the exploitation to continue unchecked. People holds signs calling for the release of files regarding late sex offender Jeffrey Epstein participate in a protest as part of the 'Good Trouble Lives On' national day of action against the administration of... People holds signs calling for the release of files regarding late sex offender Jeffrey Epstein participate in a protest as part of the 'Good Trouble Lives On' national day of action against the administration of US President Donald Trump in Houston, Texas, on July 17, 2025. More RONALDO SCHEMIDT / AFP/Getty Images Last year, for example, House Speaker Mike Johnson (R-La.) and Majority Leader Steve Scalise (R-La.) did Big Tech's bidding to sink the Kids Online Safety Act, preventing it from receiving a vote in the House after it passed the Senate 91-3. Conservative news outlets identified Meta's control over House leadership as key to killing child protection legislation. This was the same congressional leadership that slipped a Big Tech-friendly non-budget provision into the budget bill that would have repealed all state legislation trying to curb AI exploitation of children. Senator Ted Cruz (R-Tex.) tried to do the same in the Senate, but was fortunately defeated when Sens. Marsha Blackburn (R-Tenn.) and Maria Cantwell (D-Wash.) had the courage to amend it—their amendment passed 99-1. But now that courage is being rewarded by Big Tech efforts to sneak the regulation moratorium into other legislation. Big Tech spent $51 million lobbying just last year, including to defeat legislation such as child online safety measures. Meta alone spent nearly $6 million dollars last quarter lobbying against, among others, child online safety provisions. The public should condemn these actions and, as we've already seen, if this child protection legislation can actually get to a vote, it passes in overwhelming margins. If those who are upset about Jeffrey Epstein really care about child sex trafficking, they should also be outraged at the Trump administration, which is currently undermining child sex trafficking protections. The Department of State "shut down" the office primarily responsible for combating human trafficking—the Office to Monitor and Prevent Human Trafficking—a move which one former U.S. ambassador-at-large to monitor and combat trafficking in persons describes as making it "impossible" to carry out what the law requires to address human trafficking. The administration also cut funding to a group that was tracking the nearly 20,000 Ukrainian children kidnapped by Russian soldiers from their families and brought to Russia, a well-known hub of human trafficking. Domestically, the Department of Justice cut more than $500 million in grants to victim services including "direct funding for victims' services for survivors of human trafficking." Additionally, it gutted the Civil Rights Division, which houses the Human Trafficking Prosecution Unit. If those following the Epstein scandal really care about the sexual assault of children, they should continue to act when the Epstein story fades, to address the government's assault on the nearly 40 percent of trafficking victims who are children. The government—by carrying Big Tech's water in legislatures, by cutting offices that enforce trafficking laws, and by cutting services to trafficking victims—is not only abandoning victims but facilitating their exploitation. Activists who truly care about child exploitation must continue to pressure Congress and the executive branch to act to protect children as vigorously as they demand answers in the Epstein case. Mary Graw Leary is a Professor of Law at the Catholic University of America focusing on criminal law and human trafficking and a former state and federal prosecutor. The views expressed in this article are the writer's own.

Instagram blocks 135K accounts for preying on kids, adds safety features
Instagram blocks 135K accounts for preying on kids, adds safety features

New York Post

time23-07-2025

  • Business
  • New York Post

Instagram blocks 135K accounts for preying on kids, adds safety features

Mark Zuckerberg's Instagram rolled out new safety features for teens and kids – admitting that it was forced to block nearly 135,000 accounts earlier this year for predatory behavior. Meta – which has faced heat from federal and state officials over its failure to protect kids – said on Wednesday its safety teams blocked the nearly 135,00 accounts 'for leaving sexualized comments or requesting sexual images from adult-managed accounts featuring children under 13.' 'We also removed an additional 500,000 Facebook and Instagram accounts that were linked to those original accounts,' the company said in a blog post. Advertisement 3 Teen users will get new warnings about unknown accounts. Meta 'We let people know that we'd removed an account that had interacted inappropriately with their content, encouraging them to be cautious and to block and report,' the post added. Now, teen users will be given more information when unknown accounts send them direct messages – including details on when the account was created and tips on how to stay safe, the company said in a Wednesday blog post. Advertisement Teen accounts on Instagram also have an updated 'block and report' feature, allowing users to immediately flag predatory accounts while blocking them, rather than having to do so separately. In June alone, teen users blocked accounts one million times and reported an additional one million after being shown a safety notice, according to Instagram. The social media giant has scrambled to reassure the public that Facebook and Instagram are safe. 3 Instagram has faced mounting scrutiny over its failure to protect kids. wichayada – Advertisement As The Post reported, the problem recently surfaced during the FTC's trial seeking a forced spinoff of Instagram and WhatsApp. The feds presented internal documents showing how Meta officials had panicked in past years about 'groomers' targeting kids on Instagram. Last year, Instagram began automatically placing users under age 18 into 'teen accounts' and blocking people who do not follow them from viewing their content or interacting with them. The company has also introduced features designed to automatically shield underage users from messages containing nude images. 3 Meta CEO Mark Zuckerberg is pictured. REUTERS Advertisement Earlier this year, a bipartisan group of US senators reintroduced the Kids Online Safety Act, which would enact a legal 'duty of care' on Meta and other social media firms to protect underage users from harm. The legislation passed the Senate last year in an overwhelming 91-3 vote, but stalled in the House and was ultimately tabled.

Meta updates safety features for teens. More than 600,000 accounts linked to predatory behavior
Meta updates safety features for teens. More than 600,000 accounts linked to predatory behavior

CNBC

time23-07-2025

  • Business
  • CNBC

Meta updates safety features for teens. More than 600,000 accounts linked to predatory behavior

Meta on Wednesday introduced new safety features for teen users, including enhanced direct messaging protections to prevent "exploitative content." Teens will now see more information about who they're chatting with, like when the Instagram account was created and other safety tips, to spot potential scammers. Teens will also be able to block and report accounts in a single action. "In June alone, they blocked accounts 1 million times and reported another 1 million after seeing a Safety Notice," the company said in a release. This policy is part of a broader push by Meta to protect teens and children on its platforms, following mounting scrutiny from policymakers who accused the company of failing to shield young users from sexual exploitation. Meta said it removed nearly 135,000 Instagram accounts earlier this year that were sexualizing children on the platform. The removed accounts were found to be leaving sexualized comments or requesting sexual images from adult-managed accounts featuring children. The takedown also included 500,000 Instagram and Facebook accounts that were linked to the original profiles. Meta is now automatically placing teen and child-representing accounts into the strictest message and comment settings, which filter out offensive messages and limit contact from unknown accounts. Users have to be at least 13 to use Instagram, but adults can run accounts representing children who are younger as long as the account bio is clear that the adult manages the account. The platform was recently accused by several state attorneys general of implementing addictive features across its family of apps that have detrimental effects on children's mental health. Meta announced last week it removed about 10 million profiles for impersonating large content producers through the first half of 2025 as part of an effort by the company to combat "spammy content." Congress has renewed efforts to regulate social media platforms to focus on child safety. The Kids Online Safety Act was reintroduced to Congress in May after stalling in 2024. The measure would require social media platforms to have a "duty of care" to prevent their products from harming children. Snapchat was sued by New Mexico in September, alleging the app was creating an environment where "predators can easily target children through sextortion schemes."

TikTok loses bid to dismiss lawsuit alleging its 'addictive design' exploits kids
TikTok loses bid to dismiss lawsuit alleging its 'addictive design' exploits kids

CNBC

time11-07-2025

  • Business
  • CNBC

TikTok loses bid to dismiss lawsuit alleging its 'addictive design' exploits kids

A New Hampshire judge this week rejected TikTok's attempt to dismiss a case accusing it of using manipulative design features aimed at children and teens. "The Court's decision is an important step toward holding TikTok accountable for unlawful practices that put children at risk," said Attorney General John Formella in a statement Friday. The ruling from Judge John Kissinger Jr. of New Hampshire Superior Court on Tuesday ruled that the allegations were valid and specific enough to proceed, writing the claims are "based on the App's alleged defective and dangerous features" and not the content in the app. The state alleges that social media platform TikTok is intentionally designed to be addictive and aims to exploit its young user base. The suit accuses the platform of implementing "addictive design features" meant to keep children engaged longer, increasing their exposure to advertisements and prompting purchases through TikTok's e-commerce platform, TikTok Shop. CNBC has reached out to TikTok for comment. The case is the latest example of attorneys general targeting design elements and safety policies from tech companies instead of the content on the platforms, which is created by other users. Meta was accused by several states of implementing addictive features across its family of apps that have detrimental effects on children's mental health. New Mexico filed a lawsuit against Snapchat in September, alleging the app was creating an environment where "predators can easily target children through sextortion schemes." In April, social-messaging platform Discord was sued by the New Jersey attorney general over misleading consumers about child safety features. Congress has attempted to take action on regulating social media platforms, but to no avail. The Kids Online Safety Act was reintroduced to Congress in May after stalling in 2024. The measure would require social media platforms to have a "duty of care" to prevent their products from harming children. TikTok's latest legal hurdle comes as its future in the U.S. remains uncertain. In April 2024, President Joe Biden signed a law requiring ByteDance to divest of TikTok or see the app banned in the U.S. The app was removed from Apple and Google app stores in January ahead of President Donald Trump's inauguration. Since taking office, Trump has postponed enforcement of the ban and continued to push back deadlines. In June, Trump granted ByteDance more time to sell off its U.S. TikTok operations, marking his third extension. The updated deadline is now set for September 17. Trump also said in June that a group of "very wealthy people" is ready to buy TikTok and told reporters that he would be having discussions with China about a potential deal. TikTok is now building a new version of its app for U.S. users, The Information reported. The standalone app is expected to operate on a separate algorithm and data system.

Behind the tanked AI moratorium
Behind the tanked AI moratorium

Politico

time03-07-2025

  • Politics
  • Politico

Behind the tanked AI moratorium

With help from Mohar Chatterjee Programming note: We'll be off Friday but will be back in your inboxes on Monday. WASHINGTON WATCH At the center of Sen. Marsha Blackburn's (R-Tenn.) decision this week to withdraw support for an artificial intelligence measure she worked on with Sen. Ted Cruz (R-Texas), was her Kids Online Safety Act, according to four people familiar with the situation. The four people, who were granted anonymity to speak about private discussions, said Blackburn had hoped to limit the scope and span of a pause on state AI rules and in turn advance her legislation to a markup in the Senate Commerce Committee, which Cruz chairs. Key context: Last year, Blackburn and her bill co-sponsor Sen. Richard Blumenthal (D-Conn.) nearly got the measure passed. The bill, which requires tech companies to ensure their platforms are designed for kids' safety, passed 91-3 in the Senate. Despite the support of former Surgeon General Vivek Murthy, 35 attorneys general and doctor groups, the bill never came to the floor for a vote in the House. Since then, progress has been slow. Cruz hasn't brought up KOSA for a markup, even though his committee has advanced other online safety bills this year, including the Kids Off Social Media Act, which he co-sponsored with Sen. Brian Schatz (D-Hawaii), as well as the Children and Teens Online Privacy Protection Act. In hopes of moving her bill, Blackburn worked out an amendment with Cruz on the AI moratorium, the four people said. The amendment would have allowed states to enforce laws passed in recent years to keep kids safe from online sexual predators, bullying, drug sales and other negative health impacts. But the bill still prevented states from putting 'undue or disproportionate burden' on AI systems. What happened next: Midday Monday, a coalition of 130 organizations wrote to Senate leaders explaining that the exemption in Blackburn and Cruz's amendment wouldn't work because state laws protecting children and adults online are intended to burden AI platforms. Some 17 Republican governors also opposed the moratorium. including Tennessee Gov. Bill Lee. Blackburn went to Cruz on Monday and tried to get the 'undue or disproportionate burden' language removed from the bill. But Cruz wouldn't budge, one of the people said. By Monday night, she amended her amendment with Cruz's legislation to include KOSA, tying the amendment's passage to a bill that would create a national framework for protecting kids, according to three of the people. The move was symbolic — the amendment was not likely to pass. But it signaled her priority. She then teamed up with Sen. Maria Cantwell (D-Wash.) to introduce a new amendment striking the original AI moratorium from the megabill altogether. In the predawn hours on Tuesday that amendment passed 99-1. (Sen. Thom Tillis (R-N.C.) was the lone nay vote.) Cruz, clearly frustrated, said the moratorium had Trump's support. Cruz office didn't return a request for comment, and Blackburn's office declined to comment. WELCOME TO FUTURE PULSE This is where we explore the ideas and innovators shaping health care. We will be off tomorrow for the holiday — and will be back in your inboxes on Monday! Share any thoughts, news, tips and feedback with Danny Nguyen at dnguyen@ Carmen Paun at cpaun@ Ruth Reader at rreader@ or Erin Schumaker at eschumaker@ Want to share a tip securely? Message us on Signal: Dannyn516.70, CarmenP.82, RuthReader.02 or ErinSchumaker.01. CHECKUP A health care system serving 2.5 million patients across Massachusetts is turning to artificial intelligence to keep people out of the emergency room during heat waves. When extreme heat hits, emergency departments are packed with people experiencing dehydration, heat cramps or kidney or heart problems, reports POLITICO's Ariel Wittenberg. 'We'll see a 10 percent jump of people in the emergency department not just for heat illness, but also weakness or syncope or other conditions due to the heat,' said Dr. Paul Biddinger, chief preparedness and continuity officer at Mass General Brigham, a nonprofit academic health system that's developing a new alert system to warn people about the dangers of heat waves. In February, MGB was one of five applicants to join a sustainability accelerator operated by IBM. The program aims to help communities facing environmental and economic stress through the use of technology. IBM received more than 100 proposals on how to use AI to advance climate sustainability and resilience. The idea is simple: Use AI to comb through electronic health records to identify patients who have health conditions or take drugs that might make them particularly vulnerable to heat. The AI program would warn them when a heat wave is approaching and instruct patients on how to protect themselves so they don't end up in an emergency room. The tool would include security features to protect patient health information. Ideally, the combination of personalized information, real-time heat data, and 'actionable messages' will help empower patients to protect themselves. 'We think patients will pay more attention if it is their doctor, their hospital saying, 'Hey, you're at risk and here's what to do,' than if they just see on the news that it will be hot tomorrow,' Biddinger said. Why it matters: Heat kills an estimated 2,300 people every year, in the United States, according to federal records, more than any other type of extreme weather event, and results in the hospitalization of thousands of others. Those numbers are expected to increase as climate change turbocharges temperatures, with one estimate from the Center for American Progress calculating that emergency rooms could be inundated with an additional 235,000 visitors each summer. The same report estimates that health care costs related to extreme heat would amount to $1 billion annually.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store