logo
#

Latest news with #HeatInitiative

Apple's App Store is getting stricter
Apple's App Store is getting stricter

Phone Arena

time4 days ago

  • Phone Arena

Apple's App Store is getting stricter

Earlier this summer, Apple introduced iOS 26, which is now already in its public beta, and with it came a bunch of new child safety features – including one that can freeze your FaceTime video and audio if someone starts undressing during a call. Now, Apple's pushing even further with a behind-the-scenes part of its new family protection tools, Apple is automatically changing the age ratings for all apps and games on the App Store. These updates are already live if you are running the beta versions of iOS 26 , iPadOS 26, macOS Tahoe 26, tvOS 26, visionOS 26, or watchOS 26. – Apple, July 2025 Up until now, App Store ratings were pretty limited – mostly sticking to 4+ and 9+. But with this new system, Apple is adding 13+, 16+, and 18+ categories to better reflect what kind of content an app contains. Ratings will still vary by region depending on local content standards. To get those new age labels right, Apple is also updating the questionnaire developers have to fill out when submitting an app. There are now new required questions covering everything from in-app controls and capabilities to health-related content and violent themes. Basically, Apple's trying to get a clearer picture of what each app actually does – and whether it's safe for younger users. So why is this happening now? My guess is that a few things may have pushed Apple in this direction. One reason: criticism. Last year, a report from the Heat Initiative and ParentsTogether Action slammed both Apple and Google for not doing enough to keep kids safe in their app stores. Out of nearly 800 apps reviewed, over 200 were flagged for 'concerning content or features' – many of them still being marketed to children. Messaging apps like Messenger, for example, are still rated 4+, which some see as a problem. | Screenshot by PhoneArena Another factor could be legal pressure. A new law called the App Store Accountability Act was introduced earlier this year and it could eventually force companies like Apple and Google to verify users' ages before they can download apps. While it's not yet a nationwide rule, a few states – including Texas – have already passed it, meaning stricter age checks could be just around the corner. Bottom line: this new age rating system isn't just a cosmetic change. It's Apple stepping up its game to create a safer App Store – and possibly to stay ahead of new regulations.

Shareholders to Demand Action from Mark Zuckerberg and Meta on Child Safety
Shareholders to Demand Action from Mark Zuckerberg and Meta on Child Safety

Yahoo

time27-05-2025

  • Health
  • Yahoo

Shareholders to Demand Action from Mark Zuckerberg and Meta on Child Safety

Investors will vote on child safety resolution at Meta's Annual General Meeting MENLO PARK, Calif., May 27, 2025 /PRNewswire/ -- Tomorrow, Meta shareholders will vote on a resolution asking Meta to assess its child safety impacts and whether harm to children on its platform has been reduced. The vote follows reports that the company's Instagram Teens feature "fails spectacularly on some key dimensions", including promoting sexual, racist, drug and alcohol-related content. The resolution - filed by Proxy Impact on behalf of Dr. Lisette Cooper and co-filed by 18 institutional investors from North America and Europe - will be presented by child safety advocate Sarah Gardner, CEO of the Heat Initiative. "Two weeks ago, I stood outside of Meta's office in NYC with bereaved parents whose children died as a result of sextortion, cyberbullying, and drug purchases on Meta's platforms and demanded stronger protections for kids," said Sarah Gardner, CEO of the Heat Initiative, "Meta's most recent 'solution' is a bandaid. They promised parents that Instagram Teens would protect their kids from harm. In reality, it still recommends sexual, racist, and violent content on their feeds. We are asking shareholders to hold Mark Zuckerberg and Meta accountable and demand greater transparency about why child safety is still lagging." "Meta algorithms designed to maximize user engagement have helped build online abuser networks, normalize cyberbullying, enable the exponential growth of child sexual abuse materials, and flood young users with addictive content that damages their mental health," said Michael Passoff, CEO of Proxy Impact, "And now, a major child safety concern is Meta's doubling down on AI despite the unique threats it poses to young users. Just this year, the National Center for Missing and Exploited Children saw 67,000 reports of suspected child sexual exploitation involving Generative AI, a 1,325% increase from 2023. Meta's continued failure to address these issues poses significant regulatory, legal, and reputational risk in addition to innumerable young lives." The resolution asks the Meta Board of Directors to publish "a report that includes targets and quantitative metrics appropriate to assessing whether and how Meta has improved its performance globally regarding child safety impacts and actual harm reduction to children on its platforms." Additional information for shareholders has been filed with the SEC. Meta has been under pressure for years linked to online child safety risks, including: 41 States and the District of Columbia Attorney's General filing lawsuits alleging that Meta Platforms has intentionally built programs with addictive features that harm young users. 1 out of 8 eight kids under 16 reported experiencing unwanted sexual advances on Instagram in the last 7 days according to Meta's internal research. A leading psychologist resigned from her position on Meta's SSI expert panel on suicide prevention and self harm, alleging Meta is willfully neglecting harmful content, disregarding expert recommendations, and prioritizing financial gain. As many as 100,000 children were sexually harassed daily on Meta platforms in 2021. Meta took no action until they were called for Senate testimony 3 years later. Internal research leaked by Meta whistleblower Frances Haugen showed that the company is aware of many harms including Instagram's toxic risks to teenage girls mental health including thoughts of suicide and eating disorders. Since 2019, Proxy Impact and Dr. Cooper have worked with members of the Interfaith Center on Corporate Responsibility, pension funds, foundations, and asset managers to empower investors to utilize their leverage to encourage Meta and other tech companies to strengthen child safety measures on social media. Proxy Impact provides shareholder engagement and proxy voting services that promote sustainable and responsible business practices. For more information, visit Heat Initiative works to hold the world's most valuable and powerful tech companies accountable for failing to protect kids from online child sexual exploitation. Heat Initiative sees a future where children's safety is at the forefront of any existing and future technological developments. Contact: Sloane Perry, sloane@ View original content: SOURCE Heat Initiative

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store