logo
#

Latest news with #TaraHopkins

Meta backs EU digital majority age
Meta backs EU digital majority age

RTÉ News​

time04-07-2025

  • Business
  • RTÉ News​

Meta backs EU digital majority age

Meta, the parent company of Facebook, Instagram and WhatsApp, has said it supports proposals for a common digital majority age across EU member states. France, Spain and Greece recently proposed the idea of a "digital majority" or the age below which it would be forbidden for children to connect to social media platforms. Under the plans, there would be an EU-wide age of digital adulthood, below which minors would need parental consent to log onto social media. The countries are also proposing the integration of age verification and parental control systems for devices connected to the internet. Meta said it would support proposals whereby parents need to approve their younger teens' access to digital services. "We believe this can be an effective solution to the industry-wide challenge of ensuring teens have safe, age-appropriate experiences online," Meta said in a newsroom post. The company added that any new provisions should apply broadly across the digital services teens use - not just to social media platforms, but also gaming, streaming, messaging, and browsing. Meta has also reiterated its call for age verification mechanisms at the app store or operating system level. "I think it makes much more sense that this is done at the ecosystem, app store, operating system level," said Global Director of Public Policy at Instagram Tara Hopkins. "A signal can then be shared across multiple apps, and the decision gets made at that app store level as to whether or not you can download the app," Ms Hopkins said. Meta said its support for an EU-wide digital majority age is not an endorsement of government mandated social media bans, which the company claimed, "take away parental authority and focus narrowly on one type of online service". In May, Tánaiste Simon Harris said serious consideration should be given to banning under-16s from using social media, similar to a law passed last year in Australia. Video sharing platforms based in Ireland will face new regulatory obligations to verify users' ages before showing adult content from 21 July under Coimisiún na Meán's online safety code. The Commission said it will not mandate any specific technology, but that the age verification systems must be robust and privacy-respecting and must not hold data for longer than is necessary.

Instagram will restrict teens from going Live, as Teen Accounts expand to Facebook and Messenger
Instagram will restrict teens from going Live, as Teen Accounts expand to Facebook and Messenger

Yahoo

time08-04-2025

  • Business
  • Yahoo

Instagram will restrict teens from going Live, as Teen Accounts expand to Facebook and Messenger

Teens on Instagram won't be able to broadcast Live to their friends without getting parental permission first, as Meta amps up youth safety features for its Teen Accounts across all its platforms. In addition to stronger restrictions on going Live for youth under the age of 16, the platform now requires teens get parental consent to turn off content moderation filters that blur images containing suspected nudity in direct messages — adding to a suite of safety features announced last year. And it's not just for Instagram now: The parent company will also begin rolling out Teen Accounts to Facebook and Messenger today (April 8). Parental supervision for Teen Accounts can be accessed on Meta's Family Center. SEE ALSO: Five years of remote work changed workplace accessibility. Employees with disabilities will feel its loss. Teen Accounts have quickly become Meta's flagship youth product, said Tara Hopkins, global director of public policy at Instagram. "Everything our youth teams are building is being built under our Best Interests of the Child Framework. Then it goes through a multi-framework youth review, and finally it's looked at through Teen Accounts," Hopkins explained to Mashable. "We're going to be increasingly using Teen Accounts as an umbrella, moving all of our [youth safety] settings into it. Anything that parents are adjacent to, that we think parents are going to be worried about or have questions about will be moved under Teen Accounts." Parents can now supervise Teen Accounts on Facebook and Messenger. Credit: Meta Parents and teens will be notified of settings changes. Credit: Meta According to Meta, more than 54 million teens have been moved into a restricted Teen Account since the initial rollout, with 97 percent of users under the age of 16 keeping the platform's default security settings. Teens 13-15 have stronger restrictions, including requiring parental permission to make any adjustments to the platform's youth accounts. Meta users aged 16 years and older have more flexibility to change their settings at will. The company launched Teen Accounts for Instagram in September, part of an app-wide overhaul of its teen safety offerings that centralized security and content restrictions under one platform banner. Teen Accounts are automatically set to private, have limited messaging capabilities, and built-in screen time controls — Instagram also limits (but doesn't ban) ad targeting for teen users. New users are now placed into a Teen Account by default, while existing teen users are still in the process of being transferred over. Meta said finding and transitioning existing accounts remains difficult. The company has previously stated it is developing an in-house, AI-powered technology to help detect teen accounts that have bypassed the automatic rollout or that have incorrect birthdays, in addition to current age verification processes. The effort, Hopkins explained, is part of a "more precautionary principle" taken up by the company in recent years, in order to "take off the pressure" from parents who have had to remain more vigilant in the past. Stronger content restrictions have become a hot topic for Meta, with ongoing concerns about children being exposed to harmful or explicit content. Meta has spent years reconciling demands to curb widespread misinformation and harassment across its platforms. But while Meta cracks down on youth endangerment, the company has reversed course on content moderation and safety generally, including slashing its third-party fact-checking team, cutting DEI programs, and gutting its hateful conduct policy.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store