logo
#

Latest news with #Arneill

Video-sharing sites must enforce age verification or face huge fines
Video-sharing sites must enforce age verification or face huge fines

Extra.ie​

time21-07-2025

  • Business
  • Extra.ie​

Video-sharing sites must enforce age verification or face huge fines

Robust age verification checks must be strictly enforced from today on video-sharing platforms based in Ireland. Video-sharing platforms headquartered in the EU are required to implement age verification measures to protect children from accessing harmful content, including pornography and violent material. These platforms must use age assurance techniques and provide parental controls. Failure to comply can result in significant penalties. Tech companies can now be fined up to € 20 million, or 10% of their turnover, whichever is greater, for failing to comply with the requirements under the new code. Robust age verification checks must be strictly enforced from today on video-sharing platforms based in Ireland. Pic: Getty Images The Online Safety Code, implemented by media regulator Coimisiún na Meán, mandates these measures. It applies to platforms such as Facebook, Instagram, YouTube, and TikTok, which have their European headquarters in Ireland. The new code also requires platforms to have systems for users to report harmful content, and for the platform to act on those reports. The age verification requirements are part of a broader effort to create a safer online environment for children and address harmful content such as cyberbullying and promotion of self-harm and eating disorders. Robust age verification checks must be strictly enforced from today on video-sharing platforms based in Ireland. Pic:These measures align with the EU Digital Services Act and the EU Terrorist Content Online Regulation. Coimisiún na Meán says that a person simply ticking a box to say they are over 18 will no longer be sufficient. Philip Arneill, head of education and innovation at CyberSafeKids, told RTÉ radio: 'What we are told is that this is the end of self-regulation by online service providers, social media platforms and the like. 'Some of the key things [from the code] is that it prohibits the uploading and sharing of restricted video content, which may be harmful to particularly younger people, such as [content about] eating disorders, self-harm and suicide. 'It's also required on the platforms to have robust parental controls that allow [parents] to restrict kids' access and, in addition, there will be clear and accessible ways to report violations.' Mr Arneill added that more than 84% of trends and usage research shows that younger users aged eight to 12 have one or more social media platforms, 'so it is up to these platforms [TikTok, Instagram etc] to figure this out and make sure they have age verification in place. 'Meta, for example, took a gross profit last year of $134billion (€115m) and we are often told it is difficult,' he said. 'Kids aged ten or 11 are smart enough to use older ages when gaining access. 'It's stretching the bounds of credibility to firstly suggest that these companies don't have the ability, talent and the resources to figure this out [new age limit rules] and for putting responsibility for that on other people, whether it's parents, educators, charities and whoever else. 'Responsibility has now shifted to them [companies]. Finally, their hands are being forced. After 12 to 24 months, we would want this reviewed to see if this is working.' Snapchat, which is used by many younger users, is not based here so will not be bound by the new regulations.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store