Latest news with #illegalContent
Yahoo
a day ago
- Business
- Yahoo
Tech firms face demands to stop illegal content going viral
Tech platforms could be forced to prevent illegal content from going viral and limit the ability for people to send virtual gifts to or record a child's livestream, under more online safety measures proposed by Ofcom. The UK regulator published a consultation on Monday seeking views on further protections to keep citizens, particularly children, safer online. These could also include making some larger platforms assess whether they need to proactively detect terrorist material under further online safety measures. Oliver Griffiths, online safety group director at Ofcom, said its proposed measures seek to build on existing UK online safety rules but keep up with "constantly evolving" risks. "We're holding platforms to account and launching swift enforcement action where we have concerns," he said. "But technology and harms are constantly evolving, and we're always looking at how we can make life safer online." The consultation highlighted three main areas in which Ofcom thinks more could be done: stopping illegal content going viral tackling harms at source giving further protections to children The BBC has approached TikTok, livestreaming platform Twitch and Meta - which owns Instagram, Facebook and Threads - for comment. Ofcom's range of proposals target a number of issues - from intimate image abuse to the danger of people witnessing physical harm on livestreams - and vary in what type or size of platform they could apply to. For example, proposals that providers have a mechanism to let users report a livestream if its content "depicts the risk of imminent physical harm" would apply to all user-to-user sites that allow a single user to livestream to many, where there may be a risk of showing illegal activity. Meanwhile potential requirements for platforms to use proactive technology to detect content deemed harmful to children, would only apply to the largest tech firms which present higher risks of relevant harms. "Further measures are always welcome but they will not address either the systemic weaknesses in the Online Safety Act," said Ian Russell, chair of the Molly Rose Foundation - an organisation set up in memory of his 14-year-old daughter Molly Russell, who took her own life after viewing thousands of images promoting suicide and self-harm. He added that Ofcom showed a "lack of ambition" in its approach to regulation. "As long as the focus is on sticking plasters not comprehensive solutions, regulation will fail to keep up with current levels of harm and major new suicide and self-harm threats," Mr Russell said. "It's time for the prime minister to intervene and introduce a strengthened Online Safety Act that can tackle preventable harm head on by fully compelling companies to identify and fix all the risks posed by their platforms." What the Online Safety Act is - and how to keep children safe online The consultation is open until 20 October 2025 and Ofcom hopes to get feedback from service providers, civil society, law enforcement and members of the public. It comes as tech platforms look to bring their services in line with the UK's sweeping online safety rules that Ofcom has been tasked with enforcing. Some have already taken steps to try and clamp down on features that experts have warned may expose children to grooming, such as through livestreaming. In 2022, TikTok banned children raised its minimum age for going live on the platform from 16 to 18 - shortly after a BBC investigation found hundreds of accounts going live from Syrian refugee camps with children begging for donations. YouTube recently said it would increase its threshold for users to livestream to 16, from 22 July. Sign up for our Tech Decoded newsletter to follow the world's top tech stories and trends. Outside the UK? Sign up here. Pornhub to introduce 'government approved' age checks in UK Ofcom: Clear link between online posts and violent disorder Man threatens to kill Twitch streamers during livestream from LA


BBC News
a day ago
- Business
- BBC News
Tech firms face demands to stop illegal content going viral
Tech platforms could be forced to prevent illegal content from going viral and limit the ability for people to send virtual gifts to or record a child's livestream, under more online safety measures proposed by UK regulator published a consultation on Monday seeking views on further protections to keep citizens, particularly children, safer could also include making some larger platforms assess whether they need to proactively detect terrorist material under further online safety Griffiths, online safety group director at Ofcom, said its proposed measures seek to build on existing UK online safety rules but keep up with "constantly evolving" risks. "We're holding platforms to account and launching swift enforcement action where we have concerns," he said."But technology and harms are constantly evolving, and we're always looking at how we can make life safer online."The consultation highlighted three main areas in which Ofcom thinks more could be done:stopping illegal content going viraltackling harms at sourcegiving further protections to childrenThe BBC has approached TikTok, livestreaming platform Twitch and Meta - which owns Instagram, Facebook and Threads - for range of proposals target a number of issues - from intimate image abuse to the danger of people witnessing physical harm on livestreams - and vary in what type or size of platform they could apply example, proposals that providers have a mechanism to let users report a livestream if its content "depicts the risk of imminent physical harm" would apply to all user-to-user sites that allow a single user to livestream to many, where there may be a risk of showing illegal potential requirements for platforms to use proactive technology to detect content deemed harmful to children, would only apply to the largest tech firms which present higher risks of relevant harms."Further measures are always welcome but they will not address either the systemic weaknesses in the Online Safety Act," said Ian Russell, chair of the Molly Rose Foundation - an organisation set up in memory of his 14-year-old daughter Molly Russell, who took her own life after viewing thousands of images promoting suicide and added that Ofcom showed a "lack of ambition" in its approach to regulation."As long as the focus is on sticking plasters not comprehensive solutions, regulation will fail to keep up with current levels of harm and major new suicide and self-harm threats," Mr Russell said."It's time for the prime minister to intervene and introduce a strengthened Online Safety Act that can tackle preventable harm head on by fully compelling companies to identify and fix all the risks posed by their platforms."What the Online Safety Act is - and how to keep children safe onlineThe consultation is open until 20 October 2025 and Ofcom hopes to get feedback from service providers, civil society, law enforcement and members of the comes as tech platforms look to bring their services in line with the UK's sweeping online safety rules that Ofcom has been tasked with have already taken steps to try and clamp down on features that experts have warned may expose children to grooming, such as through 2022, TikTok banned children raised its minimum age for going live on the platform from 16 to 18 - shortly after a BBC investigation found hundreds of accounts going live from Syrian refugee camps with children begging for recently said it would increase its threshold for users to livestream to 16, from 22 July. Sign up for our Tech Decoded newsletter to follow the world's top tech stories and trends. Outside the UK? Sign up here.


The Verge
10-06-2025
- The Verge
That's what UK regulator Ofcom is investigating under
Is 4chan doing enough to protect kids? Online Safety Act rules, alongside complaints about 'the potential for illegal content and activity' on the platform. Anyone familiar with the controversial web forum could have this probe wrapped up by lunch, but let's see how long it takes them.


Bloomberg
23-05-2025
- Bloomberg
Vietnam Takes Steps to Block Telegram App Over Illegal Content
Vietnam authorities have instructed local telecoms and internet service providers to block the Telegram messaging app in the country, for allegedly failing to prevent illegal content and anti-government activities being carried out by its users. The majority of Telegram groups in Vietnam contain what the regulator called 'toxic' information, including groups accused of spreading anti-government content and crimes such as fraud, selling user data and drug trafficking, according to a statement on the government's website, citing public security ministry data.


The Sun
14-05-2025
- Business
- The Sun
Porn site faces being BLOCKED as watchdog launches probe into illegal content complaints
A PORN site is under investigation after complaints of suspected illegal material appearing on it. In serious cases, XXX platforms can now be blocked from the UK if owners are found to have broken tough new laws and fail to make drastic changes. 2 2 Regulator Ofcom says it is looking into whether the platform breached the UK's new online safety laws after bosses failed to respond to requests for key information. The watchdog was prompted into action after receiving complaints about potential illegal content and activity, including child sexual abuse material and extreme pornography. Two investigations are being carried out into the site's owner Kick Online Entertainment. Ofcom had asked the company to provide a risk assessment over the potential for illegal content to appear on the porn site. Due to Kick Online Entertainment's failure to respond, the regulator said it was now investigating whether the firm had not met legal requirements to complete and keep a record of an illegal content risk assessment - as well as failing to respond to an information request. "In light of this, we will also be considering whether the provider has put appropriate safety measures in place to protect its UK users from illegal content and activity and may launch an additional investigation into its compliance with this duty if appropriate," Ofcom said. The Online Safety Act was passed in 2023 in a bid to make the internet safer, particularly for children. In March, the regulator kick-started a programme to check website operators are complying with their duties. Under the law, Ofcom can impose hefty fines of up to £18million or 10 per cent of the a company's worldwide revenue, whichever is greater. In serious cases, it can seek a court order requiring payment providers and advertisers to stop working on the platform. First country in the world BANS social media for under-16s outlawing Instagram & TikTok in move that could sweep globe It can even get internet service providers to block access to the site from being visible in the UK. "We will now gather and analyse evidence to determine whether a contravention has occurred. "If our assessment indicates a compliance failure, we will issue a provisional notice of contravention to the provider, who can then make representations on our findings, before we make our final decision. "We will provide regular updates as these investigations progress." What is the Online Safety Act? The online safety act is a new set of duties that social media companies and search services have to comply with to operate in the UK. Media regulator Ofcom is responsible for keeping relevant tech companies in check against the law. Enforcement has been introduced in stages. Platforms must protect users from a range of content, including child sexual abuse, fraud and terrorism. And the sites must take steps to prevent children from accessing things like pornography or content that encourages suicide. There are also offences that apply to individuals too, including: encouraging or assisting serious self-harm cyberflashing sending false information intended to cause non-trivial harm threatening communications intimate image abuse epilepsy trolling