Tech secretary to slash red tape in bid to boost tech growth
Speaking at the TechUK conference on Monday, Kyle said that 'there is no route to long term growth without innovation'.
He also announced plans for the first ever dedicated strategy for the digital and technology sector, which is centred on pro-innovation regulation.
Kyle highlighted the urgent need to remove unnecessary barriers that slow down new technology.
He cited an ongoing trial in London where medical drones are speeding up blood sample deliveries, a project that could be derailed by a single nose complaint.
Under the government's new approach, such regulatory obstacles will be removed to ensure tech reaches the market quickly and safely.
To lead this transformation, former science minister Lord David Willetts has been appointed as the first chair of the Regulatory Innovation Office (RIO), tasked with modernising rules to accelerate game-changing technology.
The government is also investing in cutting edge technology, with Kyle announcing £12m for ten winners of Innovate UK's quantum missions pilot, to advance quantum computing and networking.
Kyle also outlined how the government's Invest 2035 strategy will harness engineering biology, AI, semiconductors, cyber, quantum, and telecoms to build a stronger economy and improve lives across the UK.
He emphasised that the nation must be a stable partner for researchers and businesses, working alongside them to tackle the biggest challenges of the decade ahead.
Lord Willetts, now leading the RIO, said he was 'honoured to shape regulatory approaches that empower new technologies'.
This comes as the government announces its plans to hit regulators with performance targets to drive innovation.

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles
Yahoo
a day ago
- Yahoo
Judge us by impact of new online safety measures for children, says regulator
Parents and children can expect to 'experience a different internet for the first time', according to the Technology Secretary as new safety measures came into effect. Peter Kyle said he had 'high expectations' for the changes, as the head of the regulator in charge of enforcement against social media platforms which do not comply urged the public to 'judge us by the impact we secure'. While some campaigners have welcomed the new protections – which include age checks to prevent children accessing pornography and other harmful content – others have branded them a 'sticking plaster'. Charities and other organisations working in the sector of children's safety have agreed the key will be ensuring the measures are enforced, urging Ofcom to 'show its teeth'. The changes also require platforms to ensure algorithms do not work to harm children by, for example, pushing such content on the likes of self harm and eating disorders towards them. Actions which could be taken against firms which fail to adhere to the new codes include fines of up to £18 million or 10% of qualifying worldwide revenue, whichever is greater, and court orders potentially blocking access in the UK. Mr Kyle has said a generation of children will not be allowed to grow up 'at the mercy of toxic algorithms' as he pledged the Government is laying the foundations for a safer, healthier, more humane online world and warned tech firms 'will be held to account' if they fail to act in line with the changes. He told Sky News: 'I have very high expectations of the change that children will experience. 'And let me just say this to parents and children, you will experience a different internet really, for the first time in from today, moving forward than you've had in the past. And that is a big step forward.' The measures, as part of the Online Safety Act and set to be enforced by regulator Ofcom, require online platforms to have age checks – using facial age estimation or credit card checks – if they host pornography or other harmful content such as self-harm, suicide or eating disorders. Ofcom chief executive Dame Melanie Dawes said the regulator's research had shown half a million eight to 14-year-olds have come across pornography online in the last month alone. When it was put to her by the BBC that one of their staff members testing out the new measures had been able to sign up to a well-known porn site on Friday using just an email address, she said sites will be 'checking patterns of email use' behind the scenes to verify users are adults. She told Radio 4's Today programme: 'We've shown that we've got teeth and that we're prepared to use them at Ofcom. And we have secured commitments across the porn industry and from the likes of X that no other country has secured. These things can work. 'Judge us by the impact we secure. And absolutely, please do tell us if you think there's something we need to know about that isn't working because the law is very clear now.' She also said the Government is right to be considering limits on the amount of time children can spend on social media apps. Earlier this week, Mr Kyle said he wanted to tackle 'compulsive behaviour' and ministers are reportedly considering a two-hour limit, with curfews also under discussion. Dame Melanie told LBC: 'I think the Government is right to be opening up this question. I think we're all a bit addicted to our phones, adults and children, obviously particularly a concern for young people. So, I think it's a good thing to be moving on to.' Children's charities the NSPCC and Barnardo's are among those who have welcomed the new checks in place from Friday, as well as the Internet Watch Foundation (IWF). The IWF warned the 'safeguards put in place need to be robust and meaningful' and said there is 'still more to be done', as they urged tech platforms to to build in safeguards rather than having them as 'an afterthought'. The Molly Rose Foundation – set up by bereaved father Ian Russell after his 14-year-old daughter Molly took her own life having viewed harmful content on social media – said there is a 'lack of ambition and accountability' in the measures, and accused the regulator of choosing to 'prioritise the business needs of big tech over children's safety'. Andy Burrows, chief executive of the foundation, told Sky News: 'We've always had a very simple test for the Online Safety Act, will it stop further young people like Molly from dying because of the harmful design of social media platforms? 'And regrettably, we just don't think it passes that test. This is a sticking plaster, not the comprehensive solution that we really need.' Ofcom said it has also launched a monitoring and impact programme focused on some of the platforms where children spend most time including social media sites Facebook, Instagram and TikTok, gaming site Roblox and video clip website YouTube. The sites are among those which have been asked to submit, by August 7, a review of their efforts to assess risks to children and, by September 30, scrutiny of the practical actions they are taking to keep children safe.
Yahoo
a day ago
- Yahoo
Judge us by impact of new online safety measures for children, says regulator
Parents and children can expect to 'experience a different internet for the first time', according to the Technology Secretary as new safety measures came into effect. Peter Kyle said he had 'high expectations' for the changes, as the head of the regulator in charge of enforcement against social media platforms which do not comply urged the public to 'judge us by the impact we secure'. While some campaigners have welcomed the new protections – which include age checks to prevent children accessing pornography and other harmful content – others have branded them a 'sticking plaster'. Charities and other organisations working in the sector of children's safety have agreed the key will be ensuring the measures are enforced, urging Ofcom to 'show its teeth'. The changes also require platforms to ensure algorithms do not work to harm children by, for example, pushing such content on the likes of self harm and eating disorders towards them. Actions which could be taken against firms which fail to adhere to the new codes include fines of up to £18 million or 10% of qualifying worldwide revenue, whichever is greater, and court orders potentially blocking access in the UK. Mr Kyle has said a generation of children will not be allowed to grow up 'at the mercy of toxic algorithms' as he pledged the Government is laying the foundations for a safer, healthier, more humane online world and warned tech firms 'will be held to account' if they fail to act in line with the changes. He told Sky News: 'I have very high expectations of the change that children will experience. 'And let me just say this to parents and children, you will experience a different internet really, for the first time in from today, moving forward than you've had in the past. And that is a big step forward.' The measures, as part of the Online Safety Act and set to be enforced by regulator Ofcom, require online platforms to have age checks – using facial age estimation or credit card checks – if they host pornography or other harmful content such as self-harm, suicide or eating disorders. Ofcom chief executive Dame Melanie Dawes said the regulator's research had shown half a million eight to 14-year-olds have come across pornography online in the last month alone. When it was put to her by the BBC that one of their staff members testing out the new measures had been able to sign up to a well-known porn site on Friday using just an email address, she said sites will be 'checking patterns of email use' behind the scenes to verify users are adults. She told Radio 4's Today programme: 'We've shown that we've got teeth and that we're prepared to use them at Ofcom. And we have secured commitments across the porn industry and from the likes of X that no other country has secured. These things can work. 'Judge us by the impact we secure. And absolutely, please do tell us if you think there's something we need to know about that isn't working because the law is very clear now.' She also said the Government is right to be considering limits on the amount of time children can spend on social media apps. Earlier this week, Mr Kyle said he wanted to tackle 'compulsive behaviour' and ministers are reportedly considering a two-hour limit, with curfews also under discussion. Dame Melanie told LBC: 'I think the Government is right to be opening up this question. I think we're all a bit addicted to our phones, adults and children, obviously particularly a concern for young people. So, I think it's a good thing to be moving on to.' Children's charities the NSPCC and Barnardo's are among those who have welcomed the new checks in place from Friday, as well as the Internet Watch Foundation (IWF). The IWF warned the 'safeguards put in place need to be robust and meaningful' and said there is 'still more to be done', as they urged tech platforms to to build in safeguards rather than having them as 'an afterthought'. The Molly Rose Foundation – set up by bereaved father Ian Russell after his 14-year-old daughter Molly took her own life having viewed harmful content on social media – said there is a 'lack of ambition and accountability' in the measures, and accused the regulator of choosing to 'prioritise the business needs of big tech over children's safety'. Andy Burrows, chief executive of the foundation, told Sky News: 'We've always had a very simple test for the Online Safety Act, will it stop further young people like Molly from dying because of the harmful design of social media platforms? 'And regrettably, we just don't think it passes that test. This is a sticking plaster, not the comprehensive solution that we really need.' Ofcom said it has also launched a monitoring and impact programme focused on some of the platforms where children spend most time including social media sites Facebook, Instagram and TikTok, gaming site Roblox and video clip website YouTube. The sites are among those which have been asked to submit, by August 7, a review of their efforts to assess risks to children and, by September 30, scrutiny of the practical actions they are taking to keep children safe.
Yahoo
2 days ago
- Yahoo
New online safety measures to protect children from ‘toxic algorithms'
A generation of children will no longer be "at the mercy of toxic algorithms", the Technology Secretary has declared, as new online safety protections officially came into force. Peter Kyle stated that the government was laying the foundations for a "safer, healthier, more humane online world", issuing a stern warning to tech firms that they "will be held to account" if they fail to adhere to the new measures. The changes, enacted as part of the Online Safety Act and set to be enforced by regulator Ofcom, mandate that online platforms hosting pornography or other harmful content – such as material related to self-harm, suicide, or eating disorders – must implement robust age checks. These can include facial age estimation or credit card verification. Furthermore, platforms are now required to ensure their algorithms do not actively harm children by, for example, pushing such content towards them. Companies found to be non-compliant face severe penalties, including fines of up to £18 million or 10 per cent of their qualifying worldwide revenue, whichever sum is greater. Court orders that could block access to these platforms in the UK are also a potential consequence. Campaigners have underscored the critical need for strict enforcement, with the NSPCC urging Ofcom to "show its teeth" if companies fail to make the necessary changes in line with the regulator's child protection codes. But the Molly Rose Foundation – set up by bereaved father Ian Russell after his 14-year-old daughter Molly took her own life having viewed harmful content on social media – said there is a 'lack of ambition and accountability' in the measures, and accused the regulator of choosing to 'prioritise the business needs of big tech over children's safety'. Mr Kyle insisted the Government has 'drawn a line in the sand' and that the codes will bring real change. He said: 'This Government has taken one of the boldest steps anywhere in the world to reclaim the digital space for young people – to lay the foundations for a safer, healthier, more humane place online. 'We cannot – and will not – allow a generation of children to grow up at the mercy of toxic algorithms, pushed to see harmful content they would never be exposed to offline. This is not the internet we want for our children, nor the future we are willing to accept.' He said the time for tech platforms 'to look the other way is over', calling on them to 'act now to protect our children, follow the law, and play their part in creating a better digital world'. He warned: 'And let me be clear: if they fail to do so, they will be held to account. I will not hesitate to go further and legislate to ensure that no child is left unprotected.' Ofcom chief executive Dame Melanie Dawes has previously defended criticism of the reforms, insisting that tech firms are not being given much power over the new measures, which will apply across the UK. Dame Melanie said: 'Prioritising clicks and engagement over children's online safety will no longer be tolerated in the UK. 'Our message to tech firms is clear – comply with age checks and other protection measures set out in our codes, or face the consequences of enforcement action from Ofcom.' The regulator said X, formerly Twitter, and others including Bluesky, Reddit and dating app Grindr are among those to have committed to age assurances, and described its safety codes as demanding that algorithms 'must be tamed and configured for children so that the most harmful material is blocked'. It said it has launched a monitoring and impact programme focused on some of the platforms where children spend most time including social media sites Facebook, Instagram and TikTok, gaming site Roblox and video clip website YouTube. The sites are among those which have been asked to submit, by August 7, a review of their efforts to assess risks to children and, by September 30, scrutiny of the practical actions they are taking to keep children safe. Chris Sherwood, chief executive at the NSPCC, said: 'Children, and their parents, must not solely bear the responsibility of keeping themselves safe online. It's high time for tech companies to step up.' He said if enforcement is 'strong', the codes should offer a 'vital layer of protection' for children and young people when they go online, adding: 'If tech companies fail to comply, Ofcom must show its teeth and fully enforce the new codes'. Echoing this, Barnardo's children's charity said the changes are 'an important stepping stone' but 'must be robustly enforced'. England's Children's Commissioner, Dame Rachel de Souza, said Friday 'marks a new era of change in how children can be protected online, with tech companies now needing to identify and tackle the risks to children on their platforms or face consequences', and said the measures must keep pace with emerging technology to make them effective in the future. But Andy Burrows, chief executive of the Molly Rose Foundation, said: 'This should be a watershed moment for young people but instead we've been let down by a regulator that has chosen to prioritise the business needs of big tech over children's safety.' He said the 'lack of ambition and accountability will have been heard loud and clear in Silicon Valley'. He added: 'We now need a clear reset and leadership from the Prime Minister. That means nothing less than a new Online Safety Act that fixes this broken regime and firmly puts the balance back in favour of children.' Earlier this week, Mr Kyle said children could face a limit on using social media apps to help them 'take control of their online lives'. He said he wanted to tackle 'compulsive behaviour' and ministers are reportedly considering a two-hour limit, with curfews also under discussion. The Cabinet minister said he would be making an announcement about his plans for under-16s 'in the near future'.