
TikTok fined £452m by EU authorities over data transfers to China

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles

Finextra
01-07-2025
- Finextra
The Data (Use and Access) Act has passed - what will change and what's next?
0 This content is contributed or sourced from third parties but has been subject to Finextra editorial review. After six years in the making, the Data (Use and Access) Act is finally here. The Act gained Royal Assent on 19 June 2025 and is set to have a significant impact despite more controversial parts of the Bill being dropped before the law was passed. The Data Act amends the UK GDPR, the Privacy and Electronic Communications Regulations and the Data Protection Act 2018, introducing a modernised data framework, extending "smart data" schemes, and laying the groundwork for a legally backed digital identity infrastructure. Here is what this means for both regulated and unregulated firms: Key highlights and impacts Smart data and open finance: The Act expands "smart data" access beyond open banking to other sectors, including payments. The creation of a legal framework for smart data will enable real-time, automated data sharing based on "recognised legitimate interest' (retained from article 6 of the UK GDPR) - a crucial element for detecting fraud in fast-moving payment systems. I expect to see innovative new tools for risk-scoring, fraud detection, and onboarding, with the potential to benefit financial institutions and consumers. The Act also sets out consistent information standards for health and adult social care IT systems in England, enabling the creation of unified medical records accessible across all related services. Stronger regulatory backbone: The Act introduces a significant increase in fines for breaches of the Privacy and Electronic Communications Regulations, from £500,000 to UK GDPR levels. Organisations could face fines of up to £17.5m or 4% of global annual turnover (whichever is higher) for the most serious infringements. Other changes include allowing cookies to be used without consent for web analytics, installing automatic software updates and extending the 'soft opt-in' for electronic marketing to charities. I am also expecting standardised APIs and governance through Smart Data schemes, which should streamline cross-industry integration and accelerate fraud controls and compliance. Automated decision-making: The Act limits the right, under article 22 of the UK GDPR, for a data subject not to be subject to solely automated decisions, including profiling, which have a legal or similarly significant effect on them. Under the new article 22A, a decision would qualify as being 'based solely on automated processing' if there was 'no meaningful human involvement in the taking of the decision'. This could give the green light to companies to use AI techniques on personal data scraped from the internet for the purposes of pre-employment background checks for example and faced some pushback during our debates in the Lords. Digital identity (A potential game-changer): One of the most transformative aspects of the new Data Act is the introduction of a statutory foundation for a trusted digital identity ecosystem. A huge opportunity for improved onboarding experiences, reduced fraud, and lower AML compliance costs. It is also likely to foster new middleware opportunities, such as identity orchestration, secure data exchange, and advanced analytics platforms, transforming the financial ecosystem. While I have been pushing for this for many years for all the potential benefits and am delighted with this progress, I am also aware that there are some serious questions about implementation and the Government's role in digital ID. Key questions remain Government intervention in digital ID: On this last point the Government still has work to do on the details. There is a massive gap in understanding how the Government's role will interact with the private sector in delivering digital ID, including over security and privacy liabilities. There are currently over fifty digital ID providers certified against the rules and standards laid out in the UK digital identity trust framework – these include providers of digital identity wallets as well as the orchestration service providers of digital wallets. This digital identity and attributes trust framework covers things like data privacy, cybersecurity, and fraud management. Effective functioning is vital to protect all our security and privacy and engender trust in the framework. Later this year the Department of Science Innovation and Technology will launch a government wallet containing Government issued 'verifiable credentials' such as a mobile driving licence. This raises questions over how the government wallet and other providers will interact, including who will be liable for ensuring protections when providers (either providers of wallets or orchestration service providers) share government-issued verifiable credentials from the government wallet. Richard Oliphant, legal consultant, has painted a picture of the problems caused by the current lack of clarity using the example of a mobile driving licence being used as proof of age when buying alcohol. Richard invites us to consider two scenarios. In the first, the holder of the government wallet uses a certified orchestration service provider to connect to a relying party and share their mobile driving licence to prove they are over 18 when buying a bottle of vodka. In the second scenario the holder of the government wallet can request the certified digital ID wallet provider take their mobile driving licence and create a "derived credential". This derived credential is no longer a mobile driving licence, but it can be stored in a digital wallet and presented as proof of age to the relying party for buying the bottle of vodka. There are now two new digital methods for age verification. But who is liable for the accuracy and integrity of the mobile driving licence and derived credential? In the case of a mobile driving licence which is shared with the relying party via an orchestration service provider, the Department for Science, Innovation and Technology has said publicly that the government is liable for the mobile driving licence and any government-issued verifiable credentials shared with the relying party (although there are questions about what precisely is meant by this). But if a digital ID wallet provider creates a derived credential from the mobile driving licence which is shared with the relying party, how is liability apportioned between the UK government and the provider? The devil, as always, is in the details - and when it is a question of privacy, security, and identity then the details really matter. The copyright question Thanks to Elton John and Sir Paul McCartney, among others, the most column inches generated by this legislation was the Lords V. Commons battle over AI and copyright. Baroness Kidron tabled various amendments designed to force transparency over training data to allow the enforcement of copyright legislation. AI has an insatiable appetite for data. LLMs and GenAI need a constant supply to train (and improve) output algorithms, and currently, this is happening without recognition or compensation to the copyright holders such as musicians and writers whose work may be used to train AI models. The government stood firm against the changes despite being defeated by votes in the Lords five times and the Act has passed without Baroness Kidron's proposed amendments. This is another issue though that is not going away. The government's consultation on AI and copyright ended in February. Among other options, it proposes to give copyright holders the right to opt out of their work being used for training AI. Once the government publishes its response to the copyright consultation, it will have to consider how to proceed. Whether proposals will come in the form of a new copyright bill or an AI regulation bill, or even, as rumours have it, DSIT may be considering a joint Bill. Watch this space. The journey towards a fully digitised, data-driven economy is accelerating. While the benefits are clear, addressing the intricate questions of trust, security, and functionality in delivering digital identity and addressing the legal and moral rights associated with training AI models will be paramount to ensuring trusted and widespread success.


BreakingNews.ie
30-06-2025
- BreakingNews.ie
Microsoft making billions from alleged unlawful processing of data for advertising
Microsoft Ireland Operations is being sued in the first-ever High Court representative action under new legislation over its alleged unlawful processing of personal data, which generates billions in advertising revenue. The Irish Council for Civil Liberties (ICCL) has brought the action claiming Microsoft operates an advertising business, through its "Xandr" platform, which allows it to sell advertising slots to individual advertisers in a real-time bidding system for a fee. Advertisement It is claimed that Microsoft's "search and news advertising" business generated some $10.2 billion (€8.7 billion) in the nine months ending March 2025. The ICCL is seeking declarations and orders against Microsoft Ireland Operations Ltd directing it to cease such data processing and/or adjust its processing to comply with GDPR and Irish law. In May, the ICCL obtained High Court approval deeming its action admissible as a representative action under the 2023 Protection of the Collective Interests of Consumers Act. This is the first such case to be brought under that Act. On Monday, the case was admitted to the fast track Commercial Court list on the application of James Doherty SC, for the ICCL. Advertisement Declan McGrath SC, for Microsoft, said he did not object to the entry of the matter into the list but he wanted time to write to the ICCL to set out why it is Xandr that should be the defendant and not Microsoft. Mr McGrath also said his side had separately written to the ICCL about its sources of funding for the proceedings and that is expected to be provided within a week. Microsoft may have to bring an application in relation to that, he said. Ms Justice Eileen Roberts said it was a rather unusual case but one that not only has a commercial aspect but will benefit enormously from the case management jurisdiction of the Commercial Court. Ireland Cleaner accused of taking photos of overweight col... Read More She adjourned it for three weeks but did not make directions on how it is to proceed to allow the parties exchange correspondence in relation to funding and the correct defendant. Advertisement In an affidavit seeking entry of the case to the commercial list, Johnny Ryan, senior fellow with the ICCL and director of its privacy and data protection programme "Enforce", said the $10.2 billion from Microsoft's search and news advertising business was contained in the company's "10-Q" submission to the US Securities and Exchange Commission. He said further publicly available documents provided by the defendant, which Microsoft claims sufficiently sets out the lawfulness of its processing activities, failed to satisfy the ICCL's concerns. Mr Ryan said the claim that Microsoft is not the correct defendant is an attempt to delay the prosecution of the proceedings and also delay the vindication of the rights of Irish consumers. If another party is to be added or substituted as a defendant, Mr Ryan believes this can be done during the currency of the proceedings.

Finextra
25-06-2025
- Finextra
The importance of CX within a highly regulated environment: By Chris Brown
The advent of a new generation of customers is prompting organisations to rethink their existing operating models. As digital natives, Gen Z brings a whole new set of opportunities for brands to consider, but also a host of new challenges, particularly when it comes to offering the right channel of choice. However, for the financial sector in particular, organisations need to be cognisant of the range of customers that they now serve and consider how to meet their evolving expectations. Central to this is investing in the customer experience (CX). The value of a great customer experience (CX) is well documented. Research tells us that those organisations classed as 'leaders' in CX achieved more than double the revenue growth of those classed as 'laggards'. Added to which, additional research shows that customer obsessed organisations report 51 percent better customer retention that those where CX isn't a priority. Clearly, investing in a stellar customer experience has significant worth. However, financial organisations must carefully balance the desire to deliver the experiences that customers crave, with the necessity of remaining compliant. From PCI DSS 4.0.1 to The Consumer Duty and GDPR – the financial sector is highly regulated and under pressure to ensure that the information around the customers and organisations they serve remains safe, and that information is used carefully to help identify any potentially vulnerable customers. So, what are the key tools that organisations can adopt to both meet the onerous demands of regulation and elevate the customer experience? Introduce AI to gather insights Within every interaction with a financial services provider, customers leave trails of information. This data provides vital insights into a customer and their experience, acting as a valuable tool for financial service providers looking to deliver an enhanced service that is compliant. However, gathering this information at scale can be a challenge. Using AI powered text/speech analytics financial organisations can rapidly comb through customer data to flag recurring themes, identify any potential customer vulnerability and highlight any problems that need to be resolved with service – elevating the customer experience in the process. Additionally, that data can reveal any inconsistencies in how agents are treating customers and whether their conduct remains compliant. Regulations such as The Consumer Duty specifically state that financial services providers need to act in the best interests of the consumer and therefore having the ability to ensure that your agents are consistently handling customers with due care is essential. This is particularly important for financial organisations dealing with vulnerable customers. Beyond insights, AI can also be used to help Contact Centre agents to work more efficiently. Most agents spend a large proportion of their time on after-call work, which increases exponentially with more complex calls. Utilising AI for summarisation removes the time-consuming element of manual summarisation delivering significant time savings, as well as removing ambiguity within the notes. By capturing better information on customers, organisations can deliver a better, and more personalised experience, next that time customer interacts with them. Voice biometrics for ID & Verification Designed to prevent fraud and money laundering and to ensure compliance with a variety of regulations, the Identity & Verification (ID&V) process is the starting point for every interaction with a financial services provider. However, while necessary, it can add an additional layer of frustration for customers – requiring them to remember numerous security details and spend additional time waiting to be fully authenticated. Using voice biometrics is an ideal alternative to this process and enables customers to be automatically identified through the unique patterns in their voice. This streamlines the entire ID&V process – offering customers robust protection and providing a smoother customer journey, all whilst ensuring compliance. Yet, in the quest to automate and simplify, it's important to remember the full spectrum of the customers being served. As while voice biometrics may be the ideal option for younger, more digitally-savvy customers, it may not be a preferred option for everyone. Therefore, providing a choice of options is essential. Secure and automate the payment process In the same way that the ID&V process is the starting point of every call, a large number of conversations with financial service providers also involve the processing of payments. PCI DSS 4.0.1 mandates that sensitive card holder information is held securely. Given that all calls are recorded, this usually means that Contact Centre agents need to manually pause their call recordings whilst taking card information and then manually restart the recording when the process is complete. However, this is subject to human error – potentially putting a consumer's data at risk. Using automated pause and resume technology ensures that no sensitive data enters the call recording environment by automatically pausing and resuming the call recording once the sensitive data has been taken. This also works in an omnichannel environment using a secure payment link to enable customers to pay in a secure portal – where agents cannot see the data being inputted. Given that customers want the financial organisations they entrust with their finances and their data to prioritise security, automating this part of the process, and removing the risk of human error, provides customers with the right assurances – and provides a smoother customer journey in the process. Striking the right balance In the world of finance, regulatory requirements will continue to evolve, adding increased pressure to organisations needing to comply. However, so too will customer expectations - especially from digital-first generations like Gen Z. By modernising CX strategies with the right tools, financial service providers can strike the right balance between meeting complex compliance needs and delivering standout customer journeys. Those organisations that act swiftly and effectively will gain a clear edge – turning regulation into opportunity and CX into a competitive differentiator.