
Lloyds selects Moneyhub for transaction data enrichment
1
Moneyhub's technology will categorise all of the bank's transactions, including card transactions, direct debits, standing orders, transfers, and faster payments for both income and expenditure.
Ranil Boteju, group chief data and analytics officer at Lloyds Banking Group, says the initiative will support customers to understand what they spend their money on, and improve their personalised digital banking experiences.
'Partnering with Moneyhub will allow us to rapidly deliver far richer and more valuable insights for our customers," he says. "By combining Moneyhub's advanced categorisation technology with our in-house GenAI expertise, we'll improve the time and accuracy of transaction classifications, unlocking new products and services for our customers and providing real-time insights so they can make more informed financial decisions.'
Lloyds acquired a minority stake in Moneyhub as part of a £35 million funding round in 2022.

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Reuters
5 hours ago
- Reuters
UK's FCA proposes 9 billion to 18 billion pound redress scheme for motor finance claims
LONDON, Aug 3 (Reuters) - Britain's Financial Conduct Authority (FCA) on Sunday proposed a redress scheme for consumers with motor finance compensation claims following last week's Supreme Court ruling, estimating the cost at between 9 billion and 18 billion pounds ($12 billion and $24 billion). Friday's court decision had calmed the industry's worst fears about the size of the bill it would face over improperly disclosed commissions on car loans - a sum analysts had estimated could run to tens of billions of pounds. However, after considering that ruling, which was largely seen as a win for the banks, the FCA still proposed an industry-wide redress scheme for certain types of compensation claims. "At this stage, we think it is unlikely that the cost of any scheme, including administrative costs, would be materially lower than 9 billion pounds and it could be materially higher," the FCA said in a statement. It said the total cost was hard to estimate. It cautioned that any estimates were indicative and susceptible to change, but it said those in the middle of the 9 billion to 18 billion pounds range were "more plausible." Some level of further compensation payout had still been expected by banks after Friday's ruling, placing investor focus on the FCA's decision over whether to launch a full redress scheme, what it might look like, and how much it would cost. Lenders, including Lloyds Banking Group (LLOY.L), opens new tab, Close Brothers (CBRO.L), opens new tab, Barclays (BARC.L), opens new tab and the UK arms of Santander ( opens new tab and Bank of Ireland (BIRG.I), opens new tab, have already set aside nearly 2 billion pounds between them to cover potential motor finance compensation claims. The FCA said firms should now refresh estimates of their liabilities, increase provisions where necessary, and keep markets informed. Prior to the Supreme Court ruling, which overturned a previous court decision, there were fears the cost of redress could rival that of a payment protection insurance mis-selling scandal, which cost lenders over 40 billion pounds between 2011 and 2019. The proposed motor finance scheme would cover so-called discretionary commission arrangements - those where the broker could adjust the interest rate offered to a customer - if they had not been properly disclosed. The regulator said agreements dating back to 2007 should be considered and it would publish a consultation by early October, with an expectation that people start receiving compensation in 2026. "Our consultation will cover how firms should assess whether the relationship between the lender and borrower was unfair for the purposes of our scheme," the statement said. "Any redress scheme must be fair to consumers who have lost out and ensure the integrity of the motor finance market, so it works well for future consumers." The consultation will also look at how interest is calculated on compensation, saying it estimated a simple annual rate of around 3% would be applicable. The regulator said it had not decided whether the scheme should require customers to opt in, or be automatically involved unless they opt out. ($1 = 0.7531 pounds)


Daily Mail
a day ago
- Daily Mail
Nudifying apps are not 'a bit of fun' - they are seriously harmful and their existence is a scandal writes Children's Commissioner RACHEL DE SOUZA
I am horrified that children are growing up in a world where anyone can take a photo of them and digitally remove their clothes. They are growing up in a world where anyone can download the building blocks to develop an AI tool, which can create naked photos of real people. It will soon be illegal to use these building blocks in this way, but they will remain for sale by some of the biggest technology companies meaning they are still open to be misused. Earlier this year I published research looking at the existence of these apps that use Generative Artificial Intelligence (GenAI) to create fake sexually explicit images through prompts from users. The report exposed the shocking underworld of deepfakes: it highlighted that nearly all deepfakes in circulation are pornographic in nature, and 99% of them feature girls or women – often because the apps are specifically trained to work on female bodies. In the past four years as Children's Commissioner, I have heard from a million children about their lives, their aspirations and their worries. Of all the worrying trends in online activity children have spoken to me about – from seeing hardcore porn on X to cosmetics and vapes being advertised to them through TikTok – the evolution of 'nudifying' apps to become tools that aid in the abuse and exploitation of children is perhaps the most mind-boggling. As one 16-year-old girl asked me: 'Do you know what the purpose of deepfake is? Because I don't see any positives.' Children, especially girls, are growing up fearing that a smartphone might at any point be used as a way of manipulating them. Girls tell me they're taking steps to keep themselves safe online in the same way we have come to expect in real life, like not walking home alone at night. For boys, the risks are different but equally harmful: studies have identified online communities of teenage boys sharing dangerous material are an emerging threat to radicalisation and extremism. The government is rightly taking some welcome steps to limit the dangers of AI. Through its Crime and Policing Bill, it will become illegal to possess, create or distribute AI tools designed to create child sexual abuse material. And the introduction of the Online Safety Act – and new regulations by Ofcom to protect children – marks a moment for optimism that real change is possible. But what children have told me, from their own experiences, is that we must go much further and faster. The way AI apps are developed is shrouded in secrecy. There is no oversight, no testing of whether they can be used for illegal purposes, no consideration of the inadvertent risks to younger users. That must change. Nudifying apps should simply not be allowed to exist. It should not be possible for an app to generate a sexual image of a child, whether or not that was its designed intent. The technology used by these tools to create sexually explicit images is complex. It is designed to distort reality, to fixate and fascinate the user – and it confronts children with concepts they cannot yet understand. I should not have to tell the government to bring in protections for children to stop these building blocks from being arranged in this way. Posts on LinkedIn have even appeared promoting the 'best' nudifying AI tools available I welcome the move to criminalise individuals for creating child sexual abuse image generators but urge the government to move the tools that would allow predators to create sexually explicit deepfake images out of reach altogether. To do this, I have asked the government to require technology companies who provide opensource AI models – the building blocks of AI tools – to test their products for their capacity to be used for illegal and harmful activity. These are all things children have told me they want. They will help stop sexual imagery involving children becoming normalised. And they will make a significant effort in meeting the government's admirable mission to halve violence against women and girls, who are almost exclusively the subjects of these sexual deepfakes. Harms to children online are not inevitable. We cannot shrug our shoulders in defeat and claim it's impossible to remove the risks from evolving technology. We cannot dismiss it this growing online threat as a 'classroom problem' – because evidence from my survey of school and college leaders shows that the vast majority already restrict phone use: 90% of secondary schools and 99.8% of primary schools. Yet, despite those restrictions, in the same survey of around 19,000 school leaders, they told me online safety is among the most pressing issue facing children in their communities. For them, it is children's access to screens in the hours outside of school that worries them the most. Education is only part of the solution. The challenge begins at home. We must not outsource parenting to our schools and teachers. As parents it can feel overwhelming to try and navigate the same technology as our children. How do we enforce boundaries on things that move too quickly for us to follow? But that's exactly what children have told me they want from their parents: limitations, rules and protection from falling down a rabbit hole of scrolling. Two years ago, I brought together teenagers and young adults to ask, if they could turn back the clock, what advice they wished they had been given before owning a phone. Invariably those 16-21-year-olds agreed they had all been given a phone too young. They also told me they wished their parents had talked to them about the things they saw online – not just as a one off, but regularly, openly, and without stigma. Later this year I'll be repeating that piece of work to produce new guidance for parents – because they deserve to feel confident setting boundaries on phone use, even when it's far outside their comfort zone. I want them to feel empowered to make decisions for their own families, whether that's not allowing their child to have an internet-enabled phone too young, enforcing screen-time limits while at home, or insisting on keeping phones downstairs and out of bedrooms overnight. Parents also deserve to be confident that the companies behind the technology on our children's screens are playing their part. Just last month, new regulations by Ofcom came into force, through the Online Safety Act, that will mean tech companies must now to identify and tackle the risks to children on their platforms – or face consequences. This is long overdue, because for too long tech developers have been allowed to turn a blind eye to the risks to young users on their platforms – even as children tell them what they are seeing. If these regulations are to remain effective and fit for the future, they have to keep pace with emerging technology – nothing can be too hard to tackle. The government has the opportunity to bring in AI product testing against illegal and harmful activity in the AI Bill, which I urge the government to introduce in the coming parliamentary session. It will rightly make technology companies responsible for their tools being used for illegal purposes. We owe it to our children, and the generations of children to come, to stop these harms in their tracks. Nudifying apps must never be accepted as just another restriction placed on our children's freedom, or one more risk to their mental wellbeing. They have no value in a society where we value the safety and sanctity of childhood or family life.


The Guardian
2 days ago
- The Guardian
Car finance scandal: UK supreme court poised to give ruling on hidden commissions
The UK's highest court is poised to give its verdict on the £44bn car finance scandal, which could pave the way for millions of motorists to claim billions of pounds in compensation for mis-selling. The supreme court judgment, which will be handed down at 5pm on Friday, will decide whether or not to uphold a finding by the court of appeal in October that hidden commissions paid to car dealers by lenders were unlawful. That ruling, based on test cases, said making such payments to brokers who arrange car loans without disclosing the sum and terms to borrowers was unlawful. The lenders involved in the case – FirstRand Bank and Close Brothers – appealed against that decision to the supreme court. The vast majority of new cars, about 90%, as well as many secondhand cars are bought using finance agreements. If the UK's most senior judges uphold the earlier decision in its entirety then those customers could be entitled to billions in compensation, leading to huge liabilities for lenders. Lloyds Banking Group, which is one of the most exposed to the scandal through its Black Horse division, has already put aside £1.2bn for potential compensation. The industry, led by the Financing & Leasing Association (FLA) which represents car lenders, argues it has done nothing wrong. Thousands of vehicle buyers were already in line for payouts if they made their purchase before 2021 with a form of car finance banned that year by the Financial Conduct Authority (FCA), known as discretionary commission arrangements (DCAs). They meant some dealers received a bigger payment if they got car buyers to sign up to a higher rate on a loan. The FCA is likely to set up a central compensation scheme for drivers mis-sold loans through DCAs, and has said it will confirm whether it will do so within six weeks of the supreme court's judgment. However, if justices also uphold the appeals court decision in its entirety that could widen the potential pool of claimants significantly. If they side with lenders then the scope of potential payouts to motorists will be much more limited. Sign up to Business Today Get set for the working day – we'll point you to all the business news and analysis you need every morning after newsletter promotion In February, the supreme court rejected an unusual intervention by the government which was concerned that huge amounts of compensation payments could dramatically affect the car market. The Treasury has said that it wants to see a 'balanced judgment' to compensate consumers who were mis-sold loans while also allowing the car finance sector to continue to support motorists who need loans to buy vehicles.