logo

Visa using AI to combat AI-driven fraud in New Zeland

Finextra07-07-2025
Visa, a world leader in digital payments, is fighting back as scammers harness the power of artificial intelligence to target New Zealanders by using AI to combat AI.
0
Launching its latest Security Roadmap for New Zealand, the bold three-year strategy aims to protect consumers and businesses from rapidly evolving cyber threats.
Scams and payment fraud in New Zealand are accelerating rapidly in both scale and sophistication, with AI-powered scams and digital fraud rising sharply across the country. In 2024 alone, scam and card fraud losses reached NZ$194 million[1], with small and medium sized businesses (SMBs) increasingly in the firing line. Unauthorised card fraud surged by 32% over the past 12 months[2], and online shopping scams have overtaken identity theft as the most reported scam type[3]. Despite this, 68% of New Zealanders chose not to report scam incidents[4], citing uncertainty around reporting channels.
'Visa used AI to stop more than NZ$273 million in fraud affecting New Zealanders in 2023 alone. And yet, as AI-enabled fraudsters evolve, we must move faster,' said Anthony Watson, Visa Country Manager for New Zealand and the Pacific Islands. 'That requires relentless innovation and continued investment in next-gen security tools and partnering across the ecosystem to stay ahead of criminals.'
The increasing use of artificial intelligence by criminals is helping them mimic legitimate consumer behaviour, bypass traditional security checks such as SMS passcodes, and manipulate human psychology with alarming precision. Social engineering tactics such as phishing, ransomware, billing scams and card-not-present fraud are now commonplace, with SMBs particularly vulnerable.
To combat this growing threat, Visa's Security Roadmap establishes the key areas for the investment of banks and financial institutions over the coming three years to: prevent enumeration attacks; modernise authentication; adopt a data-driven, risk-based approach to managing risks; strengthen resilience against AI-driven scams; enhance cybersecurity across the ecosystem; and secure digital payments with advanced protocols.
Supporting New Zealand SMBs with fraud prevention tips
To help SMBs protect themselves and their customers, Visa has also launched the SMB Fraud Prevention Toolkit - a suite of practical tips and resources to support small business owners. The toolkit provides step-by-step guidance on identifying, preventing, and responding to threats such as phishing, ransomware, billing scams, card-not-present fraud, and enumeration attacks.
With a 95% increase in scam reports in 2023 and NZ$1.9 million lost to scams targeting businesses[5], the toolkit offers clear checklists, real-world case studies, employee training tips, and incident response plans. 'SMBs are the engine of New Zealand's economy and, increasingly, cyber criminals exploit the most vulnerable point in the payments' ecosystem: humans,' said Watson. 'This toolkit gives business owners in New Zealand, commonly a target for cybercrime, the knowledge and confidence to take control of their security.'
The toolkit also promotes best practices including multi-factor authentication, employee awareness training, secure online transaction protocols, and real-time payment monitoring. It is designed to suit businesses of varying sizes and industries, and encourages a proactive approach to cybersecurity.
Funding innovation through sustainable infrastructure investment
Visa's Roadmap also highlights the importance of sustainable funding mechanisms, such as interchange, to maintain and improve fraud prevention capabilities. These mechanisms support the essential infrastructure behind secure payments, including AI systems, biometric authentication, 24/7 fraud monitoring, and tokenisation.
'Fraud prevention doesn't just happen - it's powered by sustained investment in technology,' said Watson. 'If we want to stay ahead of scammers, we need to ensure the ecosystem remains commercially viable for innovation to thrive. In markets where interchange fees have been significantly reduced, we've observed increased friction and higher fraud rates, leading to poorer customer experiences.'
Visa continues to work closely with banks, acquirers, merchants, and government agencies to implement its security roadmap across the country and ensure New Zealanders are protected in an increasingly digital economy.
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

High taxes, a recession: my fears for young job hunters in Scotland
High taxes, a recession: my fears for young job hunters in Scotland

Times

time25 minutes ago

  • Times

High taxes, a recession: my fears for young job hunters in Scotland

I started employing my latest assistant in March this year and for reliability, productivity, speed and all-round knowledge, he's hard to beat. Unfailingly polite and endlessly resourceful, he's settled into my small in-house team of seven with ease. Everyone loves him. Although he is only five months old and his background is unknown, he's already indispensable. He is, of course, one of the new autonomous artificial intelligence agents — otherwise known as agentic AI. This is one of the first publicly available AI agents capable of independent planning, decision-making and real-world task execution without requiring detailed human oversight. In beta mode and available by invitation only — codes were changing hands for $1,000 recently — it is a glimpse of a future that is awe-inspiring and terrifying in equal measure. For the time being, I'm ignoring the fact that I've had to hand over a lot of personal information to gain access (admittedly much of it already available online) and that very little is known about the Chinese start-up behind the technology. It is simply too valuable a tool and I'm already hooked. Agentic AI is turbocharging technical aspects of my business that other AI tools simply can't reach. I'm an optimist about the advent of AI. Or I should say, I'm an optimist about humanity. Such tools can, and are, being used for destructive purposes. But this is the best argument for not withdrawing from research. If the good guys slow down, they simply hand advantage to the bad actors. I understand the arguments against AI that end with humanity facing Armageddon. But mankind is perfectly capable of orchestrating its own destruction without the use of artificial intelligence. We just have to look at Gaza and Ukraine to be reminded of the depth of human depravity. Meanwhile AI is already saving lives. All progress has provoked moral panic. From the coming of the railways to Elvis wiggling his hips. And while my new AI assistant sometimes leaves me feeling like an 18th-century peasant contemplating the wonders of the internal combustion engine, I know that it is actual intelligence combined with AI that gives us the breakthroughs and competitive edge we need. While the AI assistant can code, I still need to employ my full-stack developer to implement, evaluate and interpret the results. But what is certainly true is that AI is contributing to an upcoming economic upheaval for which Scotland is wholly unprepared. A toxic combination of political decisions by the Labour government at Westminster and the SNP government in Scotland, a mental health crisis among millennials and Gen Zs and weak economic growth have the potential to tip the country into recession. This month, the accountancy firm EY reported that Scotland's high income tax rates were seen as the main barrier to expansion in Scotland's financial services industry, which contributes about 10 per cent of the Scottish economy by value. All Scottish workers earning more than £30,318 pay more income tax than their English counterparts and the highest band is set at 48 per cent for Scotland compared with 45 per cent for the rest of the UK. The job market is being squeezed from both ends. According to McKinsey & Co, the number of job vacancies online fell by 31 per cent in the three months to May, compared with the same period in 2022, the year that ChatGPT was launched. Research from KPMG and the Recruitment and Employment Confederation revealed that hiring fell in June at the fastest pace in almost two years. Sluggish growth and higher interest rates have been blamed but in occupations at entry level across all industries, including graduate traineeships and apprentices, jobs are disappearing at an alarming rate. The last apprentice I hired was unable to address an envelope and had no idea what a stamp was. She had a HNC in 'collective dance, specialising in hip-hop' and was about as prepared for the world of work as your average pigeon. She lasted three months. Somebody within the education system had let her down badly. Young people will be most seriously affected by the storm that is coming. They are also the group facing the biggest mental health crisis. In Scotland more than one million adults report that anxiety interferes with daily life. Gen Z and young millennials lose up to 60 days of productivity per year due to mental health issues compared with 36 days for older colleagues. The number of Scots out of work because of sickness and disability is at its highest level in 20 years and the number claiming disability payments in Scotland is set to almost double by 2030. Labour's plans under the Employment Rights Bill to remove the two-year qualifying period for key rights such as protection against unfair dismissal, parental leave and statutory sick pay, mean that many SMEs will not risk hiring staff without experience or a track record. That's if the SMEs stay in business. Confidence is at a low ebb. One in five small businesses believe they will be forced out of business if conditions don't improve. According to the Federation of Small Businesses, 27 per cent of business owners believe their company will downsize, be sold or close in the next 12 months. For the first time in 15 years, pessimism has outweighed optimism. Even profitable SMEs wonder if the juice is still worth the squeeze. The government is not protecting the jobs we do have. The closure of the Grangemouth refinery and the threat by bus manufacturer Alexander Dennis to move Scottish production to Scarborough could lead to 400 jobs lost in the Falkirk area. Add in jobs lost in the supply chain and the number rises to four figures. Both companies have foreign ownership, which rather dampens enthusiasm for the SNP government's boast that Scotland punches above its weight for inward investment. The Grangemouth closure and a sharp fall in manufacturing output drove a 0.4 per cent GDP decline in the three months up to May. About 80 per cent of leisure and hospitality businesses believe the Scottish economy will decline this year. John Swinney has mentioned a possible Scottish recession, blaming US tariffs. Even without a recession, growth is weak and Scottish economic activity is fragile. Even boom sectors such as renewables are facing cuts. At least one of the country's largest employers has just cut nearly all its graduate jobs for the present cohort reaching the end of their two-year training stint. Recent recessions have not brought the same level of job losses that the UK experienced in the 1990s and before. But that is set to change, and we are not prepared. This will affect a generation, already struggling post-pandemic, for most of their lives. The Scottish government has deliberately and negligently failed to promote the nation's economic wellbeing at the expense of ideology which a majority of voters do not share. As Harold Macmillan pointed out, it is 'events, dear boy' that bring down governments. But it is policy decisions that cripple countries.

We must lead AI revolution or be damned, says Muslim leader
We must lead AI revolution or be damned, says Muslim leader

Telegraph

time2 hours ago

  • Telegraph

We must lead AI revolution or be damned, says Muslim leader

Muslims must take charge of artificial intelligence or 'be damned' as a marginalised community, the head of the Muslim Council of Britain (MCB) has said in a leaked video. Dr Wajid Akhter, the general secretary of the MCB, said Muslims and their children risked missing the AI revolution in the same way as they had been left behind in the computer and social media revolutions. He added that while Muslims had historically been at the forefront of civilisation and were credited with some of the greatest scientific advances, they had ended up as the butt' of jokes in the modern world after failing to play a part in the latest technological revolutions. 'We already missed the industrial revolution. We missed the computer revolution. We missed the social media revolution. We will be damned and our children will damn us if we miss the AI revolution. We must take a lead,' said Dr Akther. Speaking at the MCB's AI and the Muslim Community conference on July 19, he added: 'AI needs Islam, it needs Muslims to step up.' Scientists 'made fun of' faith at computer launch Dr Akther recalled how at the launch of one of the world's earliest computers, the Mark II , US scientists brought out a prayer mat aligned towards Mecca. 'They were making fun of all religions because they felt that they had now achieved the age of reason and science and technology and we don't need that superstition any more,' he said. 'And so to show that they had achieved mastery over religion, they decided to make fun and they chose our faith. 'How did we go from a people who gave the world the most beautiful buildings, science, technology, medicine, arts to being a joke? 'I'll tell you one thing – the next time that the world is going through a revolution, the next time they go to flip that switch, they will also pull out a prayer mat and they will also line it towards the Qibla [the direction towards Mecca] and they will also pray, but this time, not to make fun of us, they will do so because they are us.' Government eases stance on MCB Dr Akther also told his audience: 'We lost each other. And ever since we lost each other, we've been falling. We've been falling ever since. We are people now who are forced, we are forced by Allah to watch the genocide of our brothers and sisters in Gaza. 'This is a punishment for us if we know it. We are people who are forced to beg the ones who are doing the killing to stop it. We are people who are two billion strong but cannot even get one bottle of water into Gaza.' Dr Akhter said Gaza had 'woken' Muslims up and showed they needed to unite. 'We will continue to fall until the day we realise that only when we are united will we be able to reverse this. Until the day we realise that we need to sacrifice for this unity,' he added. British governments have maintained a policy of 'non-engagement' with the MCB since 2009 based on claims, disputed by the council, that some of its officials have previously made extremist comments. However, Angela Rayner, the Deputy Prime Minister, is drawing up a new official definition of Islamophobia, and last week it emerged the consultation has been thrown open to all groups including the MCB. Earlier this year, Sir Stephen Timms, a minister in the Department for Work and Pensions, was one of four Labour MPs to attend an MCB event.

Easily Install Any AI Model Locally on Your PC Using Open WebUI
Easily Install Any AI Model Locally on Your PC Using Open WebUI

Geeky Gadgets

time6 hours ago

  • Geeky Gadgets

Easily Install Any AI Model Locally on Your PC Using Open WebUI

Have you ever wondered how to harness the power of advanced AI models on your home or work Mac or PC without relying on external servers or cloud-based solutions? For many, the idea of running large language models (LLMs) locally has long been synonymous with complex setups, endless dependencies, and high-end hardware requirements. But what if we told you there's now a way to bypass all that hassle? Enter Docker Model Runner—an innovative tool that makes deploying LLMs on your local machine not only possible but surprisingly straightforward. Whether you're a seasoned developer or just starting to explore AI, this tool offers a privacy-first, GPU-free solution that's as practical as it is powerful. In this step-by-step overview, World of AI show you how to install and run any AI model locally using Docker Model Runner and Open WebUI. You'll discover how to skip the headaches of GPU configurations, use seamless Docker integration, and manage your models through an intuitive interface—all while keeping your data secure on your own machine. Along the way, we'll explore the unique benefits of this approach, from its developer-friendly design to its scalability for both personal projects and production environments. By the end, you'll see why WorldofAI calls this the easiest way to unlock the potential of local AI deployment. So, what does it take to bring innovative AI right to your desktop? Let's find out. Docker Model Runner Overview Why Choose Docker Model Runner for LLM Deployment? Docker Model Runner is specifically designed to simplify the traditionally complex process of deploying LLMs locally. Unlike conventional methods that often require intricate GPU configurations or external dependencies, Docker Model Runner eliminates these challenges. Here are the key reasons it stands out: No GPU Setup Required: Avoid the complexities of configuring CUDA or GPU drivers, making it accessible to a broader range of developers. Avoid the complexities of configuring CUDA or GPU drivers, making it accessible to a broader range of developers. Privacy-Centric Design: All models run entirely on your local machine, making sure data security and privacy for sensitive applications. All models run entirely on your local machine, making sure data security and privacy for sensitive applications. Seamless Docker Integration: Fully compatible with existing Docker workflows, supporting OpenAI API compatibility and OCI-based modular packaging for enhanced flexibility. These features make Docker Model Runner an ideal choice for developers of all experience levels, offering a balance of simplicity, security, and scalability. How to Access and Install Models Docker Model Runner supports a wide array of pre-trained models available on popular repositories such as Docker Hub and Hugging Face. The installation process is designed to be straightforward and adaptable to various use cases: Search for the desired model on Docker Hub or Hugging Face to find the most suitable option for your project. Pull the selected model using Docker Desktop or terminal commands for quick and efficient installation. Use OCI-based packaging to customize and control the deployment process, tailoring it to your specific requirements. This modular approach ensures flexibility, allowing developers to experiment with AI models or deploy them in production environments with ease. How to Install Any LLM Locally Watch this video on YouTube. Browse through more resources below from our in-depth content covering more areas on local AI. System Requirements and Compatibility Docker Model Runner is designed to work seamlessly across major operating systems, including Windows, macOS, and Linux. Before beginning, ensure your system meets the following basic requirements: Docker Desktop: Ensure Docker Desktop is installed and properly configured on your machine. Ensure Docker Desktop is installed and properly configured on your machine. Hardware Specifications: Verify that your system has sufficient RAM and storage capacity to handle the selected LLMs effectively. These minimal prerequisites make Docker Model Runner accessible to a wide range of developers, regardless of their hardware setup, making sure a smooth and efficient deployment process. Enhancing Usability with Open WebUI To further enhance the user experience, Docker Model Runner integrates with Open WebUI, a user-friendly interface designed for managing and interacting with models. Open WebUI offers several notable features that simplify the deployment and management process: Self-Hosting Capabilities: Run the interface locally, giving you full control over your deployment environment. Run the interface locally, giving you full control over your deployment environment. Built-In Inference Engines: Execute models without requiring additional configurations, reducing setup time and complexity. Execute models without requiring additional configurations, reducing setup time and complexity. Privacy-Focused Deployments: Keep all data and computations on your local machine, making sure maximum security for sensitive projects. Configuring Open WebUI is straightforward, often requiring only a Docker Compose file to manage settings and workflows. This integration is particularly beneficial for developers who prioritize customization and ease of use in their AI projects. Step-by-Step Guide to Deploying LLMs Locally Getting started with Docker Model Runner is a simple process. Follow these steps to deploy large language models on your local machine: Enable Docker Model Runner through the settings menu in Docker Desktop. Search for and install your desired models using Docker Desktop or terminal commands. Launch Open WebUI to interact with and manage your models efficiently. This step-by-step approach minimizes setup time, allowing you to focus on using the capabilities of AI rather than troubleshooting technical issues. Key Features and Benefits Docker Model Runner offers a range of features that make it a standout solution for deploying LLMs locally. These features are designed to cater to both individual developers and teams working on large-scale projects: Integration with Docker Workflows: Developers familiar with Docker will find the learning curve minimal, as the tool integrates seamlessly with existing workflows. Developers familiar with Docker will find the learning curve minimal, as the tool integrates seamlessly with existing workflows. Flexible Runtime Pairing: Choose from a variety of runtimes and inference engines to optimize performance for your specific use case. Choose from a variety of runtimes and inference engines to optimize performance for your specific use case. Scalability: Suitable for both small-scale experiments and large-scale production environments, making it a versatile tool for various applications. Suitable for both small-scale experiments and large-scale production environments, making it a versatile tool for various applications. Enhanced Privacy: Keep all data and computations local, making sure security and compliance for sensitive projects. These advantages position Docker Model Runner as a powerful and practical tool for developers seeking efficient, private, and scalable AI deployment solutions. Unlocking the Potential of Local AI Deployment Docker Model Runner transforms the process of deploying and running large language models locally, making advanced AI capabilities more accessible and manageable. By integrating seamlessly with Docker Desktop and offering compatibility with Open WebUI, it provides a user-friendly, scalable, and secure solution for AI deployment. Whether you are working on a personal project or a production-level application, Docker Model Runner equips you with the tools to harness the power of LLMs effectively and efficiently. Media Credit: WorldofAI Filed Under: AI, Guides Latest Geeky Gadgets Deals Disclosure: Some of our articles include affiliate links. If you buy something through one of these links, Geeky Gadgets may earn an affiliate commission. Learn about our Disclosure Policy.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store