Criminals have used AI to run more scams and steal more money, resulting in £ 629 million in losses due to fraud in the UK during the first half of the year.
UK Finance reported that fraud cases increased by 17% to over 2 million between January and June, marking one of the highest levels ever recorded in the UK.
Criminals in the UK use AI to create fake messages, emails, and videos that look real, to reach thousands of victims at once with very little effort. They copy the logos, voices, and faces of real companies or trusted public figures to extort people because it’s really hard to tell the difference between what’s real and what’s fake these days.
UK Finance stated that these fraudsters utilize AI tools to translate scam messages, craft convincing fake stories, and tailor their tricks to each target. The most common trick these criminals use is creating deepfake videos of popular celebrities promoting fake cryptocurrency or stock investment opportunities. Victims of these crimes lost about $15,000 each (and counting), and the total losses rose by 55% in the first half of the year alone (that’s nearly £100 million gone with the wind).
These AI-generated videos look so realistic that people aware of such scams sometimes struggle to distinguish them. Scammers will take their time to create fake websites that appear legitimate and send the links to victims once they show some interest.
Victims can literally open accounts with these fake websites, watch their investments “grow” in some of them, and sometimes even withdraw small profits at first, which is the cherry on top. By the time someone realizes the site is fake, they have already thrown a large amount of money into their account, and that’s when everything just disappears or freezes.
Criminals have learned that the quickest way to some people’s wallets is through their hearts. Using romance scams, they make their schemes more personal and emotionally damaging. With AI-generated profiles and chatbots that sound genuine and affectionate, they lure victims on dating apps and social media.
These fraudsters often spend weeks or even months chatting daily, sharing fake stories, and building emotional connections that don’t actually exist. Once trust is gained, they start asking for money — often claiming it’s for travel, medical expenses, or urgent emergencies.
Many victims begin by sending small amounts but end up transferring larger sums as the fake relationship deepens. According to UK Finance, romance fraud cases increased by 19%, with total losses rising 35% to approximately £20.5 million in the first half of the year.
Traditional fraud checks can no longer keep up with AI-powered crime, so banks have had to modernize their own systems and use artificial intelligence to fight fire with fire. These new systems learn how each customer spends their money and will flag any suspicious activity, like large transfers to an unknown new account.
UK Finance stated that this initiative is already showing a significant impact, as banks have prevented £870 million worth of unauthorised fraud in the first six months of the year. This figure represents a 20% improvement compared to the same period last year. This means that for every £1 that criminals attempted to steal, banks were able to block approximately 70 pence before it left the victim’s account.
Ruth Ray, Director of Fraud Policy at UK Finance, stated that banks are now investing heavily in these AI systems because they work and react more efficiently than human teams. The systems monitor unusual customer behavior, such as someone suddenly sending money overseas, purchasing expensive items they have never bought before, or responding to pressure from a stranger over the phone. The AI will immediately alert the bank’s security team when it detects something suspicious, or it can pause the transaction until the customer confirms it is real.
A special police team funded by UK banks, the Dedicated Crime and Payment Card Unit (DCPCU), has also been investigating how criminals use “SMS blasters” to send thousands of fake messages to phones in busy areas, asking people to click on fake links that expose their bank or card details.
If you’re reading this, you’re already ahead. Stay there with our newsletter.