![5967001.jpg](https://krb.world/wp-content/uploads/2025/02/5967001-1024x538.jpg)
Victims have been lured in by convincing deepfakes (Image: GETTY)
You would be forgiven for assuming the forlorn figures handing over cash behind a seedy Oxford Street souvenir shop were just another example of London’s regular criminal economy – the cash perhaps the proceeds of drugs, prostitution or both.
In reality, the money being turned over was coming from innocent victims of the latest devious romance scam, in which vulnerable people are tricked into thinking they are in a relationship with an illusory celebrity.
In an age where technology is blurring the lines between reality and illusions, these sophisticated deceptions use AI-generated videos and audio to precisely impersonate beloved celebrities, with the scammers able to control exactly what they say and do in their video communications with victims, whether they are begging for funds from a hospital bed, or recommending a fraudulent get-rich-quick scheme.
Worryingly, they became the most reported scams in 2024, according to the Advertising Standards Agency (ASA).
The latest instance, exposed by London’s Proactive Economic Crime Team, led to two arrests last week after victims allegedly handed over £200,000, which they were told would be sent to the celebrity. While the investigation is ongoing, detectives revealed that one individual was allegedly duped into handing over an incredible £60,000.
READ MORE: Man loses £600 to deepfake fraud – ‘I thought I was dating a soldier’
A deepfake video featuring Martin Lewis and Elon Musk was used to promote a fake Bitcoin investment scheme (Image: PA)
The consequences of such crimes are severe – victims are financially crippled while their reputations and lives are ruined. Simon Horswell, a fraud specialist manager at identity security provider Entrust, says the rise of deepfake technology in fraud is alarming.
According to the company’s latest report, a deepfake attempt occurs every five minutes. Horswell says: “We’ve seen time and again, bad actors will use them to improve the perceived legitimacy of their scams.”
It is not just romance but fake financial investments too. In recent months, high-profile celebrities such as A-list star Brad Pitt, Money Saving Expert Martin Lewis, BBC host Naga Munchetty and actor Pierce Brosnan have become both the victims and the weapons in deepfake fraud, with criminals manipulating their images and voices to promote scams, or even foster personal relationships.
In one case, a deepfake video featuring Martin Lewis and billionaire Elon Musk was used to promote a fake Bitcoin investment scheme, swindling a self-employed tradesman out of £76,000. Deepfake technology, which relies on artificial intelligence to create hyper-realistic video and audio, has rapidly evolved.
Initially used for harmless fun – like altering movie scenes or creating fake celebrity videos – the technology has taken a darker turn. Con artists are using the software to impersonate public figures, including political leaders, actors and financial experts, to extract cash from unsuspecting individuals and businesses.
Damian Rourke, a partner in the fraud practice at global law firm Clyde & Co, says: “In its harmless form, it might put Nicolas Cage’s face into every film ever made, but in the hands of bad actors, it’s a tool for deception.”
BBC presenter Naga Munchetty recently spoke out about deepfake scammers exploiting her public image (Image: GETTY)
The development of deepfakes has outpaced efforts to regulate or monitor the technology. While early deepfakes were often easy to spot, today’s creations are so convincing that it is nearly impossible to tell whether a video or voice recording is real. The speed at which AI replication technology advances only makes it harder for ordinary people to protect themselves.
One of the most shocking cases recently involved a French interior designer, Anne, who was tricked into believing she was in a romantic relationship with Fight Club actor Brad Pitt. Over the course of a year and a half, scammers used a series of deepfakes and fake social media profiles to convince Anne that the Hollywood actor needed money for medical treatment. She was swindled out of €830,000 (£700,000). Her story made national headlines in France – and Anne has since become the target of online ridicule.
She told a popular French YouTube show that she was not “crazy”, adding: “I just got played, I admit it, and that’s why I came forward because I am not the only one. Every time I doubted him, he managed to dissipate my doubts.” Anne is now said to be living with a sympathetic friend. She said: “My whole life is a small room with some boxes. That’s all I have left.”
Simone Simms, an art gallery owner in Nottingham, also found herself at the mercy of deepfake fraud. She was led to believe Pierce Brosnan, the James Bond actor turned artist, had agreed to exhibit his paintings at her gallery and would attend a meet-and-greet. Scammers used AI to create convincing video calls from what appeared to be Brosnan’s Hawaiian mansion. She sold £20,000 worth of tickets. Her dream was shattered when the real Brosnan denied any involvement and issued a cease-and-desist. It ruined her reputation and cost her a total of £30,000, forcing the gallery’s closure.
Closer to home, BBC presenter Naga Munchetty recently faced a more invasive scam when faked nude images of her were used in paid advertisements on X (formerly Twitter) and Facebook. The images were crudely doctored to appear authentic. Another scam linked to Munchetty exploited her trusted public image to lure users into cryptocurrency schemes.
Furious about the violation, Munchetty spoke out, warning of how her name and reputation were being used to “hoodwink people out of money”.
Jessica Tye, regulatory project manager at the Advertising Standards Authority, says scammers are capitalising on the public’s interest in celebrity gossip and endorsements. “We see new stars being used in these scam ads all the time, depending on who is in the news or just who scammers have alighted on.”
But deepfake scams are a far cry from traditional fraud. In the past, scammers would spend weeks or months grooming victims through calls, emails or even in person.
Today, scammers can trick victims into sending money in minutes. Rourke says: “Deepfakes enable fraudsters to cut corners. They no longer need to spend months to get close to an individual or learn how to infiltrate an organisation, they simply fake videos and jump straight to the end.”
He adds: “These doubles are so convincing because they’re unbelievably accurate and the pace at which they become more accurate is only increasing. Importantly we are wired to trust what we see and hear, so deepfakes take advantage of that.”