[stock-market-ticker symbols="FB;BABA;AMZN;AXP;AAPL;DBD;EEFT;GTO.AS;ING.PA;MA;MGI;NPSNY;NCR;PYPL;005930.KS;SQ;HO.PA;V;WDI.DE;WU;WP" width="100%" palette="financial-light"]

Almost 30% of UK adults targeted by AI voice cloning scams in the past year. Scammers only need three seconds of audio to clone your voice.

20 septembrie 2024

28% of UK adults think they have been targeted by an AI voice cloning scam in the past year. Yet nearly half (46%) of UK adults do not know this type of scam even exists. Just 30% know what to look out for if they were to be targeted with a voice cloning scam.

Voice cloning scams – where fraudsters use AI technology to replicate the voice of a friend or family member – could be set to catch millions out, according to new research*.

The data, from Starling Bank, found that over a quarter (28%) of UK adults say they have been targeted by an AI voice cloning scam at least once in the past year. Yet, nearly half of UK adults (46%) have never even heard of such scams, let alone know how to protect themselves.

AI is giving fraudsters new ways to target people – they can now use voice cloning technology to replicate a person’s voice from as little as three seconds of audio, which can easily be captured from a video someone has uploaded online or to social media.

Scam artists can then identify that person’s family members and use the cloned voice to stage a phone call, voice message or voicemail to them, asking for money that is needed urgently. In the survey, nearly 1 in 10 (8%) say they would send whatever they needed in this situation, even if they thought the call seemed strange – potentially putting millions at risk.

Despite the prevalence of this attempted fraud tactic, just 30% say they would confidently know what to look out for if they were being targeted with a voice cloning scam.

Starling Bank has launched the Safe Phrases campaign, in support of the government’s Stop! Think Fraud campaign, encouraging the public to agree a ‘Safe Phrase’ with their close friends and  family that no one else knows, to allow them to verify that they are really speaking to them. Then if anyone is contacted by someone purporting to be a friend or family member, and they don’t know the phrase, they can immediately be alerted to the fact that it is likely a scam.

To launch the campaign, Starling Bank has recruited leading actor, James Nesbitt, to have his own voice cloned by AI technology, demonstrating just how easy it is for anyone to be scammed.

Commenting on the campaign, Nesbitt said “I think I have a pretty distinctive voice, and it’s core to my career. So to hear it cloned so accurately was a shock. You hear a lot about AI, but this experience has really opened my eyes (and ears) to how advanced the technology has become, and how easy it is to be used for criminal activity if it falls into the wrong hands. I have children myself, and the thought of them being scammed in this way is really scary. I’ll definitely be setting up a Safe Phrase with my own family and friends.

Anyone can fall for an AI voice clone,
even James Nesbitt

With criminals utilising increasingly sophisticated methods to elicit money, financial fraud offences across England and Wales are on the rise. UK Finance found offences jumped by 46 per cent last year, and the Starling research found the average UK adult has been targeted by a fraud scam five times in the past 12 months.

Lisa Grahame – Chief Information Security Officer at Starling Bank, commented, “People regularly post content online which has recordings of their voice, without ever imagining it’s making them more vulnerable to fraudsters. Scammers only need three seconds of audio to clone your voice, but it would only take a few minutes with your family and friends to create a Safe Phrase to thwart them. So it’s more important than ever for people to be aware of these types of scams being perpetuated by fraudsters, and how to protect themselves and their loved ones from falling victim”.

When prompted as to what AI voice cloning scams entail, 79% of UK adults reported being concerned about being targeted – more so than HMRC / High Court impersonations scams (75%), social media impersonation scams (76%), investment scams (70%) or safe account scams (73%).

____________

*Based on research conducted with Mortar Research between 21st and 23rd August 2024 among a representative sample of 3,010 UK adults. 

Adauga comentariu

Noutăți
Cifra/Declaratia zilei

Anders Olofsson – former Head of Payments Finastra

Banking 4.0 – „how was the experience for you”

So many people are coming here to Bucharest, people that I see and interact on linkedin and now I get the change to meet them in person. It was like being to the Football World Cup but this was the World Cup on linkedin in payments and open banking.”

Many more interesting quotes in the video below:

Sondaj

In 23 septembrie 2019, BNR a anuntat infiintarea unui Fintech Innovation Hub pentru a sustine inovatia in domeniul serviciilor financiare si de plata. In acest sens, care credeti ca ar trebui sa fie urmatorul pas al bancii centrale?