A UK bank has warned that “millions” of individuals could be at risk of scams involving AI voice cloning.
Starling Bank, an online lender, noted that fraudsters can use AI to mimic a person’s voice using just three seconds of audio, such as a video posted online. Scammers can then identify the person’s friends and family and use the cloned voice to make phone calls requesting money.
These scams have the potential to “catch millions out,” according to Starling Bank, which stated that hundreds have already been affected. A recent survey of over 3,000 adults, conducted by the bank in collaboration with Mortar Research, found that more than a quarter of respondents had encountered an AI voice-cloning scam in the past year.
Additionally, the survey revealed that 46% of respondents were unaware that such scams existed, and 8% indicated they would send money to a friend or family member, even if the call seemed unusual. “People frequently post content online with recordings of their voice, not realizing it increases their vulnerability to fraud,” said Lisa Grahame, chief information security officer at Starling Bank.
The bank recommends establishing a “safe phrase” with loved ones—an easy-to-remember, random phrase that differs from other passwords—to verify identity over the phone.
They advise against sharing this safe phrase via text, as it could be intercepted by scammers. If it must be shared in this way, the message should be deleted after being read.
As AI technology becomes more adept at imitating human voices, concerns are rising about its potential misuse, such as aiding criminals in accessing bank accounts or spreading misinformation.
Earlier this year, OpenAI introduced its voice replication tool, Voice Engine, but opted not to release it to the public due to concerns over possible misuse of synthetic voices.