- AI voice-clone scams are on the rise, according to security experts
- Voice-enabled AI models can be used to imitate loved ones
- Experts recommend agreeing a safe phrase with friends and family
The next spam call you receive might not be a real person – and your ear won’t be able to tell the difference. Scammers are using voice-enabled AI models to automate their fraudulent schemes, tricking individuals by imitating real human callers, including family members.
Scam calls aren't new, but AI-powered ones are a new dangerous breed. They use generative AI to imitate not just authorities or celebrities, but friends and family.
The arrival of AI models trained on human voices has unlocked a new realm of risk when it comes to phone scams. These tools, such as OpenAI’s voice API, support real-time conversation between a human and the AI model. With a small amount of code, these models can be programmed to execute phone scams automatically, encouraging victims to disclose sensitive information.
So how can you stay safe? What makes the threat so problematic is not just how easily and cheaply it can be deployed, but how convincing AI voices have become.
OpenAI faced backlash for its Sky voice option earlier this year, which sounded spookily like Scarlett Johansson, while Sir David Attenborough has described himself as “profoundly disturbed” by an AI voice clone which was indistinguishable from his real speech.
Even tools designed to beat scammers demonstrate how blurred the lines have become. UK network O2 recently launched Daisy, an AI grandma designed to trap phone scammers in a time-wasting conversation, which they believe is with a real senior citizen. It’s a clever use of the technology, but also one that shows just how well AI can simulate human interactions.
Disturbingly, fraudsters can train AI voices based on very small audio samples. According to F-Secure, a cybersecurity firm, just a few seconds of audio is enough to simulate the voice of a loved-one. This could easily be sourced form a video shared on social media.
How AI voice-cloning scams work
The basic concept of a voice-clone scam is similar to standard phone scams: cybercriminals impersonate someone to gain the victim’s trust, then create a sense of urgency which encourages them to disclose sensitive information or transfer money to the fraudster.
The difference with voice-clone scams are two-fold. Firstly, the criminals can automate the process with code, allowing them to target more people, more quickly and for less money. Secondly, they are able to imitate not just authorities and celebrities, but people known directly to you.
All that’s required is an audio sample, which is usually taken from a video online. This is then analyzed by the AI model and imitated, allowing it to be used in deceptive interactions. One increasingly common technique is for the AI model to imitate a family member requesting money in an emergency.
The technology can also be used to simulate voices of high-profile individuals to manipulate victims. Scammers recently used an AI voice clone of Queensland Premier, Steven Miles, to try an execute an investment con.
How to stay safe from AI voice scams
According to Starling Bank, a digital lender, 28% of UK adults say they have been targeted by AI voice-clone scams, yet only 30% are confident that they’d know how to recognize one. That’s why Starling launched its Safe Phrases campaign, which encourages friends and family to agree a secret phrase which they can use to confirm each other’s identity – and that's a wise tactic.
1. Agree a safe phrase with friends and family
2. Ask the caller to confirm some recent private information
3. Listen for uneven stresses on words or emotionless talk
4. Hang up and call the person back
5. Be wary of unusual requests, like requests for bank details
Even without a pre-agreed safe phrase, you can use a similar tactic if you’re ever in doubt as to the veracity of a caller’s identity. AI voice clones can imitate a person’s speech pattern, but they won’t necessarily have access to private information. Asking the caller to confirm something that only they would know, such as information shared in the last conversation you had, is one step closer to certainty.
Trust your ear as well. While AI voice clones are very convincing, they aren’t 100% accurate. Listen for tell-tale signs such as uneven stresses on certain words, emotionless expression or slurring.
Scammers have the ability to mask the number they’re calling from and may even appear to be calling from your friend’s number. If you’re ever in doubt, the safest thing you can do is hang up and call the person back on the usual number you have for them.
Voice-clone scams also rely on the same tactics as traditional phone scams. These tactics aim to apply emotional pressure and create a sense of urgency, to force you into taking an action your otherwise wouldn’t. Be alert to these and be wary of unusual requests, especially when it relates to making a money transfer.
The same red flags apply to callers claiming to be from your bank or another authority. It pays to be familiar with the procedures used by your bank when contacting you. Starling, for example, has a call status indicator in its app, which can you check at any time to see if the bank is genuinely calling you.
You might also like...
from TechRadar - All the latest technology news https://ift.tt/xnlGcJH
0 coment�rios: