A recent research study by the digital lender Starling Bank has found that in the past years, 28% of people have been targeted by an AI voice cloning scam. Consumers are warned that their social media videos could be exploited by scammers to clone their voices with AI and then trick their family and friends out of cash, reports The Guardian.
Scammers can easily replicate the audio from the videos that are uploaded online. Using the extracted audio, they send voicemails to friends and family, asking them to send money urgently.
Read also: Instagram Rolls Out ‘Teen Accounts’ With Parental Control For Child Safety
From the study, Starling found that nearly 28% of the people were unaware that such scams exist. While 8% said that they would send the money if requested, even if they thought the call from their loved one seemed strange.
The chief information security officer at Starling Bank, Lisa Grahame said people regularly post content online that has recordings of their voice, without even imagining it making them more vulnerable to fraudsters.
“Scammers only need three seconds of audio to clone your voice, but it would only take a few minutes with your family and friends to create a safe phrase to thwart them,” Grahame said. “So it’s more important than ever for people to be aware of these types of scams being perpetuated by fraudsters, and how to protect themselves and their loved ones from falling victim.
The lender also suggested that people use a safe phrase with close friends and family to check whether the call is genuine.
The UK cyber security agency said that AI is making it difficult to identify such phishing messages, where users are tricked into handing their passwords to fraudsters. The scammers have also even managed to dupe big international businesses.
Read also: Apple’s iOS 18 Is Here: Check Out All New Features
Hong Kong police began its investigation in February after an employee at an unnamed company claimed she had been duped into paying HK$200m of her firm’s money to fraudsters in a deepfake video conference call impersonating senior officers of the company. The fraudster is expected to have downloaded videos in advance and then used artificial intelligence to add fake videos to use in the video conference.