The development of AI seems to be threatening especially with voice cloning, which often used for false personification. As such case was shared by an Florida lawyer on X, explaining that scammers used AI to mimic his voice, convincing his parents to offer him money claiming that he had been in a car accident and was arrested for DUI.
Mr. Jay Shooter, shared his threatening experience saying that his father received a call, where he heard what sounded like his son asking for $30,000. But Mr. Shooter clarified in his post saying “But it wasn’t me” and also said that there was no accident. Claiming it as a AI scam, he listed out the possibilities as well.
Read Also: Scammers Clone Audios From Videos Uploaded On Social Media
Referring to his recent TV appearance, the lawyer has pointed that just 15 seconds of his voice from TV had been more than enough to make a decent AI clone.
Emphasising the need of awareness, Mr. Shooter also highlighted a horrible side effect of this cloning practice, stating that people in real emergencies will have to prove their identities to their loved ones with passwords. He also raised a questioned such scenarios stating “Can you imagine your parent doubting whether they’re actually talking to you when you really need help?”
Today, my dad got a phone call no parent ever wants to get. He heard me tell him I was in a serious car accident, injured, and under arrest for a DUI and I needed $30,000 to be bailed out of jail.
But it wasn’t me. There was no accident. It was an AI scam.
— Jay Shooster (@JayShooster) September 28, 2024
This shocking encounter of AI cloning stirred the internet and arise concern among may users. He also suggested while receiving such calls “Tell them you’re going to hang up and call them back on their phone. Ideally you would have agreed to use some kind of password in advance or you can ask them questions that only they would know the answer to and that would be very hard to for someone find online”
Read Also: Voice-Based Payments: The Future Of Money?
An other user also shared a similar experience of him stating “My dad got a call from my oldest son. He didn’t fall for the scam though because he called him “grandpa” which isn’t what my son calls him. I’m glad he kept his wits about him and was able to realize immediately it was a scam. I’ve talked to my parents repeatedly and I’m so glad they listened.”
The growth of AI has also made it easier for the scammers to clone voices and false impersonification, as per reports the US lost $2.7 billion to imposter scams alone. In India about 83%of Indians have lost money in AI voice scams as per the reports.