Deepfake Con: Kerala Senior Citizen Duped In India's First Case

Technology Written by Updated: Sep 21, 2023, 2:17 pm
Deepfake Con: Kerala Senior Citizen Duped In India's First Case

Deepfake Con: Kerala Senior Citizen Duped in India's First Case

In a mind boggling incident in Kerala, India, a senior citizen fell victim to what might be the country”s first deepfake con. Radhakrishnan P S, a 72-year-old retiree, received a distressing call from an apparent “old colleague” in dire need of hospital funds. What seemed like a genuine video call with his colleague turned out to be a sophisticated deepfake scam. Radhakrishnan, deceived by the doctored audio and video, transferred INR 40,000 immediately. He later filed a complaint with the Kozhikode Cybercrime Police Station under sections 420 IPC and 66(C), 66(D) of the IT Act 2000.

Deepfake technology involves using algorithms to edit audio clips, images, or videos to replace the original person with someone else in a way that appears authentic. Radhakrishnan”s case may mark the first instance in India where deepfake technology has been employed to defraud an individual. With the proliferation of AI tools, the fear is that deepfakes could become a significant element of future cybercrime.

The Unique Aspects of the Case:

The intriguing aspect of this case is the use of deepfake technology during a video call with the victim. The suspect allegedly positioned their face very close to the camera, making it appear as a face but too hazy to be clear. Radhakrishnan, deceived by the voice resembling his colleague”s and the partial glimpse of the face, transferred the money.

Furthermore, the scammer displayed uncanny knowledge of mutual friends, shared family photos via WhatsApp, and utilized an eerily fake voice to gain Radhakrishnan”s trust. Investigators have traced the transaction to a suspect named Kaushal Shah from Ahmedabad, Gujarat, who has been involved in multiple cases of financial fraud.

The Mechanics of Deepfake Technology:

Creating convincing deepfakes is a complex process that requires a substantial amount of targeted research. It typically involves collecting 30,000 to 200,000 images of the person and a large number of audio samples. The quality and quantity of images significantly impact the realism of the deepfake. Scammers may use apps to add facial expressions and motions to photographs or employ basic face-swapping techniques.

Generating realistic audio deepfakes also demands a large volume of audio data. While English fares better in creating convincing audio, other languages like Hindi present challenges due to the need for extensive audio data.

Global Concerns and Regulatory Measures:

Deepfake technology has raised significant global security concerns. Scammers have employed deepfakes in various crimes, including impersonation, identity theft, blackmail, extortion, political manipulation, evidence manipulation, and celebrity scams.

Governments worldwide are increasingly alarmed by the implications of deepfake technology. Regulatory measures are being implemented to address this emerging threat. However, the legal framework surrounding deepfakes remains a work in progress, often lacking clarity on what constitutes a criminal offense.

As deepfake technology advances, distinguishing between authentic content and deepfakes becomes increasingly difficult. The cloak of anonymity allows scammers to operate from anywhere in the world, leaving law enforcement chasing shadows. Combating deepfake scams requires vigilance, cautious online behavior, and limited sharing of personal data. Global governments must develop comprehensive regulatory frameworks to address the evolving threat of deepfake technology.

In a world where technology outpaces the law, the battle against deepfake scams presents a multifaceted challenge for law enforcement agencies. The urgency of addressing this issue is further underscored by the upcoming elections and the potential for deepfake- political manipulation. While the perpetrator in this case remains at large, efforts are underway to bring them to justice, but there may be countless others lurking in the shadows, ready to deploy their deception tactics.