AI Voice Cloning Scam: Tamil Nadu Police Warn People Against Fraudsters Using Artificial Intelligence-Based Voice Cloning
The Cyber Crime wing of the Tamil Nadu Police on Saturday warned the people to be cautious about the Artificial Intelligence (AI)-based voice cloning leading to an impersonation scam cheating innocent people.
Chennai, April 27: The Cyber Crime wing of the Tamil Nadu Police on Saturday warned the people to be cautious about the Artificial Intelligence (AI)-based voice cloning leading to an impersonation scam cheating innocent people. The ADGP Cyber Crime Wing, Tamil Nadu Police, Sanjay Kumar, in an advisory has asked the people to be cautious about the unsolicited calls received on mobile phones.
The top police officer said that cyber fraudsters are now employing voice cloning to mimic the voices of trusted individuals such as family members over phone calls. The advisory said that the voices were mimicked using advanced AI technologies. The calls, according to Tamil Nadu ADGP, under the pretext of an emergency, and by creating a sense of urgency or distress, deceive victims into transferring money quickly into the scamsters' account. AI Voice Cloning Scam in Mumbai: Fraudsters Dupe Businessman of Rs 80,000 Using Artificial Intelligence
The scamster, according to him, poses as someone the victim knows and trusts, such as a family member or friend. The scamster is likely to speak in urgent need of financial assistance due to a fabricated emergency or threat. The top police officer said that the scamster uses various tactics to evoke a sense of urgency and emotional distress in the victim and may employ sobbing or pleading tones, claiming to be in a dire situation that requires immediate help.
"The scamster utilises sophisticated AI software to clone the voice of the person they are impersonating. The scamster use voice sample of the person from social media posts/videos or by just talking to the person over the phone using a wrong number tactic," he added. This technology allows them to mimic the voice as well as the intonation and emotional nuance of the victim's trusted contact convincingly, he said.
ADGP Kumar added that the scamsters use an AI-generated cloned voice to commit cybercrimes. He said that after the scamster gains the trust, requests the victim to transfer money immediately to help resolve the crisis. The fraudsters, according to Cyber Crime Wing Police asks fast and convenient payment methods like the Unified Payments Interface (UPI) system to expedite the transaction. Lok Sabha Election 2024: AI-Generated Deepfake Videos, Voice Cloning Emerge As Potential Threats During Poll Season
The victim is likely to comply with the demand of the scamster without even verifying the authenticity of the caller or the situation and its legitimacy. ADGP Kumar in the advisory urged the people to always verify the identity of the person calling, especially if they request urgent financial assistance. He also asked them to contact a friend/relative through a known and verified number to confirm their identity before taking any action. He said, "Be wary of unexpected requests for money, especially if they involve urgent situations or emotional manipulation."
(The above story first appeared on LatestLY on Apr 27, 2024 08:48 PM IST. For more news and updates on politics, world, sports, entertainment and lifestyle, log on to our website latestly.com).