New York June 12: As usual, any good technology can be turned into a menace in the wrong hands. This time, it is the use of artificial intelligence by scammers.
In a disturbing development, criminals are cheating victims by using the Ai-cloned voices of their loved ones. The trick is to convince the victim that their family member/ friend is in trouble. Scale AI CEO Alexandr Wang Says US Risks Falling Behind China in Race for Powerful AI Warfare Tools.
Scammers are extorting money by using AI-cloned voices to convince victims that their loved ones is kidnapped or hurt in an accident. The voice samples used for cloning can be a 20-second clip on social media.
One person heard their grandson saying he is in a car accident and hurt. However, when they checked, their grandkid was safe in school. Another mother heard her daughter sobbing and a man demanded a ransom. However, the abduction was fake and the girl's voice was an AI clone.
While voice-cloning scams are relatively new, criminals are already using them to cheat people. The new scammers are very sophisticated and tech literate. Also, they can be operating from anywhere in the world. Currently, these spam calls are on the rise in the US and many other parts of the world.
How to Protect Against the Latest Modus Operandi of Cybercriminals
If you receive a distress call from a loved one, it is advisable to double-check. Try to reach them personally or through a family number. Also, limit personal information and media on public platforms.
Artificial intelligence has made it harder than before to sport fakes and find the correct information. AI voice is now almost indistinguishable from human speech. Many freely available apps allow you to create an AI clone in a few seconds. Telangana To Get ChatGPT-Like AI Chatbot for Government Services: World Economic Forum To Hold Workshop for Government Employees.
A recent report claims that one in four people have experienced an AI voice cloning scam and seventy per cent of people could not tell the difference between AI and a human voice. Even convincing videos can be made using DeepFake tech.
(The above story first appeared on LatestLY on Jun 12, 2023 01:26 PM IST. For more news and updates on politics, world, sports, entertainment and lifestyle, log on to our website latestly.com).