12/07/2023 / By Cassie B.
Cybersecurity experts have warned that countless people could lose their savings in the coming years as advancements in AI make it much easier for criminals to get their hands on people’s money.
A lawyer who helps individuals recoup lost savings after being victimized by cybercrimes, Marc Maisch, warned in a recent interview with the German Die Welt newspaper that a major crime wave is on the horizon in which clever cybercriminals could scam millions of people out of their money with help from AI tools.
He explained: “AI is a great tool, but of course cybercriminals also use it to create phishing emails in order to have ChatGPT write malicious code that can be used immediately. And we haven’t yet talked about deepfakes and voice phishing, i.e., faces and voices are recreated.”
AI voice cloning scams are one concern that has been on the rise lately, with new technology enabling thieves to clone the voices of people they are targeting on social media so they can create fake calls in which they contact the victim’s friends or family, impersonating the victim and claiming they are in danger and need money or access to sensitive information right away.
For example, they might find a clip of a teenager’s voice online, such as in one of their TikTok videos; with as little as 3 seconds of footage, they can clone it with AI and make that person’s voice say whatever they want. They can even add emotions such as fear into the voice. They can then call the teen’s parents and claim to need money for some type of emergency like a car accident.
Another approach they might use is deploying AI-generated code to steal people’s personal information or hack a significant volume of accounts at once. Love scams are also growing in popularity now that AI has made them more practical for thieves to carry out. In the past, a love scammer had to spend a lot of their time focusing on one potential victim, conversing with them and convincing them they are in love with them and then trying to get them to send them money. With AI chatbots, these scams can be automated and carried out at scale, with an endless number of potential victims being targeted at once.
AI can also be used to improve the algorithms criminals use to guess users’ passwords, enabling them to analyze vast password datasets and quickly generate different variations. Deepfakes can also be used to trick people into transferring money to other parties.
The technology is improving so rapidly that it won’t be long before scammers achieve unimaginable levels of sophistication.
“We are currently seeing the start of a revolutionary development in cybercrime. Next year, the year after that, this will have reached a whole new level,” Maisch, who is also a cybercrime expert, added.
He also warned that victims of these crimes have very little recourse given their global nature. He explained how a cybercriminal stole 800,000 euros from one of his clients and the police did not do much about it. Because the perpetrators were in a different country, they would be too difficult to catch in this case and others like it. Many police departments currently lack the expertise to investigate these crimes effectively.
Even though AI is still relatively new and many people don’t fully understand how it works, the public is already very concerned about its potential. A recent Agency Forward survey carried out by Nationwide found that 82 percent of Americans are worried about criminals using AI to steal people’s identities. Unfortunately, as these attacks become more and more sophisticated thanks to the power of AI, there may not be much people can do to protect their money and identity from determined cybercriminals.
Sources for this article include:
Tagged Under:
AI, artificial intelligence, big government, computing, conspiracy, cyber war, cybercrimes, Dangerous, deception, faked, future science, future tech, Glitch, information technology, inventions, money supply, national security, phishing, robotics, robots, scams, Voice cloning
This article may contain statements that reflect the opinion of the author
COPYRIGHT © 2017 GLITCH.NEWS
All content posted on this site is protected under Free Speech. Glitch.news is not responsible for content written by contributing authors. The information on this site is provided for educational and entertainment purposes only. It is not intended as a substitute for professional advice of any kind. Glitch.news assumes no responsibility for the use or misuse of this material. All trademarks, registered trademarks and service marks mentioned on this site are the property of their respective owners.