Technology does not stop advancing and now it not only stops at generating hyper-realistic images, such as those of the pope francis wearing fashioneither donald trump being arrested. Currently, there are cases that have drawn the attention of the authorities, because lScammers now imitate the voice of a family member with artificial intelligence (AI) and commit fraud.
Phishing is taken to another level
What seems like a scene from a science fiction movie is now a reality. One of the great nightmares of humanity is that someone impersonates your identity, which was already happening with multimedia content known as deepfake.
However, the technique has been perfected since 2019, when The Wall Street Journal reported on a case. It was about a group of cybercriminals who used AI software to impersonate the voice of a CEO. The result? They managed to transfer about 220,000 euros from a British company.
Four years have passed since that incident and the cases have been increasing. In fact, it has reached the point that the Federal Trade Commission (FTC)the person in charge of receiving fraud complaints in the United States, issued an alert about this method of deception.
How the new scams that imitate the voice of a relative with AI operate
the tools of artificial intelligence are increasingly sophisticated, it may suffice with an audio posted on FacebookInstagram, YouTube or TikTok, so that scammers steal your voice. About 30 seconds is enough. Just like Ursula in The Little Mermaid, the criminals use that fragment that mimics the voice of a relative with AI. What do they do with her? They can call your relatives pretending to be you and ask for money for an emergency.
“A scammer could use artificial intelligence to clone your loved one’s voice. All you need is a short audio of your family member’s voice that you might get from content posted on the internet and a voice cloning computer program. When the scammer calls you, it will sound like your loved one,” says the FTC.
According to a story from The Washington Post (TWP), A 73-year-old lady, Ruth Card, received a call from a scammer posing as her grandson Brandon. She comments that she sounded just like him. So when she told him that she was in jail, without a wallet or cell phone, and she needed her help to post bail, Ruth did her best to help him.
“It was definitely this feeling of… fear,” the lady commented. “We have to help him right now.”
So she and her husband, Greg Grace, went to the bank and withdrew $2,207, the maximum amount she could get from the ATM, so they went to another bank for more money. However, the manager took them to his office and explained that it might be a scam. At that moment they realized that they had been deceived.
Greg told the outlet that “we were convinced we were talking to Brandon.”
More and more voice impersonation scams
“Technology is making it easier and cheaper for bad actors to impersonate voices, convincing people, often the elderly, that their loved ones are in danger,” explains TWP.
These types of scams even became so popular in the United States that they ranked second with more than 36,000 reports of people being scammed by people posing as friends and family in 2022.
According to FTC data, these scams account for more than $11 million in losses.
“Artificial intelligence is no longer some crazy idea from a science fiction movie. We are living with it here and now,” emphasizes the FTC.
What can be done against calls that use AI to commit fraud
Experts say that the authorities are not equipped to stop the growing scam. Meanwhile, victims have few or no leads to the offender and police have a hard time tracing calls.
On the other hand, there are still not enough legal precedents for the authority and the courts to hold companies that develop this type of technology responsible for their use. And that, only in the United States, in the rest of the world they are still in diapers on the subject.
However, yes we can identify fraud to prevent it. Also, keep in mind that scams basically work the same way. That is to say, a scammer pretends to be someone they trust and convinces the victim to send them money because they are scared.
The problem now, is that with artificially generated voice technology it makes the fraud much more convincing.
The advice given by the FTC is “don’t trust the voice” and call the person asking for help directly to see if they really do have an emergency. But the most important sign that you may be the victim of a scammer is if they immediately ask you for money. They can also be cryptocurrencies or even gift cards.
In the northern country you can denounce in the FTC portal.
Editorial Team The editorial team of EMPRENDEDOR.com, which for more than 27 years has worked to promote entrepreneurship.