The advance of the Artificial intelligence It has become a double-edged sword. If the positive vision is enormous, the negative one does not stop growing: in this last area there are scams with cloned familiar voices.
CBS recently published the case of the American Jennifer Stefano, that he received a desperate call from his 15-year-old daughter Brianna. It wasn’t her, but an AI clone.
“She says: ‘Mom, these bad men have me, help me, help me, help me,’ Destefano recounted. “And this man gets very aggressive, ‘Listen, I have your daughter.’ And that’s when I went into panic mode.”
The person demanded the payment of one million dollars. Fortunately, Destefano realized in time, contacted his daughter and avoided the scam.
“A voice is like a fingerprint. That’s that unique fingerprint that is being exploited and turned into a weapon. It has to stop”, Destefano told CBS.
When this occurs with older people, scams are often successful for criminals.
The case of grandparents before “the gravity” of their grandson
It could be the case Ruth Card and Greg Grace, 73 and 75 years old, who recounted what happened to the Washington Post. A voice similar to that of his grandson, Brandon, called out to them, saying he was in jail, without a wallet or cell phone, and needed money for bail.
“We definitely had this feeling of… fear… We have to help him right now.” Ruth told the American newspaper. The two left for their bank in Regina, Saskatchewan, to withdraw $3,000 Canadian.
As that was the limit established in the entity, They went to another bank to withdraw more money.
But a manager from that bank noticed the situation and took them to his office: another customer had received a similar call. The voice had been cloned by Artificial Intelligence and the person on the phone was most likely not his grandson.
How to avoid being scammed by Artificial Intelligence voice cloning?
Faced with this type of situation, it is always recommended to take the call calmly and try to verify by other means if it is really the relative who calls them.
We recently published in FayerWayer the recommendations of Niklas Myhr, professor at Chapman University, to avoid being scammed by voice cloning.
The first: Agree with loved ones a secret word, which serves as a key in cases of pressure.
And the second: Always distrust, be skeptical in the face of this type of circumstance.
We leave you the link with Myhr’s recommendations next.