An incident occurred in which a kidnapping was made up with a clone voice created by AI and a ransom was demanded



Recent advances in AI have made it easier for individuals to reproduce their voices with AI that has learned their own voice data. A phone call to Jennifer DeStefano, who lives in Arizona, USA, was a call telling her a fake kidnapping using a clone voice that replicated her daughter's voice with AI.

'I've got your daughter': Mom warns of terrifying AI voice cloning scam that faked kidnapping

https://www.wkyt.com/2023/04/10/ive-got-your-daughter-mom-warns-terrifying-ai-voice-cloning-scam-that-faked-kidnapping/



AI clones teen girl's voice in $1M kidnapping scam

https://nypost.com/2023/04/12/ai-clones-teen-girls-voice-in-1m-kidnapping-scam/



Mr. DeStefano, who received a call from an unfamiliar phone number, originally intended to make it an answering machine, but he said he was worried about the safety of his 15-year-old daughter who was out skiing at the time and answered the phone. I picked up the phone and heard my daughter crying, yelling 'Mama!'

When Mr. DeStefano asked what had happened, his daughter was sobbing, saying, 'I messed up.' Mr. DeStefano, who was confused when he heard his daughter's voice, then reported being terrified when he heard a man's voice say, 'Get back and lie down.'

At the phone mouth, the man said, ``Listen well. threatened Mr. Behind the man, the daughter heard a voice calling for help.

Initially, the man demanded a ransom of $ 1 million (about 130 million yen), but when Mr. DeStefano told him, ``I don't have that much money,'' the man said $50,000 (about 6.6 million yen). ).



At that time, there was another mother around Mr. DeStefano, one called the police and the other called Mr. DeStefano's husband. And just four minutes after the man called her, her daughter was confirmed safe. It was confirmed that the daughter was in her room, and it seemed that she did not know what had happened. DeStefano is indignant, saying, 'This is not a game.'

Mr. DeStefano recalls, ``That voice on the phone was completely my daughter's voice,'' but it is speculated that the actual phone voice is probably a clone voice. In the past, a large amount of original voice data was required to create a clone voice, but in January 2023, Microsoft's voice synthesis AI ' VALL-E ' announced emotional tones from a sample of only 3 seconds. It is possible to synthesize the voice of a person who reproduces

Microsoft announces speech synthesis AI ``VALL-E'' that can reproduce human voice from a sample of only 3 seconds - GIGAZINE



FBI Assistant Special Investigator Dan Mayo reports, ``Scammers who use speech synthesis AI often find targets on SNS such as Twitter and Instagram.'' 'In order not to fall victim to such scams, it is important to make your profile and posts on social media private. When scammers find an account with a lot of personal information open to the public, they will thoroughly We will investigate the subject in the future.”

In addition, Mr. Mayo recommends ``Do not answer the phone if the phone number is originating from an unfamiliar area code or if it is an international call'' and ``Use a secret code only for your family when asking for help''. doing. According to Mayo, there are cases where people forget to report fraud because they are relieved that their families are safe, while others fall victim to fraud and pay the ransom.



'The FBI is thoroughly investigating these scammers and will definitely find them,' Mayo said.

In the United States, there is a rapid increase in the number of incidents where AI-made clone voices are used for ``It's me fraud'', and it is reported that the amount of damage in 2022 will reach 11 million dollars (about 1.5 billion yen) in the United States alone. I'm here.

The number of cases of using AI clone voice for ``Oreore fraud'' has increased rapidly, and the amount of damage is nearly 1.5 billion yen annually - GIGAZINE

in Note, Posted by log1r_ut