Post Image

“It was completely her voice.” – AI technology used to Clone Voice of Teen in Kidnapping

A recent kidnapping scam in Scottsdale, Arizona, has highlighted the increasing ease of creating believable cloned voices with artificial intelligence (AI) software. In this case, scammers used AI to clone the voice of a 15-year-old girl and extort a ransom from her mother.

Jennifer DeStefano, the mother of the targeted girl, received a call from an unfamiliar number while she was at her daughter’s dance studio. She almost let it go to voicemail, but picked it up because her daughter was out of town skiing and she feared there may have been an accident. It was her daughter’s voice on the line, sobbing and admitting to making a mistake. However, a man’s voice took over the call and threatened to harm DeStefano’s daughter if she called the police or anyone else. The man demanded a $1 million ransom, which he eventually lowered to $50,000.

No alt text provided for this image

DeStefano was surrounded by other parents at the dance studio who caught on to the situation and called for help. Within minutes, they confirmed that her daughter was safe and had never been in any danger. The scammers had used AI software to clone her daughter’s voice, which was so convincing that DeStefano never doubted it was her daughter.

“It was completely her voice. It was her inflection. It was the way she would have cried,” DeStefano said in a video interview with KPHO.

AI-generated voices are becoming easier to create and use, even for non-experts, due to advancements in technology. It used to take extensive recordings to create a believable cloned voice, but now, it only takes a few seconds of recorded speech. Experts say that AI voice generation is becoming more accessible and is not just in the hands of Hollywood and computer programmers anymore.

Just have in mind, that a recent AI model by Microsoft needs only a 3 seconds clip of any voice in order to clone it.

These cases highlight the importance of being aware of the potential for cloned voices to be used in scams. Experts recommend being cautious when receiving unexpected calls, especially if they demand sensitive information or money. It’s always best to verify the identity of the caller before giving out any information or transferring any funds.

toroblocks Protection
svgYes, Large Language Models can Self-Improve.
svgHow a Chatbot Pretended to Be Blind and Outsmarted a Human