Jennifer DeStefano, a mother from Arizona claims that scammers used AI technology to clone her daughter’s voice in a $1 million kidnapping scheme. She received a call from someone claiming to have kidnapped her 15-year-old daughter while she was on a skiing trip. The person used an AI simulation to imitate her daughter’s voice and demanded a ransom of $1 million, later reduced to $50,000. DeStefano said that she heard her daughter crying and begging for help in the background, which convinced her that it was a genuine call — fortunately, the mother managed to confirm her daughter’s location and safety after contacting the police.
The technology used to clone the daughter’s voice was an AI simulation, which replicated her voice from brief soundbites. The incident has raised concerns about the misuse of AI and the potential for scammers to use it to manipulate people in new and sophisticated ways. The computer science professor and AI authority at Arizona State University, Subbarao Kambhampati, explained that AI simulations could come close to replicating someone’s voice with only three seconds of audio.
In order to avoid such scams, Dan Mayo, an assistant special agent in the FBI Phoenix office advises people to be cautious with their personal information on social media, as scammers can use this to dig into their lives and create a convincing story. Mayo recommends asking questions about loved ones that scammers wouldn’t know to verify their identity.
This incident highlights the importance of being vigilant and aware of the potential for AI technology to be used in scams.
Filed in AI (Artificial Intelligence) and Crime.
. Read more about