AI’s such as ChatGPT and myriad voice changer tools can be useful and even fun to play around with but stories like this highlight the dangers of such technology. One Arizona mother was terrified into believing that her daughter had been kidnapped and held for ransom.
As reported by Arizona-based WKYT, Jennifer DeStefano was subjected to more than just a prank call. She received a call seemingly from her 15-year-old daughter, who was out of town on a ski trip.
‘Mom, I messed up,’ It began, followed by an unknown man saying “Put your head back, lie down.”
Every parent’s worse nightmare, the man went on to claim that he had Jennifer’s daughter hostage and made several horrific threats that unless she paid him $1 million dollars that she would not see her daughter again. The whole time she could be heard “going, ‘Help me, Mom. Please help me. Help me,’ and bawling.”
Thankfully, after calling her husband Jennifer was able to confirm her daughter was safe, sound and completely unaware what was going on. It had all been a horrible trick. The criminals had used AI to imitate her daughter’s voice exactly.
“It was completely her voice. It was her inflection. It was the way she would have cried,” she said. “I never doubted for one second it was her. That’s the freaky part that really got me to my core.”
This situation is horrible, but sadly not an isolated incident. Even high-profile figures have had words put into their mouth. Drake and the Weekend were targeted when a fake song was released in their voices.
More unforgivably, 7-time Formula One World Champion Michael Schumacher, who had a tragic ski accident in 2014, appeared to give an exclusive interview to a German magazine earlier this month, detailing some heartbreaking details about family life. What a scoop, until it was revealed it was all faked, using an AI trained to produce quotes that sounded like him. Schumacher’s family is now taking the magazine to court.
How can we avoid being faked by AI?
Unfortunately, we now live in a world where even just granting people access to our likeness and clips of our voice can be used against us. Public figures will likely struggle to escape from such scams but regular citizens can take a few measures to protect themselves.
Setting social media accounts to private is a great way to ensure only eyes and ears of people you trust see your posts. Videos on Instagram, Facebook, and the like can not only reveal what you look like but scammers only need a few of seconds of audio to spoof your voice.
If you find yourself talking online or on the phone to someone who you are wary of, ask them personal questions the real friend or family member would know and never send money to or click on links from non-trusted sources.