A terrifying new scam uses artificial intelligence to dupe people by replicating their loved one's voice. That's exactly what happened to a San Jose mother who recently lost hundreds of dollars.
A few weeks ago, Livie Hernandez picked up the phone and heard her 20-year-old daughter screaming on the other end.
The call was from a number in New Jersey. The man's voice identified himself as a drug dealer and explained that Hernandez's daughter had witnessed a drug transaction and that he had kidnapped her daughter.
Then he asked for thousands of dollars. She threatened to take her daughter to Tijuana or else Hernandez would never see her daughter again.
So she followed the voice's instructions and ran to a nearby money transfer service in San Jose, where she first deposited $400 in the name of a woman from Tultepec, Mexico.
Hernandez told the man that this was all the cash he had, but then he heard his daughter screaming for help again. She raised her remaining money and sent it to the same woman.
Hernandez never heard from them again.
Finally, she called her daughter's cell phone and realized that her daughter was safe the whole time. That call was a scam.
FBI Special Agent Gilberto Lujan said, “I've talked to this money transfer service and they tell me this isn't the first fraud they've seen here.'' “We have given people that ability,” he added. Creating these attacks requires previous or deep technical experience. ”
Lujan is a member of the FBI's San Francisco Division's Cyber Security Unit. The department said crime scenes like Hernandez's are currently investigating hypothetical extortion cases. This makes tracking leads difficult.
In fact, Hernandez said he was told that by a family friend who was a police officer. That's why she didn't formally report it to San Jose police.
When NBC Bay Area contacted the number she had called, the line was already disconnected.
Still, the FBI recommends reporting such incidents to local authorities.
Some Bay Area technology companies are also stepping in to help.
“Our models can actually detect whether content is generated by AI, even when the human eye cannot tell,” said Hive CEO Kevin Guo.
San Francisco-based Hive has a free online tool that lets you check if images and audio are real.
Although it doesn't necessarily help in moments of panic, as it did with Hernandez, Guo recommends trying to carry on a longer conversation with someone who claims to be your loved one.
“So if you ask that person a specific question, can that person actually be that character and answer instantly in that voice?” Guo said. “Normally it's pre-recorded, right? Because it costs a lot of money to generate it.”
Hernandez now hopes that no one else has to go through what she went through, and that someday the people who deceived her will be punished.