Parents warned of disturbing kidnapping scheme using children’s voice copies

Phone scams have been around for a while, but recent advances in artificial intelligence technology are making it easier for bad actors to convince people they’ve kidnapped their loved ones.

Scammers are using artificial intelligence to replicate the voices of people’s family members in fake kidnapping schemes that convince victims to send money to scammers in exchange for the safety of their loved ones.

The scheme recently targeted two victims in a Washington state school district.

Highline Public Schools in Burien, Washington, issued a September 25 announcement warning community members that the two individuals were targeted by “scammers falsely claiming to have kidnapped a family member.”

“The cheaters played [AI-generated] audio recording of the family member, then demanded a reward,” the school district wrote. “The Federal Bureau of Investigation (FBI) has observed a nationwide increase in these scams, with a particular focus on families who speak a language other than English.

In June, Arizona mother Jennifer DeStefano testified before Congress about how scammers used artificial intelligence to trick her into believing her daughter had been kidnapped in a $1 million extortion plot. She began by explaining her decision to answer a call from an unknown number on Friday afternoon.

“I answered the phone ‘Hello.’ On the other end was our daughter Briana crying and crying saying, ‘Mommy,'” DeStefano wrote in her congressional testimony. “I didn’t think anything of it at first. … I asked her accidentally what happened while I had her on speakerphone walking across the parking lot to meet her sister. Briana continued with, “Mom, I messed up,” crying and crying.

In June, Arizona mother Jennifer DeStafano said congressional fraudsters used artificial intelligence to fake her daughter’s abduction. US Senate

At the time, DeStefano had no idea that a bad actor had used AI to repeat her daughter’s voice on the next line.

Then, she heard a man’s voice on the other end yelling at her daughter, while Briana — at least her voice — kept yelling that the bad guys had “got her.”

DeStafano claims she heard her daughter, Briana, crying and crying saying “mommy,” but it was all AI. Instagram/briedestefano

“The call was received by a threatening and vulgar man. ‘Listen here, I’ve got your daughter, tell somebody, call the cops, I’ll fill her belly full of drugs, I’ll go with her, throw her in. [M]exico and you will never see him again!’ All the while Briana was in the background desperately pleading, ‘Mom, help me!'” DeStefano wrote.

The men asked for a $1 million reward in exchange for Briana’s safety, while DeStefano tried to contact friends, family and police to help him find her daughter. When DeStefano told the scammers she didn’t have $1 million, they demanded $50,000 in cash to pick up in person.

While these negotiations were going on, DeStefano’s husband found Briana safely at home in her bed, unaware of the hoax involving her voice.

This. is a text sent by DeStefano during the fake kidnapping scam. Family leaflet

“How could she be safe with her father and yet be in the possession of kidnappers? It didn’t make sense. I needed to talk to Brie,” DeStefano wrote. “I couldn’t believe she was safe until I heard her voice say she was. I kept asking her if it was really her, if she was really sure, again, is this really Brie? Are you sure you are really sure?! My mind was spinning. I can’t remember how many times I needed reassurance, but when I finally understood the fact that she was safe, I was furious.”

DeStefano concluded her testimony by noting how AI is making it harder for people to trust their eyes and hear with their ears — especially in virtual environments or on the phone.

Briana wasn’t kidnapped at all. Her father found her at home in bed. Instagram/briedestefano

Bad actors have targeted victims across the United States and around the world. They are able to replicate a person’s voice through two main tactics: first, by collecting data from a person’s voice if they answer an unknown call from fraudsters, who will then use AI to manipulate the same voice to say complete sentences; and second, by collecting data from a person’s voice through public videos posted on social media.

That’s according to Beenu Arora, CEO and chairman of Cyble, a cybersecurity company that uses AI-driven solutions to stop bad actors.

DeStafano told lawmakers that AI has made it difficult for people to trust their eyes and ears. US Senate

“So you’re basically talking to them, assuming somebody’s trying to have a conversation, or … a telemarketing call, for example. The goal is to get the right data through your voice, which they can go further and emulate … through some other AI model,” Arora explained. “And that’s becoming much more prominent now.”

He added that “extracting audio from a video” on social media “isn’t that difficult.”

“The longer you talk, the more data they’re collecting,” Arora said of the scammers, “and also less voice modulations and your jargon and the way you talk. But … when you come to these ghost calls or idle calls, in in fact I recommend you not to talk too much.”

The National Institutes of Health (NIH) said to be wary of unknown area codes or numbers that do not come from the victim’s phone. Reuters

The National Institutes of Health (NIH) recommends that crime targets beware of calls demanding ransom money from unknown area codes or numbers that are not the victim’s phone. These bad actors also go to great lengths to keep victims on the phone so they don’t have a chance to contact the authorities. They also often require the ransom money to be sent through a wire transfer service.

The agency also recommends that people targeted by this crime try to slow down the situation, ask to speak directly with the victim, ask for a description of the victim, listen carefully to the victim’s voice, and try to contact the victim separately via phone call, text or messages. direct message.

The NIH released a public service announcement saying that many NIH employees have fallen victim to this type of scam, which “usually begins with a phone call saying your family member is being held captive.” In the past, scammers pretended that a victim’s family member had been kidnapped, sometimes with the sounds of screaming in the background of a phone call or video message.

Recently, however, due to advances in AI, fraudsters can replicate a loved one’s voice using social media videos to make the kidnapping plot seem more real. In other words, bad actors can use AI to copy the voice of a loved one, using publicly available videos on the Internet, to make their targets believe that their family members or friends are kidnapped.

Callers usually give the victim information on how to send money in exchange for the safe return of their loved one, but experts suggest that victims of this crime take a moment to stop and reflect on whether what they’re hearing on the other line is true. even in a moment of panic.

“My humble advice… is that when you are[s] of alarming messages, or someone is trying to push you or do things [with a sense] urgent, it is always better to stop and think before you go,” said Arora. “I think as a society, it’s going to become much more challenging to identify the true versus the false because of the progress we’re seeing.”

Arora added that while there are some AI-centric tools being used to proactively identify and combat AI fraud, due to the rapid pace of this developing technology, there is a long way to go for those working to prevent bad actors from taking advantage of unknown victims. Anyone who believes they have fallen victim to this type of scam should contact their local police department.

#Parents #warned #disturbing #kidnapping #scheme #childrens #voice #copies
Image Source : nypost.com

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top