Lara Trump Withdraws from Consideration for Florida Senate Seat, Leaving Marco Rubio’s Position Open

Lara Trump removes her name to replace Marco Rubio as Florida senator Lara Trump, daughter-in-law of President-elect Donald Trump, said on Saturday she has removed her name from consideration to replace outgoing U.S. Senator Marco Rubio. Rubio was picked by Trump to serve as secretary of state. Florida Governor Ron DeSantis will pick a replacement for Rubio, who
HomeBusinessVoice-Cloning AI: The New Frontier in Scamming and Identity Theft

Voice-Cloning AI: The New Frontier in Scamming and Identity Theft

 

 

Voice-Cloning AI: A New Tool for Scammers to Steal Your Money


As the festive season approaches, there’s an urgent alert regarding how your holiday-themed TikTok or Facebook videos might be exploited by scammers using AI voice-cloning technology to defraud individuals, including vulnerable family members like Grandma.

 

Even more concerning is the potential misuse of that friendly voicemail greeting you leave. Experts now suggest that instead of a personal message like “Hi, this is Sally; I can’t come to the phone right now,” it’s safer to rely on the standard, pre-recorded voicemail greeting that’s not personalized.

This isn’t the cheerful news we want to hear as we ring in 2025, but it’s a critical message we cannot afford to overlook.

AI Technology Can Mimic Our Voices

Fraudsters now have access to advanced tools that experts believe will significantly amplify fraudulent activities in the coming years—AI voice and video cloning methods.

 

Scammers are keen to capture our voices and videos to convincingly impersonate us for financial gain. They might pose as someone close, like a grandchild needing help to escape a legal bind, or a supervisor requesting payment for an ambiguous bill, among other scenarios.

 

The FBI has raised alarms that AI tools represent an increasing risk to both consumers and businesses, as cybercriminals utilize AI for complex phishing and social manipulation schemes.

 

In early December, Michigan’s Attorney General Dana Nessel alerted citizens that the rapid evolution of AI technology is being exploited to create “deepfake audio and video scams that are so lifelike they can deceive even our closest acquaintances.”

 

While we aren’t currently hearing about numerous voice-impersonation fraud cases in our area, experts warn that we should be bracing ourselves for an imminent surge in these types of scams and take action to safeguard ourselves.

 

According to Greg Bohl, chief data officer for Transaction Network Services—which supports the telecommunications sector—criminals only require about three seconds of your voice to accurately reproduce it, capturing distinctive elements like pitch, tone, and speech speed using affordable AI tools.

Criminals often gather data available on social media or other sources to aid in voice replication.

“The longer the audio they can access, the more precise their imitation can be,” Bohl shared in a video conference.

He described a 30-second voicemail or social media clip as a “treasure trove for malicious actors.”

 

Many scams are already adept at disguising their calls as coming from recognizable businesses or government groups. Frequently, real names are even employed to create the illusion of legitimacy.

However, this latest development in AI voice cloning will elevate scams to an unprecedented level, complicating the ability for consumers to identify fraudulent robocalls and texts.

The Federal Communications Commission cautions that AI could be employed to make it seem as though celebrities, elected officials, or even friends and family members are calling. The FCC is collaborating with state attorneys general to eradicate illegal AI-generated calls and messages.

 

Scammers Conduct Background Research to Sound Authentic

 

Scammers need to identify who among your friends and family would likely assist you in a crisis. Therefore, they first research to find potential targets before orchestrating a false emergency call asking for money.

Over the holidays, Bohl said, any interactions on social media meant to connect with loved ones could increase your vulnerability to fraud.

His two principal recommendations are:

No. 1: Switch to automated voicemail.

No. 2: Establish a family “safe word.”

With scammers using replicated voices, fraudulent calls will seem much more convincing. Consequently, it is essential to have a secure way to verify identities before taking any action during such calls.

Consider using questions that are not easily guessable, such as: What tricks can the dog perform in the morning? What was your most cherished childhood memory? What was your highest golf score? You want something that a scammer cannot quickly figure out or find online. (And if you don’t have a dog or play golf, you might just have a clever question in mind.)

 

“We should anticipate a considerable uptick in AI-powered

“By 2025, fraudulent activities will increase,” stated Katalin Parti, an associate professor of sociology and a cybercrime specialist at Virginia Tech.

 

She noted that the merger of social media and generative AI will lead to more advanced and perilous scams.

As an aspect of their fraud tactics, these scammers can utilize robocalls to grab voice samples from potential victims. It’s advisable to avoid engaging in such calls, even if it’s just to say “hello.”

Parti offers additional advice: Avoid reaching out to any phone number received through pop-ups, texts, or emails. Do not engage with cold calls, even if the caller ID shows a local number. If you opt to answer a call from an unknown number, allow the caller to speak first.

 

According to Siwei Lyu, a computer science and engineering professor at the University of Buffalo and director of the UB Media Forensic Lab, AI voice-cloning poses a serious risk, particularly for financial scams aimed at older adults, as well as for misleading information in political campaigns.

He expressed concern that AI-generated voices can be challenging to distinguish from real voices, especially when used over the phone and in messages designed to elicit emotional responses—like believing a family member is in distress.

“Take a moment to step back and verify if the call is actually genuine,” Lyu advised, suggesting paying attention to specific indicators that might reveal an AI-generated voice.

“Be alert for unusual features, such as an unusually quiet backdrop, an absence of emotional inflection in the voice, or silence where you would typically hear breathing,” he added.

 

Advancements in technology can enhance the credibility of scam calls

However, it’s essential to remember that technology is continuously evolving. Nowadays, various phishing emails and messages appear more authentic due to AI advancements.

 

The traditional advice that suggests looking for poor grammar or spelling mistakes in an email to identify scams may become ineffective, as AI tools assist foreign criminals in crafting messages to target U.S. consumers and businesses.

Among other threats, the FBI has warned that cybercriminals might:

  • Create brief audio snippets mimicking the voice of a loved one to pose as a grandchild or relative in distress, such as claiming they were arrested or in an accident. When the voice resembles someone familiar, individuals may be more inclined to panic and comply with requests for money, like bail or ransom. Similarly, they may feel pressured to respond swiftly when their “boss” demands purchase of gift cards from Best Buy to settle an invoice. Always remain skeptical.
  • Utilize AI-generated audio clips to impersonate individuals, gaining unauthorized access to bank accounts.

 

  • Expect scammers to deploy realistic videos in private communications to ensure their online persona appears as a genuine person.

It is often hard to fathom how cybercriminals from far distances could have knowledge of our voice patterns, but significant information is readily available beyond just a basic voicemail.

School events are bein streamed online. Business seminars are available for viewing online. Often, our occupations require us to share information online to promote the brand.

 

“There is escalating worry that cybercriminals could infiltrate voicemail systems or even telecommunications providers to access voicemail messages left for doctors’ offices or financial advisers,” said Teresa Murray, the director of the Consumer Watchdog office for U.S. PIRG, a nonprofit advocacy group.

She emphasized that these risks have become more pressing following incidents like the massive data breach at National Public Data, which was reported in August and collects data for background checks.

It’s indeed disheartening.

Murray highlighted that the spread of scams necessitates discussions among family members to ensure everyone is aware that computers can mimic the voices of people we know.

It’s also vital to discuss the unreliability of Caller ID in confirming that a legitimate government agency is calling.

 

Feel free to hang up when necessary

Michigan Attorney General Nessel issued a warning about possible holiday scams leveraging artificial intelligence, advising that:

  • Families should establish a “code word” known only to them to verify identities during suspicious calls.
  • Be prepared to hang up; if something doesn’t feel right, just disconnect.
  • Verify someone’s identity by calling a number you recognize as legitimate.
  • Exercise caution when asked to transfer money. Scammers frequently demand payment via cryptocurrency, gift cards, or wire transfers, and once the money is sent, it is challenging to track or recover.