Technology

AI driving more sophisticated scams, tech scholars tell lawmakers 

Chair of the Senate Special Committee on Aging, Sen. Bob Casey, D-Pa., attends a committee hearing, Thursday, May 18, 2023, on Capitol Hill in Washington. (AP Photo/Jacquelyn Martin)

Tech scholars and scam victims told lawmakers artificial intelligence (AI) is driving more sophisticated scams during a Senate hearing Thursday morning.  

“AI amplifies the impact of scams, enhancing their believability and emotional appeal through personalization,” said Tahir Ekin, professor and the director of the Center for Analytics and Data Science. 

Lawmakers agreed, saying although AI has many good uses, its malicious uses pose a difficult task for regulators.  

“The main takeaway is that AI can be used for that [scamming], and also a tool that can be used against it,” said Sen. Mike Braun (R-Ind.), ranking member of the Senate Special Committee on Aging. “That’s the conundrum, and we just have to figure it out.” 

Witness Gary Schildhorn testified about his experience being scammed. The scammer used AI to mimic his son’s voice. Schildhorn was close to sending the scammer $9,000 when he called his daughter-in-law to ask her about the situation. The scammer, posing as an attorney, told Schildhorn his son had been driving while drunk, caused a car accident and would need 10 percent of a $90,000 bail after failing a breathalyzer. Schildhorn’s daughter-in-law told him this was not the case.  


“I said to Brett [his son] that there was no doubt in my mind that it was his voice on the phone—it was the exact cadence with which he speaks,” Schildhorn said. “I sat motionless in my car just trying to process these events. How did they get my son’s voice? The only conclusion I can come up with is that they used artificial intelligence, or AI, to clone his voice.” 

According to the Federal Trade Commission, older Americans are more likely to fall victim to scams than younger Americans.  

“It has long been an aging committee priority to protect older adults from fraud and from scams,” said committee Chair Sen. Bob Casey (D-Pa.), while also noting AI makes scams harder to detect. 

“While we are working to understand potential applications of AI, scammers have been integrating it into their schemes to make their ploys more life-like and convincing,” he said.  

Because there is limited regulation regarding AI and scamming, the witnesses suggested some ways for lawmakers to tackle this issue. 

“My answer is that there needs to be some legislation that allows these people [anonymous scammers] to be identified,” Schildhorn said.   

Since no money was transferred, law enforcement told Schildhorn they could not punish the scammer who targeted him. After reaching out to local law enforcement but failing to get them to open an investigation, Schildhorn reached out to his local newspaper, the Philadelphia Inquirer, to do a feature on his experience with the attempted scam.  

When asked about any advice he would give to other scam victims, Schildhorn said, “I recommend two things; they publicize their story and suggest they contact their bank and recommend a policy where bank tellers are required to ask customers why they may be making an unusual cash withdrawal.” 

Other witnesses called scamming “extortion,” and said it should be criminalized, to make scammers legally accountable for the distress they cause their victims. 

“Extortion is a crime,” Steve Weisman, a scam expert and editor of scamicide.com, said. “Attempted extortion is a crime.”  

Experts also advised lawmakers to think about the balance between technology innovation while protecting vulnerable Americans, who could be targeted by scammers using AI to make their scams almost undetectable. 

“Striking a careful balance between fostering AI innovation and protecting vulnerable populations is paramount,” Ekin said.