Technology

FCC bans AI-generated voices in robocalls

The Federal Communications Commission (FCC) banned the use of artificial intelligence (AI) generated voices in robocalls on Thursday just weeks after a message imitating President Biden went out to New Hampshire residents ahead of the state’s primary election.

The agency unanimously adopted a ruling recognizing AI-generated voices as “artificial” under the Telephone Consumer Protection Act, which restricts telemarketing calls and the use of artificial or prerecorded voice messages.

The law gives the FCC the ability to fine robocallers and block calls from telephone carriers facilitating illegal robocalls, the agency noted in a press release. It also allows consumers and organizations to sue robocallers.

“It seems like something from the far-off future, but it is already here,” FCC Chairwoman Jessica Rosenworcel said in a statement. “Artificial Intelligence-generated voice cloning and image creating tools are now more accessible and more likely to be used for fraud.” 

“This technology can confuse us when we listen, view, and click, because it can trick us into thinking all kinds of fake stuff is legitimate,” she continued.


Rosenworcel pointed to several recent incidents, including the circulation of explicit AI-generated photos of pop superstar Taylor Swift last month and the AI version of actor Tom Hanks that promoted a dental plan online last fall.

She also appeared to reference the fake Biden call, citing “calls from candidates for political office that are designed to confuse us about where and when to vote.” The fake Biden call that went out to Granite State residents last month urged them save their vote for November’s election and not cast their ballot in the primary contest.

“No matter what celebrity or politician you favor, or what your relationship is with your kin when they call for help, it is possible we could all be on the receiving end of these faked calls,” the FCC chair added.