The Federal Communications Commission on Thursday declared the use of voice-cloning technology in robocalls to be illegal, giving states another tool to go after fraudsters behind the calls.
The ruling takes effect immediately and comes amid an increase in such calls due to technology that offers the ability to confuse people with recordings that mimic the voices of celebrities, political candidates and even close family members.
“Bad actors are using AI-generated voices in unsolicited robocalls to extort vulnerable family members, imitate celebrities and misinform votes,” FCC Chairwoman Jessica Rosenworcel stated. “State attorneys general will now have new tools to crack down on these scams and ensure the public is protected from fraud and misinformation.”
The FCC’s action follows an incident ahead of New Hampshire’s presidential primary last month in which a phonyrobocall impersonating President Biden encouraged voters not to cast ballots in the contest. An estimated 5,000 to 25,000 of the calls were made.
New Hampshire Attorney General John Formella on Tuesday said the AI-generated recording made to sound like the president has been linked to two Texas companies, with a criminal probe underway.
The artificial intelligence-produced disinformation targeting voters prompted two U.S. Senators — Minnesota Democrat Amy Klobuchar and Maine Republican Susan Collins to recently press the U.S. Election Assistance Commission to take steps to combat such disinformation campaigns.
The New Hampshire robocall is only the latest flashpoint in AI-generated images, video and audio propagated online in an already contentious 2024 campaign cycle.