By David Shepardson
WASHINGTON (Reuters) -The Federal Communications Commission on Thursday finalized a $6 million fine for a political consultant over fake robocalls that mimicked President Joe Biden's voice, urging New Hampshire voters not to vote in that state's Democratic primary.
In May, Steven Kramer, a Louisiana Democratic political consultant, was indicted in New Hampshire over calls that appeared to have Biden asking residents not to vote until November. Kramer had worked for Biden's primary challenger, Representative Dean Phillips, who denounced the calls.
In January, Kramer told media outlets he paid $500 to have the calls sent to voters to raise attention to the danger of artificial intelligence in campaigns.
The FCC said the calls were generated using an AI-generated deepfake audio recording meant to sound like Biden’s voice.
FCC rules prohibit transmission of inaccurate caller ID information. The commission said Kramer will be required to pay the fine within 30 days or the matter will be referred to the Justice Department for collection.
Kramer or a spokesperson could not immediately be reached.
"It is now cheap and easy to use Artificial Intelligence to clone voices and flood us with fake sounds and images," FCC Chair Jessica Rosenworcel said. "By unlawfully appropriating the likeness of someone we know, this technology can illegally interfere with elections. We need to call it out when we see it and use every tool at our disposal to stop this fraud."
In August, Lingo Telecom agreed to pay a $1 million fine after the FCC said it transmitted the New Hampshire fake robocalls.
The FCC said Lingo under the settlement will implement a compliance plan requiring strict adherence to FCC caller ID authentication rules.
The commission in July voted to propose requiring broadcast radio and television political advertisements to disclose whether content is generated by AI. That proposal is still pending.
(Reporting by David Shepardson; Editing by David Gregorio)