FCC bans robocalls using deepfake voice clones − but AI-generated disinformation still looms over elections
Share This
February 08, 2024
Joan Donovan, Assistant Professor of Journalism and Emerging Media Studies, Boston University -
The Conversation
The Federal Communications Commission on Feb. 8, 2024, outlawed robocalls that use voices generated by artificial intelligence.
The 1991 Telephone Consumer Protection Act bans artificial voices in robocalls. The FCC’s Feb. 8 ruling declares that AI-generated voices, including clones of real people’s voices, are artificial and therefore banned by law.
The move follows on the heels of a robocall on Jan. 21, 2024, from what sounded like President Joe Biden. The call had Biden’s voice urging voters inclined to support Biden and the Democratic Party not to participate in New Hampshire’s Jan. 23 GOP primary election. The call falsely implied that a registered Democrat could vote in the Republican primary and that a voter who voted in the primary would be ineligible to vote in the general election in November.
The FCC and the New Hampshire attorney general’s office are investigating the call. On Feb. 6, 2024, New Hampshire Attorney General John Formella identified two Texas companies, Life Corp. and Lingo Telecom, as the source and transmitter, respectively, of the call.
Injecting confusion
Robocalls in elections are nothing new and not illegal; many are simply efforts to get out the vote. But they have also been used in voter suppression campaigns. Compounding this problem in this case is the application of AI to clone Biden’s voice.
In a media ecosystem full of noise, scrambled signals such as deepfake robocalls make it virtually impossible to tell facts from fakes.
A second teen has pleaded guilty in the death of a 20-year-old driver who was hit in the head by a rock that crashed through her windshield in suburban Denver last year