InformedInsights

Get Informed, Stay Inspired

US political consultant indicted over AI-generated Biden robocalls
Technology

US political consultant indicted over AI-generated Biden robocalls

A Louisiana political consultant has been indicted over a fake robocall imitating U.S. President Joe Biden seeking to dissuade people from voting for him in New Hampshire’s Democratic primary election, the New Hampshire Attorney General’s Office said on Thursday.

Steven Kramer, 54, faces 13 charges of felony voter suppression and misdemeanor impersonation of a candidate after thousands of New Hampshire residents received a robocall message asking them not to vote until November.

A lawyer for Kramer could not immediately be identified.

Separately, the Federal Communications Commission on Thursday proposed a $6 million fine over the robocalls it said were using an AI-generated deepfake audio recording of Biden’s cloned voice, saying its rules prohibit transmission of inaccurate caller ID information.

It also proposed to fine Lingo Telecom $2 million for allegedly transmitting the robocalls.

There is growing concern in Washington that AI-generated content could mislead voters in the November presidential and congressional elections. Some senators want to pass legislation before November that would address AI threats to election integrity.

“New Hampshire remains committed to ensuring that our elections remain free from unlawful interference and our investigation into this matter remains ongoing,” Attorney General John Formella said.

Formella hopes the state and federal actions “send a strong deterrent signal to anyone who might consider interfering with elections, whether through the use of artificial intelligence or otherwise.”

On Wednesday, FCC Chairwoman Jessica Rosenworcel proposed requiring disclosure of content generated by artificial intelligence (AI) in political ads on radio and TV for both candidate and issue advertisements, but not to prohibit any AI-generated content.

The FCC said the use of AI is expected to play a substantial role in 2024 political ads. The FCC singled out the potential for misleading “deep fakes,” which are “altered images, videos, or audio recordings that depict people doing or saying things that [they] did not actually do or say, or events that did not actually occur.”

Source: voanews.com