Current:Home > InvestFCC to consider rules for AI-generated political ads on TV, radio, but it can't regulate streaming -PrimeFinance
FCC to consider rules for AI-generated political ads on TV, radio, but it can't regulate streaming
View
Date:2025-04-14 21:44:56
The head of the Federal Communications Commission introduced a proposal Wednesday that would require political advertisers to disclose when they use AI-generated content in broadcast TV and radio ads.
The proposal, if adopted by the commission, would add a layer of transparency that many lawmakers and artificial intelligence experts have been calling for as rapidly advancing generative AI tools produce lifelike images, videos and audio clips that threaten to mislead voters in the upcoming U.S. election.
But the FCC, the nation's top telecommunications regulator, only has authority over TV, radio and some cable providers. Any new rules would not cover the explosive growth in advertising on digital and streaming platforms.
"As artificial intelligence tools become more accessible, the commission wants to make sure consumers are fully informed when the technology is used," FCC Chair Jessica Rosenworcel said in a statement Wednesday. "Today, I've shared with my colleagues a proposal that makes clear consumers have a right to know when AI tools are being used in the political ads they see, and I hope they swiftly act on this issue."
This is the second time this year that the commission has begun taking significant steps to combat the growing use of artificial intelligence tools in political communications. Earlier, the FCC confirmed that AI voice-cloning tools in robocalls are banned under existing law. That decision followed an incident in New Hampshire's primary election when automated calls used voice-cloning software to imitate President Joe Biden in order to dissuade voters from going to the polls.
If adopted, the proposal would ask broadcasters to verify with political advertisers whether their content was generated using AI tools — like text-to-image creators or voice-cloning software. The FCC has authority over political advertising on broadcast channels under the 2002 Bipartisan Campaign Reform Act.
But commissioners would still have to discuss several details, including whether broadcasters would have to disclose AI-generated content in an on-air message or only in the TV or radio station's political files, which are public. They also will be tasked with agreeing on a definition of AI-generated content, a challenge that has become fraught as retouching tools and other AI advancements become increasingly embedded in all kinds of creative software.
Rosenworcel hopes to have the regulations in place before the election.
Jonathan Uriarte, a spokesperson and policy adviser for Rosenworcel, said she is looking to define AI-generated content as that generated using computational technology or machine-based systems, "including, in particular, AI-generated voices that sound like human voices, and AI-generated actors that appear to be human actors." He said her draft definition will likely change through the regulatory process.
The proposal comes as political campaigns already have experimented heavily with generative AI, from building chatbots for their websites to creating videos and images using the technology.
Last year, for example, the RNC released an entirely AI-generated ad meant to show a dystopian future under another Biden administration. It employed fake but realistic photos showing boarded-up storefronts, armored military patrols in the streets and waves of immigrants creating panic.
Political campaigns and bad actors also have weaponized highly realistic images, videos and audio content to scam, mislead and disenfranchise voters. In India's elections, recent AI-generated videos misrepresenting Bollywood stars as criticizing the prime minister exemplify a trend AI experts say is cropping up in democratic elections around the world.
Rob Weissman, president of the advocacy group Public Citizen, said he was glad to see the FCC "stepping up to proactively address threats from artificial intelligence and deepfakes, including especially to election integrity."
He urged the FCC to require on-air disclosure for the public's benefit and chided another agency, the Federal Election Commission, for its delays as it also considers whether to regulate AI-generated deepfakes in political ads.
Rep. Yvette Clarke, a Democrat from New York, said it's time for Congress to act on the spread of online misinformation, which the FCC doesn't have jurisdiction over. She has introduced legislation for disclosure requirements on AI-generated content in online ads.
As generative AI has become more cheap, accessible and easy to use, multiple bipartisan groups of lawmakers have called for legislation to regulate the technology in politics. With just a little over five months until the November elections, they still have not passed any bills.
A bipartisan bill introduced by Sen. Amy Klobuchar, a Democrat from Minnesota, and Sen. Lisa Murkowski, a Republican from Alaska, would require political ads to have a disclaimer if they are made or significantly altered using AI. It would require the Federal Election Commission to respond to violations.
Uriarte said Rosenworcel realizes the FCC's capacity to act on AI-related threats is limited but wants to do what she can ahead of the 2024 election.
"This proposal offers the maximum transparency standards that the commission can enforce under its jurisdiction," Uriarte said. "It is our hope that government agencies and lawmakers can build on this important first step in establishing a transparency standard on the use of AI in political advertising."
- In:
- Joe Biden
- Elections
- Federal Communications Commission
- Politics
- Artificial Intelligence
veryGood! (5332)
Related
- Elon Musk's skyrocketing net worth: He's the first person with over $400 billion
- Eurasian eagle-owl eaten by tiger at Minnesota Zoo after escaping handler: Reports
- Nevada governor releases revised climate plan after lengthy delay
- Fired Philadelphia officer leaves jail to await trial after charges reduced in traffic stop death
- DeepSeek: Did a little known Chinese startup cause a 'Sputnik moment' for AI?
- US men’s basketball team rallies to beat Serbia in Paris Olympics, will face France for gold medal
- 'Trad wives' controversy continues: TikTok star Nara Smith reacts to 'hateful' criticism
- USA basketball pulls off furious comeback to beat Serbia: Olympics highlights
- Why members of two of EPA's influential science advisory committees were let go
- 2024 Olympics: Swimmers Are Fighting Off Bacteria From Seine River by Drinking Coca-Cola
Ranking
- Gen. Mark Milley's security detail and security clearance revoked, Pentagon says
- Rain, wind from Tropical Storm Debby wipes out day 1 of Wyndham Championship
- Rain, wind from Tropical Storm Debby wipes out day 1 of Wyndham Championship
- Montana sheriff says 28-year-old cold case slaying solved
- Nevada attorney general revives 2020 fake electors case
- Judge dismisses antisemitism lawsuit against MIT, allows one against Harvard to move ahead
- Christian Coleman, delayed by ban, finally gets shot at Olympic medal
- USA Olympic Diver Alison Gibson Reacts to Being Labeled Embarrassing Failure After Dive Earns 0.0 Score
Recommendation
Appeals court scraps Nasdaq boardroom diversity rules in latest DEI setback
Serbian athlete dies in Texas CrossFit competition, reports say
Montana sheriff says 28-year-old cold case slaying solved
An estimated 1,800 students will repeat third grade under new reading law
Behind on your annual reading goal? Books under 200 pages to read before 2024 ends
Harris and Walz head to Arizona, where a VP runner-up could still make a difference
Boeing’s new CEO visits factory that makes the 737 Max, including jet that lost door plug in flight
3 Denver officers fired for joking about going to migrant shelters for target practice