Bringing The World Home To You

© 2024 WUNC North Carolina Public Radio
120 Friday Center Dr
Chapel Hill, NC 27517
919.445.9150 | 800.962.9862
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations
WUNC End of Year - Make your tax-deductible gift!

Facebook Clamps Down On Posts, Ads That Could Undermine U.S. Presidential Election

Facebook CEO Mark Zuckerberg testifies before the House Energy and Commerce Committee in April 2018 on Capitol Hill.
Chip Somodevilla
/
Getty Images
Facebook CEO Mark Zuckerberg testifies before the House Energy and Commerce Committee in April 2018 on Capitol Hill.

Updated at 10:55 a.m. ET

Facebook said it won't accept any new political ads in the week leading up to the presidential election, one of several policies that CEO Mark Zuckerberg said will help ensure a fair election in November. One such measure involves deleting posts that claim people will get COVID-19 if they vote.

"This election is not going to be business as usual," Zuckerberg said Thursday about the vote that is now two months away. "We all have a responsibility to protect our democracy."

The new policies are also Facebook's latest attempt to respond to long-held criticisms over its handling of political content. Critics say the company allows a free hand to those who use the platform to mislead voters. And they accuse Facebook of applying inconsistent standards for members of the public, political figures and advertisers.

Facebook is already working to help people register to vote and to clarify how the election will work during a pandemic, Zuckerberg said. And in light of the intense political antagonism in the U.S., he added that the social media company will act "to reduce the chances of violence and unrest."

The company recently joined Twitter in removing accounts that spread "false stories about racial justice, the Democratic presidential campaign of Joe Biden and Kamala Harris and President Trump's policies," as NPR's Bobby Allyn reported.

Those removed accounts were linked to Russian state actors, suggesting Russia is once again seeking to influence U.S. elections with misinformation and by amplifying the differences among Americans.

But there is a new wrinkle for the Nov. 3 vote, Zuckerberg said. He noted that while Facebook has removed dozens of international networks that spread bogus stories, "we're increasingly seeing attempts to undermine the legitimacy of our elections from within our own borders."

To address those concerns, the company and its CEO outlined several changes Thursday in how Facebook handles ads, posts and other content.

If a post aims to undermine the legitimacy of the election – for instance, by "claiming that lawful methods of voting will lead to fraud," Facebook said – the company will attach an informational label. A similar approach will cover any content whose goal is to delegitimize the election's outcome.

Other posts would be removed outright, if they "claim that people will get COVID-19 if they take part in voting," the company said. If a post doesn't go quite that far – but still seeks to use COVID-19 to suppress voter participation — Facebook will attach a link to verified information about the coronavirus.

Facebook would still allow advertisers to "adjust the targeting" for ads that were accepted before it institutes the one-week quiet period in late October.

The policy restricting new political or issue ads in the final week of the campaign season does not mean Facebook users won't see any political ads in that time. Instead, advertisers can continue running ads that have already been published and scrutinized.

"It's important that campaigns can run get out the vote campaigns, and I generally believe the best antidote to bad speech is more speech," Zuckerberg said, "but in the final days of an election there may not be enough time to contest new claims."

The company's plan also includes potential measures that could be taken to rein in candidates or campaign accounts on Facebook that claim victory before final election results are in. In those cases, Facebook said, it will tack on a label that sends readers to authoritative election results, from either Reuters or the National Election Pool.

The company will also limit the ability to forward content on its Messenger platform in hopes of "reducing the risk of misinformation and harmful content going viral," Zuckerberg said.

"We've already implemented this in WhatsApp during sensitive periods and have found it to be an effective method of preventing misinformation from spreading in many countries," he added.

Facebook has faced intense scrutiny over how it approaches political content and ads. Most notably, Russian operatives used the platform to spread disinformation ahead of the 2016 U.S. presidential election.

New criticisms emerged early this year, and on many fronts. When Facebook said in January that it would continue to allow political advertisers to target users, Federal Election Commissioner Ellen Weintraub responded by saying that Facebook's "weak plan suggests the company has no idea how seriously it is hurting democracy."

Critics also said the company fell short in its response to an inflammatory post by Trump in late May in which the president revived the phrase "when the looting starts, the shooting starts."

Twitter hid its version of Trump's post behind a warning label, but Facebook declined to take such action. In the weeks that followed, members of the public and Facebook employees complained about the company's handling of racist and hateful rhetoric, leading a number of large corporations to pause their advertising on the social media platform.

By late June, as NPR's Shannon Bond reported, Facebook had reversed its position somewhat, saying it would "put warning labels on posts that break its rules but are considered newsworthy."

Editor's note:Facebook is among NPR's financial supporters.

Copyright 2021 NPR. To see more, visit https://www.npr.org.

Bill Chappell is a writer and editor on the News Desk in the heart of NPR's newsroom in Washington, D.C.
More Stories