Advertisement

Facebook says it will step up efforts to stop coordinated campaigns that cause harm

The new policies directly build on Facebook's security work cracking down on inauthentic coordinated behavior.
Facebook
(Isriya Paireepairit / Flickr)

Facebook will ramp up efforts to curb coordinated activities from real users who are connected to dangerous activities in the real world, such as promotion of vaccine misinformation and organizing violence, the company said Thursday.

The new policy is an attempt to address a gap in the platform’s enforcement against real individuals who band together to repeatedly violate the platform’s standards. The plan is based on Facebook’s existing efforts to scrub its platform of fake accounts.

“From a security perspective, our goal is to borrow from the cybersecurity world and build an in-depth approach here, where we have multiple layers to catch violating activity that can cause harm to people on our platform,” Nathaniel Gleicher, Facebook’s head of security policy, said Thursday in a call with reporters.

Facebook will take a range of actions against violating accounts, including reducing content reach and disabling violating accounts.

Advertisement

The new policies build on Facebook’s work cracking down on inauthentic coordinated behavior, like those employed by Russia’s Internet Research Agency in 2016. Facebook has since removed thousands of fake pages and inauthentic accounts related to political efforts from nations including Egypt and Iran, as well as hundreds of accounts linked to a conservative organization, Turning Point USA, ahead of the 2020 election. Often, fake accounts pose as legitimate media outlets.

The platform has long struggled to rein in coordinated campaigns by real users that spread content in violation of the platform’s rules. The most visible example was an apparent failure to detect the way real users relied on the site to coordinate “Stop the Steal” protests of the U.S. election results, contributing to misinformation in the leadup to the riot at the Capitol on Jan. 6.

The company found that it had “little policy around coordinated authentic harm” and that it was wrestling with developing policies and tools to address such actions, according to an internal Facebook memo previously obtained by BuzzFeed. Facebook’s announcement Thursday bears out suggestions in the memo to take a more proactive approach against authentic coordinated behavior.

Under the new policy, Facebook removed roughly 150 accounts and pages associated with Querdenken movement in Germany, a group that promotes the notion that government regulations around COVID-19 are a conspiracy to restrict citizens’ rights. The group has been linked to offline violence, one of the factors that Facebook will weigh in its new policy.

The changes come as the company faces intense pressure from U.S. lawmakers and the White House to crack down on COVID-19 disinformation.

Advertisement

Gleicher emphasized that the company is currently taking a very narrow approach to the protocols in order to distinguish between groups systemically causing harm, and like-minded individuals coming together authentically to share similar beliefs on the platform.

The current system monitors for technical signals and behavior patterns that go beyond just posting the same content. He declined to discuss them in further detail.

“There are some serious questions about where the lines should be drawn here,” said Gleicher. “There’s a broader societal conversation that needs to answer some of these questions.”

The new policy comes amid a series of investigations by the Wall Street Journal revealing Facebook’s failures to address “ill-effects” on the platform highlighted by internal research, including its uneven enforcement of high-profile accounts.

Tonya Riley

Written by Tonya Riley

Tonya Riley covers privacy, surveillance and cryptocurrency for CyberScoop News. She previously wrote the Cybersecurity 202 newsletter for The Washington Post and before that worked as a fellow at Mother Jones magazine. Her work has appeared in Wired, CNBC, Esquire and other outlets. She received a BA in history from Brown University. You can reach Tonya with sensitive tips on Signal at 202-643-0931. PR pitches to Signal will be ignored and should be sent via email.

Latest Podcasts