The number of posts that Facebook removed for violating its policies around hate speech in the second quarter of 2020 more than doubled than the number of posts during the previous quarter, the company said.
Facebook scrubbed 22.5 million pieces of hate speech — defined as violent or dehumanizing speech, statements of inferiority, slurs or calls for exclusion or segregation — from its platform, up from 9.6 million pieces of content in the first three months of the year. The uptick coincided with the removal of 14 networks that Facebook associated with “hate and/or white supremacist groups” such as the Ku Klux Klan, the Proud Boys and avowed neo-Nazi groups Atomwaffen and Blood & Honour.
The update comes as part of the firm’s regular community standards enforcement report. Facebook also removed 1.5 billion fake accounts during the same period.
The numbers come after civil rights attorneys Facebook hired to audit its content moderation program found what they described as a “deeply troubling” culture toward hateful content. Civil rights groups also have led an advertising boycott against Facebook over complaints that the social media firm has moved too slowly to combat misinformation, and remove posts from President Donald Trump that could result in voter suppression during the upcoming election.
The company also said in June that it removed advertisements from the president’s reelection campaign for violating the company policy against “organized hate.” The messaging included a large red triangle, pointing in the downward direction. The logo bore a striking resemblance to insignia that Nazis used to mark political prisoners incarcerated in concentration camps.
Facebook’s quarterly report Tuesday comes after an NBC News report revealed that Facebook groups dedicated to QAnon — an unfounded conspiracy which alleges that critics of Donald Trump are involved with a network of pedophiles — include millions of users. Facebook has not disclosed the huge scale of the QAnon community on its site, NBC reported. When pressed about the activity during a conference call with reporters Tuesday, a Facebook executive pointed to the company’s prior removal of 20 accounts involved in the movement.
Facebook also said on Tuesday it removed roughly 8.7 million million pieces of content that it determined violated company policy about terrorism, up from some 6.3 million earlier in 2020, according to its community standards enforcement report.
The increases were “largely driven by improvements in our proactive detection technology to identify content that violates [Facebook] policy,” a company spokeswoman said.