Faced with continued ethnic violence in Myanmar, Facebook banned the country’s commander-in-chief, the military’s television network and dozens of pages and accounts followed by almost 12 million people, the company announced on Monday.
Earlier on the same day, a United Nations fact finding mission in Myanmar called for an independent investigation of Facebook’s role in what the mission’s report describes as a genocide against the Rohingya ethnic minority, directed in large part by Gen. Min Aung Hlaing.
The U.N. investigators found that “Facebook has been a useful instrument for those seeking to spread hate, in a context where for most users Facebook is the Internet” and that the company has been “slow and ineffective” in response to the ongoing crisis.
Earlier this year, a U.N. investigator said Facebook’s primary role in directing hate and inciting violence against the Rohingya showed the platform had “turned into a beast.”
“The ethnic violence in Myanmar has been truly horrific,” a Facebook blog published on Monday read. Admitting they “were too slow to act,” spokespeople for the social media giant insist they are now “making progress — with better technology to identify hate speech, improved reporting tools, and more people to review content.”
The company removed 18 Facebook accounts, one Instagram account and 52 Facebook Pages.
Facebook founder Mark Zuckerberg told the U.S. senate in April that the company would hire dozens of Burmese speakers to handle hate speech from Myanmar. A recent Reuters review found that a host of content on Facebook attacking Rohingya and other Muslims from Myanmar.
Monday’s U.N. report spoke at length about Facebook’s role and the need to know more:
The role of social media is significant. Facebook has been a useful instrument for those seeking to spread hate, in a context where for most users Facebook is the Internet.
Although improved in recent months, Facebook’s response has been slow and ineffective. The extent to which Facebook posts and messages have led to real-world discrimination
and violence must be independently and thoroughly examined. The Mission regrets that Facebook is unable to provide country-specific data about the spread of hate speech on its platform, which is imperative to assess the adequacy of its response.
The United Nations report concludes that “the gross human rights violations and abuses” committed in Myanmar “are shocking for their horrifying nature and ubiquity.”
Both the U.N. and Facebook agreed Monday that the company’s response to the platform’s role in the violence has been inadequate. Facebook has admitted this before, telling Reuters that the number of Burmese speakers the company is paying in its content monitoring operations is “not enough.”
Myanmar is just one front in an ongoing global disinformation emergency that Facebook is dealing with. In the U.S. earlier this year, Zuckerberg pledged to Congress to curb “fake news.” Last week, the company removed hundreds of accounts tied to influence campaigns directed at the United States.
While much of the activity was discovered by Facebook itself, cybersecurity company FireEye contributed intelligence and cooperated with Facebook in taking down suspected bad actors. A report from FireEye offered insight into how widespread a problem Facebook is now confronted with.
“The activity we have uncovered highlights that multiple actors continue to engage in and experiment with online, social media-driven influence operations as a means of shaping political discourse,” an assessment from FireEye read last week.
“The activity we have uncovered highlights that multiple actors continue to engage in and experiment with online, social media-driven influence operations as a means of shaping political discourse.”