In its latest effort to root out disinformation on its platform, Facebook announced it is taking down 97 pages, groups, and accounts emanating from Russia and targeting Ukraine that attempted to conceal who was behind them.
Facebook’s head of cybersecurity policy Nathaniel Gleicher clarified that Facebook was taking them down because of their manipulative behavior and not due to the content they were posting.
The groups, accounts, and pages posted primarily about political topics of local concern in Ukraine, such as the conflict in Eastern Ukraine, the conflict in Syria, Russian politics, and European politics, Gleicher said in a blog post. The actors used fake accounts to disseminate information and also worked to redirect users to an external site that posted on similar topics.
In one case up to 34,000 users followed a page coordinating disinformation, and in another, up to 86,000 accounts joined one of the groups.
Ben Nimmo, who has analyzed the pages that have been taken down, told CyberScoop this doesn’t look like it was purely focused on promoting Kremlin propaganda. He said there are indications it was aimed at monetization.
“We ran the main website through VirusTotal but it showed up a piece of malware which can be used for hijacking your computer, which can then be mined for bitcoin,” said Nimmo, a Senior Fellow for Information Defense at the Atlantic Council’s Digital Forensic Research Lab. “Generally if you’re a propaganda outlet you’re pushing propaganda, if you’re a bitcoin miner you’re mining bitcoin.”
Some of the accounts that Facebook took down were linked to accounts that were removed in a previous takedown the company announced in March, in which there was also inauthentic behavior emanating from Iran, Macedonia, and Kosovo, according to Gleicher. There was “technical overlap” between that activity and that which was removed this month, a Facebook official told CyberScoop.
Although all the language and cultural clues are linked to Russia, his team hasn’t found enough evidence to attribute this behavior to the Kremlin, let alone any one actor.
“It’s a very important reminder that … even if something is coming from Russia there are quite a lot of reasons it might be coming from Russia,” Nimmo said. “There’s not enough evidence to attribute this to any particular actor … We certainly don’t have enough information to say.”
The behavior associated with this campaign was not as sophisticated as previous Russian state-linked interference, Nimmo said.
“It was a lot more primitive … The original troll farm we saw in 2017 had quite a lot of strong personality accounts,” he said. “It was all about engaging people and keeping them on the platform.”
Nimmo said some of the behavior looked like previous activity he has seen emanate from Iran.
“It was using social media to steer people towards this off-platform publication which is actually a pretty clunky way of doing it, it’s what we’ve seen the Iranians do.”