Twitter has taken action against a slew of state-linked influence operations run from Russia, Iran and Armenia in recent days, the company announced Tuesday.
One Russian influence operation, believed to be run by state actors, shared information that aligned with the Russian government’s goals and which sought to undermine the North Atlantic Treaty Organization, according to a Twitter blog post. Another Russian campaign appears to have links with the government-run troll farm that interfered in the 2016 presidential elections in the U.S. The company removed the accounts for pretending to belong to people they were not.
The takedown of these efforts in recent days is emblematic of a pernicious threat that social media companies face in trying to establish ground truth on their platforms. Twitter has been working for years to oust manipulative influence operations from its platform, and while it has seen some success, it continuously runs into repeat offenders who spread disinformation, including in the case of the Kremlin-linked troll farm known as the Internet Research Agency.
A scheme apparently emanating out of Iran appears to be linked with a broader campaign Twitter has already sought to address. Twitter previously targeted this particular operation in previous months for its efforts to influence the 2020 presidential elections in the U.S. Those 130 accounts had attempted to “disrupt the public conversation” during the first presidential debate last year.
The Iranian influence operation comprised the bulk of the most recent takedown, with 328 accounts violating Twitter manipulation policies. In all, Twitter suspended 373 accounts across four different networks of the state-linked influence operations.
Twitter also announced that it conducted a takedown of an influence operation that appeared aligned with the Armenian government’s goal’s. The accounts in question pretended to be political figures or news organizations located in the country.
The news comes as stakeholders concerned about the power of disinformation and misinformation and the repercussions of influence operations spilling over into the real world ratchet up their efforts to tamp down on its effects. An analysis conducted by NewsGuard and PeakMetrics published earlier this month showed a whopping 87% of the news links shared on social media platform Parler around the Capitol insurrection were filled with misinformation.
Following the January attack on the seat of government, disinformation experts told CyberScoop correcting course on disinformation in the nation will likely take a lot of work and investment from the federal government moving forward.
Twitter also shared data about the accounts discussed in its update Tuesday with researchers from Stanford University.