You don’t need to look far to find an example of where disinformation had a profound effect on international politics.
Consider Ireland’s abortion referendum, where foreign anti-abortion organizations targeted Irish social media users with specific ads. Or, last month, when Facebook announced it detected Russian propagandists masquerading as a Georgian fashion site. Or, of course, the 2016 U.S. presidential election.
Incidents like this are why the Atlantic Council is organizing a series of events and listening sessions next month in Brussels, Madrid and Athens where security experts can advise international lawmakers on how to stifle influence efforts. The goal is to help world leaders recognize and act more quickly to stop campaigns to magnify false narratives like ones used in debates over the Catalonia policy, Brexit, Ireland’s abortion referendum and the 2016 election.
“Western Europe and European Union countries once thought this is just a problem affecting the U.S. and Balkan states,” said Geysha Gonzalez, deputy director of the think tank’s Eurasia Center. “But it’s not just the Kremlin. It’s far-left and far-right domestic actors trying to influence public opinion.”
Misinformation, propaganda, cyber-operations and other influence campaigns have become ingrained in international policy debates as outsiders exploit social media to advance their own narrative.
Facebook last month said it removed nearly 800 accounts broadcasting messages that originated with Iranian state media about Israeli-Palestine relations and the Syrian war. Twitter also has said users are magnifying government communications from Venezuela and Bangladesh, while experts suggest the spread of fake news swayed last year’s referendum in Macedonia.
Representatives from Facebook, Twitter, Google, FireEye and renowned security experts like Clint Watts, who has testified in front of U.S. Congress about Russian interference, will participate in “#DisInfo Week Europe,” Gonzalez said. Geoffrey Pyatt, the U.S. ambassador to Greece, will also be in attendance at the Athens event.
The event series is scheduled to occur just months before the European Union’s Parliamentary elections begin May 23. The European Commission also has proposed fining social media companies up to 4 percent of their annual revenue if they fail to remove extremist content within an hour, according to the Guardian. That legislation could be enacted following Britain’s exit from the EU.
Gonzalez points to the U.K. as a recent example of an admirable response to misinformation. After Sergei Skripal, a former double agent who spied on Russia, and his daughter, Yulia, were poisoned in Salisbury, England, the Kremlin’s state media tried to create a smokescreen around the story to direct attention away from the suspects in the poisoning, two Russian nationals.
In April, Sputnik published an editorial questioning whether British authorities were “deliberately hiding” Yulia Skripal. The Russian Embassy’s official Twitter account suggested the Skripals were never treated for chemical poisoning.
Britain’s Foreign Office meanwhile sent its own tweets and published its own videos exposing conflicting narratives in Russian media and reporting that the poisoning followed “a well-established pattern of Russian state aggression.”
“They were ahead of the story, transparent about what they were doing and they informed the public,” said Gonzalez. “They had a great strategic communication response in an environment when the truth could have become more subjective…And we need to hear about solutions or tools people may have used where they addressed the challenges like this.”