Advertisement

After pro-Trump riot, experts urge US to tackle domestic disinformation

The reckoning is just beginning.
Dome of the U.S. Capitol by night
Dome of the U.S. Capitol by night. (Getty Images)

As Americans reckon with the ways that manufactured political narratives can influence public behavior following the riot at the U.S. Capitol, researchers who have spent years studying the issue warn that there’s no simple solution.

Disinformation campaigns on social media, sinking trust in journalism and a willingness among some lawmakers to spread conspiracies present a pernicious set of challenges for the federal government. While major technology firms have started to act against calls for violence, specialists say Congress, the intelligence community, the private sector and the incoming Biden administration must consider ways that Americans can improve media literacy before the issue becomes more of a national security issue.

“This isn’t specifically about elections, or just the pandemic,” said Cindy Otis, vice president of analysis at Alethea Group, which tracks threats online. “Influence targets our economy. Domestic actors such as white supremacist groups use it. It can become a counterterrorism issue.”

Belief in the unfounded notion that the Democratic Party stole electoral victory from President Donald Trump, a narrative that’s been widely debunked, is the latest evidence that Americans largely don’t trust the media to report news fairly or accurately.

Some 60% of Americans lack confidence that journalists report news “fully, accurately, and fairly,” according to a Gallup poll conducted between August and September 2020. Fifty-eight percent of Republicans reported having no confidence in the news, a ten point increase from 2019. 

New roles in government

The numbers paint a trying landscape ahead. Several lawmakers in recent weeks have urged the Biden administration to establish a coronavirus misinformation role. But the government should carve out more roles to properly address disinformation, including an intelligence community lead as well a role focused on interagency coordination on attempts to undercut trust in the government, suggested Otis, a former CIA officer in the directorate of analysis. 

“There needs to be an acknowledgement that disinformation affects topics across the board,” Otis said. “I think there needs to be a much stronger approach for the next administration.”

Such an idea could face significant roadblocks. Several Trump allies, including Rep. Matt Gaetz, R-Fla., one of the lawmakers that partook in the effort to overturn President-Elect Joe Biden’s win, have spread conspiracy theories in recent days that “antifa” was part of the storming of the Capitol.

Gaetz’s role in spreading lies reveals just how mainstream conspiracy theories have already become, says Emerson Brooking, a resident fellow at the Atlantic Council’s Digital Forensic Research Lab.

“It will remain, to an extent, a voting bloc,” Brooking told reporters on a call Friday. “There will always be a temptation for politicians to court and feed this movement because they’re getting great political utility from doing so.”

The U.S. government hasn’t been entirely blind to the threat of information operations. The Department of Homeland Security’s cybersecurity arm has dispelled election-related misinformation, along with the FBI. The Pentagon has also worked to interfere with foreign actors that have run influence campaigns targeting American politics.

But the insurrection at the Capitol exposes the gaps current government programs have in preventing conspiracy theories from dovetailing into real-world harms — and alerting on them. 

Platforms

Twitter and Facebook have banned Donald Trump, while Amazon, Google and Apple have each taken action against Parler, where many pro-Trump insurrectionists planned the assault on the Capitol. But the fact that these digital blockades went up only after five people were killed, rather than after repeated warnings about how the president’s claims would be weaponized, hasn’t escaped notice.

“It shouldn’t have taken an actual, literal attempted coup and insurrection to get people to acknowledge that this content is not harmless, to acknowledge that coordinated campaigns of false information can radicalize people, can inspire real-world harm,” Otis said. 

Facebook said Monday it has decided it will block phrases that could incite violence in the next several weeks, like “stop the steal,” which can be interpreted as a call to arms. 

Initial findings suggest that de-platforming could disrupt extremist networks. A 2015 study by professors from the Georgia Institute of Technology suggested that Reddit’s removal of two hateful sections resulted in an 80% decrease in the amount of hate-speech from accounts that belong to those communities. 

Instead of downgrading harmful content when it’s politically convenient — and after violence has already been committed — experts are calling on social media companies to step up and take action when they know they can make a difference.

“At what point does the bottom line of a private company depend more on that company being accountable to a [democracy], as opposed to a revenue base? I don’t think we’ve reached a tipping point on that,” Graham Brookie, director and managing director of DFRLab, told CyberScoop.

Social media giants have been warned on multiple occasions of the kinds of harms that can result from violent speech on their platforms, and now that they’ve downgraded some content, they need to wrestle with their longer-term strategy to prioritize safety, says Otis. 

“My question is then ok, so how do you decide it’s safe to … turn the hate speech and incitement of violence back on?” Otis said.

Coordination issues

A coordinating body or forum designed to shore up the nation’s protections against domestic disinformation, made up of lawmakers, government officials and other stakeholders, could be useful in correcting course on disinformation in America, said Simin Kargar, a human rights lawyer and researcher at DFRLab.

“Efforts to combat disinformation need to be centralized through a coordinating agency with links to civil rights groups on one hand and the security establishment on the other,” Kargar said. “Conspiracy theories and domestic disinformation are by far the most prominent threat to liberal democracies.”

The U.S. government typically addresses tech policy, foreign policy, and domestic policy in silos, but that may not be appropriate in this case, Brookie says: “I would add a category to that: ‘shoring up democracy policy.’”

A forum like this on disinformation, akin to the Cyberspace Solarium Commission, which successfully saw several of its proposals passed into law this winter on cybersecurity priorities, may need to come in the form of legislation — potentially a tall order when lawmakers are actively spreading misinformation on the pandemic and the election.

Addressing how the U.S. education system instructs students on media literacy will be key to any solution moving forward, Otis said.

“There needs to be a much more significant investment in education exactly on these issues, so integrating skills in critical thinking not just, for example, tacking on a digital literacy module that they do once a year but actually integrating it into all coursework,” Otis said.

One of the fundamental challenges is that many American lawmakers routinely struggle to understand the platforms through which much disinformation spreads.

“Lawmakers across the board do not understand the internet, do not understand information influence, do not understand information security in general,” Otis said, citing several testimonies from representatives of major platforms like Twitter and Facebook that have gone awry due to irrelevant questioning.

“There’s of course exceptions,” she said. “[But] it’s hard to have a lot of confidence that they have the skills and abilities they need to legislate appropriately.”

Shannon Vavra

Written by Shannon Vavra

Shannon Vavra covers the NSA, Cyber Command, espionage, and cyber-operations for CyberScoop. She previously worked at Axios as a news reporter, covering breaking political news, foreign policy, and cybersecurity. She has appeared on live national television and radio to discuss her reporting, including on MSNBC, Fox News, Fox Business, CBS, Al Jazeera, NPR, WTOP, as well as on podcasts including Motherboard’s CYBER and The CyberWire’s Caveat. Shannon hails from Chicago and received her bachelor’s degree from Tufts University.

Latest Podcasts