Six current and former social media executives will testify before the Senate Homeland Security Committee Wednesday at a hearing that committee officials say will focus on how algorithms and targeted ads can amplify harmful content and threaten homeland security.
Several of the witnesses have become strong critics of the platforms, including a former Facebook vice president, Brian Boland, who said he left the company because he believed its products were further fueling racism in the aftermath of the George Floyd killing.
Wednesday’s hearing is just the latest Congressional foray into trying to better understand how social media exacerbates disinformation, a threat that Cybersecurity and Infrastructure Security Agency Director Jen Easterly recently called a surging and “incredibly difficult problem” plaguing her agency as it tries to protect the electoral system from cyber threats.
The White House also has been ratcheting up the pressure on social media giants in recent days. Last week, administration officials invited several prominent critics of social media to the White House for what it billed as a listening session. A White House “readout” of the event noted that content on tech platforms has led to “tragic acts of violence linked to toxic online cultures.”
Several participants in the White House meeting raised concerns about what the read out called “the rampant collection of vast troves of personal data by tech platforms,” alleging that this data collection augments misinformation and disinformation because social media platforms “maximize ‘user engagement’ for profit by using personal data to display content tailored to keep users’ attention — content that is often sensational, extreme, and polarizing.”
Also, on Tuesday, Twitter whistleblower Peiter “Mudge” Zatko testified before the Senate Judiciary Committee, telling lawmakers that Twitter is unable to track how its employees access internal data. Zatko said “a lack of fundamental tools and access controls” put the company at least a decade behind the rest of the industry and made it impossible for management to weed out spies.
A whistleblower complaint filed by Zatko in July included allegations of two incidents involving foreign spies. In one instance, Twitter knowingly allowed a non-engineering employee who was a state agent for India to retain access to internal dealings with the Indian government. In a second, the FBI alerted Twitter’s security team to the presence of a Chinese state agent in the ranks of its security team.
The White House released a list of “core principles for reform” last Thursday that notably included a call to remove special legal protections for large tech platforms. Currently, Section 230 of the Communications Decency Act gives social media companies special protections that “broadly shield them from liability even when they host or disseminate illegal, violent conduct or materials,” the White House readout said. “The President has long called for fundamental reforms to Section 230.”
The White House also called for increased transparency around platforms’ algorithms and content moderation decisions, saying that despite the central role social media plays in American life, tech platforms are “notoriously opaque” about their content moderation practices. As a result, the White House said, platforms are failing to provide sufficient transparency to allow the public and researchers to understand their content moderation decisions and “the very real dangers these decisions may pose.”
Boland is not the only former social media executive turned critic who will appear at the Wednesday hearing. He will be joined by Alex Roetter, former senior vice president of engineering at Twitter, who appeared in a documentary called the “Social Dilemma” sounding an alarm about the platform.
Adam Conner, a former Facebook lobbyist who now works on technology policy at the Center for American Progress, expects the hearing could reveal new details about the platforms’ business operations.
“This is a real opportunity to hear criticisms from people who can’t be dismissed because they were doing the work that drives the fundamental product,” Conner said. “There will be a really important focus on algorithmic amplification and recommendations in driving extremism, particularly that manifests in real world violent events like January 6.”