Facebook's Mark Zuckerberg and Google CEO Sundar Pichai brushed off their platforms' role in the January Capitol insurrection, facing a congressional panel on Thursday.
It marked the tech giants' first appearance before Congress since hundreds of people fueled by social media messages stormed the building.
"The responsibility here lies with the people who took the actions to break the law and do the insurrection," Mark Zuckerberg told the Democratic-led House Energy and Commerce Committee via videoconference. "And the people who spread that content," he added. "Including the president."
Twitter's Jack Dorsey was the only social media CEO testifying to admit his platform bears responsibility.
Pichai said his company "always feels a deep sense of responsibility." But in this case, he said, "I think we worked hard. This election effort was one of our most substantial efforts."
Congress members pushed the three executives on platforms' algorithms and their role in spreading false and violent content. The powerful trio is under increased scrutiny as Congress considers revamping Section 230, a law protecting big tech companies such as Facebook, Twitter and Google from taking the hit when it comes to misinformation.
Rep. Jan Schakowsky (D-IL) asked directly about a Reuters interview in which Facebook CEO Sheryl Sandberg said planning ahead of the siege mostly took place on smaller platforms.
"Certainly there was content on our services," Zuckerberg replied. "From that perspective I think there's further work that we need to do to make our services and moderation more effective."
In his written testimony, Zuckerberg urged Congress to consider dialing back Section 230. His proposal would grant liability protections only to companies with systems in place to oversee those posts.
And Dorsey acknowledged that Twitter can "do more" when it comes to building and exposing the platform's algorithms that impact the content users see.
But behind their calls for tighter regulation is an army of lobbyists working to keep the nation's most influential — and profitable —companies on top.
A recent report from the nonprofit Public Citizen found that Facebook and Amazon are the two biggest corporate lobbying spenders in the nation. And 94% of Congress members with authority over antitrust and privacy issues have taken money from a big tech lobbyist or PAC.
Some say the proposals would ultimately benefit the tech giants. Small companies, meanwhile, may struggle to build new systems and teams dedicated to overseeing dicey content. Whatever happens is likely to have profound implications for how users experience social media.
In response to concerns about how the platforms moderated harmful and false posts, the tech titans defended their strategies. Zuckerberg cited the tags added to some 150 million posts that misrepresented the 2020 presidential election. Pichai said YouTube removed 13,000 channels for promoting violence and extremism between October and December 2020.
"There's a lot of impressive numbers in there," Carmen Scurato, president of the advocacy group Free Press, said during a livestream event prior to the hearing. "They're grading their own homework."
"Don't get wowed by these statistics," Dr. Joan Donovan from the Shorenstein Center at Harvard said during the YouTube livestream with Scurato.
Coalitions like the Real Facebook Oversight Board want more than numerical evidence.
The group of advocacy organizations and nonprofits formed in late 2020 to tackle the slew of Spanish language posts that violate Facebook's policies but aren't caught by its algorithms.
"What kind of investment is Facebook making on the different languages to make sure that we have more of an accuracy?" asked Rep Tony Cárdenas, a Democrat representing a heavily Latino portion of the San Fernando Valley.
Zuckerberg pointed to Facebook's international fact-checking program, an initiative he said is "something we invest a lot in and it will be something we continue to invest more in."
But advocates cast doubt.
"We are not convinced one bit by Zuckerberg's empty promises and roundabout answers meant to distract us from the truth," said Brenda Victoria Castillo, president and CEO of the National Hispanic Media Coalition.
- The Future of Content Moderation Online - dot.LA ›
- How Social Media Moderation Might Be Legislated - dot.LA ›
Facebook is letting misinformation around COVID-19 vaccines and election fraud run rampant through posts in Spanish, a group of advocacy organizations and lawmakers said.
"Facebook continues to fail to effectively moderate Spanish-language misinformation and online hate targeting the Latino community," said Jessica Cobian, senior campaign manager on tech policy at the Center for American Progress.
The Center for American Progress along with Free Press and the National Hispanic Media Coalition are running the campaign "Ya Basta Facebook" or "Enough Already, Facebook" under a coalition they've named the Real Facebook Oversight Board.
In a press briefing Wednesday, the group referenced an April 2020 report from the advocacy group Avaaz that found Facebook flagged 70% of misleading or false posts surrounding COVID-19 but only 30% of similar posts in Spanish.
"It's really hard to track what's happening and how prolific the problem is," said Jessica J. González, co-CEO of Free Press. "There's a lack of transparency from Facebook about what's getting taken down."
In an email to dot.LA, Facebook spokesperson Kevin McAlister said the company is "taking aggressive steps to fight misinformation in Spanish and dozens of other languages."
"A key part of getting accurate information out is working with communities, which is why we're providing free ads to health organizations to promote reliable information about COVID-19 vaccines," he added.
But, Cobian pointed to a handful of posts she said Facebook has refused to take down even though they violate the company's own policies. One post in Spanish, from June of 2020, shows photos of armed men with a caption mistranslated in English to "Stop proud to defend your country."
Had the caption been correctly translated, the algorithm would have flagged it.
"The actual translation in Spanish should read, 'Stand proud to defend your country," said Cobian. "The correct translation shows that the post violates their policy against dangerous individuals and organizations."
In other cases, Cobian said, Facebook flagged posts about voter fraud with a "False information" tag instead of removing them from the platform.
Part of the problem is a lack of representation at Facebook, advocates argue.
"Facebook is headquartered in California, where Latinos are 40% of the population," said Brenda Victoria Castillo, president and CEO of the National Hispanic Media Coalition. "Yet Facebook has little to no Latino representation on their board and C-Suite positions."
She said the coalition came about after her group, based in Los Angeles, sent a letter to Facebook in September 2020 outlining their concerns. She didn't hear back. Two months later, the new group formed and sent a second letter.
Meanwhile during congressional hearings in November, Mark Zuckerberg was asked how he would prevent the spread of Spanish-language misinformation ahead of the Georgia runoff elections.
"This is something that we are already working on and worked on ahead of the general election," he replied. "We're certainly committed to focusing on this."
The group met with Zuckerberg in December and presented a PowerPoint of a dozen Facebook posts they found concerning. They still weren't satisfied with the company's response.
Among the group's demands is that Facebook hire a C-Suite position to oversee U.S. Spanish-language content moderation. Ya Basta Facebook is also calling on Facebook to "publicly explain the translation process of content moderation algorithms."
Congressman Tony Cárdenas, a Democrat who represents a heavily Latino portion of the San Fernando Valley, said he will ask Mark Zuckerberg about his plans for moderation during a House hearing next week with tech CEOs.
"This is not going to be the first time we have Mark Zuckerberg and others in front of the Energy and Commerce Committee," he said. "But every time we do have them there, it's unfortunate that they tend to give us rhetorical answers instead of giving some commitments."
On Monday, Facebook announced a new plan to help users learn more about the COVID-19 vaccines and where to get them.
The company is introducing labels containing "credible information" from the World Health Organization that will be tacked onto Facebook and Instagram posts that discuss the vaccine. The label is rolling out in six languages including English and Spanish.
- How Social Media Moderation Might Be Legislated - dot.LA ›
- Even as Social Sites Crack Down, Misinformation Spreads - dot.LA ›
- Biden Inauguration Social Posts Are Full of Misinformation - dot.LA ›
- Facebook Won't Take Down Misleading Political Ads - The New ... ›
Despite a crackdown on social media content that calls for violence, posts about conspiracy theories continue to proliferate on both fringe alt-right sites and mainstream platforms like Facebook and Twitter.
A report from the non-partisan nonprofit Advance Democracy found that four of the five most popular tweets about the inauguration between January 15 and 18 promoted conspiracies about COVID-19 and/or the election. The organization conducts public-interest research and investigations.
"As these false claims spread unchecked, it provides the fuel for other potential violence across the nation," said Advance Democracy President Daniel J. Jones.
Snap, Twitter, Facebook and other sites took down President Donald Trump's accounts in the days after a violent mob stormed the U.S. Capitol. Twitter alone shut down 70,000 QAnon-related profiles on Jan. 11 and companies including TikTok and YouTube introduced new restrictions on content. Parler, the social network where many Trump backers gathered, was removed from the Apple App Store, the Google Play Store and Amazon Web Services, where the site was hosted. It has partially returned reportedly with the help of a Russian-backed internet service provider.
But conspiracies related to QAnon are still building across mainstream platforms, several of which say that President Trump will begin a second term this week following a string of arrests. And many of these conversations are also going on in fringe platforms like GreatAwakening.win, TheDonald.win and 8kun.
"Trump isn't going anywhere," states a top post on GreatAwakening.win, a sister site of TheDonald dedicated to QAnon.
Jones wants legislatures and platforms to do more to stem the spread of disinformation. But the attack at the Capitol has highlighted the difficulty officials will have drawing a line between speech that incites violence and muzzling political expression. While social media sites have been more aggressive, it's not something they can easily stamp out.
Far-right conspiracy theorist Jack Posobiec's Twitter account saw the most engagement among users posting about Wednesday's inauguration, Advance Democracy said.
"They are instituting thought vetting for the troops guarding inauguration to make sure they aren't conservative," a tweet on Jan. 17 read. "This isn't about national security. Understand where it's all going."
Advance Democracy's report finds that Posobiec's posts about the inauguration "are consistently conspiratorial in nature."
Skeptics of the content moderation crackdowns following Jan. 6 said the statements and company policies came too late. Conspiracy theorists have been gathering online for years, culminating in a siege that reflected planning on platforms that did little to moderate them. It also speaks to looming problems for these sites about the growth of misinformation and their role in allowing it.
Karen North, a USC Annenberg professor of digital and social media, said that private companies maintain a legal right to make their own decisions over censoring and content moderation through Section 230. They might make decisions because of political pressure or to maintain "the kind of community they want to foster and cultivate."
Talking or posting about conspiracy theories is not illegal, North said. But it's important to watch these conversations online.
"Social media often has the opportunity for the authorities at least to keep an eye on the discussion and make sure that it doesn't go astray," she said.
Meanwhile, social media platforms are preparing to take down content on Inauguration Day. Snap has created a committee to conduct "regular proactive sweeps" of its platform and TikTok has updated community regulations around Biden's swearing-in.
- Snapchat Locks Trump's Account After Riot at Capitol - dot.LA ›
- Lawmakers Take Aim at Algorithms 'at Odds with Democracy' - dot.LA ›
- Facebook Fails to Stop Spanish-Language Misinformation - dot.LA ›