
Get in the KNOW
on LA Startups & Tech
Xcapitol attack
Facebook's Zuckerberg, Google's Pichai Brush Off Their Platform's Role in Capitol Attack
Facebook's Mark Zuckerberg and Google CEO Sundar Pichai brushed off their platforms' role in the January Capitol insurrection, facing a congressional panel on Thursday.
It marked the tech giants' first appearance before Congress since hundreds of people fueled by social media messages stormed the building.
"The responsibility here lies with the people who took the actions to break the law and do the insurrection," Mark Zuckerberg told the Democratic-led House Energy and Commerce Committee via videoconference. "And the people who spread that content," he added. "Including the president."
Twitter's Jack Dorsey was the only social media CEO testifying to admit his platform bears responsibility.
Pichai said his company "always feels a deep sense of responsibility." But in this case, he said, "I think we worked hard. This election effort was one of our most substantial efforts."
Congress members pushed the three executives on platforms' algorithms and their role in spreading false and violent content. The powerful trio is under increased scrutiny as Congress considers revamping Section 230, a law protecting big tech companies such as Facebook, Twitter and Google from taking the hit when it comes to misinformation.
Rep. Jan Schakowsky (D-IL) asked directly about a Reuters interview in which Facebook CEO Sheryl Sandberg said planning ahead of the siege mostly took place on smaller platforms.
"Certainly there was content on our services," Zuckerberg replied. "From that perspective I think there's further work that we need to do to make our services and moderation more effective."
In his written testimony, Zuckerberg urged Congress to consider dialing back Section 230. His proposal would grant liability protections only to companies with systems in place to oversee those posts.
And Dorsey acknowledged that Twitter can "do more" when it comes to building and exposing the platform's algorithms that impact the content users see.
But behind their calls for tighter regulation is an army of lobbyists working to keep the nation's most influential — and profitable —companies on top.
A recent report from the nonprofit Public Citizen found that Facebook and Amazon are the two biggest corporate lobbying spenders in the nation. And 94% of Congress members with authority over antitrust and privacy issues have taken money from a big tech lobbyist or PAC.
Some say the proposals would ultimately benefit the tech giants. Small companies, meanwhile, may struggle to build new systems and teams dedicated to overseeing dicey content. Whatever happens is likely to have profound implications for how users experience social media.
In response to concerns about how the platforms moderated harmful and false posts, the tech titans defended their strategies. Zuckerberg cited the tags added to some 150 million posts that misrepresented the 2020 presidential election. Pichai said YouTube removed 13,000 channels for promoting violence and extremism between October and December 2020.
"There's a lot of impressive numbers in there," Carmen Scurato, president of the advocacy group Free Press, said during a livestream event prior to the hearing. "They're grading their own homework."
"Don't get wowed by these statistics," Dr. Joan Donovan from the Shorenstein Center at Harvard said during the YouTube livestream with Scurato.
Coalitions like the Real Facebook Oversight Board want more than numerical evidence.
The group of advocacy organizations and nonprofits formed in late 2020 to tackle the slew of Spanish language posts that violate Facebook's policies but aren't caught by its algorithms.
"What kind of investment is Facebook making on the different languages to make sure that we have more of an accuracy?" asked Rep Tony Cárdenas, a Democrat representing a heavily Latino portion of the San Fernando Valley.
Zuckerberg pointed to Facebook's international fact-checking program, an initiative he said is "something we invest a lot in and it will be something we continue to invest more in."
But advocates cast doubt.
"We are not convinced one bit by Zuckerberg's empty promises and roundabout answers meant to distract us from the truth," said Brenda Victoria Castillo, president and CEO of the National Hispanic Media Coalition.
- The Future of Content Moderation Online - dot.LA ›
- How Social Media Moderation Might Be Legislated - dot.LA ›
- Sheryl Sandberg Accused of Burying Stories - dot.LA ›
Watch: The Future of Content Moderation Online
As Big Tech cracks down on moderation after the Capitol attack and Wall Street braces for more fallout from social media's newfound influence on stock trading,
legislators are eyeing changes to Section 230 of the Communications Decency Act of 1996. On Wednesday, February 10, dot.LA brought together legal perspectives and the views of a founder and venture capitalist on the ramifications of changing the way that social media and other internet companies deal with the content posted on their platforms.
A critic of Big Tech moderation, Craft Ventures General Partner and former COO of PayPal David Sacks called for an amendment of the law during dot.LA's Strategy Session Wednesday. Tyler Newby and Andrew Klungness, both partners at law firm Fenwick, laid out the potential legal implications of changing the law.
Section 230 limits the liability of internet intermediaries, including social media companies, for the content users publish on their platforms.
"Mend it, don't end it," Sacks said.
Sacks said he's concerned about censorship in the wake of companies tightening moderation policies. In the case of Robinhood's recent decision to freeze users from trading certain stocks— including GameStop's — for a period of time, he said we're now seeing discussions about Big Tech's role in censorship unfold in nonpartisan settings.
"Who has the power to make these decisions?" he said. "What concerns me today is that Big Tech has all the power."
Social media sites including Twitter pulled down former President Trump's account after last month's attack on the U.S. Capitol. But critics have said that these sites didn't go far enough in stopping conversations that provoked the violence.
To provide some external standard, he called for the "reestablishment of some First Amendment rights in this new digital public square" — which is to say, on privately owned platforms.
Newby pointed to a series of recent bills aimed at reigning in the power of tech companies. Changes to moderation laws could have sweeping impacts on more companies than giants like Facebook or Twitter.
"It's going to have a huge stifling effect on innovation," said Klungness, referring to a possible drop in venture capital to new startups. "Some business models may be just simply too risky or may be impractical because they require real-time moderation of content."
And if companies are liable for how their users behave, Klungness said, some companies may never take the risk in launching these companies at all. "Some business models may be just simply too risky or may be impractical because they require real-time moderation of content," he said.
Watch the full discussion below.
Strategy Session: The Future of Content Moderation Online
David Sacks, Co-Founder and General Partner of Craft Ventures
David Sacks, Co-Founder and General Partner of Craft Ventures
David Sacks is co-founder and general partner at Craft. He has been a successful tech entrepreneur and investor for two decades, building and investing in some of the most iconic companies of the last 20 years. David has invested in over 20 unicorns, including Affirm, Airbnb, Bird, Eventbrite, Facebook, Houzz, Lyft, Opendoor, Palantir, Postmates, Reddit, Slack, SpaceX, Twitter and Uber.
In December 2014, Sacks made a major investment in Zenefits and became the company's COO. A year later, in the midst of a regulatory crisis, the Board asked David to step in as interim CEO of Zenefits. During his one year tenure, David negotiated resolutions with insurance regulators across the country, and revamped Zenefits' product line. By the time he left, regulators had praised David for "righting the ship", and PC Magazine hailed the new product as the best small business HR system.
David is well known in Silicon Valley for his product acumen. AngelList's Naval Ravikant has called David "the world's best product strategist." David likes to begin any meeting with a new startup by seeing a product demo.
Kelly O'Grady, Chief Correspondent & Host and Head of Video at dot.LA
Kelly O'Grady is dot.LA's chief host & correspondent. Kelly serves as dot.LA's on-air talent, and is responsible for designing and executing all video efforts. A former management consultant for McKinsey, and TV reporter for NESN, she also served on Disney's Corporate Strategy team, focusing on M&A and the company's direct-to-consumer streaming efforts. Kelly holds a bachelor's degree from Harvard College and an MBA from Harvard Business School. A Boston native, Kelly spent a year as Miss Massachusetts USA, and can be found supporting her beloved Patriots every Sunday come football season.
Tyler Newby is a partner at Fenwick
Tyler Newby, Partner at Fenwick
Tyler focuses his practice on privacy and data security litigation, counseling and investigations, as well as intellectual property and commercial disputes affecting high technology and consumer-facing companies. Tyler has an active practice in defending companies in consumer class actions, state attorney general investigations and federal regulatory agency investigations arising out of privacy and data security incidents. In addition to his litigation practice, Tyler regularly advises companies large and small on reducing their litigation risk on privacy, data security and secondary liability issues. Tyler frequently counsels companies on compliance issues relating to key federal regulations such as the Children's Online Privacy Protection Act (COPPA), the Fair Credit Reporting Act (FCRA), the Computer Fraud and Abuse Act (CFAA), the Gramm Leach Bliley Act (GLBA), Electronic Communications Privacy Act (ECPA) and the Telephone Consumer Protection Act (TCPA).
In 2014, Tyler was named among the top privacy attorneys in the United States under the age of 40 by Law360. He currently serves as a Chair of the American Bar Association Litigation Section's Privacy & Data Security Committee, and was recently appointed to the ABA's Cybersecurity Legal Task Force. Tyler is a member of the International Association of Privacy Professionals, and has received the CIPP/US certification.
Andrew Klungness is a partner at Fenwick
Andrew Klungness, Partner at Fenwick
Leveraging nearly two decades of business and legal experience, Andrew navigates clients—at all stages of their lifecycles—through the opportunities and risks presented by novel and complex transactions and business models.
Andrew is a co-chair of Fenwick's consumer technologies and retail and digital media and entertainment industry teams, as well as a principal member of its fintech group. He works with clients in a number of verticals, including ecommerce, consumer tech, fintech, enterprise software, blockchain, marketplaces, CPG, mobile, AI, social media, games and edtech, among others.
Andrew leads significant and complex strategic alliances, joint ventures and other collaboration and partnering arrangements, which are often driven by a combination of technological innovation, industry disruption and rights to content, brands or celebrity personas. He also structures and negotiates a wide range of agreements and transactions, including licensing, technology sourcing, manufacturing and supply, channel partnerships and marketing agreements. Additionally, Andrew counsels clients in various intellectual property, technology and contract issues in financing, M&A and other corporate transactions.
Sam Adams, Co-Founder and CEO of dot.LA
Sam Adams, Co-Founder and CEO of dot.LA
Sam Adams serves as chief executive of dot.LA. A former financial journalist for Bloomberg and Reuters, Adams moved to the business side of media as a strategy consultant at Activate, helping legacy companies develop new digital strategies. Adams holds a bachelor's degree from Harvard College and an MBA from the University of Southern California. A Santa Monica native, he can most often be found at Bay Cities deli with a Godmother sub or at McCabe's with a 12-string guitar. His favorite colors are Dodger blue and Lakers gold.
- Lawmakers Take Aim at Algorithms 'at Odds with Democracy' - dot.LA ›
- How Trump's Order Could Impact The Fates of Snap, TikTok and ... ›
- How Social Media Moderation Might Be Legislated - dot.LA ›
- Counterpart Rakes in $10M to Help Insure Small Businesses - dot.LA ›
- Facebook, Google Execs Admit No Blame For Capitol Attack - dot.LA ›
- Event: Startup Coil Presents Clean Crypto - dot.LA ›
Even as Social Sites Crack Down, Inauguration Posts Are Full of Misinformation, Report Finds
Despite a crackdown on social media content that calls for violence, posts about conspiracy theories continue to proliferate on both fringe alt-right sites and mainstream platforms like Facebook and Twitter.
A report from the non-partisan nonprofit Advance Democracy found that four of the five most popular tweets about the inauguration between January 15 and 18 promoted conspiracies about COVID-19 and/or the election. The organization conducts public-interest research and investigations.
"As these false claims spread unchecked, it provides the fuel for other potential violence across the nation," said Advance Democracy President Daniel J. Jones.
Snap, Twitter, Facebook and other sites took down President Donald Trump's accounts in the days after a violent mob stormed the U.S. Capitol. Twitter alone shut down 70,000 QAnon-related profiles on Jan. 11 and companies including TikTok and YouTube introduced new restrictions on content. Parler, the social network where many Trump backers gathered, was removed from the Apple App Store, the Google Play Store and Amazon Web Services, where the site was hosted. It has partially returned reportedly with the help of a Russian-backed internet service provider.
But conspiracies related to QAnon are still building across mainstream platforms, several of which say that President Trump will begin a second term this week following a string of arrests. And many of these conversations are also going on in fringe platforms like GreatAwakening.win, TheDonald.win and 8kun.
"Trump isn't going anywhere," states a top post on GreatAwakening.win, a sister site of TheDonald dedicated to QAnon.
Jones wants legislatures and platforms to do more to stem the spread of disinformation. But the attack at the Capitol has highlighted the difficulty officials will have drawing a line between speech that incites violence and muzzling political expression. While social media sites have been more aggressive, it's not something they can easily stamp out.
Far-right conspiracy theorist Jack Posobiec's Twitter account saw the most engagement among users posting about Wednesday's inauguration, Advance Democracy said.
"They are instituting thought vetting for the troops guarding inauguration to make sure they aren't conservative," a tweet on Jan. 17 read. "This isn't about national security. Understand where it's all going."
Advance Democracy's report finds that Posobiec's posts about the inauguration "are consistently conspiratorial in nature."
Skeptics of the content moderation crackdowns following Jan. 6 said the statements and company policies came too late. Conspiracy theorists have been gathering online for years, culminating in a siege that reflected planning on platforms that did little to moderate them. It also speaks to looming problems for these sites about the growth of misinformation and their role in allowing it.
Karen North, a USC Annenberg professor of digital and social media, said that private companies maintain a legal right to make their own decisions over censoring and content moderation through Section 230. They might make decisions because of political pressure or to maintain "the kind of community they want to foster and cultivate."
Talking or posting about conspiracy theories is not illegal, North said. But it's important to watch these conversations online.
"Social media often has the opportunity for the authorities at least to keep an eye on the discussion and make sure that it doesn't go astray," she said.
Meanwhile, social media platforms are preparing to take down content on Inauguration Day. Snap has created a committee to conduct "regular proactive sweeps" of its platform and TikTok has updated community regulations around Biden's swearing-in.
- Snapchat Locks Trump's Account After Riot at Capitol - dot.LA ›
- Lawmakers Take Aim at Algorithms 'at Odds with Democracy' - dot.LA ›
- Facebook Fails to Stop Spanish-Language Misinformation - dot.LA ›