New Lawsuit Takes a Unique Approach To Holding Social Media Companies Accountable
Image by Shutterstock

New Lawsuit Takes a Unique Approach To Holding Social Media Companies Accountable

Social media companies are often accused of hosting harmful content, but it’s very hard to successfully sue them. A federal law known as Section 230 largely protects the platforms from legal responsibility for hate speech, slander and misinformation created by its users.

But a new lawsuit blaming TikTok for the deaths of two children is taking a different approach. Rather than accuse the company of failing to moderate content, the complaint claims TikTok is a dangerous and defective product.


The suit, filed last week in Los Angeles County Superior Court, takes aim at the video sharing app’s recommendation algorithm, alleging that it served up videos depicting the deadly “Blackout Challenge,” in which people choke themselves to achieve a euphoric feeling. Two children—8-year-old Lalani Erika Walton and 9-year-old Arriani Jaileen Arroyo—died last year after allegedly trying the "blackout challenge," the suit said.

“We believe that there is a fundamental flaw in the design of the algorithm that directs these children to this horrific thing,” Matthew Bergman, the lawyer for the children's families, told dot.LA. Bergman is the founding attorney for the Social Media Victims Law Center, a self-described legal resource for parents of children harmed by social media.

Section 230 has long been an obstacle for social media’s opponents. "You can't sue Facebook. You have no recourse,” U.S. Sen. Richard Blumenthal, a Democrat from Connecticut, said last year after Facebook whistleblower Frances Haugen detailed Instagram’s toxic effect on young girls. The federal law’s defenders contend that Section 230 is what allows websites like YouTube and Craigslist to host user-generated content. It would be infeasible for companies to block all the objectionable posts from their massive user bases, the argument goes.

The strategy of bypassing that debate altogether by focusing on apps’ designs and features has gained steam lately. In May, an appellate panel ruled that Santa Monica-based Snap can’t dodge a lawsuit alleging that a Snapchat speed filter—which superimposed users’ speeds on top of photos and videos—played a role in a deadly car crash at 113 mph. The judges said Section 230 didn’t apply to the case because the lawsuit did not seek to hold Snap liable as a publisher.

Similarly, California lawmakers are advancing a bill that would leave social media companies open to lawsuits alleging their apps have addicted children. Proponents of the bill take issue with product features such as likes, comments and push notifications that grab users’ attention, with the ultimate goal of showing them ads.

“A product liability claim is separate and distinct from suing a company for posting third party content or publishing third party content, which we know has been unfruitful in many ways, for many years, as a vehicle to hold these companies accountable,” Bergman said.

Representatives for Culver City-based TikTok did not return a request for comment. In a previous statement about another TikTok user’s death, a company spokesperson noted the “disturbing” blackout challenge predates TikTok, pointing to a 2008 warning from the Centers for Disease Control and Prevention about deadly choking games. The spokesperson claimed the challenge “has never been a TikTok trend.” The app currently doesn’t produce any search results for “blackout challenge” or a related hashtag.

It’s too early to tell whether product liability claims will be more successful against social media companies. “We're realistic here. This is a long fight,” Bergman said. In the meantime, his suit against TikTok takes pains to note what it is not about: the users posting the dangerous challenge videos.

“Plaintiffs are not alleging that TikTok is liable for what third parties said or did [on the platform],” the suit said. “but for what TikTok did or did not do.”

Subscribe to our newsletter to catch every headline.

Cadence
The 'Whale Safe' Project Is Working To Eliminate Whale Deaths From Boating Accidents
Padraig Duignan

It’s difficult to calculate exactly how many whales are killed by collisions with ships every year because many strikes go unnoticed and unreported. But some estimates put the number as high as 20,000 per year. This spring, the SF Chronicle reported that as many as 83 endangered whales are killed by ships off the coast of California each year, citing projections from Petaluma organization Point Blue Conservation Science. As the shipping industry continues to grow and climate change forces whales into closer proximity to humans, the problem is only set to get worse.

Fortunately, there’s a simple solution: Simply reducing the speed of the boat gives the whales enough time to respond and escape from threats. Which is why a new initiative, born out of the University of California Santa Barbara, is beginning to supply vessels with strategic information about when and where to brake for whales.

Read moreShow less
David Shultz

David Shultz reports on clean technology and electric vehicles, among other industries, for dot.LA. His writing has appeared in The Atlantic, Outside, Nautilus and many other publications.

RELATEDTRENDING
LA TECH JOBS
interchangeLA