A Lawsuit Blames ‘Defective’ TikTok Algorithm for Children’s Deaths
Image by Shutterstock

A Lawsuit Blames ‘Defective’ TikTok Algorithm for Children’s Deaths

Social media companies are often accused of hosting harmful content, but it’s very hard to successfully sue them. A federal law known as Section 230 largely protects the platforms from legal responsibility for hate speech, slander and misinformation created by its users.

But a new lawsuit blaming TikTok for the deaths of two children is taking a different approach. Rather than accuse the company of failing to moderate content, the complaint claims TikTok is a dangerous and defective product.


The suit, filed last week in Los Angeles County Superior Court, takes aim at the video sharing app’s recommendation algorithm, alleging that it served up videos depicting the deadly “Blackout Challenge,” in which people choke themselves to achieve a euphoric feeling. Two children—8-year-old Lalani Erika Walton and 9-year-old Arriani Jaileen Arroyo—died last year after allegedly trying the "blackout challenge," the suit said.

“We believe that there is a fundamental flaw in the design of the algorithm that directs these children to this horrific thing,” Matthew Bergman, the lawyer for the children's families, told dot.LA. Bergman is the founding attorney for the Social Media Victims Law Center, a self-described legal resource for parents of children harmed by social media.

Section 230 has long been an obstacle for social media’s opponents. "You can't sue Facebook. You have no recourse,” U.S. Sen. Richard Blumenthal, a Democrat from Connecticut, said last year after Facebook whistleblower Frances Haugen detailed Instagram’s toxic effect on young girls. The federal law’s defenders contend that Section 230 is what allows websites like YouTube and Craigslist to host user-generated content. It would be infeasible for companies to block all the objectionable posts from their massive user bases, the argument goes.

The strategy of bypassing that debate altogether by focusing on apps’ designs and features has gained steam lately. In May, an appellate panel ruled that Santa Monica-based Snap can’t dodge a lawsuit alleging that a Snapchat speed filter—which superimposed users’ speeds on top of photos and videos—played a role in a deadly car crash at 113 mph. The judges said Section 230 didn’t apply to the case because the lawsuit did not seek to hold Snap liable as a publisher.

Similarly, California lawmakers are advancing a bill that would leave social media companies open to lawsuits alleging their apps have addicted children. Proponents of the bill take issue with product features such as likes, comments and push notifications that grab users’ attention, with the ultimate goal of showing them ads.

“A product liability claim is separate and distinct from suing a company for posting third party content or publishing third party content, which we know has been unfruitful in many ways, for many years, as a vehicle to hold these companies accountable,” Bergman said.

Representatives for Culver City-based TikTok did not return a request for comment. In a previous statement about another TikTok user’s death, a company spokesperson noted the “disturbing” blackout challenge predates TikTok, pointing to a 2008 warning from the Centers for Disease Control and Prevention about deadly choking games. The spokesperson claimed the challenge “has never been a TikTok trend.” The app currently doesn’t produce any search results for “blackout challenge” or a related hashtag.

It’s too early to tell whether product liability claims will be more successful against social media companies. “We're realistic here. This is a long fight,” Bergman said. In the meantime, his suit against TikTok takes pains to note what it is not about: the users posting the dangerous challenge videos.

“Plaintiffs are not alleging that TikTok is liable for what third parties said or did [on the platform],” the suit said. “but for what TikTok did or did not do.”

Standing Together Through the Flames

🔦 Spotlight

To our Los Angeles family,

This week’s wildfires have brought immense pain and hardship to our beloved city. Many of our friends, neighbors, and colleagues have faced evacuations, power outages, and the devastating loss of homes and livelihoods. Our hearts go out to everyone affected by this tragedy.

Read moreShow less
How Will LA Look in 2028? A Look at the City's Plan To Embrace Transformational Tech
Midjourney/Dall-E

It’s 8 a.m. on a Monday morning. I wave at the contact-free traffic sensor and the cars stop so I can cross. A delivery robot zooms past bringing cold brew and breakfast burritos to neighbors, while someone activates a micromobility electric scooter and glides off down a side street. An autonomous vehicle on a trial run pauses at the stop sign, guided by Global Positioning System satellites more than 12,000 miles overhead. A smart pole tracks air quality at the intersection and reports back to the data science team at City Hall.

Read moreShow less
S.C. Stuart
S.C. Stuart is a foreign correspondent (ELLE China, Esquire Latin America), Contributing Writer at Ziff Davis PCMag, and consults as a futurist for Hollywood Studios. Previously, S.C. was the head of digital at Hearst Magazines International while serving as a Non-Executive Director, UK Trade & Investment (US) and Digital Advisor at The Smithsonian.
RELATEDTRENDING
LA TECH JOBS
interchangeLA