Social media companies are often accused of hosting harmful content, but it’s very hard to successfully sue them. A federal law known as Section 230 largely protects the platforms from legal responsibility for hate speech, slander and misinformation created by its users.
But a new lawsuit blaming TikTok for the deaths of two children is taking a different approach. Rather than accuse the company of failing to moderate content, the complaint claims TikTok is a dangerous and defective product.
The suit, filed last week in Los Angeles County Superior Court, takes aim at the video sharing app’s recommendation algorithm, alleging that it served up videos depicting the deadly “Blackout Challenge,” in which people choke themselves to achieve a euphoric feeling. Two children—8-year-old Lalani Erika Walton and 9-year-old Arriani Jaileen Arroyo—died last year after allegedly trying the "blackout challenge," the suit said.
“We believe that there is a fundamental flaw in the design of the algorithm that directs these children to this horrific thing,” Matthew Bergman, the lawyer for the children's families, told dot.LA. Bergman is the founding attorney for the Social Media Victims Law Center, a self-described legal resource for parents of children harmed by social media.
Section 230 has long been an obstacle for social media’s opponents. "You can't sue Facebook. You have no recourse,” U.S. Sen. Richard Blumenthal, a Democrat from Connecticut, said last year after Facebook whistleblower Frances Haugen detailed Instagram’s toxic effect on young girls. The federal law’s defenders contend that Section 230 is what allows websites like YouTube and Craigslist to host user-generated content. It would be infeasible for companies to block all the objectionable posts from their massive user bases, the argument goes.
The strategy of bypassing that debate altogether by focusing on apps’ designs and features has gained steam lately. In May, an appellate panel ruled that Santa Monica-based Snap can’t dodge a lawsuit alleging that a Snapchat speed filter—which superimposed users’ speeds on top of photos and videos—played a role in a deadly car crash at 113 mph. The judges said Section 230 didn’t apply to the case because the lawsuit did not seek to hold Snap liable as a publisher.
Similarly, California lawmakers are advancing a bill that would leave social media companies open to lawsuits alleging their apps have addicted children. Proponents of the bill take issue with product features such as likes, comments and push notifications that grab users’ attention, with the ultimate goal of showing them ads.
“A product liability claim is separate and distinct from suing a company for posting third party content or publishing third party content, which we know has been unfruitful in many ways, for many years, as a vehicle to hold these companies accountable,” Bergman said.
Representatives for Culver City-based TikTok did not return a request for comment. In a previous statement about another TikTok user’s death, a company spokesperson noted the “disturbing” blackout challenge predates TikTok, pointing to a 2008 warning from the Centers for Disease Control and Prevention about deadly choking games. The spokesperson claimed the challenge “has never been a TikTok trend.” The app currently doesn’t produce any search results for “blackout challenge” or a related hashtag.
It’s too early to tell whether product liability claims will be more successful against social media companies. “We're realistic here. This is a long fight,” Bergman said. In the meantime, his suit against TikTok takes pains to note what it is not about: the users posting the dangerous challenge videos.
“Plaintiffs are not alleging that TikTok is liable for what third parties said or did [on the platform],” the suit said. “but for what TikTok did or did not do.”
- Banning Snapchat Drug Sales Is 'Top Priority,' Snap Says - dot.LA ›
- TikTok Blamed For Girl's Death in 'Blackout Challenge' Suit - dot.LA ›
- TikTok Rolls Out 'Content Levels' To Protect Younger Users - dot.LA ›